One thing that struck me when reading up on Cassandra is that there is a very strong mindset in the Cassandra community around linear scalability and therefore on primary key based data models. So de-normalizing your data, such as by using materialized views is considered a best practice.
However, de-normalization has some challenges of its own. Both Cassandra-managed materialized views or any other application side managed denormalization run the risk of becoming inconsistent. And of course it does mean you're multiplying your database size.
The Open Source Initiative recently organized its first conference: State of the Source 2020. I presented a talk In Defense of Extreme Copyleft, where I explored the boundaries of current network copyleft licenses and potential need for further - carefully deliberated - expansion of copyleft licensing.
The paper stating the RUM conjecture was published by a group of Harvard DASLab researchers in 2016. They also have created a more easily digestable RUM conjecture home page with graphics. Yet, in this blog post I try to describe the idea in even simpler terms than that page.
An engineer I work with asked me for tips on what to read about database benchmarking. I told him I've learned a lot from reading Mark Callaghan's blog. Now that I think about it, articles and conference talks from Baron Scwhartz were also, or even more, fundamental early on when I was getting started.
When I choose technologies to use, or employers to work for, my system is based on sticking with a few things I believe in. Datastax happens to tick quite a few of those boxes:
© 2006-2021 Henrik Ingo.
The content on this site is published with the Creative Commons Attribution License.
That means you are free to copy and reuse and redistribute the book, blog posts and other original content you find on this site.
Non-original content will be clearly attributed with their respective copyright terms.
Designed by: Golems G.A.B.B. OÜ