What’s Hot, What’s Not: How Data Temperature Helps Manage Very Big Data

I recently had the opportunity to speak at Stanford’s XLDB 2015, a conference for people who are interested in the problems surrounding extremely large databases (XLDB). In my talk, I gave a reality check about putting all data in memory. It’s nice to think of a world with infinite budgets where we can put all our data in memory, but it is not economically rational to do so – especially for “big” data. For exploitation of the tsunami of data that’s coming, we need multi-temperature data management. I’ll explain what that means, but let me also point out why you should care, even if you don’t go to geek conferences at Stanford in your spare time. People will try to sell you on putting all your data in memory, and it’s just not reasonable from a cost perspective given the amount of big data most organizations will want to get value from – and this becomes even more obvious when examining the access patterns against very large data sets.

from Forbes – Tech http://ift.tt/1frPa1n
via IFTTT