Big Data Is A Matter Of Speed, Not Size
Finally the market is getting over its initial BIG Data fixation. Unfortunately, in the process we may be inclined to throw away the Big Data signal in an attempt to rid ourselves of all the noise.
The Guardian's John Burn-Murdoch highlights this today, asserting that "'small data' - or data of the volumes most regular analysts, researchers and statisticians are used to dealing with - is actually both more relevant and more useful to the vast majority of organisations than its big cousin." He concludes, "[I]t is speed, not size that is increasingly driving desire for software and hardware improvements at data-processing organisations."
While we talk about Big Data, the reality is that there is a much more important trend going on in data, generally, as Rufus Pollock, Founder and Co-Director of the Open Knowledge Foundation, captures:
[W]e risk overlooking the much more important story here, the real revolution, which is the mass democratisation of the means of access, storage and processing of data. This story isn't about large organisations running parallel software on tens of thousand of servers, but about more people than ever being able to collaborate effectively around a distributed ecosystem of information, an ecosystem of small data.
Now if only we could get everyone else to recognize this essential truth, so we could stop admiring how very big all our data is, and instead focus on actually putting it to work in time for it to be useful to us.