What will be our future Bigdata? How to store data in future?

Bigdata is mostly successor of normal databases, which is a collection of tools that handles data of data. So present trafficful sites implemented this at their backend. I am thinking of what will be the future to store TBs of  TB data. We are in a situation to believe big applications dumping Tera bytes of data each day. Where to store these data if this goes up and on? Lot of cost has to be invested for hardware and proper accommodation has to be allocated for those hardwares. I am thinking of some way which will reduce the current cost in future.

 How can we do this? Yes you may think the term compressing (i don't want to bold it since it may impress your eyes to skip to this line ;-) ) of data to reduce its size. Yes it may also used now but how to compress a GB of data to a Mega BYTEs or Kilo Bytes or Bytes(i am crazy). Is that possible? If we make it possible this is a way to do it.

Current compressing algorithms gives you some flexibility to reduce the size by 100 or even 2*100MB of a GB data, and we need to reduce it to 100MB or even low.

I am thinking of way to achieve it.

Just an illustration,

Think of upload and sharing your Birthday party video (feel free to see Birthday party as whatever you like to upload ;-) ).

Upload a file:
  1. Upload the 100MB video to an application.
  2. Compress it to say 10MB by the future algorithm.
  3. Save it to some storage.
Download a file: Yes it should be reverse of upload

  1. Decompress it to 100MB video.
  2. Download 100MB file.
Or even.

How about if clients (in client-server terminology) are given an additional work of decompressing?
  1. Download a 10MB file. (Sounds awesome isn't? you will save data for more Movies and music :-))
  2. Decompress it into 100MB file.


Yes of course there will be security problems we should fix them.







No comments:

Post a Comment