Intro
The last entry about the big data, got me thinking... How many information we generate per day, per month, per year?, the information can be saturate the servers?, we will be die?, the lolcats where will stay? and others.
Data
Reserching I found that in the year 2003 the humanity generate 5 exabytes of data, in 2011 we generate the same quantity each 2 days and in the 2013 the same quantity be produce each 10 minutes... 10 MINUTES..!!!!
Image that in the 2013 we have a population of 7,000,000,000, each person will be generate per day 102.85 gb. of information and in the year generate 37540.25 gb., WTF?! the servers will be break or better yet will be commit suicide.
Don't worry
But the most of engineers trust in the Moore's Law, that says that the density of the transistors of the computers will be duplicate each 18 months, making that the machines and the storage become more smaller.
Solutions:
Actually some companies propose a differents ideas for prevent this Apocalypse. For example limit the user of the information that save, charge for the services, make temporary the information and others.
Bibliography:
http://www.google.com.mx/publicdata/explore?ds=d5bncppjof8f9_&met_y=sp_pop_totl&tdim=true&dl=es&hl=es&q=poblacion+mundial
http://www.businessweek.com/articles/2012-05-02/the-case-against-digital-sprawl
http://www.cloudconnectevent.com/santaclara/cloud-computing-conference/big-data-processing-and-parallel-computing.php
Estoy suponiendo que esto es para clase. Sería bueno ponerlo en el Wiki. Van 4 por esta semana. Te urge hacer más entradas y más completas...
ResponderEliminar