Amazon Cloud recently raised their S3 object size limit from 5B G to 5 TB ,a significant increase in quantum. Following this Gigaom in its blog post speculates if this pertains to targeting a company like Netflix, which may need improved storage spaces on the cloud for its growing online rich-media business. I also believe this upping of the storage limit may be also correlated to the recent addition of Amazon cluster computing instances which is meant for processing huge data sets pertaining to the scientific simulation scenarios . So, It is a perfect scenario if we were to consider for example the Amazon GPU Clusters nodes being used for heavy HD Video transcoding or Bioinformatics processing jobs that possibly requires feeding hundreds of GB of raw input data , as well as finished output data of which have to be moved to a durable storage that has enough capacity to store such rendered /finished files.
There are also other scenarios such as the managed data back up ISVs such as Nasuni using amazon S3 as the cloud storage as a part of their products who might need a lot of storage going forward.
Also,if you look at the way , the landscape is evolving in the field of Cloud computing and particularly at Amazon and specifically its services like the CloudFront, I personally think, it may not be a long time before Amazon becomes a formidable CDN similar to services like Edgecast,Lime Light or Akamai and at this juncture, I foresee that Amazon may perhaps offer a 10 or 25 TB S3 storage from where HD videos may be seamlessly delivered on their new type of CloudFront CDNs.
Essentially, what we are seeing today as 5TB for S3 may only a tip of the iceberg.. .