Google will host large scientific datasets at . That is, of you have a dataset that is requested constantly, now you can ‘open-source’ it and let google take the server load. Wired has this covered.
For those not seeing the point in having open, portable data, this presentation (Making Massive Datasets Universally Accessible and Useful) is a good an explanation.
How do you ship a large dataset to google? Well, they send you hard drives in a suitcase!:
(Google people) are providing a 3TB drive array (Linux RAID5). The array is provided in “suitcase” and shipped to anyone who wants to send they data to Google. Anyone interested gives Google the file tree, and they SLURP the data off the drive. I believe they can extend this to a larger array (my memory says 20TB).