In a press release issued earlier this month Amazon announced their 'Public Data Sets on Amazon Web Services' initiative, providing a free home to potentially massive public data sets and free use of those data by developers hosting their applications in the company's data centres.
Larry Dignan covered the story for ZDNet at the time, but the Semantic Web angle arose in mailing list-based discussion amongst members of the Linking Open Data community project, which is supported by the World Wide Web Consortium.
"Please see: http://aws.amazon.com/publicdatasets/ ; potentially the final destination of all published RDF archives from the [Linked Open Data] cloud.
Once the data sets are available from Amazon, database constructions costs will be significantly alleviated.
We have DBpedia reconstruction down to 1.5 hrs (or less) based on Virtuoso's in-built integration with Amazon S3 for backup and restoration etc.. We could get the reconstruction of the entire LOD cloud down to some interesting numbers once all the data is situated in an Amazon data center." (my links)
As I note in a blog post here, data bundled up inside the 'Elastic Block Stores' that Amazon offers aren't fully-fledged participants in the open data web, but developers already comfortable with Amazon's Web Services certainly do gain free and easy access to incredibly large bodies of data.
To have the data sets already collected by the Linked Open Data community easily available to the (different) community of Amazon Web Services developers would go a long way toward educating them about Linked Data and its potential... even if the resulting applications aren't necessarily reliant upon Amazon infrastructure.