X
Innovation

Amazon switches on DynamoDB cloud database service

The scalable managed NoSQL database service moves Amazon Web Services further into the platform-as-a-service arena, where it will compete with Salesforce, Oracle and others to attract developers of large web applications
Written by Jack Clark, Contributor

Amazon Web Services has launched a hosted database service called DynamoDB, stepping into competition with platform-as-a-service providers such as Salesforce.com, Oracle and others.

Werner Vogels AWS

Amazon Web Services chief technology officer Werner Vogels has introduced the company's hosted database service DynamoDB. Image credit: Dan Farber/CNET News

The service takes Dynamo, the cloud-based NoSQL database technology used in house by Amazon, and builds on it to create a cloud-based service for external customers. A beta programme for DynamoDB began in the US on Wednesday, but is likely to expand to other locales in the future.

"Amazon DynamoDB is the result of everything we've learned for building large non-relational databases for Amazon.com, and from building scalable, high-reliability cloud services for Amazon Web Services," the company's chief technology officer Werner Vogels said in a webcast announcing the technology.

In the past, AWS has focused on providing infrastructure-as-a-service (IaaS), but the addition of DynamoDB sees it move further into platform-as-a-service (PaaS) as it is offering to manage the back-end technology as well as rent it out. This puts it into competition with database-management specialists such as Oracle and Salesforce with Database.com, as well as with Google App Engine and Microsoft Windows Azure.

"This is Amazon taking the fight to others," Redmonk chief analyst James Governor told ZDNet UK. "It is right in the heart of platform-as-a-service."

DynamoDB features

DynamoDB is tailored to handling large web applications that need to process vast amounts of data, then deliver the results in a predictable timeframe, according to AWS. To do this, it has been designed to cope with unpredictable spikes in demand while guaranteeing reliable input/output rates.

In addition, AWS will automatically replicate data from DynamoDB across multiple IT stacks, or 'availability zones', within each region, in order to guard against hardware hiccups.

"DynamoDB is not database software, it is a database service," Vogels said. "[AWS] handle all the muck that's required behind the scenes to make sure our customers' databases are consistently fast and secure."

Customers' data will be stored on solid-state drives (SSDs) to allow for fast data throughput rates, AWS said. Typical requests to the database will take milliseconds to complete, Vogels wrote in a blog post

"With Amazon DynamoDB, developers scaling cloud-based applications can start small with just the capacity they need, and then increase the request capacity of a given table as their app grows in popularity," he said. "Behind the scenes, Amazon DynamoDB automatically spreads the data and traffic for a table over a sufficient number of servers to meet the request capacity specified by the customer."

The service can integrate with Amazon's Elastic MapReduce data analytics engine — a clone of Hadoop — so customers can link multiple DynamoDB tables together and run analytics on top of them. It can also hook into Amazon's cloud-based Simple Storage Service (S3). This lets people put frequently accessed data on the SSDs that power DynamoDB, while leaving less frequently accessed data on the slower — but cheaper — S3 storage.

Technically complex

DynamoDB can be seen as a more expansive, technically complex version of AWS's existing non-relational database service, SimpleDB, according to Vogels. A key difference is the new product has no size limit on its tables, compared with SimpleDB's 10GB limit.

The new database service has a feature called Provisioned Throughput, which lets developers specify the throughput capacity they require for specific tables within their DynamoDB database. In consequence, it can deliver predictable performance at any scale — a crucial feature for online gaming, advertising and real-time analytics applications, Vogels said.

This is Amazon taking the fight to others. It is right in the heart of platform-as-a-service.
– James Governor, Redmonk

In addition, the database service can scale up to 10,000 reads and 10,000 writes per second across 256 tables, though AWS is "ready, willing and able to increase any of these values" for an undisclosed price on a developer-by-developer basis, cloud-computing senior manager Jeff Barr wrote in a blog post.

DynamoDB came into being in much the same way as Hadoop, the data analytics platform put into development by Yahoo and the open-source community after Google outlined the logical structure of the technology in white papers. Amazon developed the technology inside Dynamo internally and used it to power its S3 storage cloud, among other things. In 2007, it outlined the software in a paper that prompted the open-source community to develop variants such as Dynomite.

The company asked AWS developers to test Dynamo, but they found it difficult to use and preferred SimpleDB, Vogels said. After that, Amazon took the major decision of developing Dynamo so it could be provisioned 'as-a-service'. DynamoDB is already being used by websites such as IMDB, Formspring and TapJoy, according to Vogels.

Potential concern

One potential concern could be the reliability of the company's delivery infrastructure, which had a number of hiccups in 2011. One glitch saw AWS's cloud delete customer data held in its Elastic Block Storage (EBS) service.

Initially, DynamoDB is available via the US-East1 region, one of AWS's seven global infrastructure hubs. In April, connectivity errors led to slower responses on EBS volumes in this region, even though AWS insists its use of availability zones prevents cascading fails.

People can try the service for free using Amazon Web Service's Free Usage Tier. However, this only allows 100MB of storage, with five writes per second and 10 reads per second. The paid-for service lets developers reserve the capacity they need and pay $1 (£0.64) per GB per month, along with $0.01 per hour for every 10 units of write capacity and $0.01 per hour for every 50 units of read capacity.


Get the latest technology news and analysis, blogs and reviews delivered directly to your inbox with ZDNet UK's newsletters.
Editorial standards