Google has updated its big data analystics offering BigQuery, and said it will cut the cost of storing data in the service next month.
BigQuery is Google's cloud-based big data analysis service which allows businesses to analyse terabytes of their own data rapidly using SQL, without having to invest in their own infrastructure.
Google said BigQuery will now support larger result, so users can run queries that return "arbitrarily large numbers of rows" and save them as a new table for follow-up analysis. Previously, the maximum size for a query result was 128MB.
It also said window functions will now allow users to take advantage of built-in functions like 'Rank' and 'Partition' to create sophisticated statistical analyses with far simpler SQL than before, while query caching means that recent queries that are re-run return a cached result when the underlying table is unchanged, providing more cost-effective analysis.
Google said BigQuery's user interface has also had refresh, including the ability to validate a query and estimate costs prior to running it, and to save frequently used queries.
In a posting on the Google Enterprise blog BigQuery product manager Ju-kay Kwek said: "We know that today, more than ever, businesses need ways to store and rapidly analyze vast amounts of data and are looking for ways to accomplish this without huge infrastructure investments."
Like other cloud providers, Google is also driving down the cost of its services. From 1 July, Google is cutting the cost of BigQuery data storage cost from $0.12 per gigabyte per month to $0.08 per gigabyte per month.
Customers can opt for Google's standard rate for interactive queries, or those with high volumes of queries can choose tiered query pricing, which the search giant said will provide a "more economical and predictable cost" for interactive queries.
Earlier this week HP launched its own big data platform dubbed HAVEn in a bid to bundle its analytics software, hardware and services into one easy to consume package.