At its first user conference, called AWS re: Invent, Amazon Web Services today launched its newest cloud-based service, called Redshift. Meant to be a petabyte-scale data warehouse, AWS officials say Redshift allows businesses to drop their data warehousing costs by 10 times compared to on-premise systems.
Redshift has launched in a limited beta and is expected to be fully rolled out next year.
Redshift is aimed directly at on-premise data warehousing systems, most notably from Oracle, SQL Server and Green Plum. "All these guys are suddenly confronted with a new full-featured SQL interface completely in the cloud with dramatically lower pricing," says Merv Adrian, a Gartner analyst who tracks big data. "This is a potentially massive disruption for traditional data warehousing vendors, but it can't come as a surprise."
In addition to Redshift, AWS also announced another reduction of its prices, this time for its Simple Storage Service (S3) in what amounts to an average 25% price cut for most S3 services.
Redshift is meant to be a cloud-based system where users can set up, manage, monitor and back up large-scale data warehouses. It's optimized for columnar data storage and structured data and takes advantage of advanced compression techniques to leverage high input/output functionality. AWS also says a variety of partnering business analytics tools are integrated to work on top of Redshift, taking advantage of parallel queries across clusters of some of the cheapest virtual machine instances AWS offers. Initial analytics partners include Jaspersoft and MicroStrategy.
Redshift is priced in two node types, 2TB or 16TB versions, with on-demand pricing starting at $0.85 per hour, with less expensive rates for longer-term reserved instance commitments. AWS says traditional on-premise data warehousing tools can run between $19,000 to $25,000 per year per TB, while Redhsift can be as little as $1,000 per year per TB.
Redshift will fit in with AWS's other data storage products, the most popular of which is its Simple Storage Service (S3). AWS launched Glacier earlier this year as a long-term, inexpensive storage option. AWS also already has DynamoDB, which is a solid-state drive managed NoSQL database that is highly scalable and fault tolerant. It also has Elastic MapReduce, which is a Hadoop-based analytics platform. MapReduce allows users to spin up scalable Hadoop clusters using Amazon's Elastic Compute Cloud (EC2) and S3. It's used for web indexing, data warehousing, machine learning, data mining and log file analysis, among other tasks, Amazon says.
There are a variety of third-party tools already on the market that provide big data analytics. AWS recently launched a "Big Data" section on its Marketplace, which are applications that are designed to run on Amazon's cloud. SAP's HANA One, Sumo Logic, Metamarkets and Splunk Storm are some of the apps designed for big data analysis already on Amazon's Marketplace. AWS also already partners with companies like Think Big Analytics, MarketShare and MapR for its Elastic MapReduce function.
Sign up for CIO Asia eNewsletters.