Cloud-based Amazon data warehouse now available
Redshift data warehouse service costs from 85 cents per hour on-demand
IDG News Service - Amazon Web Services has made available to all users its cloud-based data warehouse Redshift, which it pitches as a lower-cost alternative to on-premise deployments.
Amazon Redshift was first announced as a limited preview at the AWS re:Invent conference, but has now been made generally available from its US East (North Virginia) data center and will be rolled out to other centers in the coming months.
Just like with its other cloud-based offerings, Amazon hopes to attract enterprises to Redshift with the speed and cost of setting up a data warehouse in its cloud.
Traditional data warehouse solutions are really expensive and complicated to manage, Amazon Web Services' Andy Jassy said when the product was launched. Redshift, on the other hand, is about a tenth of the cost, and also automates the deployment and administration, according to Jassy.
With the AWS Management Console or the Amazon Redshift APIs, users can provision a single 2TB data warehouse or as a cluster of 16 2TB nodes or 16TB nodes, by default.
The nodes are called High Storage Extra Large (XL) and Storage Eight Extra Large (8XL).In addition to 2TB or 16TB of storage, they also have 15GB or 120GB of RAM.
On-demand pricing starts at 85 cents per hour for an XL node and $6.80 per hour for the 8XL node. Reserved instance pricing lowers the effective price to $0.228 per hour or under $1,000 per terabyte per year, according to Amazon.
Good security and reliability are key ingredients for any data warehouse. Data written to a node in a Redshift cluster is automatically replicated to other nodes within the cluster and all data is continuously backed up to Amazon's Simple Storage Service (S3), according to Amazon.
To secure data, Redshift can use SSL in transit and hardware-acccelerated AES-256 encryption for primary storage as well as backed-up data. Using Virtual Private Cloud, Redshift can be connected with an enterprise's existing data center using encrypted VPN tunnels.
A number of vendors have been testing their big data and analytics products with the SQL-based Redshift. Users can buy Jaspersoft's Reporting and Analytics for AWS on Amazon's Marketplace, for example. The service can also be integrated with business intelligence tools from the likes of Business Object and Cognos, according to Amazon.
Data can be uploaded to Redshift in a number of different ways. Companies that have a lot of data can use AWS Direct Connect to set up a private network connection at 1G bps or 10G bps between their datacenter and Amazon's cloud. They can also use AWS Import/Export to send data on portable storage devices.
Redshift can also use AWS Data Pipeline to import data or load data directly from services such as S3 and DynamoDB.
Send news tips and comments to firstname.lastname@example.org
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Pay-as-you-Grow Data Protection: IBM Tivoli's Full-featured Data Protection Suite for Small to Medium Businesses IBM Tivoli Storage Manager Suite for Unified Recovery gives small and medium businesses the opportunity to start out with only the individual solutions...
- Simplify and Consolidate Data Protection for Better Business Results Learn about IBM® Tivoli® Storage Manager Operations Center, which provides advanced visualization, built-in analytics and integrated workflow automation features that leapfrog traditional backup...
- Smarter Environmental Analytics Solutions: Offshore Oil and Gas Installations Example This IBM Redbooks® Solution Guide describes a solution for implementing smarter environmental monitoring and analytics for oil and gas industries. The solution implements...
- Who's afraid of the big (data) bad wolf? Survive the big data storm by getting ahead of integration and governance functional requirements This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well...
- Webinar: Building a Big Data solution that's production-ready Big data solutions are no longer just a nice-to-have. All Big Data White Papers | Webcasts