Amazon improves performance and cuts cost of data warehouse service
Amazon Web Services has introduced a new node type with SSD storage and faster processors
IDG News Service - Amazon Web Services has improved the performance of its Redshift data warehouse with new SSD-based nodes, which can also lower the cost of the service as long as storage capacity needs are also low.
Just like with many of its other hosted services, Amazon contends that Redshift lowers the bar for implementing and managing, in this case, a data warehouse. IT takes care of the work needed to set up and operate a data warehouse, including provisioning the infrastructure and automating tasks such as backups and patching.
Redshift data warehouses are made up of clusters of Dense Storage nodes or the new SSD-based Dense Compute nodes. The storage nodes allow enterprises to create very large data warehouses using hard disk drives for a low price per gigabyte, while the compute nodes allow enterprises to build high-performance data warehouses using faster CPUs, large amounts of RAM and SSD storage.
The compute nodes are ideal for enterprises which have less than 500GB of data in their warehouse or whose primary focus is performance. The storage nodes are a better fit when performance isn't as critical and storage demands are high but the budget isn't.
On-demand prices for a single Large Dense Compute node start at $0.25 per hour. For that users get 160GB of SSD storage, two Intel Xeon E5-2670v2 virtual cores (based on Ivy Bridge) and 15GB of RAM. A single Extra Large Dense Storage node may cost from $0.85 per hour, but it has 2TB of storage. It also has two Intel Xeon E5-2650 virtual cores (based on Sandy Bridge) and 15GB of RAM.
For users that want better performance and more storage per node, Amazon also offers Eight Extra Large nodes in both Compute and Storage variants. Warehouses based on those two can contain up to 100 nodes, while ones based on the smaller nodes can contain up to 32 nodes.
Scaling a cluster up and down or switching between node types is done using API calls or the AWS Management Console.
Since its launch in February last year, Redshift -- which still is in beta -- has been adopted by companies such as Fender, Financial Times, Nasdaq OMX, Nokia, and Pinterest, Amazon said.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- HP HAVEn: See the big picture in Big Data HP HAVEn is the industry's first comprehensive, scalable, open, and secure platform for Big Data. Enterprises are drowning in a sea of data...
- Piecing Together the Business Intelligence Puzzle Business intelligence (BI) technology collects and analyzes company data, delivering relevant information to corporate decision-makers in an effort to produce favorable outcomes.
- Harness IT -- An Introduction to Business Intelligence Solutions Learn the key selection criteria required to provide your organization with the capability to address structured data, unstructured data and mobile demands so...
- Business Intelligence Shows its Smarts Today's Business Intelligence (BI) tools provide a new way to think about data with self-service capabilities and user-friendly analytics that can be used...
- Cloud Knowledge Vault Learn how your organization can benefit from the scalability, flexibility, and performance that the cloud offers through the short videos and other resources...
- Testimonial: Cystic Fibrosis Trust Peter Hawkins, the Head of IT for Cystic Fibrosis Trust, discusses the role CommVault's Simpana software platform plays in improving the company's information... All Data Center White Papers | Webcasts