Amazon improves performance and cuts cost of data warehouse service
Amazon Web Services has introduced a new node type with SSD storage and faster processors
IDG News Service - Amazon Web Services has improved the performance of its Redshift data warehouse with new SSD-based nodes, which can also lower the cost of the service as long as storage capacity needs are also low.
Just like with many of its other hosted services, Amazon contends that Redshift lowers the bar for implementing and managing, in this case, a data warehouse. IT takes care of the work needed to set up and operate a data warehouse, including provisioning the infrastructure and automating tasks such as backups and patching.
Redshift data warehouses are made up of clusters of Dense Storage nodes or the new SSD-based Dense Compute nodes. The storage nodes allow enterprises to create very large data warehouses using hard disk drives for a low price per gigabyte, while the compute nodes allow enterprises to build high-performance data warehouses using faster CPUs, large amounts of RAM and SSD storage.
The compute nodes are ideal for enterprises which have less than 500GB of data in their warehouse or whose primary focus is performance. The storage nodes are a better fit when performance isn't as critical and storage demands are high but the budget isn't.
On-demand prices for a single Large Dense Compute node start at $0.25 per hour. For that users get 160GB of SSD storage, two Intel Xeon E5-2670v2 virtual cores (based on Ivy Bridge) and 15GB of RAM. A single Extra Large Dense Storage node may cost from $0.85 per hour, but it has 2TB of storage. It also has two Intel Xeon E5-2650 virtual cores (based on Sandy Bridge) and 15GB of RAM.
For users that want better performance and more storage per node, Amazon also offers Eight Extra Large nodes in both Compute and Storage variants. Warehouses based on those two can contain up to 100 nodes, while ones based on the smaller nodes can contain up to 32 nodes.
Scaling a cluster up and down or switching between node types is done using API calls or the AWS Management Console.
Since its launch in February last year, Redshift -- which still is in beta -- has been adopted by companies such as Fender, Financial Times, Nasdaq OMX, Nokia, and Pinterest, Amazon said.
- Hadoop for Dummies Today, organizations in every industry are being showered with imposing quantities of new information. Along with traditional sources, many more data channels and...
- The Top Five Ways to Get Started with Big Data Despite the increased focus on big data over the past few years, most organizations are still talking about what big data is rather...
- Data Warehouse Augmentation: The Queryable Data Store While organizations have, to date, been busy exploring and experimenting, they are now beginning to focus on using big data technologies to solve...
- The IBM Big Data Platform IBM is unique in having developed an enterprise class big data platform that allows you to address the full spectrum of big data...
- Live Webcast Best Practices: How to Improve Business Continuity with Virtualization VMware solutions include a range of business continuity capabilities to help ensure availability for applications across your virtualized environment. Learn More>>
- Cloud Knowledge Vault Learn how your organization can benefit from the scalability, flexibility, and performance that the cloud offers through the short videos and other resources...
- Endpoint Data Management: Protecting the Perimeter of the Internet of Things Not surprisingly, "Internet of Things" (IoT) and Big Data present new challenges AND opportunities for enterprise IT. Teams need to harness, secure and... All Data Center White Papers | Webcasts