Can't afford a Database Machine? Oracle pushes compression as less lavish scale-up method
At OpenWorld, Oracle puts spotlight on 11g database's Advanced Compression feature
Computerworld - Oracle Corp.'s powerful new HP Oracle Database Machine comes with 168TB of storage, a new method of retrieving data more quickly and intelligently, and — wait for it — a $2.33 million price tag.
It's the turbocharged option for the database administrator with money to burn and a need for speed.
But most administrators don't get to drive in the fast lane — especially not with IT budgets the way they are. So as a less lavish option for enterprise users, Oracle is touting another approach.
That one involves data compression, which has long been a popular way to save storage space and money. Traditionally, though, the trade-off has been high: gobs of memory and processing power are typically needed to compress data and write it to disks. Even more is needed when the information is later extracted.
Now Oracle claims to have solved this thorny problem with a feature it first introduced in its Oracle 11g database, which was released last year.
By using the Advanced Compression option in 11g, Oracle says, administrators can shrink database sizes by as much as 75% and boost read/write speeds by three to four times, no matter whether they're running a data warehouse or a transaction-processing database — all while incurring little in the way of processor-utilization penalties.
Oracle claims the storage and speed gains are so dramatic that companies using Advanced Compression will no longer need to move old, seldom or unused data to archives. Instead, they can keep it all in the same production database, even as the amount of data stored there grows into the hundreds of terabytes or even the petabyte range.
"This works completely transparently to your applications," Juan Loaiza, Oracle's senior vice president of systems technologies, said during a session at the company's OpenWorld conference in San Francisco last week. "It increases CPU usage by just 5%, while cutting your [database] table sizes by half."
Oracle says it's responding to the demands of enterprise customers with fast-growing databases (download PDF). "The envelope is always being pushed," Loaiza said. "Unstructured data is growing very quickly. We expect someone to be running a 1 petabyte, 1,000-CPU-core database by 2010."
It's also responding to the fact that storage technology, one of the keys to database performance, has made little progress from a speed standpoint, according to Loaiza. "Disks are getting bigger, but they're not getting a whole lot faster," he said.
Taking data compression down to the block level
Oracle has offered simple index-level compression since the 8i version of its database was introduced in 1999. That improved several years later with the introduction of table-level compression in Oracle 9i Release 2, which helped data warehousing users compress data for faster bulk loads, according to Sushil Kumar, senior director of product management for database manageability, high availability and performance at Oracle.
- FAQ: Oracle (and HP's) new database in a box, accelerator
- Oracle enters hardware business with high-speed data warehouse server
- Can't afford a Database Machine? Oracle pushes compression as less lavish scale-up method
- Oracle, Red Hat spar over Linux
- Oracle puts its 11g database in Amazon's cloud
- Oracle's Fusion app suite may not ship until 2010
- Oracle's 'X'-file: It's not 11g R2, not solid-state disks
- Update: As OpenWorld nears, details of Oracle 11g R2 database emerge and are suppressed
- Best iPhone, iPad Business Apps for 2014
- 14 Tech Conventions You Should Attend in 2014
- 10 Desktop Apps to Power Your Windows PC
- How to Add New Job Skills Without Going Back to School
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Aberdeen Group: Marketing Analytics for Manufacturing: Forging Customer Insights There are no recalls for poor marketing. Manufacturers need to get their customer intelligence and messaging right the first time. Learn how.
- SIEM: Keeping Pace with Big Security Data Learn how SIEM can have the right database back-end and offer security intelligence that leverages contextual data to achieve a strong security posture...
- Accelerating Network Convergence in Virtualized and Cloud Data Centers Adopting a converged networking strategy enables organizations to traffic server and storage I/O workloads on consolidated data throughput channels. Intelligent software helps optimize...
- Omnichannel: From Buzzword to Strategy Customers demand a seamless experience across channels, especially mobile. Read this whitepaper for a research-based framework for using omnichannel for higher customer engagement.
- Webinar: Building a Big Data solution that's production-ready Big data solutions are no longer just a nice-to-have.
- Meg Whitman presents Unlocking IT with Big Data During this Web Event you will hear Meg Whitman, President and CEO, HP discuss HAVEn - the #1 Big Data platform, as well... All Databases White Papers | Webcasts