Microsoft adding in-memory technology to SQL Server
Code-named Hekaton, Microsoft's in-memory technology will allow an entire database to be run from memory
IDG News Service - In a move to speed online transaction processing (OLTP), Microsoft is adding in-memory capabilities into its SQL Server relational database management system.
The next version of SQL Server will feature the ability to host database tables or even entire databases within a server's working memory. "Any application that is throttled by the I/O of a hard disk would benefit by [having its data] moved into memory," said Doug Leland, Microsoft general manager. Currently Microsoft is testing this in-memory technology, codenamed Hekaton, with a number of customers.
By holding a database table in memory, a server can execute transactions using that table much more quickly because the database server doesn't have to immediately read data from, nor write data to, a disk. Microsoft predicts that its in-memory technology can run transaction 50 times faster than a standard SQL Server setup.
The Hekaton technology is geared for OLTP workloads, such as online banking systems, ERP (enterprise resource planning) systems, and other heavily used disk-bound transactional systems, Leland said. Hekaton runs only on a single server, though it has no hard limit on how much memory it can use, so it can scale up to however much RAM can be installed on a single server.
Hekaton maintains all the ACID (atomicity, consistency, isolation, durability) properties required of relational databases. It writes serialized transaction logs in-memory and then periodically writes these logs to disk, Leland said. The software uses a number of compression algorithms to fit more data in memory. For instance, the content is stored in a column store, so like data types are clustered together.
To help database administrators deploy the in-memory technology, the next version of SQL Server will include a tool to easily designate databases or individual database tables that can be run in memory. No changes will be required of the applications that use these databases. Additionally, Hekaton will be able to compile store-procedures so that they run in-memory as well. "You can compile your stored procedures and run them as native machine code," Leland said.
In-memory technologies have grown more popular among enterprises that want to process data more quickly. Oracle's Exadata and SAP's HANA both address this market as well. Grafting the in-memory technology into SQL Server itself will simplify the customer's IT architecture because it eliminates the need to procure and maintain separate stand-alone in-memory technology, Leland argued.
Leland points out this is not Microsoft's first foray into in-memory technology. Both PowerPivot and Power View use in-memory technologies to allow users to quickly manipulate large swaths of data within Excel.
Microsoft announced the new technology at the Professional Association for SQL Server (PASS) Summit, being held this week in Seattle. Microsoft unveiled a few other new products at the conference as well
The company also announced that it would soon release the next version of its data warehouse appliance, SQL Server 2012 Parallel Data Warehouse (PDW). Using a new data processing engine called PolyBase, this version of PDW will be able to run queries against both relational data and non-relational data managed by Apache Hadoop. Hadoop queries will be channeled through the Apache Hive data warehouse software.
The company has also released SQL Server 2012 SP1 (Service Pack 1), which, among other features, includes the ability for Excel 2013 users to work directly with SQL Server data.
Microsoft did not divulge any details about when the next version of SQL Server would be released.
- Budd Van Lines Moves Data Closer to Home Shipping and logistics company Budd Van Lines uses Infinio to improve performance on their VDI environment. The company employs a virtualized datacenter based...
- Storage Performance with Cost Control As IT groups expand their server virtualization initiatives, central storage performance can become the bottleneck and create poor end user experience.
- Pivotal Melds Big Data and Platform-as-a-service The value of Information has increased, so has the business's thirst for more information. Access to data and collaboration are at the heart...
- The Pivotal Big Data Suite- Reducing the Risks of Big Data The explosion of big data and the rapid evolution of big data tools and technologies is challenging IT to meet the demands of...
- The Key to Happiness: Throw out Your Data Warehouse In this webinar, Kerry Reitnauer, Director, Solution Architect at FairPoint Communications will discuss the challenges the data warehouse brought, how they migrated to...
- Building Tomorrow's Data Center with Converged Technologies A number of forces are converging: the cloud, converged infrastructure, big data and fabric architectures to name a few. All Data Center White Papers | Webcasts
Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!