Microsoft adding in-memory technology to SQL Server
Code-named Hekaton, Microsoft's in-memory technology will allow an entire database to be run from memory
IDG News Service - In a move to speed online transaction processing (OLTP), Microsoft is adding in-memory capabilities into its SQL Server relational database management system.
The next version of SQL Server will feature the ability to host database tables or even entire databases within a server's working memory. "Any application that is throttled by the I/O of a hard disk would benefit by [having its data] moved into memory," said Doug Leland, Microsoft general manager. Currently Microsoft is testing this in-memory technology, codenamed Hekaton, with a number of customers.
By holding a database table in memory, a server can execute transactions using that table much more quickly because the database server doesn't have to immediately read data from, nor write data to, a disk. Microsoft predicts that its in-memory technology can run transaction 50 times faster than a standard SQL Server setup.
The Hekaton technology is geared for OLTP workloads, such as online banking systems, ERP (enterprise resource planning) systems, and other heavily used disk-bound transactional systems, Leland said. Hekaton runs only on a single server, though it has no hard limit on how much memory it can use, so it can scale up to however much RAM can be installed on a single server.
Hekaton maintains all the ACID (atomicity, consistency, isolation, durability) properties required of relational databases. It writes serialized transaction logs in-memory and then periodically writes these logs to disk, Leland said. The software uses a number of compression algorithms to fit more data in memory. For instance, the content is stored in a column store, so like data types are clustered together.
To help database administrators deploy the in-memory technology, the next version of SQL Server will include a tool to easily designate databases or individual database tables that can be run in memory. No changes will be required of the applications that use these databases. Additionally, Hekaton will be able to compile store-procedures so that they run in-memory as well. "You can compile your stored procedures and run them as native machine code," Leland said.
In-memory technologies have grown more popular among enterprises that want to process data more quickly. Oracle's Exadata and SAP's HANA both address this market as well. Grafting the in-memory technology into SQL Server itself will simplify the customer's IT architecture because it eliminates the need to procure and maintain separate stand-alone in-memory technology, Leland argued.
Leland points out this is not Microsoft's first foray into in-memory technology. Both PowerPivot and Power View use in-memory technologies to allow users to quickly manipulate large swaths of data within Excel.
Microsoft announced the new technology at the Professional Association for SQL Server (PASS) Summit, being held this week in Seattle. Microsoft unveiled a few other new products at the conference as well
The company also announced that it would soon release the next version of its data warehouse appliance, SQL Server 2012 Parallel Data Warehouse (PDW). Using a new data processing engine called PolyBase, this version of PDW will be able to run queries against both relational data and non-relational data managed by Apache Hadoop. Hadoop queries will be channeled through the Apache Hive data warehouse software.
The company has also released SQL Server 2012 SP1 (Service Pack 1), which, among other features, includes the ability for Excel 2013 users to work directly with SQL Server data.
Microsoft did not divulge any details about when the next version of SQL Server would be released.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Who's afraid of the big (data) bad wolf? Survive the big data storm by getting ahead of integration and governance functional requirements This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
- Understanding big data so you can act with confidence Automating information integration and governance and employing it at the point of data creation helps organizations boost confidence in their big data.
- Integrating and Governing Big Data The end-to-end information integration capabilities of IBM® InfoSphere® Information Server are designed to help organizations understand, cleanse, monitor, transform and deliver data-as well...
- The MDM advantage: Creating insight from big data To help enterprises create trusted insight as the volume, velocity and variety of data continue to explode, IBM offers several solutions designed to...
- Webinar: Building a Big Data solution that's production-ready Big data solutions are no longer just a nice-to-have.
- Big Data and Analytics will transform your business - what you don't know will hurt you! Feedback from over 23,000 actual Oracle big data and analytics (BDA) customers was analyzed by Solitaire Interglobal Ltd (SIL) and their findings are... All Big Data White Papers | Webcasts