image missing
Date: 2024-07-17 Page is: DBtxt003.php txt00003400

Technology
The modern database ... amazing

Setting The Record Straight - SAP HANA v. Exadata X3

Burgess COMMENTARY
I stumbled on this article courtesy of 'Twitter', and became totally absorbed by the possibilities that such processing power presents.

I was in charge of a commercial computer installation in 1967 that had 4K of main memory ... about 500 HP of air condition power ... and used a tractor trailer's worth of punched cards every week. But even with this rather modest computing capacity we were able to do some fairly amazing things compared to what went before. Our efforts were written up as Harvard Business School cases back in those times.

The computation power that now exists should be enabling amazing analysis and in turn amazing conclusions and predictions. In the field of weather forecasting, massive computer modelling is giving very good resuls ... though it has not yet been able to prove that global warming is caused by human activity, which I might possibly argue is something that can never be proved even if it is actually so. This is actually important, because the casuation around many important matters cannot be proved, but even so important decisions still need to be made.

The possibility that scientists associated with SAP could actually take something like TrueValueMetrics and use it in a computational environment that has the power to model economic progress and performance in any community on the planet, and then aggregate these results so that countries, regions and eventuall the progress and performance of society as a whole can be optimized gets to be probable, not just possible or a silly dream.
Peter Burgess

Setting The Record Straight - SAP HANA v. Exadata X3

At the Oracle Open World keynote this week, Oracle repeated what Hasso showed years ago - 'Everything is in memory…Disk drives are becoming passé.' We are, of course, glad they realized this. With SAP HANA, our customers have been benefiting from this reality for more than 18 months now.

And yet Oracle made statements that are clear distortions and misrepresent HANA. It has become something of a recurring theme, to mis-state and distort things. As industry leaders, we must do better. It behooves us to tell the truth to our customers, our partners and our employees. We do not serve our stakeholders well by mis-statements and omissions of key things we know to be true. They deserve better. History deserves better. It is true that HANA represents a fundamentally new, rethought, database paradigm, and is receiving tremendous success in the market. Perhaps it is its disruptive nature that threatens the status quo of database incumbents. Perhaps it is some other reason. Regardless, I find myself once again setting the record straight.

The statement Mr Ellison made about HANA, when talking about the release of a new Exadata machine, that has 4TB of DRAM and 22TB of SSD, is false. He referred to HANA being 'a small machine' with 0.5TB of memory. He said his machine has 26TB of memory, which is also wrong (SSD is not DRAM and does not count as memory, HANA servers also use SSDs for persistence).

Here is the truth:

  • HANA systems range from the very small, to extremely large scale systems. HANA’s architecture, with full exploitation of massive-parallelism of multi-core systems, and native use of memory via new, totally redesigned data structures, enables nearly unlimited scalability.
  • We are presently shipping, for the last several months, certified 16-node HANA hardware made by 4 vendors: IBM, HP, Fujitsu and Cisco. These systems are available for 16TB of DRAM, so they are already 4 times bigger than Oracle's machine, and they have been in the market since spring of this year. The machines can take up to 32TB of DRAM, within their current configurations. In IBM's case, with the Max5 configuration, they can go up to 40TB.
  • The largest HANA system deployed so far is the 100-node system built earlier this year with Intel & IBM (see picture below). It has 100TB of DRAM and 4000 CPU cores. Mr. Ellison is welcome to come to Santa Clara and see the system working himself, with workloads from several customers. We shared this information in front of 10s of thousands of customers at our SAPPHIRE NOW event earlier this year. Already today this system can go up to 250TB of DRAM (and with HANA's compression, can hold multiple Petabytes of data entirely in-memory). Our partner, Steve Mills, Senior Vice President and Group Executive of IBM’s Software & Systems, whose team helped build this system, had this to say in support of this open innovation, “IBM and SAP have partnered to demonstrate an SAP HANA system at 100 TB, making it the largest in-memory database system in the world. That system, running on 100 IBM X5 nodes, can now scale to 250 TB.”
With the processor and memory roadmaps from our partners, these systems will be doubling in their capacity by this fall/early 2013 (so multiply these numbers above by 2, etc.). And we don't have to release new versions of hardware to take advantage of this innovation.

HANA is built on a simple notion: advances in hardware, and deep research into the nature of modern enterprise software, enable us to rethink the database. And we did. I treated this notion as a design principle for HANA’s construction. Others are trying to protect their database systems that were designed in the past. The use of new SSD access technology, which accelerates access to flash and demonstrates performance improvements, simply reinforces this point. HANA also benefits from this technology, for reading logs, for restart performance, etc. as do our ASE and IQ databases. But HANA is built on a basic principle that Hasso had articulated many years ago: when we run everything in-memory, we can get predictable response times on even the most complex queries and algorithms, and everything can execute with massive parallelism. This power gives us the freedom to rethink enterprise software. To renew existing systems without disruption: from OLTP apps (such as our Business One product that we released on HANA last week), to Analytics, from structured data processing, to unstructured. To rethink systems to run 1000s of times faster, and to eliminate batch jobs everywhere. It also gives us the ability to build completely new applications, unprecedented solutions. To help software simplify the world, and connect it better, in real-time: from genome sequencing to energy exploration, from real-time customer intimacy, to inclusive banking. To liberate us from the confines and limitations of systems of the past, and enable us to be limited only by our imagination.

As Alan Kay always told me, the future does not just have to be an increment of the past. We choose to focus on the future, on building a highly desirable, feasible and viable future, with our hands, with our customers and partners. Instead of focusing on incrementing limited systems of the past with temporary technologies. And we think there is no room for lies in that world. The truth of a HANA based landscape, and its unmistakable success, is open to all, and it is ours to take and build on.

Happy HANA.

Best,
Vishal

SITE COUNT Amazing and shiny stats
Copyright © 2005-2021 Peter Burgess. All rights reserved. This material may only be used for limited low profit purposes: e.g. socio-enviro-economic performance analysis, education and training.