benchmarkingblog

Elisabeth Stahl on Benchmarking and IT Optimization

Archive for the ‘SPARC’ Category

Oracle Meets That ’70s Show

leave a comment »

Last week I made the annual spring break pilgrimage to my childhood home in the shadows of the cherry blossoms.

What always strikes me when I visit — and you’ve probably had the same experience — is how nothing, almost nothing, has changed since I lived there four decades ago. Yes, there’s a huge TV with cable now. And a cell phone, though not so smart yet. And an iPad that always needs something done to it. But other than these few new features, the general layout and beauty of the interior is essentially the same.

Which I love. Why get new kitchen cabinets when you can take the beautiful solid wood ones and have them refinished? Why buy new cheap chairs when 50’s Danish Modern is built so well and gorgeous to boot?

But one of the best examples of this retro environment, hands down, has to be the downstairs bathroom. When entering you are transported to the time of Nixon and Sonny and Cher. The colors are tremendous – bright bright yellows and oranges. Big plaid wallpaper. And wicker accessories. A 70’s dream of a bathroom. And you know what — it still looks great. The glamour of everything from the 70’s has returned in full force in this one tiny room.

But some things are not meant to come back. And that includes the way some vendors compare systems and benchmarks.

I recently saw a comparison from Oracle comparing the SPARC T7-1 vs. the IBM Power System S824. It brought me right back to when I started blogging almost ten years ago, when we were all inundated with benchmark flaws. Let’s take a look at some of the details :

  • The tool Oracle used to compare the systems is NOT an industry standard benchmark audited by a third party. It is a tool that can be used by anyone. Oracle ran all tests themselves.
  • The tool used is adapted from the TPC-C benchmark, which Oracle themselves has stated in the past that they feel is dated.
  • The disks used in the systems compared are not the same – HDD vs. SAS.
  • The logs and database files for the IBM test were not run on the IBM system – they were run on a different Oracle system.
  • Solaris 11.3 was used for the logs and database file systems on the Oracle side; Solaris 11.2 was used for the IBM configuration.

 

A photo of my childhood downstairs bathroom was Instagrammed recently. It received 35 likes, over half of them from students at the best design school in the country. That makes sense. Oracle’s benchmark comparisons don’t.

 

************************************************

Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.
TPC-C ,TPC-H, and TPC-E are trademarks of the Transaction Performance Processing Council (TPPC).

The postings on this site solely reflect the personal views of the author and do not necessarily represent the views, positions, strategies or opinions of IBM or IBM management.

technorati tags: , , ,,,,,,,,,,,,,

Written by benchmarkingblog

March 23, 2016 at 10:07 am

Posted in Oracle, POWER8, SPARC

Tagged with , ,

Oracle’s SPARC Enhancements: Construction or Wind ?

leave a comment »

Two nights ago I spent a lovely 6 hours in the airport. Flight cancelled, next plane delayed for incoming aircraft, no runways to be had in one of the largest airports in the country. Announcement 1: There was only one runway because the others were under construction. Announcement 2: There was only one runway that could be used because the wind patterns were strange.

All you want is to get home to your couch and your dog. At the same time it would be great to get the real story on what is happening. Just because you want to know, you want it to make sense.

And that’s exactly how I was feeling again as I read one of Oracle’s recent press releases on the Fujitsu SPARC M10 “enhancements.” The claim was for “15 world records.” I decided to take a look at each one just to know — was it the construction or the wind ?

1. Oracle needed 2.5x more cores/memory than IBM. The IBM result was from 4 years ago.
2. Oracle needed 2x more cores/memory than IBM. The IBM result was from 4 years ago.
3. Oracle compared themselves with themselves.
4. Oracle compared themselves with themselves.
5. Oracle needed 2x more cores than SGI.
6. Oracle compared themselves with themselves.
7. Oracle needed 2x more cores than IBM.
8. Oracle compared themselves with themselves.
9. Oracle needed 4x more cores than IBM.
10. Oracle compared themselves with themselves.
11. Oracle picked on little x86.
12. Oracle compared themselves with themselves.
13. Oracle needed 16x more cores than IBM. The IBM result was from 6 years ago.
14. Oracle needed 8x more cores than IBM. The IBM result was from 6 years ago.
15. Oracle needed 8x more cores than IBM. The IBM result was from 6 years ago.

Also note that there are really only 4 different benchmarks here. And notably all but 2 of these 15 are in the Technical Computing space, using simple component type benchmarks.

So that’s the real story. The other real story is that if I had driven the 500 miles I would have been home much faster.

************************************************

The postings on this site solely reflect the personal views of the author and do not necessarily represent the views, positions, strategies or opinions of IBM or IBM management.

technorati tags: , , ,,,,,,,,,,

,

Written by benchmarkingblog

April 11, 2014 at 2:49 pm

Posted in SPARC

Tagged with , , ,

%d bloggers like this: