In the 1970s, high performance computing was a major trend but in recent years, it’s fallen into the shadows created by the personal computer and the world wide web. Indeed, for a while it seemed that HPC’s destiny was to provide the basis for the Deep Thought computer in Douglas Adams’ satire, The Hitchhiker’s Guide to the Galaxy (HG2G), which was designed to provide the answer to life, the universe and everything (which we now know to be 42, of course!).
In reality, HPC never went away and technology has been improving because of Fujitsu (and others) innovating and investing (indeed, IBM named one of their Chess-playing computers Deep Thought, in reference to the HG2G computer).
Last week I wrote about the Lyons Electronic Office (LEO) as an important part of Fujitsu’s heritage and I referenced the Fujitsu K supercomputer. We normally avoid talking about Fujitsu products on this blog but creating the world’s fastest supercomputer is a truly impressive feat of technological engineering – and, besides, in a recent CIO forum I was asked “so why do you bother given only a few of these installations will ever be sold?”. That’s is a fair question and one to which I think I gave a reasonable answer at the time but it is also one that merits further exposition – hence this post.
What I didn’t say in answer to the CIO who asked me that question was that the K supercomputer has over 22,032 blade servers fitted into 864 server racks delivering 705,024 for parallel computation jobs. It is based on the Fujitsu eight-core “Venus” Sparc64-VIIIfx processor running at 2GHZ delivering 128 gigaflops per chip. However, I confess that I did say that it has achieved the words best LINPACK benchmark performance of 10.51 petaflops (quadrillion floating point operations per second) with a computer efficient ratio of 93.2%; softening the geeky answer by explaining that the name “K” comes from the Japanese Kanji character “Kei” which means 10 peta (1016) and that in its original sense Kei expresses a large gateway, and it is hoped that the system will be a new gateway to computational science. More detail on the testing process and league table can be found in the TOP500 project’s presentation.
The importance of the K supercomputer is both what it enables through its staggering computational power and how the technological advances it represents are cascaded through our future product portfolio. Of course, we’re not the only company that takes supercomputing seriously and IBM Watson is another example, even competing in TV gameshows!
Our digital world is growing exponentially and if we want to enrich it through technology and bring to life the Internet of things then, along with the storage capacity, we need compute power. As we get closer to using sensors to drive or manage real-time events, we need to deploy faster computational power. However, that compute power needs to be available at a sensible cost level and market forces are busy at work here in the cloud computing context.
Interpreting the world in real-time has led some to ponder how soon will we have computer processing power to rival that of the human brain. I’ve seen some articles asserting that 10 petaflops is the processing power of the human brain although I think the general informed consensus is that it is in reality at least 100 petaflops and perhaps a factor ten times higher than that. IBM have apparently forecast the existence of a brain-rivalling real-time supercomputer by 2020 although how it would be powered and the space required to hold it may limit applicability! Inspired by the K supercomputer, Gary Marshall asks if technology could make our brains redundant but it’s worth noting that no computer has yet passed the famous Turing Test (i.e. we can still tell the difference between a machine response and a human response).
Advances in supercomputing are bringing new capabilities into the digital world as what was once restricted and unique becomes pervasive technological capability. Once we are able to connect sensors and actuators with sufficient processing power to enable their connection to be meaningful then we can enrich the quality of human life. This concept is at the heart of the Fujitsu vision, delivering human-centric intelligent society.
I hope I’ve shown how the criticality of the K supercomputer and our drive to commercialise those technological advances through various models including cloud computing. It lies at the heart of our vision of how technology will continue to evolve to enrich our lives, not just in enabling high performance computation for research simulations but in delivering solutions that will touch our everyday lives, as shown in the Discovery Channel’s video about the K supercomputer. As for the addressable market, there is a commercial variant of the K, called the PRIMEHPC FX10.
I do hope you will forgive me an atypically Fujitsu-centric post. The question the CIO asked me was a good one, it made me think how to give context to something I’d come to assume was obvious.
At the head of the post, I mentioned Douglas Adams’ Deep Thought computer… if I think back to the reasons we built K and the type of workloads it will process (medical research, astronomy, physics, etc.), maybe it really is computing an answer to life, the universe and everything.