Computing an answer to life, the universe and everything

In the 1970s, high performance computing was a major trend but in recent years, it’s fallen into the shadows created by the personal computer and the world wide web. Indeed, for a while it seemed that HPC’s destiny was to provide the basis for the Deep Thought computer in Douglas Adams’ satire, The Hitchhiker’s Guide to the Galaxy (HG2G), which was designed to provide the answer to life, the universe and everything (which we now know to be 42, of course!).

In reality, HPC never went away and technology has been improving because of Fujitsu (and others) innovating and investing (indeed, IBM named one of their Chess-playing computers Deep Thought, in reference to the HG2G computer).

Last week I wrote about the Lyons Electronic Office (LEO) as an important part of Fujitsu’s heritage and I referenced the Fujitsu K supercomputer.  We normally avoid talking about Fujitsu products on this blog but creating the world’s fastest supercomputer is a truly impressive feat of technological engineering – and, besides, in a recent CIO forum I was asked “so why do you bother given only a few of these installations will ever be sold?”.  That’s is a fair question and one to which I think I gave a reasonable answer at the time but it is also one that merits further exposition – hence this post.

What I didn’t say in answer to the CIO who asked me that question was that the K supercomputer has over 22,032 blade servers fitted into 864 server racks delivering 705,024 for parallel computation jobs. It is based on the Fujitsu eight-core “Venus” Sparc64-VIIIfx processor running at 2GHZ delivering 128 gigaflops per chip.  However, I confess that I did say that it has achieved the words best LINPACK benchmark performance of 10.51 petaflops (quadrillion floating point operations per second) with a computer efficient ratio of 93.2%; softening the geeky answer by explaining that the name “K” comes from the Japanese Kanji character “Kei” which means 10 peta (1016) and that in its original sense Kei expresses a large gateway, and it is hoped that the system will be a new gateway to computational science.   More detail on the testing process and league table can be found in the TOP500 project’s presentation.

The importance of the K supercomputer is both what it enables through its staggering computational power and how the technological advances it represents are cascaded through our future product portfolio.  Of course, we’re not the only company that takes supercomputing seriously and IBM Watson is another example, even competing in TV gameshows!

Our digital world is growing exponentially and if we want to enrich it through technology and bring to life the Internet of things then, along with the storage capacity, we need compute power.  As we get closer to using sensors to drive or manage real-time events, we need to deploy faster computational power.  However, that compute power needs to be available at a sensible cost level and market forces are busy at work here in the cloud computing context.

Interpreting the world in real-time has led some to ponder how soon will we have computer processing power to rival that of the human brain.  I’ve seen some articles asserting that 10 petaflops is the processing power of the human brain although I think the general informed consensus is that it is in reality at least 100 petaflops and perhaps a factor ten times higher than that.  IBM have apparently forecast the existence of a brain-rivalling real-time supercomputer by 2020 although how it would be powered and the space required to hold it may limit applicability!  Inspired by the K supercomputer, Gary Marshall asks if technology could make our brains redundant but it’s worth noting that no computer has yet passed the famous Turing Test (i.e. we can still tell the difference between a machine response and a human response).

Advances in supercomputing are bringing new capabilities into the digital world as what was once restricted and unique becomes pervasive technological capability.  Once we are able to connect sensors and actuators with sufficient processing power to enable their connection to be meaningful then we can enrich the quality of human life.  This concept is at the heart of the Fujitsu vision, delivering human-centric intelligent society.

Fujitsu PRIMEHPC FX10I hope I’ve shown how the criticality of the K supercomputer and our drive to commercialise those technological advances through various models including cloud computing.  It lies at the heart of our vision of how technology will continue to evolve to enrich our lives, not just in enabling high performance computation for research simulations but in delivering solutions that will touch our everyday lives, as shown in the Discovery Channel’s video about the K supercomputer.  As for the addressable market, there is a commercial variant of the K, called the PRIMEHPC FX10.

I do hope you will forgive me an atypically Fujitsu-centric post.  The question the CIO asked me was a good one, it made me think how to give context to something I’d come to assume was obvious.

At the head of the post, I mentioned Douglas Adams’ Deep Thought computer… if I think back to the reasons we built K and the type of workloads it will process (medical research, astronomy, physics, etc.), maybe it really is computing an answer to life, the universe and everything.

Some thoughts on the “Internet of Things”

The “Internet of Things” has become a commonly used phrase and I think it’s quite a good on: we’ve some idea what the “Things” are but no idea where it will lead (although Hollywood has tried a few times over the years). One thing we can not do is dissolve ourselves of the management responsibility as there will always need to be humans somewhere in the system to avoid the “Skynet” scenario from the Terminator films.

More positively, the Internet of Things has the potential to make the digital world a very pervasive aspect of our daily lives in the physical world, supporting and enhancing many of the positive aspects of society and the aspirations we have for living together.

Eventually, people will have as many sensors as a Formula One racing cars (well, quite a few anyway!), sending lots of data in to the cloud.  Not quite as wired up as the people involved with the measured life movement but they are leading the charge. At some point out human-centric devices will become patches (electronic tattoos), powered by energy harvested from our bodies (thermal or kinetic) and that’s when things get really exciting. We can expect to see mobile phones being used as a proxy device to pass telemetry to the Cloud. You can see why the Health industry wants this technology (although, by then we’re not talking about health but “well being”) and we might need far fewer trips to visit a doctor as a result.

Intelligent Device Hierarchy Potential, by Harbor ResearchNow with the number of “things” feeding the Internet, the potential to manage and change the way we do things is an exciting prospect – and we’re not just looking at health – examples include energy management, traffic management, alert and monitoring systems – the list goes on.

One example reaching commercial introduction is the Boeing-Fujitsu partnership with RFID tags, where the tags contain the service history of a component and, using hand held scanners, maintenance staff can determine which parts need to be serviced – how long before this can be managed in-flight, with the parts waiting on the ground ready to intercept the aircraft on a just-in time basis?

Another aspect of the Internet of things will be our ability to make smart decisions based on the large volumes of data we will have to hand. This “big data”, along with associated analytics tools, can be used to spot patterns with examples including traffic, energy consumption and weather. Imagine a world of connected systems where the weatherman might not only predict the “Barbeque Summer” with a little more accuracy, but we won’t get stuck in traffic as we all rush to the beach!

Image credit: Harbor Research