There was some serious brainpower assembled at the recent Multicore World 2013 conference in Wellington.
Some of the subject matter was way-beyond the once over lightly understanding that I have of computers and computing. But the clear message was that there?s currently a big gap between what multicore computers (many cores on one chip) are capable of, and the programs to run them.
As an analogy, it is as if a car engine has eight cylinders, but there?s only fuel getting through to one or two of them ? vastly decreasing its possible performance.
Put another way, multicore hardware is way, way ahead of multicore software. (When you consider that 64+ cores on a chip are now being manufactured, it is obvious, as has often been stated, that hardware capability is no longer an issue). How this clear gap is resolved is very much a problem in search of an answer.
It would?ve been good to see a heavier concentration of government and corporate IT heavyweights at the two-day conference held at the Wellington Town Hall.
The line-up of speakers would grace any northern hemisphere conference (and no doubt pull in hundreds of attendees), looking over the horizon at where the actual bits and bytes of computing is heading.
In other words, as opposed to the frothy apps and gee-whiz retail end of things, this conference was about where all the hard work of computers, memory, transactions and data crunching takes place.
One of the underlying themes of the conference put together by Oamaru-based Nicolas Erdody (Open Parallel) is that NZ Inc has an opportunity as the world grapples with how to utilise the huge amount of power available, but not yet being accessed.
The (parallel) programming required to take advantage of multicore, where the instructions to and from each core has to inform and be informed by every other core, is not easy.
As Poul-Henning Kamp, a Danish software writer and inventor of ?Varnish? commented; ?parallelism is hard?..really, really hard.?
And one thing that hasn?t been decided is what computer language is best suited for writing parallel programming is still unclear ? and indeed numbers of languages could evolve.
New Zealand has the opportunity to be a niche operator and software supplier in this emerging world ? providing answers where others find it too difficult.
Ex-patriot Kiwi Dr Ian Foster (originally from Wellington, and these days among other roles the Professor of Computer Science at the University of Chicago) helped frame some of the already apparent and emerging possibilities capable through multicore in his keynote address.
(His, and the other presenters talks can be found here).
He described the exponentialism that multicores potentially provide as offering new paradigms that ?can bring about huge transformations?.
Where he sees the grunt of multicore having near-future effects are:
-????????? Digital visual effects
-????????? Digital fabrication (additive manufacturing)
-????????? Industrial internet (heavy industrial internet)
-????????? Data analytics (big data)
Foster says multicore is a hugely disruptive technology ? New Zealand has an opportunity to ride its wave, or (especially if the country doesn?t build a second fibre optic cable linking us to the rest of the world) be left behind.
Disclaimer:
I helped write some of the publicity and press releases around Multicore World 2013. The thoughts above are mine alone however.
?
?
Like this:
Loading...
sticK is by Peter Kerr, a writer for hire. I have a broad science and technology background and interest, with an original degree in agricultural science. My writing speciality is making the complex understandable. I am available for outside consultancy work, and for general discussions of converting a good idea into something positiveharden nor easter nor easter veep los angeles kings earth day timothy leary
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.