Press Release – Multicore World
If New Zealand gets its IT act together, it could become the Switzerland of data storage says the founder of Multicore World 2013 .
New Zealand to Become the Switzerland of Secure Data Storage?
Press Release – Wednesday, 9 January 2013
Multicore World 2013
If New Zealand gets its IT act together, it could become the Switzerland of data storage says the founder of Multicore World 2013.
Nicolas Erdody says New Zealand has a number of factors in its favour in harnessing the power and opportunity represented in multicore’s ability to pack many processors onto one computer chip.
“Firstly and most obviously, we need a second fibre optic cable in and out of our country,” Erdody says.
“Once that was signalled though, there’s every incentive and prospect for New Zealand to really start touting its location and ability to be a trusted data storage and crunching resource for the entire world.”
Again, predicated on a second cable, New Zealand would instantly be known as a safe country to keep data secure, legally he says.
“We’re known for the transparency and rule of law of our legal system, and continue to rank highly as an honest and easy country in which to do business,” Erdody says.
“At the same time, any server farms – which could be massive areas covering several hectares – could be efficiently cooled through renewable energy. Our ability to provide such a green tick for power hungry data storage and data crunching would be viewed most favourably by the likes of Google and Facebook, which already stores some of its data in Sweden near the Arctic. That is, provided we have a second cable to provide backup and redundancy.”
The opportunity of multicore computing and parallel programming can also provide the hub of new entrepreneurial businesses based on the technology. Oracle’s CEO Larry Ellison now insists on the importance of a parallel architecture strategy; with all software development in the multicore era needing to be hardware aware.
“Innovation occurs at the fringe, and it is pretty difficult to be more edgy than New Zealand,” says Erdody.
“It is not at all outlandish to envisage global entities looking to partner with clever Kiwi companies to solve their multicore challenges. Imagine the possibilities if we keep some of our IT talent onshore, delivering answers other countries can’t.”
The wider debate of what is required to build multicore-oriented competence and services out of New Zealand are to be discussed at Multicore World 2013. “There’s no other forum that addresses this key component for our IT future,” Erdody says.
What is multicore?
The ability of computers to process massive amounts of data has been growing ever since they were invented.
As computer power has increased, the speed of processing has reached a physical barrier, and more processing power cannot be put onto a chip without overheating.
The problem has been solved by putting more processors onto a single chip, creating multicore chips. These multicore chips entered the mainstream market a few years ago, and all vendors currently sell them. They are now standard kit in all laptops, desktops and smartphones.
Multicore chips are also more power efficient, and the number of cores able to be added is theoretically virtually unlimited.
Previously impossible computational tasks can now be achieved. And processes which previously took, days or even weeks to perform can now be done swiftly.
But while this new processing power enables computers to do things faster, it also adds new challenges.
Before Multicore computer software was written for a single central processing unit (CPU) in a chip. To exploit the potential of multicore chips, software now needs to be written while thinking in parallel.
But parallel programming is different than traditional programming, and so far few programmers have experience of it.
Multicore is a mainstream but (as yet) niche new technology.
In the next 10-15 years, there will be huge opportunities to translate sequential programming (‘traditional’) legacy code, and to create new software that takes full advantage of thousands of cores in the next generation of chips.
Around the world parallel computing is currently used to process vast quantities of data produced by the internet and the “big data” originating out of social networks and millions of intelligent data recording devices attached to the internet.
Here in NZ it is also used in the biggest CGI rendering facility in the world at Wellington’s Weta Digital.
And soon it will be a key component of the information processing required to handle the data produced by the Square Kilometer Array radio – telescope – a global scientific project that New Zealand is a part of.
In addition, there is a wide range of services, solutions and systems integration challenges to connect the two world’s together.