Are Quantum Computers On The Horizon?
Massachusetts Institute of Technology (MIT) and Austria’s University of Innsbruck claim to have put together a working quantum computer capable of solving a simple mathematical problem.
The architecture they have devised ought to be relatively easy to scale, and could therefore form the basis of workable quantum computers in the future – with a bit of “engineering effort” and “an enormous amount of money”, according to Isaac Chuang, professor of physics, electrical engineering and computer science at MIT.
Chuang’s team has put together a prototype comprising the first five quantum bits (or qubits) of a quantum computer. This is being tested on mathematical factoring problems, which could have implications for applications that use factoring as the basis for encryption to keep information, including credit card details, secure.
The proof-of-concept has been applied only to the number 15, but the researchers claim that this is the “first scalable implementation” of quantum computing to solve Shor’s algorithm, a quantum algorithm that can quickly calculate the prime factors of large numbers.
“The team was able to keep the quantum system stable by holding the atoms in an ion trap, where they removed an electron from each atom, thereby charging it. They then held each atom in place with an electric field,” explained MIT.
Chuang added: “That way, we know exactly where that atom is in space. Then we do that with another atom, a few microns away – [a distance] about 100th the width of a human hair.
“By having a number of these atoms together, they can still interact with each other because they’re charged. That interaction lets us perform logic gates, which allow us to realise the primitives of the Shor factoring algorithm. The gates we perform can work on any of these kinds of atoms, no matter how large we make the system.”
Chuang is a pioneer in the field of quantum computing. He designed a quantum computer in 2001 based on one molecule that could be held in ‘superposition’ and manipulated with nuclear magnetic resonance to factor the number 15.
The results represented the first experimental realisation of Shor’s algorithm. But the system wasn’t scalable as it became more difficult to control as more atoms were added.
However, the architecture that Chuang and his team have put together is, he believes, highly scalable and will enable the team to build quantum computing devices capable of solving much bigger mathematical factors.
“It might still cost an enormous amount of money to build, [and] you won’t be building a quantum computer and putting it on your desktop anytime soon, but now it’s much more an engineering effort and not a basic physics question,” said Chuang.
In other quantum computing news this week, the UK government has promised £200m to support engineering and physical sciences PhD students and fuel UK research into quantum technologies, although most of the cash will be spent on Doctoral Training Partnerships rather than trying to build workable quantum computing prototypes.
Courtesy-TheInq
Courtesy-TheInq
Will A.I. Create The Next Industrial Revolution?
Comments Off on Will A.I. Create The Next Industrial Revolution?
Artificial Intelligence will be responsible for the next industrial revolution, experts in the field have claimed, as intelligent computer systems replace certain human-operated jobs.
Four computer science experts talked about how advances in AI could lead to a “hollowing out” of middle-income jobs during a panel debate hosted by ClickSoftware about the future of technology.
“It’s really important that we take AI seriously. It will lead to the fourth industrial revolution and will change the world in ways we cannot predict now,” said AI architect and author George Zarkadakis.
His mention of the “fourth industrial revolution” refers to the computerization of the manufacturing industry.
If the first industrial revolution was the mechanisation of production using water and steam power, followed by the second which introduced mass production with the help of electric power, then the third is what we are currently experiencing: the digital revolution and the use of electronics and IT to further automate production.
The fourth industrial revolution, which is sometimes referred to as Industry 4.0, is the vision of the ‘smart factory’, where cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralized decisions.
These cyber-physical systems communicate and cooperate with each other and humans in real time over the Internet of Things.
Dan O’Hara, professor of cognitive computing at Goldsmiths, University of London, explained that this fourth industrial revolution will not be the same kind of “hollowing out” of jobs that we saw during the last one.
“It [won’t be] manual labour replaced by automation, but it’ll be the hollowing out of middle-income jobs, medium-skilled jobs,” he said.
“The industries that will be affected the most from a replacement with automation are construction, accounts and transport. But the biggest [industry] of all, remembering this is respective to the US, is retail and sales.”
O’Hara added that many large organisations’ biggest expense is people, who already work alongside intelligent computer systems, and this area is most likely to be affected as companies look to reduce costs.
“Anything that’s working on an AI-based system is bound to be very vulnerable to the replacement by AI as it’s easily automated already,” he said.
However, while AI developments in the retail space could lead to the replacement of jobs, it is also rather promising at the same time.
Mark Bishop, professor of cognitive computing at Goldsmiths, highlighted that AI could save businesses money if it becomes smart enough to determine price variants in company spending, for example, scanning through years of an organisation’s invoice database and detecting the cheapest costs and thus saving on outgoings.
While some worry that AI will take over jobs, others have said that they will replace humans altogether.
John Lewis IT chief Paul Coby said earlier this year that the blending of AI and the IoT in the future could signal the end of civilisation as we know it.
Coby explained that the possibilities are already with us in terms of AI and that we ought to think about how “playing with the demons” could be detrimental to our future.
Apple co-founder Steve Wozniak added to previous comments from Stephen Hawking and Elon Musk with claims that “computers are going to take over from humans”.
Woz made his feelings on AI known during an interview with the Australian Financial Review, and agreed with Hawking and Musk that its potential to surpass humans is worrying.
“Computers are going to take over from humans, no question. Like people including Stephen Hawking and Elon Musk have predicted, I agree that the future is scary and very bad for people,” he said.
NOAA Super Computer Goes Live
August 21, 2013 by admin
Filed under Around The Net
Comments Off on NOAA Super Computer Goes Live
The National Oceanic and Atmospheric Administration has rolled out two new supercomputers that are expected to improve weather forecasts and perhaps help better prepare us for hurricanes.
The two IBM systems, which are identical clones, will be used by NOAA’s National Weather Service to produce forecast data that’s used in the U.S. and around the world.
One of the supercomputers is in Reston, Va.; the other is in Orlando. The NWS can switch between the two in about six minutes.
Each is a 213-teraflop system running a Linux operating system on Intel processors. The federal government is paying about $20 million a year to operate the leased systems.
“These are the systems that are the origin of all the weather forecasts you see,” said Ben Kyger, director of central operations at the National Centers for Environmental Prediction.
NOAA had previously used identical four-year-old 74-teraflop IBM supercomputers that ran on IBM’s AIX operating system and Power 6 chips.
Before it could activate the new systems, the NWS had to ensure that they produced scientifically accurate results. It had been running the old and new systems in parallel for months, comparing their output.
The NWS has a new hurricane model, which is 15% more accurate in day five of a forecast for a storm’s track and intensity. That model is now operational and running on the new systems. That’s important, because the U.S. is expecting a busy hurricane season.
Big Blue Still The Patent King
As technology companies start to stockpile patents so that they can see off their rivals IFI Claims Patent Services, a company that maintains global patent databases, has clocked the outfits with the most weapons in any patent war.
More than 224,505 utility patents were awarded in the U.S. last year, jumping two percent over the previous year’s record-breaking tally of 219,614 patents. IBM has always had the most patents, probably because it has been around the longest. The company was granted 6,180 utility patents, up nearly five percent from 2010. Samsung was the number two 4,894 patents, followed by Canon at 2,821 patents, Panasonic with 2,559 and Toshiba with 2,483 utility patents.
Microsoft, which held on to the third spot in 2010, is in the sixth place with 2,311 utility patents granted last year, According to IFI CEO Mike Baycroft global companies, and especially Asian ones, are collecting U.S patents at a dizzying pace, and now Asian firms hold eight of the top 10 slots in the 2011 ranking.
IBM’s Watson Shows Up For Work
IBM’s Watson supercomputer is about to start work evaluating evidence-based cancer treatment options that can be delivered to the doctors in a matter of seconds for assessment.
IBM and WellPoint, which is Blue Cross Blue Shield’s largest health plan, are developing applications that will essentially turn the Watson computer into an adviser for oncologists at Cedars-Sinai’s Samuel Oschin Comprehensive Cancer Institute in Los Angeles, according to Steve Gold, director of worldwide marketing for IBM Watson Solutions.
Cedars-Sinai’s historical data on cancer as well as its current clinical records will be ingested into an iteration of IBM’s Watson that will reside at WellPoint’s headquarters. The computer will act as a medical data repository on multiple types of cancer. WellPoint will then work with Cedars-Sinai physicians to design and develop applications as well as validate their capabilities.
Dr. M. William Audeh, medical director of the cancer institute, will work closely with WellPoint’s clinical experts to provide advice on how the Watson may be best used in clinical practice to support increased understanding of the evolving body of knowledge on cancer, including emerging therapies not widely known by physicians.
IBM announced earlier this year that healthcare would be the first commercial application for the computer, which defeated two human champions on the popular television game show Jeopardy! in February.
Japan Takes 1st Place On Supercomputer List
Comments Off on Japan Takes 1st Place On Supercomputer List
A Japanese computer has earned the number one spot on the Top 500 supercomputer list, ending China’s short reign of just six months. At 8.16 petaflops (quadrillion floating-point calculations per second), the K computer is more powerful than the next five systems combined.
The K computer’s performance was measured using 68,544 SPARC64 VIIIfx CPUs each with eight cores, for a total of 548,352 cores, almost twice as many as any other system on the Top500 list. The computer is still being put together, and when it enters service in November 2012 will have more than 80,000 SPARC64 VIIIfx CPUs according to its manufacturer, Fujitsu.
Japan’s ascension to the top means that the Chinese Tianhe-1A supercomputer, which took the number 1 position in November last year, is now in second spot with its 2.57 petaflops. But China continues to grow the number of systems it has on the list, up from 42 to 62 systems. The change at the top also means that Jaguar, built for the U.S. Department of Energy (DOE), is bumped down to third place.