Nvidia Teams Up With Volvo For Self-Driving Car Computer
January 15, 2016 by admin
Filed under Around The Net
Comments Off on Nvidia Teams Up With Volvo For Self-Driving Car Computer
Nvidia Corp. took the wraps off of a new, lunchbox-size super-computer for self-driving cars and announced that Volvo Car Group will be the new device’s first customer.
Volvo, of Sweden, is owned by China’s Geely Automotive Holdings.
Nvidia made the announcement at the beginning of the Consumer Electronic Show in Las Vegas. Calls to Volvo’s spokesman in China were not immediately answered.
The new Drive PX 2, said company CEO Jen-Hsung Huang, has computing power equivalent to 150 MacBook Pro computers, and can deliver up to 24 trillion “deep learning” operations – allowing the computer to use artificial intelligence to program itself to recognize driving situations – per second.
Partnerships between automakers and Silicon Valley companies on self-driving technologies are taking center stage at this year’s show.
Also on Monday, General Motors Co. announced a $500 million investment in ride-sharing service Lyft.
Huang didn’t offer revenue projections for Drive PX 2, but automotive is the fastest-growing business segment for Nvidia, whose largest revenue source is video games.
Source-http://www.thegurureview.net/aroundnet-category/nvidia-teams-up-with-volvo-for-self-driving-car-computer.html
IPv6 Turns 20, Did You Notice?
IPv6 is 20 years old and the milestone has been celebrated with 10 percent adoption across the world for the first time.
The idea that IPv6 remains so far behind its saturated incumbent, IPv4, is horrifying given that three continents ran out of IPv4 addresses in 2015. Unfortunately, because the product isn’t ‘end of life’ most internet providers have been working on a ‘not broken, don’t fix it’ basis.
But 2016 looks to be the year when IPv6 makes its great leap to the mainstream, in Britain at least. BT, the UK’s biggest broadband provider, has already committed to switch on IPv6 support by the end of the year, and most premises will be IPv6-capable by April. Most companies use the same lines, but it will be up to each individual supplier to switch over. Plusnet, a part of BT, is a likely second.
IPv6 has a number of advantages over IPv4, most notably that it is virtually infinite, meaning that the capacity problems that the expanded network is facing shouldn’t come back to haunt us again. It will also pave the way for ever faster, more secure networks.
Some private corporate networks have already made the switch. Before Christmas we reported that the UK Ministry of Defence was already using the protocol, leaving thousands of unused IPv4 addresses lying idle in its wake.
IPv6 is also incredibly adaptable for the Internet of Things. Version 4.2 of the Bluetooth protocol includes IPv6 connectivity as standard, making it a lot easier for tiny nodes to make up a larger internet-connected grid.
Google’s latest figures suggest that more than 10 percent of users are running IPv6 connections at the weekend, while the number drops to eight percent on weekdays. This suggests that the majority of movement towards IPv6 is happening in the residential broadband market.
That said, it is imperative that businesses begin to make the leap. As Infoblox IPv6 evangelist Tom Coffeen told us last year, it could start to affect the speed at which you are able to trade.
“If someone surfs onto your site and its only available in IPv4, but they are using IPv6, there has to be some translation, which puts your site at a disadvantage. If I’ve not made my site available in IPv6, I’m no longer in control over where that translation occurs.”
In other words, if you don’t catch up, you will soon get left behind. It was ever thus.
Courtesy-TheInq
Twitter To Revive Tweets
January 11, 2016 by admin
Filed under Around The Net
Comments Off on Twitter To Revive Tweets
Right on the heels of the first U.S. presidential primaries and caucuses, a popular archive of sometimes-misguided or embarrassing tweets that have been deleted by politicians and their staff has been resurrected by Twitter.
Politwoops had been a popular social media destination for political junkies and others looking to unearth social media gaffes by politicians.
But in a move widely lambasted by open-government advocates, Twitter effectively shuttered Politwoops last summer when it revoked access to its interface by the government accountability watchdog, the Sunlight Foundation, that had developed the tool and had been publishing the tweets.
On Thursday, Twitter said it had reached a deal with Sunlight and another organization, the Open State Foundation, to restore the tool.
“Politwoops is an important tool for holding our public officials, including candidates and elected or appointed public officials, accountable for the statements they make, and we’re glad that we’ve been able to reach an agreement with Twitter to bring it back online both in the U.S. and internationally,” said Jenn Topper, communications director for The Sunlight Foundation.
While the announcement was a victory for government-transparency advocates, it could prove to be a setback for politicians hoping to avoid the social media rumpus that can accompany an ill-timed tweet or misconstrued online musing.
The deal comes as the clock ticks closer to the first vote casting in the 2016 U.S. presidential campaign. The Iowa caucuses will take place on Feb. 1, followed by the first primary in New Hampshire on Feb. 9.
Source-http://www.thegurureview.net/aroundnet-category/twitter-to-revived-archived-deleted-tweets-of-politicians.html
AMD Goes Full Steam To Open-Source
AMD and now RTG (Radeon Technologies Group) are involved in a major push to open source GPU resources.
According to Ars Technica Under the handle “GPUOpen” AMD is releasing a slew of open-source software and tools to give developers of games, heterogeneous applications, and HPC applications deeper access to the GPU and GPU resources.
In a statement AMD said that as a continuation of the strategy it started with Mantle, it is giving even more control of the GPU to developers.
“ As console developers have benefited from low-level access to the GPU, AMD wants to continue to bring this level of access to the PC space.”
The AMD GPUOpen initiative is meant to give developers the ability to use assets they’ve already made for console development. They will have direct access to GPU hardware, as well as access to a large collection of open source effects, tools, libraries and SDKs, which are being made available on GitHub under an MIT open-source license.
AMD wants GPUOpen will enable console-style development for PC games through this open source software initiative. It also includes an end-to-end open source compute infrastructure for cluster-based computing and a new Linux software and driver strategy
All this ties in with AMD’s Boltzmann Initiative and an HSA (Heterogeneous System Architecture) software suite that includes an HCC compiler for C++ development. This was supposed to open the field of programmers who can use HSA. A new HCC C++ compiler was set up to enable developers to more easily use discrete GPU hardware in heterogeneous systems.
It also allows developers to convert CUDA code to portable C++. According to AMD, internal testing shows that in many cases 90 percent or more of CUDA code can be automatically converted into C++ with the final 10 percent converted manually in the widely popular C++ language. An early access program for the “Boltzmann Initiative” tools is planned for Q1 2016.
AMD GPUOpen includes a new Linux driver model and runtime targeted at HPC Cluster-Class Computing. The headless Linux driver is supposed to handle high-performance computing needs with low latency compute dispatch and PCI Express data transfers, peer-to-peer GPU support, Remote Direct Memory Access (RDMA) from InfiniBand that interconnects directly to GPU memory and Large Single Memory Allocation support.
Courtesy-Fud
Will Facebook Go Open-Source
December 29, 2015 by admin
Filed under Around The Net
Comments Off on Will Facebook Go Open-Source
Facebook has unveiled its next-generation GPU-based systems for training neural networks, Open Rack-compatible hardware code-named “Big Sur” which it plans to open source.
The social media giant’s latest machine learning system has been designed for artificial intelligence (AI) computing at a large scale, and in most part has been crafted with Nvidia hardware.
Big Sur comprises eight high-performance GPUs of up to 300 watts each, with the flexibility to configure between multiple PCI-e topologies. It makes use of Nvidia’s Tesla Accelerated Computing Platform, and as a result is twice as fast as Facebook’s previous generation rack.
“This means we can train twice as fast and explore networks twice as large,” said the firm in its engineering blog. “And distributing training across eight GPUs allows us to scale the size and speed of our networks by another factor of two.”
Facebook claims that as well as better performance, Big Sur is also far more versatile and efficient than the off-the-shelf solutions in its previous generation.
“While many high-performance computing systems require special cooling and other unique infrastructure to operate, we have optimised these new servers for thermal and power efficiency, allowing us to operate them even in our own free-air cooled, Open Compute standard data centres,” explained the company.
We spoke to Nvidia’s senior product manager for GPU Computing, Will Ramey, ahead of the launch, who has been working on the Big Sur project alongside Facebook for some time.
“The project is the first time that a complete computing system that is designed for machine learning and AI will be released as an open source solution,” said Ramey. “By taking the purpose-built design spec that Facebook has designed for their own machine learning apps and open sourcing them, people will benefit from and contribute to the project so it can move the entire industry forward.”
While Big Sur was built with Nvidia’s new Tesla M40 hyperscale accelerator in mind, it can actually support a wide range of PCI-e cards in what Facebook believes could make for better efficiencies in production and manufacturing to get more computational power for every penny that it invests.
“Servers can also require maintenance and hefty operational resources, so, like the other hardware in our data centres, Big Sur was designed around operational efficiency and serviceability,” Facebook said. “We’ve removed the components that don’t get used very much, and components that fail relatively frequently – such as hard drives and DIMMs – can now be removed and replaced in a few seconds.”
Perhaps the most interesting aspect of the Big Sur announcement is Facebook’s plans to open-source it and submit the design materials to the Open Compute Project. This is a bid to make it easier for AI researchers to share techniques and technologies.
“As with all hardware systems that are released into the open, it’s our hope that others will be able to work with us to improve it,” Facebook said, adding that it believes open collaboration will help foster innovation for future designs, and put us closer to building complex AI systems that will probably take over the world and kill us all.
Nvidia released its end-to-end hyperscale data centre platform last month claiming that it will let web services companies accelerate their machine learning workloads and power advanced artificial intelligence applications.
Consisting of two accelerators, Nvidia’s latest hyperscale line aims to let researchers design new deep neural networks more quickly for the increasing number of applications they want to power with AI. It also is designed to deploy these networks across the data centre. The line also includes a suite of GPU-accelerated libraries.
Courtesy-TheInq
TSMC Goes Fan-Out Wafers
TSMC is scheduled to move its integrated fan-out (InFO) wafer-level packaging technology to volume production in the second quarter of 2016.
Apparently the fruity cargo cult Apple has already signed up to adopt the technology, which means that the rest of the world’s press will probably notice.
According to the Commercial Times TSMC will have 85,000-100,000 wafers fabricated with the foundry’s in-house developed InFo packaging technology in the second quarter of 2016.
TSMC has disclosed its InFO packaging technology will be ready for mass production in 2016. Company president and co-CEO CC Wei remarked at an October 15 investors meeting that TSMC has completed construction of a new facility in Longtan, northern Taiwan.
TSMC’s InFo technology will be ready for volume production in the second quarter of 2016, according to Wei.
TSMC president and co-CEO Mark Liu disclosed the company is working on the second generation of its InFO technology for several projects on 10nm and 7nm process nodes.
Source-http://www.thegurureview.net/computing-category/tsmc-goes-fan-out-wafers.html
Samsung Goes Auto
December 22, 2015 by admin
Filed under Around The Net, Internet
Comments Off on Samsung Goes Auto
Samsung has announced it will begin manufacturing electronics parts for the automotive industry, with a primary focus on autonomous vehicles.
The South Korean electronics giant is only the latest tech firm to make a somewhat belated push into the carmaker industry, as vehicle computer systems and sensors become more sophisticated.
In October, General Motors announced a strategic partnership with South Korea’s LG Electronics. LG will supply a majority of the key components for GM’s upcoming electric vehicle (EV), the Chevrolet Bolt. LG has also been building computer modules for GM’s OnStar telecommunications system for years.
Apple and Google have also developed APIs that are slowly being embedded by automakers to allow smartphones to natively connect and display their infotainment screens. Those APIs led to the rollout in several vehicles this year of Apple’s CarPlay and Android Auto.
Having formerly balked at the automotive electronics market as too small, consumer computer chipmakers are now entering the space with fervor.
Dutch semiconductor maker NXP is closing an $11.8 billion deal to buy Austin-based Freescale, which makes automotive microprocessors. The combined companies would displace Japan’s Renesas as the world’s largest vehicle chipmaker.
German semiconductor maker Infineon Technology has reportedly begun talks to buy a stake in Renesas.
Adding to growth in automotive electronics are regulations mandating technology such as backup cameras in the U.S. and “eCalling” in Europe, which automatically dials emergency services in case of an accident.
According to a report published by Thomson Reuters, Samsung and its tech affiliates are ramping up research and development for auto technology, with two-thirds of their combined 1,804 U.S. patent filings since 2010 related to electric vehicles and electric components for cars.
The combined automotive software, services and components market is worth around $500 billion, according to ABI Resarch.
Source-http://www.thegurureview.net/consumer-category/samsung-announces-entry-into-auto-industry.html
Pawn Storm Hacking Develops New Tools For Cyberespionage
Comments Off on Pawn Storm Hacking Develops New Tools For Cyberespionage
A Russian cyberespionage group known as Pawn Storm has made use of new tools in an ongoing attack campaign against defense contractors with the goal of defeating network isolation policies.
Since August, the group has been engaged in an attack campaign focused on defense contractors, according to security researchers from Kaspersky Lab.
During this operation, the group has used a new version of a backdoor program called AZZY and a new set of data-stealing modules. One of those modules monitors for USB storage devices plugged into the computer and steals files from them based on rules defined by the attackers.
The Kaspersky Lab researchers believe that this module’s goal is to defeat so-called network air gaps, network segments where sensitive data is stored and which are not connected to the Internet to limit their risk of compromise.
However, it’s fairly common for employees in organizations that use such network isolation policies to move data from air-gapped computers to their workstations using USB thumb drives.
Pawn Storm joins other sophisticated cyberespionage groups, like Equation and Flame, that are known to have used malware designed to defeat network air gaps.
“Over the last year, the Sofacy group has increased its activity almost tenfold when compared to previous years, becoming one of the most prolific, agile and dynamic threat actors in the arena,” the Kaspersky researchers said in a blog post. “This activity spiked in July 2015, when the group dropped two completely new exploits, an Office and Java zero-day.”
Source- http://www.thegurureview.net/aroundnet-category/pawn-storm-hacking-group-develops-new-tools-for-cyberespionage.html
AI Assistant on The Way
December 15, 2015 by admin
Filed under Around The Net
Comments Off on AI Assistant on The Way
Researchers at Carnegie Mellon University are working on artificial intelligence software that could one day become a personal assistant, whispering directions to get to a restaurant, put together a book shelf or repair a manufacturing machine.
The software is named Gabriel, after the angel that serves as God’s messenger, and is designed to be used in a wearable vision system – something similar to Google Glass or another head-mounted system. Tapping into information held in the cloud, the system is set up to feed or “whisper” information to the user as needed.
At this point, the project is focused on the software and is not connected to a particular hardware device.
“Ten years ago, people thought of this as science fiction,” said Mahadev Satyanarayanan, professor of computer science and the principal investigator for the Gabriel project, at Carnegie Mellon. “But now it’s on the verge of reality.”
The project, which has been funded by a $2.8 million grant from the National Science Foundation, has been in the works for the past five years.
“This will enable us to approach, with much higher confidence, tasks, such as putting a kit together,” said Satyanarayanan. “For example, assembling a furniture kit from IKEA can be complex and you may make mistakes. Our research makes it possible to create an app that is specific to this task and which guides you step-by-step and detects mistakes immediately.”
He called Gabriel a “huge leap in technology” that uses mobile computing, wireless networking, computer vision, human-computer interaction and artificial intelligence.
Satyanarayanan said he and his team are not in talks with device makers about getting the software in use, but he hopes it’s just a few years away from commercialization.
“The experience is much like a driver using a GPS navigation system,” Satyanarayanan said. “It gives you instructions when you need them, corrects you when you make a mistake and, most of the time, shuts up so it doesn’t bug you.”
One of the key technologies being used with the Gabriel project is called a “cloudlet.” Developed by Satyanarayanan, a cloudlet is a cloud-supported data center that serves multiple local mobile users.
Source- http://www.thegurureview.net/consumer-category/want-an-ai-based-whispering-personal-assistant.html
Deutsche Bank Taking Dives Into ‘Big Data’
December 14, 2015 by admin
Filed under Around The Net
Comments Off on Deutsche Bank Taking Dives Into ‘Big Data’
Deutsche Bank is undertaking a major computer systems overhaul that will help it to make greater use of so-called “big data” to provide a detailed picture of how, when and where customers interact with it, the bank’s chief data officer said in an interview.
JP Rangaswami, who joined Deutsche Bank in January as its first-ever chief data officer, said better and cheaper metadata was allowing the bank to analyze previously inaccessible information.
“We are able to see patterns that we could not see beforehand, allowing us to gain insights we couldn’t gain before,” Rangaswami told Reuters in an interview.
Upgrading the technical infrastructure Deutsche Bank needs to get the most out of this data is a priority for Chief Executive John Cryan. He is trying to improve the performance of Germany’s biggest bank, which is struggling to adapt to the tougher climate for banks since the financial crisis.
Cryan, who unveiled a big overhaul at Deutsche on Oct. 29, said at the time that imposing standards on Deutsche’s IT infrastructure was key to improving controls and reducing overheads.
The CEO said in the October presentation that IT design had occurred in silos with the application of little or no common standards. “Our systems are disjointed, cumbersome and far too often just plain incompatible.”
An annual global survey of more than 200 senior bankers published last week by banking software firm Temenos found that “IT Modernization” was now top priority, displacing earlier investment objectives such as regulation and customer friendly mobile apps. IT modernization ranked only fourth among major priorities in the survey last year.
The shift toward technology as a priority shows the extent of the challenge facing banks to modernize infrastructure to analyze internal customer data and try to fend off competition from new financial technology companies.
Rangaswami, who was chief scientist at Silicon Valley marketing software giant Salesforce from 2010 until 2014, said the data would allow Deutsche to tailor services to customers’ needs and to identify bottlenecks and regional implications faster and solve problems more quickly.
Source- http://www.thegurureview.net/aroundnet-category/deutsche-bank-taking-a-deeper-dive-into-big-data.html