nVidia Updates Its Grid Platform
Nvidia has updated its Grid software platform with deeper performance profiling and analytics tools for planning, deployment, and support of virtual GPU users.
According to the company the improved management tools address both host (server) managment and virtual client monitoring. Nvidia says that with the new Grid software, admins will be able to get information about the number of virtual graphics instances in use and the number they can potentially create.
They can also see usage information for the stream processors on board each card, the percentage of the card’s frame buffer that’s in use, and the load on each card’s dedicated video encode and decode hardware.
Each guest vGPU instance will tell admins information on encoder and decoder usage, frame buffer occupancy, and the vGPU use. Nvidia adds that it all takes the guess work out of vGPU provisioning and the data it’s exposing about vGPU usage will let system administrators tailor their virtual user profiles better.
All this means that it might stop the admins giving too much processing power to accounts when it is needed for the graphics team. Nvidia thinks those operational improvements will also help lower costs. The August 2016 Grid software update should be available immediately.
Courtesy-Fud
nVidia NVLINK 2.0 Going In IBM Servers
Comments Off on nVidia NVLINK 2.0 Going In IBM Servers
On Monday, PCWorld reported that the first servers expected to use Nvidia’s second-generation NVLINK 2.0 technology will be arriving sometime next year using IBM’s upcoming Power9 chip family.
IBM launched its Power8 lineup of superscalar symmetric multiprocessors back in August 2013 at the Hot Chips conference, and the first systems became available in August 2014. The announcement was significant because it signaled the beginning of a continuing partnership between IBM and Nvidia to develop GPU-accelerated IBM server systems, beginning with the Tesla K40 GPU.
The result was an HPC “tag-team” where IBM’s Power8 architecture, a 12-core chip with 96MB of embedded memory, would eventually go on to power Nvidia’s next-generation Pascal architecture which debuted in April 2016 at the company’s GPU Technology Conference.
NVLINK, first announced in March 2014, uses a proprietary High-Speed Signaling interconnect (NVHS) developed by Nvidia. The company says NVHS transmits data over a differential pair running at up to 20Gbps, so eight of these differential 20Gbps connections will form a 160Gbps “Sub-Link” that sends data in one direction. Two sub-links—one for each direction—will form a 320Gbps, or 40GB/s bi-directional “Link” that connects processors together in a mesh framework (GPU-to-GPU or GPU-to-CPU).
NVLINK lanes upgrade from 20Gbps to 25Gbps
IBM is projecting its Power9 servers to be available beginning in the middle of 2017, with PCWorld reporting that the new processor lineup will include support for NVLINK 2.0 technology. Each NVLINK lane will communicate at 25Gbps, up from 20Gbps in the first iteration. With eight differential lanes, this translates to a 400Gbps (50GB/s) bi-directional link between CPUs and GPUs, or about 25 percent more performance if the information is correct.
NVLINK 2.0 capable servers arriving next year
Meanwhile, Nvidia has yet to release any NVLINK 2.0-capable GPUs, but a company presentation slide in Korean language suggests that the technology will first appear in Volta GPUs which are also scheduled for release sometime next year. We were originally under the impression that the new GPU architecture would release in 2018, as per Nvidia’s roadmap. But a source hinted last month that Volta would be getting 16nm FinFET treatment and may show up in roughly the same timeframe as AMD’s HBM 2.0-powered Vega sometime in 2017. After all, it is easier for Nvidia to launch sooner if the new architecture is built on the same node as the Pascal lineup.
Still ahead of PCI-Express 4.0
Nvidia claims that PCI-Express 3.0 (32GB/s with x16 bandwidth) significantly limits a GPU’s ability to access a CPU’s memory system and is about “four to five times slower” than its proprietary standard. Even PCI-Express 4.0, releasing later in 2017, is limited to 64GB/s on a slot with x16 bandwidth.
To put this in perspective, Nvidia’s Tesla P100 Accelerator uses four 40GB/s NVLINK ports to connect clusters of GPUs and CPUs, for a total of 160GB/s of bandwidth.
With a generational NVLINK upgrade from 40GB/s to 50GB/s bi-directional links, the company could release a future Volta-based GPU with four 50GB/s NVLINK ports totaling of 200GB/s of bandwidth, well above and beyond the specifications of the new PCI-Express standard.
Courtesy-Fud
MIT Researchers Triple Wireless Speeds
August 29, 2016 by admin
Filed under Around The Net
Comments Off on MIT Researchers Triple Wireless Speeds
MIT researchers have uncovered a way to transfer wireless data using a smartphone at a speed about three times faster and twice as far as existing technology.
The researchers developed a technique to coordinate multiple wireless transmitters by synchronizing their wave phases, according to a statement from MIT on Tuesday. Multiple independent transmitters will be able to send data over the same wireless channel to multiple independent receivers without interfering with each other.
Since wireless spectrum is scarce, and network congestion is only expected to grow, the technology could have important implications.
The researchers called the approach MegaMIMO 2.0 (Multiple Input, Multiple Output) .
For their experiments, the researchers set up four laptops in a conference room setting, allowing signals to roam over 802.11 a/g/n Wi-Fi. The speed and distance improvement is expected to also apply to cellular networks. A video describes the technology as well as a technical paper (registration required), which was presented this week to the Association for Computing Machinery’s Special Interest Group on Data Communications (SIGCOMM 16).
The researchers, from MIT’s Computer Science and Artificial Intelligence Lab, are: Ezzeldin Hamed, Hariharan Rahul, Mohammed Abdelghany and Dina Katabi.
Courtesy-http://www.thegurureview.net/mobile-category/mit-researchers-develop-technique-to-triple-wireless-speeds.html
Courtesy-http://www.thegurureview.net/mobile-category/mit-researchers-develop-technique-to-triple-wireless-speeds.html
Apple Jumps On The AR Bandwagon
August 26, 2016 by admin
Filed under Around The Net
Comments Off on Apple Jumps On The AR Bandwagon
Apple is trying to convince the world it is “coming up with something new” by talking a lot about Artificial Reality.
It is a fairly logical development, the company has operated a reality distortion field to create an alternative universe where its products are new and revolutionary and light years ahead of everyone else’s. It will be curious to see how Apple integrates its reality with the real world, given that it is having a problem with that.
Apple CEO Tim Cook has been doing his best to convince the world that Apple really is working on something. He needs to do this as the iPhone cash cow starts to dry up and Jobs Mob appears to have no products to replace it.
In an interview with The Washington Post published Sunday, Cook said Apple is “doing a lot of things” with augmented reality (AR), the technology that puts digital images on top of the real world.
He said:
“I think AR is extremely interesting and sort of a core technology. So, yes, it’s something we’re doing a lot of things on behind that curtain we talked about.”
However Apple is light years behind working being done by Microsoft with its Microsoft’s HoloLens headset and the startup Magic Leap’s so-called cinematic reality that’s being developed now.
Cook appears to retreat to AR whenever he is under pressure. But so far he has never actually said that the company is developing any.
Appple has also snapped up several companies and experts in the AR space. And in January, the Financial Times claimed that the company has a division of hundreds of people researching the technology.
But AR would be a hard fit to get a product out which fits Apple’s ethos and certainly not one for years. Meanwhile it is unlikely we will see anything new before Microsoft and Google get their products out.
Courtesy-Fud
Intel To Acquire Deep Learning Company Nervana
Comments Off on Intel To Acquire Deep Learning Company Nervana
Intel is acquiring deep-learning startup Nervana Systems in a deal that could help it make up for lost ground in the increasingly hot area of artificial intelligence.
Founded in 2014, California-based Nervana offers a hosted platform for deep learning that’s optimized “from algorithms down to silicon” to solve machine-learning problems, the startup says.
Businesses can use its Nervana cloud service to build and deploy applications that make use of deep learning, a branch of AI used for tasks like image recognition and uncovering patterns in large amounts of data.
Also of interest to Intel, Nervana is developing a specialty processor, known as an ASIC, that’s custom built for deep learning.
Financial terms of the deal were not disclosed, but one estimate put the value above $350 million.
“We will apply Nervana’s software expertise to further optimize the Intel Math Kernel Library and its integration into industry standard frameworks,” Diane Bryant, head of Intel’s Data Center Group, said in a blog post. Nervana’s expertise “will advance Intel’s AI portfolio and enhance the deep-learning performance and TCO of our Intel Xeon and Intel Xeon Phi processors.”
Though Intel also acquired AI firm Saffron late last year, the Nervana acquisition “clearly defines the start of Intel’s AI portfolio,” said Paul Teich, principal analyst with Tirias Research.
“Intel has been chasing high-performance computing very effectively, but their hardware-design teams missed the convolutional neural network transition a few years ago,” Teich said. CNNs are what’s fueling the current surge in artificial intelligence, deep learning and machine learning.
As part of Intel, Nervana will continue to operate out of its San Diego headquarters, cofounder and CEO Naveen Rao said in a blog post.
The startup’s 48-person team will join Intel’s Data Center Group after the deal’s close, which is expected “very soon,” Intel said.
Source- http://www.thegurureview.net/aroundnet-category/intel-to-acquire-deep-learning-company-nervana.html
Is nVidia’s Auto Venture Paying Off?
August 17, 2016 by admin
Filed under Consumer Electronics
Comments Off on Is nVidia’s Auto Venture Paying Off?
The driverless car market is expected to grow to $42 billion by 2025 and Nvidia has a cunning plan to grab as much of that market as possible with its current automotive partnerships.
The company started to take in more cash from its car business recently. The company earned $113 million from its automotive segment in fiscal Q1 2017. While that is not much it represents a 47 percent increase over the year before. Automotive revenue up to about 8.6 percent of total revenue and it is set to get higher.
BMW, Tesla, Honda and Volkswagen are all using Nvidia gear in one way or another.
BMW’s been using Nvidia infotainment systems for years and seems to have been Nvidia’s way into the industry. Tesla has a 17 inch touchscreen display of which is powered by Nvidia. You can see Tesla’s all-digital 12.3-inch instrument cluster display uses Nvidia GPUs. Honda has Tegra processors for its Honda Connect infotainment system.
But rumors are that Nvidia is hoping to make a killing from the move to driverless cars. The company is already on the second version of its Drive PX self-driving platform. Nvidia claims that Drive PX recently learned how to navigate 3,000 miles of road in just 72 hours.
BMW, Ford, and Daimler are testing Drive PX and Audi used Nvidia’s GPUs to help pilot some of its self-driving vehicles in the past. In fact Audi has claimed that it can be used to help normal car driving.
It said that the deep learning capabilities of Drive PX allowed its vehicles to learn certain self-driving capabilities in four hours instead of the two years that it took on competing systems.
According to Automotive News Europe Nvidia is working closely with Audi as its primary brand for Drive PX but then it will move to Volkswagen, Seat, Skoda, Lamborghini, and Bentley.
Tesla also appears to think that Nvida is a key element for driverless car technology. At the 2015 GPU Technology Conference last year, the company said that Tegra GPU’s will prove “really important for self-driving in the future.” Tesla does not use the Drive PX system yet, but it could go that way.
Courtesy-Fud
Amazon Goes Droning
August 5, 2016 by admin
Filed under Around The Net
Comments Off on Amazon Goes Droning
Amazon.com Inc announced that it has entered into a partnership with the British government to hasten the process for allowing small drones to makes deliveries.
The world’s biggest online retailer, which has laid out plans to start using drones for deliveries by 2017, said a cross-government team supported by the UK Civil Aviation Authority had provided it with the permissions necessary to explore the process.
Amazon unveiled a video last year showcasing how an unmanned drone could deliver packages, narrated by former Top Gear TV host Jeremy Clarkson.
The U.S. Federal Aviation Administration said last month the use of drones for deliveries will require separate regulation from their general use.
Wal-Mart Stores Inc said last month it was six to nine months from beginning to use drones to check warehouse inventories in the United States.
Source-http://www.thegurureview.net/aroundnet-category/u-k-regulators-give-amazon-permission-to-explore-drone-deliveries.html
NFC For ATM Transactions Catching On
August 3, 2016 by admin
Filed under Around The Net
Comments Off on NFC For ATM Transactions Catching On
Several of the nation’s biggest banks in the U.S. now support the use of a smartphone to withdraw cash from an ATM — many by way of Near Field Communication (NFC) technology — instead of requiring customers to use a bank card.
One of the early adopters, Bank of America, said this week it currently supports cardless technology at 2,800 of its ATMs. That number will reach 8,000 ATMs by year’s end that rely on NFC and other technology. Bank of America, which has about 15,000 ATMs nationwide, created a video to show how a smartphone loaded with the bank’s mobile app can now withdraw cash from some ATMs.
Wells Fargo said it has a “handful” of ATMs that are NFC-ready and working to deliver cash and other transactions and is planning to reach 5,000 by the end of 2016. A total of 12,000 ATMs will be enabled in 2017.
JPMorgan Chase said it also will have many cardless ATMs available this year, but didn’t specify how many or when. Initially at Chase, customers will show up at an ATM and type in a numerical code they acquired wirelessly through use of the Chase smartphone app to get their cash. That numerical code verification process will be an early step in rolling out cardless technology at the bank’s nearly 15,000 ATMs.
In addition to using NFC or a numerical code to authenticate a transaction, some bank ATMs are expected to rely on scanning a QR code displayed on a phone.
The number of ATMs supporting cardless cash remains a small portion of the estimated 500,000 ATMs in the U.S. Crone Consulting, which monitors the mobile payment industry, recently said it expects about 95,000 ATMs in the U.S. to support cardless cash by year’s end.
Courtesy-http://www.thegurureview.net/mobile-category/nfc-for-atm-transactions-catching-on.html
Office 365 Subscription Slows Signficantly
August 1, 2016 by admin
Filed under Around The Net
Comments Off on Office 365 Subscription Slows Signficantly
Microsoft said that consumer subscriptions to Office 365 topped 23 million, signaling that the segment’s once quite large year-over-year growth had slowed significantly.
The Redmond, Wash. company regularly talks up the latest subscription numbers for the consumer-grade Office 365 plans — the $100 a year Home and the $70 Personal — and did so again this week during an earnings call with Wall Street analysts.
“We also see momentum amongst consumers, with now more than 23 million Office 365 subscribers,” CEO Satya Nadella said Tuesday.
But analysis of Microsoft’s consumer Office 365 numbers showed that the rate of growth — or as Nadella put it, “momentum” — has slowed.
For the June quarter, the 23.1 million cited by Microsoft in its filing with the U.S. Securities & Exchange Commission (SEC) represented a 52% increase over the same period the year prior. Although most companies would give their eye teeth — or maybe a few executives — to boast of a rate of increase that size, it was the smallest since Microsoft began providing subscription data in early 2013.
A year before, the June 2015 quarter sported a consumer Office 365 subscription growth rate of 171% over the same three-month span in 2014.
The subscription increase also was small in absolute terms: Microsoft added approximately 900,000 to the rolls during the June quarter, down from 2.8 million the year before and also less than the 1.6 million accumulated in 2016′s March quarter.
The 900,000 additional subscribers added in the June quarter were the smallest number in more than two years.
While Microsoft did not directly address the slowing of growth in the consumer Office 365 market, it did attribute a similar trend among corporate subscriptions to the difficulty of maintaining huge year-over-year percentage gains as the raw numbers of subscriptions increased.
Courtesy-http://www.thegurureview.net/aroundnet-category/microsofts-office-365-subscription-slows-signficantly.html
Tech Firms Form OTrP To Support IoT Security
Comments Off on Tech Firms Form OTrP To Support IoT Security
A bunch of tech firms including ARM and Symantec have joined forces to create a security protocol designed to protect Internet of Things (IoT) devices.
The group, which also includes Intercede and Solacia, has created The Open Trust Protocol (OTrP) that is now available for download for prototyping and testing from the IETF website.
The OTrP is designed to bring system-level root trust to devices, using secure architecture and trusted code management, akin to how apps on smartphones and tablets that contain sensitive information are kept separate from the main OS.
This will allow IoT manufacturers to incorporate the technology into devices, ensuring that they are protected without having to give full access to a device OS.
Marc Canel, vice president of security systems at ARM, explained that the OTrP will put security and trust at the core of the IoT.
“In an internet-connected world it is imperative to establish trust between all devices and service providers,” he said.
“Operators need to trust devices their systems interact with and OTrP achieves this in a simple way. It brings e-commerce trust architectures together with a high-level protocol that can be easily integrated with any existing platform.”
Brian Witten, senior director of IoT security at Symantec, echoed this sentiment. “The IoT and smart mobile technologies are moving into a range of diverse applications and it is important to create an open protocol to ease and accelerate adoption of hardware-backed security that is designed to protect onboard encryption keys,” he said.
The next stage is for the OTrP to be further developed by a standards-defining organisation after feedback from the wider technology community, so that it can become a fully interoperable standard suitable for mass adoption.
Courtesy-TheInq