Qualcomm Jumps Into VR
Qualcomm has thrown its hat into the virtual reality (VR) ring with the launch of the Snapdragon VR SDK for Snapdragon-based smartphones and VR headsets.
The SDK gives developers access to advanced VR features, according to Qualcomm, allowing them to simplify development and attain improved performance and power efficiency with Qualcomm’s Snapdragon 820 processor, found in Android smartphones such as the Galaxy S7 and tipped to feature in upcoming VR headsets.
In terms of features, the development kit offers tools such as digital signal processing (DSP) sensor fusion, which allows devs to use the “full breadth” of technologies built into the Snapdragon 820 chip to create more responsive and immersive experiences.
It will help developers combine high-frequency inertial data from gyroscopes and accelerometers, and there’s what the company calls “predictive head position processing” based on its Hexagon DSP, while Qualcomm’s Symphony System Manager makes easier access to power and performance management for more stable frame rates in VR applications running on less-powerful devices.
Fast motion to photon will offer single buffer rendering to reduce latency by up to 50 percent, while stereoscopic rendering with lens correction offers support for 3D binocular vision with color correction and barrel distortion for improved visual quality of graphics and video, enhancing the overall VR experience.
Stereoscopic rendering with lens correction supports 3D binocular vision with color correction and barrel distortion for improved visual quality of graphics and video, enhancing the overall VR experience.
Rounding off the features is VR layering, which improves overlays in a virtual world to reduce distortion.
David Durnil, senior director of engineering at Qualcomm, said: “We’re providing advanced tools and technologies to help developers significantly improve the virtual reality experience for applications like games, 360 degree VR videos and a variety of interactive education and entertainment applications.
“VR represents a new paradigm for how we interact with the world, and we’re excited to help mobile VR developers more efficiently deliver compelling and high-quality experiences on upcoming Snapdragon 820 VR-capable Android smartphones and headsets.”
The Snapdragon VR SDK will be available to developers in the second quarter through the Qualcomm Developer Network.
The launch of Qualcomm’s VR SDK comes just moments after AMD also entered the VR arena with the launch of the Sulon Q, a VR-ready wearable Windows 10 PC.
Courtesy-TheInq
Cisco Fixes Major Flaw
Cisco has patched high-impact vulnerabilities in several of its cable modem and residential gateway devices which are popular among those distributed by ISPs to their customers.
The embedded Web server in the Cisco Cable Modem with Digital Voice models DPC2203 and EPC2203 contains a buffer overflow vulnerability that can be exploited remotely without authentication. Apparently all you need to do is send a crafted HTTP requests to the Web server and you could see some arbitrary code execution.
Cisco said that its customers should contact their service providers to ensure that the software version installed on their devices includes the patch for this issue.
The Web-based administration interfaces of the Cisco DPC3941 Wireless Residential Gateway with Digital Voice and Cisco DPC3939B Wireless Residential Voice Gateway are affected by a vulnerability that could lead to information disclosure. An unauthenticated, remote attacker could exploit the flaw by sending a specially crafted HTTP request to an affected device in order to obtain sensitive information from it.
The Cisco Model DPQ3925 8×4 DOCSIS 3.0 Wireless Residential Gateway with EDVA is affected by a separate vulnerability, also triggered by malicious HTTP requests, that could lead to a denial-of-service attack.
Hackers have been hitting modems, routers and other gateway devices, hard lately – especially those distributed by ISPs to their customers. By compromising such devices, attackers can snoop on, hijack or disrupt network traffic or can attack other devices inside local networks.
Courtesy-Fud
Microsoft Goes Quantum Computing
Software giant Microsoft is focusing a lot of its R&D money on quantum computing.
Peter Lee, the corporate vice president of Microsoft Research said that Quantum computing is “stupendously exciting right now.”
Apparently it is Microsoft Research’s largest area of investment and Lee is pretty certain it is on the verge of some major scientific achievements.
“There’s just hope and optimism those scientific achievements will lead to practical outcomes. It’s hard to know when and where,” Lee said.
This is the first we have heard about Redmond’s quantum ambitions for a while. In 2014 the company revealed its “Station Q” group located on the University of California, Santa Barbara, campus, which has focused on quantum computing since its establishment a decade ago.
We sort of assumed that Microsoft would not get much work done on Quantum states because faced with a choice most cats would rather die in a box rather than listen to Steve Ballmer. But we guess with a more cat friendly CEO it is moving ahead.
Lee said that he has explained quantum computing research to Microsoft chief executive Satya Nadella by comparing it with speech processing. In that field, Microsoft researchers worked “so hard for a decade with no practical improvement,” he said. Then deep learning brought about considerable leaps forward in speech recognition and Microsoft was in on the ground floor.
“With quantum, we’ve made just gigantic advancements making semiconductor interfacing, allowing semiconductor materials to operate as though they were superconducting. What that means is the possibility of semiconductors that can operate at extremely high clock rates with very, very little or no heat dissipation. It’s just really spectacular.”
Courtesy-Fud
Intel Putting RealSense Into VR
March 16, 2016 by admin
Filed under Around The Net
Comments Off on Intel Putting RealSense Into VR
Intel is adapting its RealSense depth camera into an augmented reality headset design which it might be licensing to other manufacturers.
The plan is not official yet but appears to have been leaked to the Wall Street Journal. Achin Bhowmik, who oversees RealSense as vice president and general manager of Intel’s perceptual computing group, declined to discuss unannounced development efforts.
But he said Intel has a tradition of creating prototypes for products like laptop computers to help persuade customers to use its components. We have to build the entire experience ourselves before we can convince the ecosystem,” Bhowmik said.
Intel appears to be working on an augmented-reality headset when it teamed up with IonVR to to work on an augmented-reality headset that could work with a variety of operating systems, including Android and iOS. Naturally, it had a front-facing RealSense camera.
RealSense depth camera has been in development for several years and was shown as a viable product technology at the Consumer Electronics Show in 2014. Since then, nothing has happened and Microsoft’s Kinect sensor technology for use with Windows Hello in the Surface Pro 4 and Surface Book knocked it aside.
Intel’s biggest issue is that it is talking about making a consumer product which is something that it never got the hang of.
RealSense technology is really good at translating real-world objects into virtual space. In fact a lot better than the HoloLens because it can scan the user’s hands and translate them into virtual objects that can manipulate other virtual objects.
Courtesy-Fud
IBM Goes After Groupon
March 14, 2016 by admin
Filed under Around The Net
Comments Off on IBM Goes After Groupon
IBM has filed suit against online deals marketplace Groupon for infringing four of its patents, including two that emerged from Prodigy, the online service launched by IBM and partners ahead of the World Wide Web.
Groupon has built its business model on the use of IBM’s patents, according to the complaint filed Wednesday in the federal court for the District of Delaware. “Despite IBM’s repeated attempts to negotiate, Groupon refuses to take a license, but continues to use IBM’s property,” according to the computing giant, which is asking the court to order Groupon to halt further infringement and pay damages.
IBM alleges that websites under Groupon’s control and its mobile applications use the technology claimed by the patents-in-suit for online local commerce marketplaces to connect merchants to consumers by offering goods and services at a discount.
About a year ago, IBM filed a similar lawsuit around the same patents against online travel company Priceline and three subsidiaries.
To develop the Prodigy online service that IBM launched with partners in the 1980s, the inventors of U.S. patents 5,796,967 and 7,072,849 developed new methods for presenting applications and advertisements in an interactive service that would take advantage of the computing power of each user’s PC and reduce demand on host servers, such as those used by Prodigy, IBM said in its complaint against Groupon.
“The inventors recognized that if applications were structured to be comprised of ‘objects’ of data and program code capable of being processed by a user’s PC, the Prodigy system would be more efficient than conventional systems,” it added.
Groupon is also accused of infringing U.S. Patent No.5,961,601, which was developed to find a better way of preserving state information in Internet communications, such as between an online merchant and a customer, according to IBM. Online merchants can use the state information to keep track of a client’s product and service selections while the client is shopping and then use that information when the client decides to make a purchase, something that stateless Internet communications protocols like HTTP cannot offer, it added.
Source- http://www.thegurureview.net/aroundnet-category/ibm-files-patent-infringement-lawsuit-against-groupon.html
Toshiba Announces New Line Of SSDs
Toshiba has announced its newest line of consumer grade SSDs based on 15nm TLC NAND, the Toshiba SG5 SSD series.
The new Toshiba SG5 SSD series will be available in 128GB, 256GB, 512GB and 1TB capacities as well as a couple of different form-factors, standard 2.5-inch and two different M.2 form-factors.
As noted, the Toshiba SG5 SSD series is based on 15nm TLC NAND with yet to be details controller and will offer sequential performance of up to 545MB/s for read and up to 388MB/s for write.
The 2.5-inch version of the Toshiba SG5 SSD series will be available in all aforementioned capacities, the M.2 2280-S2 (single side) form-factor version will be available in 128GB, 256GB and 512GB capacities and the M.2 2280-D2 (double side) version will only come in 1TB capacity.
The rest of the features are pretty standard for a consumer-grade SSD so you are looking at a power consumption 4.5W to 5.6W under load and 0.65W in idle and it includes Toshiba’s QSBC (Quadruple Swing-By Code) error correction technology.
Unfortunately, Toshiba did not unveil any details regarding the actual price of the new SG5 series SSDs but did say that it should be available sometime during this quarter.
Courtesy-Fud
Will Intel Release It’s 10nm Processors By 2017?
Comments Off on Will Intel Release It’s 10nm Processors By 2017?
Intel has said that a job advert which implied that it would not be using the 10nm process for two years was inaccurate and confirmed that it is on track for a 2017 release.
The advert, which was spotted by the Motley Fool has since been taken down, said the company’s 10-nanometer chip manufacturing technology would begin mass production “approximately two years” from the posting date.
Intel has said that the advert was wrong and confirmed that its “first 10-nanometer product is planned for the second half of 2017.”
It is not expected that Intel will roll out server chips in 2017. At the moment the plan appears to be introducing its second-generation 14-nanometer server chip family in early to mid-2017. But instead Intel will be trying to get its process ramped at high yields experimenting on the PC market so that 10-nanometer server processors will be ready for the first half of 2018.
This follows Intel’s traditional pattern of a having a few parts released as it experiments with the new tech. This is what happened in the first year of Intel’s 14-nanometer availability.
Courtesy-Fud
U.S. Wants To Help Supercomputer Makers
Comments Off on U.S. Wants To Help Supercomputer Makers
Five of the top 12 high performance computing systems in the world are owned by U.S. national labs. But they are beyond reach, financially and technically, for many within the computing industry, even larger ones.
That’s according to U.S. Department of Energy (DOE) officials, who run the national labs. A new program aims to connect manufacturers with supercomputers and the expertise to use them.
This program provides $3 million, initially, for 10 industry projects, the DOE has announced. Whether the program extends into future fiscal years may well depend on Congress.
The projects are all designed to improve efficiency, product development and energy use.
For instance, Procter & Gamble will get help to reduce the paper pulp in products by 20%, “which could result in significant cost and energy savings” in this energy- intensive industry, according to the project description.
Another firm, ZoomEssence, which produces “powder ingredients that capture all the key sensory components of a liquid,” will work to optimize the design of a new drying method using HPC simulations, according to the award description.
Some other projects in the initial implementation of what is being called HPC4Mfg (HPC for Manufacturing) includes an effort to help Global Foundriesoptimize transistor design.
In another, the Ohio Supercomputer Center and the Edison Welding Institute will develop a welding simulation tool.
The national labs not only have the hardware; “more importantly the labs have deep expertise in using HPC to help solve complex problems,” said Donna Crawford, the associate director of computation at Lawrence Livermore National Laboratory, in a conference call. They have the applications as well, she said.
HPC can be used to design and prototype products virtually that otherwise might require physical prototypes. These systems can run simulations and visualizations to discover, for instance, new energy-efficient manufacturing methods.
Source-http://www.thegurureview.net/computing-category/u-s-wants-to-help-supercomputer-makers.html
Is Microsoft A Risk?
Hewlett Packard Enterprise (HPE) has cast a shade on what it believes to be the biggest risks facing enterprises, and included on that list is Microsoft.
We ain’t surprised, but it is quite a shocking and naked fact when you consider it. The naming and resulting shaming happens in the HPE Cyber Risk Report 2016, which HPE said “identifies the top security threats plaguing enterprises”.
Enterprises, it seems, have myriad problems, of which Microsoft is just one.
“In 2015, we saw attackers infiltrate networks at an alarming rate, leading to some of the largest data breaches to date, but now is not the time to take the foot off the gas and put the enterprise on lockdown,” said Sue Barsamian, senior vice president and general manager for security products at HPE.
“We must learn from these incidents, understand and monitor the risk environment, and build security into the fabric of the organisation to better mitigate known and unknown threats, which will enable companies to fearlessly innovate and accelerate business growth.”
Microsoft earned its place in the enterprise nightmare probably because of its ubiquity. Applications, malware and vulnerabilities are a real problem, and it is Windows that provides the platform for this havoc.
“Software vulnerability exploitation continues to be a primary vector for attack, with mobile exploits gaining traction. Similar to 2014, the top 10 vulnerabilities exploited in 2015 were more than one-year-old, with 68 percent being three years old or more,” explained the report.
“In 2015, Microsoft Windows represented the most targeted software platform, with 42 percent of the top 20 discovered exploits directed at Microsoft platforms and applications.”
It is not all bad news for Redmond, as the Google-operated Android is also put forward as a professional pain in the butt. So is iOS, before Apple users get any ideas.
“Malware has evolved from being simply disruptive to a revenue-generating activity for attackers. While the overall number of newly discovered malware samples declined 3.6 percent year over year, the attack targets shifted notably in line with evolving enterprise trends and focused heavily on monetisation,” added the firm.
“As the number of connected mobile devices expands, malware is diversifying to target the most popular mobile operating platforms. The number of Android threats, malware and potentially unwanted applications have grown to more than 10,000 new threats discovered daily, reaching a total year-over-year increase of 153 percent.
“Apple iOS represented the greatest growth rate with a malware sample increase of more than 230 percent.”
Courtesy-TheInq
Microsoft Goes Underwater
Technology giants are finding some of the strangest places for data centers these days.
Facebook, for example, built a data center in Lulea in Sweden because the icy cold temperatures there would help cut the energy required for cooling. A proposed Facebook data center in Clonee, Ireland, will rely heavily on locally available wind energy. Google’s data center in Hamina in Finland uses sea water from the Bay of Finland for cooling.
Now, Microsoft is looking at locating data centers under the sea.
The company is testing underwater data centers with an eye to reducing data latency for the many users who live close to the sea and also to enable rapid deployment of a data center.
Microsoft, which has designed, built, and deployed its own subsea data center in the ocean, in the period of about a year, started working on the project in late 2014, a year after Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.
A prototype vessel, named the Leona Philpot after an Xbox game character, operated on the seafloor about 1 kilometer from the Pacific coast of the U.S. from August to November 2015, according to a Microsoft page on the project.
The subsea data center experiment, called Project Natick after a town in Massachusetts, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers.
“Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable,” the company said.
Using undersea data centers helps because they can serve the 50 percent of people who live within 200 kilometers from the ocean. Microsoft said in an FAQ that deployment in deepwater offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.
Courtesy- http://www.thegurureview.net/aroundnet-category/microsoft-goes-deep-with-underwater-data-center.html