Microsoft Goes Quantum Computing
Software giant Microsoft is focusing a lot of its R&D money on quantum computing.
Peter Lee, the corporate vice president of Microsoft Research said that Quantum computing is “stupendously exciting right now.”
Apparently it is Microsoft Research’s largest area of investment and Lee is pretty certain it is on the verge of some major scientific achievements.
“There’s just hope and optimism those scientific achievements will lead to practical outcomes. It’s hard to know when and where,” Lee said.
This is the first we have heard about Redmond’s quantum ambitions for a while. In 2014 the company revealed its “Station Q” group located on the University of California, Santa Barbara, campus, which has focused on quantum computing since its establishment a decade ago.
We sort of assumed that Microsoft would not get much work done on Quantum states because faced with a choice most cats would rather die in a box rather than listen to Steve Ballmer. But we guess with a more cat friendly CEO it is moving ahead.
Lee said that he has explained quantum computing research to Microsoft chief executive Satya Nadella by comparing it with speech processing. In that field, Microsoft researchers worked “so hard for a decade with no practical improvement,” he said. Then deep learning brought about considerable leaps forward in speech recognition and Microsoft was in on the ground floor.
“With quantum, we’ve made just gigantic advancements making semiconductor interfacing, allowing semiconductor materials to operate as though they were superconducting. What that means is the possibility of semiconductors that can operate at extremely high clock rates with very, very little or no heat dissipation. It’s just really spectacular.”
Courtesy-Fud
Intel Putting RealSense Into VR
March 16, 2016 by admin
Filed under Around The Net
Comments Off on Intel Putting RealSense Into VR
Intel is adapting its RealSense depth camera into an augmented reality headset design which it might be licensing to other manufacturers.
The plan is not official yet but appears to have been leaked to the Wall Street Journal. Achin Bhowmik, who oversees RealSense as vice president and general manager of Intel’s perceptual computing group, declined to discuss unannounced development efforts.
But he said Intel has a tradition of creating prototypes for products like laptop computers to help persuade customers to use its components. We have to build the entire experience ourselves before we can convince the ecosystem,” Bhowmik said.
Intel appears to be working on an augmented-reality headset when it teamed up with IonVR to to work on an augmented-reality headset that could work with a variety of operating systems, including Android and iOS. Naturally, it had a front-facing RealSense camera.
RealSense depth camera has been in development for several years and was shown as a viable product technology at the Consumer Electronics Show in 2014. Since then, nothing has happened and Microsoft’s Kinect sensor technology for use with Windows Hello in the Surface Pro 4 and Surface Book knocked it aside.
Intel’s biggest issue is that it is talking about making a consumer product which is something that it never got the hang of.
RealSense technology is really good at translating real-world objects into virtual space. In fact a lot better than the HoloLens because it can scan the user’s hands and translate them into virtual objects that can manipulate other virtual objects.
Courtesy-Fud
Samsung Bring 15TB SSD To Market
Samsung has now officially announced and started to ship its new Samsung PM1633a line of solid state drives for Enterprise Storage Systems, which includes the highest capacity SSD ever made by Samsung, the 15.35TB PM1366a model.
Revealed back during the 2015 Flash Memory Summit in August last year, the now available Samsung PM1633a enterprise SSD series is based on a standard 2.5-inch form factor and features a 12Gbps Serial Attached SCSI (SAS) interface. It also uses Samsung new controller as well as Samsung’s own 3rd generation 256Gb 48-layer TLC V-NAND.
As noted, the Samsung PM1633a lineup is based on Samsung’s 256Gb V-NAND flash chips. The 256Gb dies are stacked in 16 layers to form a single 512GB package and by adding up a total of 32 NAND packages, you get the 15.36TB model. According to Samsung, the 3rd generation 256Gb V-NAND will provide both significant performance as well as reliability improvements compared to the PM1633 drive which used 2nd generation 32-layer 128Gb V-NAND flash.
The controller has also been upgraded to concurrently access large amounts of high-density NAND flash and the PM1633a 15.36TB model comes with no less than 16GB of cache.
When it comes to performance, the Samsung PM1633a provides sequential read and write performance of up to 1,200MB/s while random 4K performance is set at up to 200,000 IOPS for read and up to 32,000 IOPS for write. The new Samsung PM1633a enterprise SSD also offers high high reliability date with 1DWPD (drive writes per day), adding up to 15.36TB that can be written every day without failure, which is quite important in the enterprise market.
While the 15.36TB model of the Samsung P1633a is already shipping to select enterprise customers, Samsung is also promising a wide range of capacities, including 480GB, 960GB, 1.92TB, 3.84TB and 7.68TB. According to Samsung, enterprise managers can now fit twice as many drives in a standard 19-inch 2U rack compared to a 3.5-inch storage drive.
Unfortunately, Samsung did not reveal any details regarding the price but we doubt that such high capacity and performance will have a low price tag.
Courtesy-Fud
IBM Goes After Groupon
March 14, 2016 by admin
Filed under Around The Net
Comments Off on IBM Goes After Groupon
IBM has filed suit against online deals marketplace Groupon for infringing four of its patents, including two that emerged from Prodigy, the online service launched by IBM and partners ahead of the World Wide Web.
Groupon has built its business model on the use of IBM’s patents, according to the complaint filed Wednesday in the federal court for the District of Delaware. “Despite IBM’s repeated attempts to negotiate, Groupon refuses to take a license, but continues to use IBM’s property,” according to the computing giant, which is asking the court to order Groupon to halt further infringement and pay damages.
IBM alleges that websites under Groupon’s control and its mobile applications use the technology claimed by the patents-in-suit for online local commerce marketplaces to connect merchants to consumers by offering goods and services at a discount.
About a year ago, IBM filed a similar lawsuit around the same patents against online travel company Priceline and three subsidiaries.
To develop the Prodigy online service that IBM launched with partners in the 1980s, the inventors of U.S. patents 5,796,967 and 7,072,849 developed new methods for presenting applications and advertisements in an interactive service that would take advantage of the computing power of each user’s PC and reduce demand on host servers, such as those used by Prodigy, IBM said in its complaint against Groupon.
“The inventors recognized that if applications were structured to be comprised of ‘objects’ of data and program code capable of being processed by a user’s PC, the Prodigy system would be more efficient than conventional systems,” it added.
Groupon is also accused of infringing U.S. Patent No.5,961,601, which was developed to find a better way of preserving state information in Internet communications, such as between an online merchant and a customer, according to IBM. Online merchants can use the state information to keep track of a client’s product and service selections while the client is shopping and then use that information when the client decides to make a purchase, something that stateless Internet communications protocols like HTTP cannot offer, it added.
Source- http://www.thegurureview.net/aroundnet-category/ibm-files-patent-infringement-lawsuit-against-groupon.html
U.S. Wants To Help Supercomputer Makers
Comments Off on U.S. Wants To Help Supercomputer Makers
Five of the top 12 high performance computing systems in the world are owned by U.S. national labs. But they are beyond reach, financially and technically, for many within the computing industry, even larger ones.
That’s according to U.S. Department of Energy (DOE) officials, who run the national labs. A new program aims to connect manufacturers with supercomputers and the expertise to use them.
This program provides $3 million, initially, for 10 industry projects, the DOE has announced. Whether the program extends into future fiscal years may well depend on Congress.
The projects are all designed to improve efficiency, product development and energy use.
For instance, Procter & Gamble will get help to reduce the paper pulp in products by 20%, “which could result in significant cost and energy savings” in this energy- intensive industry, according to the project description.
Another firm, ZoomEssence, which produces “powder ingredients that capture all the key sensory components of a liquid,” will work to optimize the design of a new drying method using HPC simulations, according to the award description.
Some other projects in the initial implementation of what is being called HPC4Mfg (HPC for Manufacturing) includes an effort to help Global Foundriesoptimize transistor design.
In another, the Ohio Supercomputer Center and the Edison Welding Institute will develop a welding simulation tool.
The national labs not only have the hardware; “more importantly the labs have deep expertise in using HPC to help solve complex problems,” said Donna Crawford, the associate director of computation at Lawrence Livermore National Laboratory, in a conference call. They have the applications as well, she said.
HPC can be used to design and prototype products virtually that otherwise might require physical prototypes. These systems can run simulations and visualizations to discover, for instance, new energy-efficient manufacturing methods.
Source-http://www.thegurureview.net/computing-category/u-s-wants-to-help-supercomputer-makers.html
Is Microsoft A Risk?
Hewlett Packard Enterprise (HPE) has cast a shade on what it believes to be the biggest risks facing enterprises, and included on that list is Microsoft.
We ain’t surprised, but it is quite a shocking and naked fact when you consider it. The naming and resulting shaming happens in the HPE Cyber Risk Report 2016, which HPE said “identifies the top security threats plaguing enterprises”.
Enterprises, it seems, have myriad problems, of which Microsoft is just one.
“In 2015, we saw attackers infiltrate networks at an alarming rate, leading to some of the largest data breaches to date, but now is not the time to take the foot off the gas and put the enterprise on lockdown,” said Sue Barsamian, senior vice president and general manager for security products at HPE.
“We must learn from these incidents, understand and monitor the risk environment, and build security into the fabric of the organisation to better mitigate known and unknown threats, which will enable companies to fearlessly innovate and accelerate business growth.”
Microsoft earned its place in the enterprise nightmare probably because of its ubiquity. Applications, malware and vulnerabilities are a real problem, and it is Windows that provides the platform for this havoc.
“Software vulnerability exploitation continues to be a primary vector for attack, with mobile exploits gaining traction. Similar to 2014, the top 10 vulnerabilities exploited in 2015 were more than one-year-old, with 68 percent being three years old or more,” explained the report.
“In 2015, Microsoft Windows represented the most targeted software platform, with 42 percent of the top 20 discovered exploits directed at Microsoft platforms and applications.”
It is not all bad news for Redmond, as the Google-operated Android is also put forward as a professional pain in the butt. So is iOS, before Apple users get any ideas.
“Malware has evolved from being simply disruptive to a revenue-generating activity for attackers. While the overall number of newly discovered malware samples declined 3.6 percent year over year, the attack targets shifted notably in line with evolving enterprise trends and focused heavily on monetisation,” added the firm.
“As the number of connected mobile devices expands, malware is diversifying to target the most popular mobile operating platforms. The number of Android threats, malware and potentially unwanted applications have grown to more than 10,000 new threats discovered daily, reaching a total year-over-year increase of 153 percent.
“Apple iOS represented the greatest growth rate with a malware sample increase of more than 230 percent.”
Courtesy-TheInq
Microsoft Goes Underwater
Technology giants are finding some of the strangest places for data centers these days.
Facebook, for example, built a data center in Lulea in Sweden because the icy cold temperatures there would help cut the energy required for cooling. A proposed Facebook data center in Clonee, Ireland, will rely heavily on locally available wind energy. Google’s data center in Hamina in Finland uses sea water from the Bay of Finland for cooling.
Now, Microsoft is looking at locating data centers under the sea.
The company is testing underwater data centers with an eye to reducing data latency for the many users who live close to the sea and also to enable rapid deployment of a data center.
Microsoft, which has designed, built, and deployed its own subsea data center in the ocean, in the period of about a year, started working on the project in late 2014, a year after Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.
A prototype vessel, named the Leona Philpot after an Xbox game character, operated on the seafloor about 1 kilometer from the Pacific coast of the U.S. from August to November 2015, according to a Microsoft page on the project.
The subsea data center experiment, called Project Natick after a town in Massachusetts, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers.
“Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable,” the company said.
Using undersea data centers helps because they can serve the 50 percent of people who live within 200 kilometers from the ocean. Microsoft said in an FAQ that deployment in deepwater offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.
Courtesy- http://www.thegurureview.net/aroundnet-category/microsoft-goes-deep-with-underwater-data-center.html
FCC Approves Use Of BYOCB
February 11, 2016 by admin
Filed under Around The Net
Comments Off on FCC Approves Use Of BYOCB
In a sweeping change of course directed at a tightly controlled television industry, cable and satellite operators in the United States will now be obligated to let their customers freely choose which set-top boxes they can use, according to a proposal announced by the Federal Communications Commission on Wednesday.
The move is expected to have wide-ranging implications for large technology companies looking to get their brand names into every consumer’s living room. For example, under the new rules, Google, Amazon and Apple would now be allowed to create entertainment room devices that blend Internet and cable programming in a way the television industry has until now resisted. Next-generation media players, including the Chromecast, Fire TV and Apple TV, would now be granted permission to line the backs of their devices with coaxial inputs and internal “smart access card” equivalents integrated right into device firmware with a simple subscription activation process.
As the Wall Street Journal notes, Senators Edward Markey of Massachusetts and Richard Blumenthal of Connecticut investigated the cable set-top box market last summer and found that the cable industry generates roughly $19.1 billion in annual revenue from cable box rentals alone.
Meanwhile, the cost of cable set-top boxes has risen 185 percent since 1995, while the cost of PCs, televisions and smartphones has dropped by 90 percent. FCC Chairman Tom Wheeler admits that these economies of scale don’t need to remain so unbalanced any longer.
The FCC says its focus will be primarily on improving day-to-day television experience. In the past, the burdensome requirements of long-term contracts tethered to clunky, unsightly cable and satellite boxes has been a major source of customer complaints.
Wheeler has also said that access to specific video content shouldn’t be frustrating to the average consumer in an age where we are constantly surrounded by a breadth of information to sift through. “Improved search functions [can] lead consumers to a variety of video content that is buried behind guides or available on video services you can’t access with your set-top box today,” Wheeler says.
The FCC is expected to vote on the proposal on Thursday, February 18th. FCC Chairman Tom Wheeler’s full statement on the commission’s new proposal can be found here.
Courtesy-Fud
iOS Developers Warned About Taking Shortcuts
Comments Off on iOS Developers Warned About Taking Shortcuts
Slapdash developers have been advised not to use the open source JSPatch method of updating their wares because it is as vulnerable as a soft boiled egg, for various reasons.
It’s FireEye that is giving JSPatch the stink eye and providing the warning that it has rendered over 1,000 applications open to copy and paste theft of photos and other information. And it doesn’t end there.
FireEye’s report said that Remote Hot Patching may sound like a good idea at the time, but it really isn’t. It is so widely used that is has opened up a 1,220-wide iOS application hole in Apple users’ security. A better option, according to the security firm, is to stick with the Apple method, which should provide adequate and timely protection.
“Within the realm of Apple-provided technologies, the way to remediate this situation is to rebuild the application with updated code to fix the bug and submit the newly built app to the App Store for approval,” said FireEye.
“While the review process for updated apps often takes less time than the initial submission review, the process can still be time-consuming and unpredictable, and can cause loss of business if app fixes are not delivered in a timely and controlled manner.
“However, if the original app is embedded with the JSPatch engine, its behaviour can be changed according to the JavaScript code loaded at runtime. This JavaScript file is remotely controlled by the app developer. It is delivered to the app through network communication.”
Let’s not all make this JSPatch’s problem, because presumably it’s developers who are lacking.
FireEye spoke up for the open source security gear while looking down its nose at hackers. “JSPatch is a boon to iOS developers. In the right hands, it can be used to quickly and effectively deploy patches and code updates. But in a non-utopian world like ours, we need to assume that bad actors will leverage this technology for unintended purposes,” the firm said.
“Specifically, if an attacker is able to tamper with the content of a JavaScript file that is eventually loaded by the app, a range of attacks can be successfully performed against an App Store application.
Courteys-TheInq
Is Facebook Going Video?
February 9, 2016 by admin
Filed under Around The Net
Comments Off on Is Facebook Going Video?
Facebook is contemplating the development of a dedicated service or page where users will be able watch videos and not be bothered by other content.
The social network continues to see surging interest in video. During one day last quarter, its users watched a combined 100 million hours of video. Roughly 500 million users watch at least some video each day.
That’s a lot of video and a lot of viewers, and Facebook wants to capitalize on it.
“We are exploring a dedicated place on Facebook for when they just want to watch videos,” CEO Mark Zuckerberg said Wednesday during a conference call to discuss Facebook’s quarterly financial results.
But he was tight-lipped on how the video might actually be presented.
Asked if a stand-alone video app is in the cards, he mentioned the success of Messenger and a Facebook app for managing Pages. “I do think there are additional opportunities for this and we’ll continue looking at them,” he said.
Facebook wants to encourage more video viewing because it keeps users on the site longer, helping it to sell more ads.
“Marketers also really love video and it’s a compelling way to reach consumers,” COO Sheryl Sandberg said during the call.
Zuckerberg has been watching the growth of video for osme time. At a town hall meeting in November 2014, he predicted, ”In five years, most of [Facebook] will be video.”
And it’s likely that most of that video will be consumed over mobile networks.
Among Facebook’s heaviest users — the billion people who access it on a daily basis — 90 percent use a mobile device, either solely or in addition to their PC.
It’s financial results for the fourth quarter were strong. Revenue was $5.8 billion, up 52 percent from the same period in 2014, while net profit more than doubled to $1.6 billion.
http://www.thegurureview.net/aroundnet-category/facebook-exploring-a-dedicated-video-service.html