Will Razer’s External Graphics Box Fail?
Comments Off on Will Razer’s External Graphics Box Fail?
We first saw the Razer Core, an external graphics box that connects to a notebook via Thunderbolt 3 port, back at CES 2016 in January, and today, Razer has finally unveiled a bit more details including the price, availability date and compatibility details.
At the GDC 2016 show in San Francisco, Razer has announced that the Core will be ready in April and have a price of US $499. As expected, it has been only validated on Razer Blade Stealth and the newly introduced Razer Blade 2016 Edition notebooks but as it uses Thunderbolt 3 interface, it should be compatible with any other notebook, as long as manufacturer wants it.
With dimensions set at 105 x 353 x 220mm, the Razer Core is reasonably portable. It comes with a 500W PSU and features four USB 3.0 ports, Gigabit Ethernet and Thunderbolt 3 port which is used to connect it to a notebook.
As far as graphics cards support is concerned, Razer says that the Core will work with any AMD Radeon graphics card since Radeon 290 series, including the latest R9 Fury, R9 Nano and Radeon 300 series, as well as pretty much all Nvidia Maxwell GPU based graphics cards since Geforce GTX 750/750 Ti, although we are not sure why would you pair up a US $500 priced box with a US $130 priced graphics cards. The maximum TDP for the graphics card is set at 375W, which means that all dual-GPU solutions are out of the picture, so it will go as far as R9 Fury X or the GTX Titan X.
There aren’t many notebooks that feature a Thunderbolt 3 ports and we have heard before that Thunderbolt 3 might have certain issues with latency, which is probably why other manufacturers like MSI and Alienware, went on with their own proprietary connectors. Of course, Razer probably did the math but we will surely keep a closer eye on it when it ships in April. Both AMD and Nvidia are tweaking their drivers and already have support for external graphics, so it probably will not matter which graphics card you pick.
According to Razer, the Razer Core will be available in April and priced at US $499. Razer is already started taking pre-orders for the Razer Core and offers a US $100 discount in case you buy it with one of their notebooks, Razer Blade 2016 or Blade Stealth.
Courtesy-Fud
MediaTek Shows Off The Helio X25 Chip
MediaTek has told Fudzilla that the Helio X25 SoC is not only real, but that it is a “turbo” version of the Helio X20.
Meizu is expected to be one of the first companies to use the X25. Last year it was also the first to use MTK 6795T for its Meizu MX5 phone. In that case the “T” suffix stood for Turbo. This phone was 200 MHz faster than the standard Helio X10 “non T” version.
In 2016 is that MediaTek decided to use the new Helio X25 name because of a commercial arrangement. MediaTek didn’t mention any of the partners, but confirmed that the CPU and GPU will be faster. They did not mention specific clock speeds. Below is a diagram of the Helio X20, and we assume that the first “eXtreme performance” cluster will get a frequency boost, as well as the GPU.
The Helio X25 will not have any architectural changes, it is just a faster version of X20, just like MTK 6795T was faster version of MTK 6795. According to the company, the Helio X25 will be available in May.
This three cluster Helio X25 SoC has real potential and should be one of the most advanced mobile solutions when it hits the market.The first leaked scores of the Helio X20 suggest great performance, but the X25 should have even better scores. There should be a dozen design wins with Helio X20/ X25 and most of them are yet to be announced. There should be a few announcements for the Helio X25 soon, but at least we do know that now there will be a even faster version of three cluster processor.
Courtesy-Fud
Is The GPU Market Going Down?
The global GPU market has fallen by 20 per cent over the last year.
According to Digitimes it fell to less than 30 million units in 2015 and the outfit suffering most was AMD. The largest graphics card player Palit Microsystems, which has several brands including Palit and Galaxy, shipped 6.9-7.1 million graphics cards in 2015, down 10 per cent on year. Asustek Computer shipped 4.5-4.7 million units in 2015, while Colorful shipped 3.9-4.1 million units, and is aiming to raise its shipments by 10 per cent on year in 2016.
Micro-Star International (MSI) enjoyed healthy graphics card shipments at 3.45-3.55 million in 2015, up 15 per cent on year, and EVGA, which has tight partnerships with Nvidia, also saw a significant shipment growth, while Gigabyte suffered from a slight drop on year. Sapphire and PowerColor suffered dramatic drops in shipments in 2015.
There are fears that several of the smaller GPU makers could be forced out of the market after AMD gets its act together with the arrival of Zen and Nvidia’s next-generation GPU architectures launch later in 2016.
Courtesy-Fud
Microsoft Goes Quantum Computing
Software giant Microsoft is focusing a lot of its R&D money on quantum computing.
Peter Lee, the corporate vice president of Microsoft Research said that Quantum computing is “stupendously exciting right now.”
Apparently it is Microsoft Research’s largest area of investment and Lee is pretty certain it is on the verge of some major scientific achievements.
“There’s just hope and optimism those scientific achievements will lead to practical outcomes. It’s hard to know when and where,” Lee said.
This is the first we have heard about Redmond’s quantum ambitions for a while. In 2014 the company revealed its “Station Q” group located on the University of California, Santa Barbara, campus, which has focused on quantum computing since its establishment a decade ago.
We sort of assumed that Microsoft would not get much work done on Quantum states because faced with a choice most cats would rather die in a box rather than listen to Steve Ballmer. But we guess with a more cat friendly CEO it is moving ahead.
Lee said that he has explained quantum computing research to Microsoft chief executive Satya Nadella by comparing it with speech processing. In that field, Microsoft researchers worked “so hard for a decade with no practical improvement,” he said. Then deep learning brought about considerable leaps forward in speech recognition and Microsoft was in on the ground floor.
“With quantum, we’ve made just gigantic advancements making semiconductor interfacing, allowing semiconductor materials to operate as though they were superconducting. What that means is the possibility of semiconductors that can operate at extremely high clock rates with very, very little or no heat dissipation. It’s just really spectacular.”
Courtesy-Fud
U.S. Wants To Help Supercomputer Makers
Comments Off on U.S. Wants To Help Supercomputer Makers
Five of the top 12 high performance computing systems in the world are owned by U.S. national labs. But they are beyond reach, financially and technically, for many within the computing industry, even larger ones.
That’s according to U.S. Department of Energy (DOE) officials, who run the national labs. A new program aims to connect manufacturers with supercomputers and the expertise to use them.
This program provides $3 million, initially, for 10 industry projects, the DOE has announced. Whether the program extends into future fiscal years may well depend on Congress.
The projects are all designed to improve efficiency, product development and energy use.
For instance, Procter & Gamble will get help to reduce the paper pulp in products by 20%, “which could result in significant cost and energy savings” in this energy- intensive industry, according to the project description.
Another firm, ZoomEssence, which produces “powder ingredients that capture all the key sensory components of a liquid,” will work to optimize the design of a new drying method using HPC simulations, according to the award description.
Some other projects in the initial implementation of what is being called HPC4Mfg (HPC for Manufacturing) includes an effort to help Global Foundriesoptimize transistor design.
In another, the Ohio Supercomputer Center and the Edison Welding Institute will develop a welding simulation tool.
The national labs not only have the hardware; “more importantly the labs have deep expertise in using HPC to help solve complex problems,” said Donna Crawford, the associate director of computation at Lawrence Livermore National Laboratory, in a conference call. They have the applications as well, she said.
HPC can be used to design and prototype products virtually that otherwise might require physical prototypes. These systems can run simulations and visualizations to discover, for instance, new energy-efficient manufacturing methods.
Source-http://www.thegurureview.net/computing-category/u-s-wants-to-help-supercomputer-makers.html
Is Microsoft A Risk?
Hewlett Packard Enterprise (HPE) has cast a shade on what it believes to be the biggest risks facing enterprises, and included on that list is Microsoft.
We ain’t surprised, but it is quite a shocking and naked fact when you consider it. The naming and resulting shaming happens in the HPE Cyber Risk Report 2016, which HPE said “identifies the top security threats plaguing enterprises”.
Enterprises, it seems, have myriad problems, of which Microsoft is just one.
“In 2015, we saw attackers infiltrate networks at an alarming rate, leading to some of the largest data breaches to date, but now is not the time to take the foot off the gas and put the enterprise on lockdown,” said Sue Barsamian, senior vice president and general manager for security products at HPE.
“We must learn from these incidents, understand and monitor the risk environment, and build security into the fabric of the organisation to better mitigate known and unknown threats, which will enable companies to fearlessly innovate and accelerate business growth.”
Microsoft earned its place in the enterprise nightmare probably because of its ubiquity. Applications, malware and vulnerabilities are a real problem, and it is Windows that provides the platform for this havoc.
“Software vulnerability exploitation continues to be a primary vector for attack, with mobile exploits gaining traction. Similar to 2014, the top 10 vulnerabilities exploited in 2015 were more than one-year-old, with 68 percent being three years old or more,” explained the report.
“In 2015, Microsoft Windows represented the most targeted software platform, with 42 percent of the top 20 discovered exploits directed at Microsoft platforms and applications.”
It is not all bad news for Redmond, as the Google-operated Android is also put forward as a professional pain in the butt. So is iOS, before Apple users get any ideas.
“Malware has evolved from being simply disruptive to a revenue-generating activity for attackers. While the overall number of newly discovered malware samples declined 3.6 percent year over year, the attack targets shifted notably in line with evolving enterprise trends and focused heavily on monetisation,” added the firm.
“As the number of connected mobile devices expands, malware is diversifying to target the most popular mobile operating platforms. The number of Android threats, malware and potentially unwanted applications have grown to more than 10,000 new threats discovered daily, reaching a total year-over-year increase of 153 percent.
“Apple iOS represented the greatest growth rate with a malware sample increase of more than 230 percent.”
Courtesy-TheInq
Microsoft Goes Underwater
Technology giants are finding some of the strangest places for data centers these days.
Facebook, for example, built a data center in Lulea in Sweden because the icy cold temperatures there would help cut the energy required for cooling. A proposed Facebook data center in Clonee, Ireland, will rely heavily on locally available wind energy. Google’s data center in Hamina in Finland uses sea water from the Bay of Finland for cooling.
Now, Microsoft is looking at locating data centers under the sea.
The company is testing underwater data centers with an eye to reducing data latency for the many users who live close to the sea and also to enable rapid deployment of a data center.
Microsoft, which has designed, built, and deployed its own subsea data center in the ocean, in the period of about a year, started working on the project in late 2014, a year after Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.
A prototype vessel, named the Leona Philpot after an Xbox game character, operated on the seafloor about 1 kilometer from the Pacific coast of the U.S. from August to November 2015, according to a Microsoft page on the project.
The subsea data center experiment, called Project Natick after a town in Massachusetts, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers.
“Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable,” the company said.
Using undersea data centers helps because they can serve the 50 percent of people who live within 200 kilometers from the ocean. Microsoft said in an FAQ that deployment in deepwater offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.
Courtesy- http://www.thegurureview.net/aroundnet-category/microsoft-goes-deep-with-underwater-data-center.html
FCC Approves Use Of BYOCB
February 11, 2016 by admin
Filed under Around The Net
Comments Off on FCC Approves Use Of BYOCB
In a sweeping change of course directed at a tightly controlled television industry, cable and satellite operators in the United States will now be obligated to let their customers freely choose which set-top boxes they can use, according to a proposal announced by the Federal Communications Commission on Wednesday.
The move is expected to have wide-ranging implications for large technology companies looking to get their brand names into every consumer’s living room. For example, under the new rules, Google, Amazon and Apple would now be allowed to create entertainment room devices that blend Internet and cable programming in a way the television industry has until now resisted. Next-generation media players, including the Chromecast, Fire TV and Apple TV, would now be granted permission to line the backs of their devices with coaxial inputs and internal “smart access card” equivalents integrated right into device firmware with a simple subscription activation process.
As the Wall Street Journal notes, Senators Edward Markey of Massachusetts and Richard Blumenthal of Connecticut investigated the cable set-top box market last summer and found that the cable industry generates roughly $19.1 billion in annual revenue from cable box rentals alone.
Meanwhile, the cost of cable set-top boxes has risen 185 percent since 1995, while the cost of PCs, televisions and smartphones has dropped by 90 percent. FCC Chairman Tom Wheeler admits that these economies of scale don’t need to remain so unbalanced any longer.
The FCC says its focus will be primarily on improving day-to-day television experience. In the past, the burdensome requirements of long-term contracts tethered to clunky, unsightly cable and satellite boxes has been a major source of customer complaints.
Wheeler has also said that access to specific video content shouldn’t be frustrating to the average consumer in an age where we are constantly surrounded by a breadth of information to sift through. “Improved search functions [can] lead consumers to a variety of video content that is buried behind guides or available on video services you can’t access with your set-top box today,” Wheeler says.
The FCC is expected to vote on the proposal on Thursday, February 18th. FCC Chairman Tom Wheeler’s full statement on the commission’s new proposal can be found here.
Courtesy-Fud
iOS Developers Warned About Taking Shortcuts
Comments Off on iOS Developers Warned About Taking Shortcuts
Slapdash developers have been advised not to use the open source JSPatch method of updating their wares because it is as vulnerable as a soft boiled egg, for various reasons.
It’s FireEye that is giving JSPatch the stink eye and providing the warning that it has rendered over 1,000 applications open to copy and paste theft of photos and other information. And it doesn’t end there.
FireEye’s report said that Remote Hot Patching may sound like a good idea at the time, but it really isn’t. It is so widely used that is has opened up a 1,220-wide iOS application hole in Apple users’ security. A better option, according to the security firm, is to stick with the Apple method, which should provide adequate and timely protection.
“Within the realm of Apple-provided technologies, the way to remediate this situation is to rebuild the application with updated code to fix the bug and submit the newly built app to the App Store for approval,” said FireEye.
“While the review process for updated apps often takes less time than the initial submission review, the process can still be time-consuming and unpredictable, and can cause loss of business if app fixes are not delivered in a timely and controlled manner.
“However, if the original app is embedded with the JSPatch engine, its behaviour can be changed according to the JavaScript code loaded at runtime. This JavaScript file is remotely controlled by the app developer. It is delivered to the app through network communication.”
Let’s not all make this JSPatch’s problem, because presumably it’s developers who are lacking.
FireEye spoke up for the open source security gear while looking down its nose at hackers. “JSPatch is a boon to iOS developers. In the right hands, it can be used to quickly and effectively deploy patches and code updates. But in a non-utopian world like ours, we need to assume that bad actors will leverage this technology for unintended purposes,” the firm said.
“Specifically, if an attacker is able to tamper with the content of a JavaScript file that is eventually loaded by the app, a range of attacks can be successfully performed against an App Store application.
Courteys-TheInq
Is Facebook Going Video?
February 9, 2016 by admin
Filed under Around The Net
Comments Off on Is Facebook Going Video?
Facebook is contemplating the development of a dedicated service or page where users will be able watch videos and not be bothered by other content.
The social network continues to see surging interest in video. During one day last quarter, its users watched a combined 100 million hours of video. Roughly 500 million users watch at least some video each day.
That’s a lot of video and a lot of viewers, and Facebook wants to capitalize on it.
“We are exploring a dedicated place on Facebook for when they just want to watch videos,” CEO Mark Zuckerberg said Wednesday during a conference call to discuss Facebook’s quarterly financial results.
But he was tight-lipped on how the video might actually be presented.
Asked if a stand-alone video app is in the cards, he mentioned the success of Messenger and a Facebook app for managing Pages. “I do think there are additional opportunities for this and we’ll continue looking at them,” he said.
Facebook wants to encourage more video viewing because it keeps users on the site longer, helping it to sell more ads.
“Marketers also really love video and it’s a compelling way to reach consumers,” COO Sheryl Sandberg said during the call.
Zuckerberg has been watching the growth of video for osme time. At a town hall meeting in November 2014, he predicted, ”In five years, most of [Facebook] will be video.”
And it’s likely that most of that video will be consumed over mobile networks.
Among Facebook’s heaviest users — the billion people who access it on a daily basis — 90 percent use a mobile device, either solely or in addition to their PC.
It’s financial results for the fourth quarter were strong. Revenue was $5.8 billion, up 52 percent from the same period in 2014, while net profit more than doubled to $1.6 billion.
http://www.thegurureview.net/aroundnet-category/facebook-exploring-a-dedicated-video-service.html