RedHat Buys InkTank
Red Hat has announced that it bought storage system provider Inktank.
Inktank is the company behind Ceph, the cloud based objects and block storage software package used in a number of Openstack cloud configurations.
Ceph will continue to be marketed alongside Red Hat’s own GlusterFS in a deal worth $175m, which the company does not believe will adversely affect its financial forecasts for the year.
In a statement, Brian Stevens, EVP and CTO of Red Hat said, “We’re thrilled to welcome Inktank to the Red Hat family. They have built an incredibly vibrant community that will continue to be nurtured as we work together to make open the de facto choice for software-defined storage. Inktank has done a brilliant job assembling a strong ecosystem around Ceph and we look forward to expanding on this success together.”
As part of the deal Ceph’s Monitoring and Diagnostics tool Calamari will also become open source, allowing users to add their own modules and functionality.
Inktank founder Sage Weil used his blog to assure users that the two storage systems will be treated with equal respect. “Red Hat intends to administer the Ceph trademark in a manner that protects the ecosystem as a whole and creates a level playing field where everyone is held to the same standards of use.”
Red Hat made the announcement fresh from Red Hat Summit in New York, where the company reaffirmed that it is the Linux distribution of choice at the CERN supercollider in Switzerland.
The Inktank deal is set to close later this month.
ARM To Focus On 64-bit SoC
ARM announced its first 64-bit cores a while ago and SoC makers have already rolled out several 64-bit designs. However, apart from Apple nobody has consumer oriented 64-bit ARM devices on the market just yet. They are slowly starting to show up and ARM says the transition to 64-bit parts is accelerating. However, the first wave of 64-bit ARM parts is not going after the high-end market.
Is 64-bit support on entry-level SoCs just a gimmick?
This trend raises a rather obvious question – are low end ARMv8 parts just a marketing gimmick, or do they really offer a significant performance gain? There is no straight answer at this point. It will depend on Google and chipmakers themselves, as well as phonemakers.
Qualcomm announced its first 64-bit part late last year. The Snapdragon 410 won’t turn many heads. It is going after $150 phones and it is based on Cortex A53 cores. It also has LTE, which makes it rather interesting.
MediaTek is taking a similar approach. Its quad-core MT6732 and octa-core MT6752 parts are Cortex A53 designs, too. Both sport LTE connectivity.
Qualcomm and MediaTek appear to be going after the same market – $100 to $150 phones with LTE and quad-core 64-bit stickers on the box. Marketers should like the idea, as they’re getting a few good buzzwords for entry-level gear.
However, we still don’t know much about their real-world performance. Don’t expect anything spectacular. The Cortex A53 is basically the 64-bit successor to the frugal Cortex A7. The A53 has a bit more cache, 40-bit physical addresses and it ends up a bit faster than the A7, but not by much. ARM says the A7 delivers 1.9DMIPS/MHz per core, while the A53 churns out 2.3DMIPS/MHz. That puts it in the ballpark of the good old Cortex A9. The first consumer oriented quad-core Cortex A9 part was Nvidia’s Tegra 3, so in theory a Cortex A53 quad-core could be as fast as a Tegra 3 clock-for-clock, but at 28nm we should see somewhat higher clocks, along with better graphics.
That’s not bad for $100 to $150 devices. LTE support is just the icing on the cake. Keep in mind that the Cortex A7 is ARM’s most efficient 32-bit core, hence we expect nothing less from the Cortex A53.
The Cortex A57 conundrum
Speaking to CNET’s Brooke Crothers, ARM executive vice president of corporate strategy Tom Lantzsch said the company was surprised by strong demand for 64-bit designs.
“Certainly, we’ve had big uptick in demand for mobile 64-bit products. We’ve seen this with our [Cortex] A53, a high-performance 64-bit mobile processor,” Lantzch told CNET.
He said ARM has been surprised by the pace of 64-bit adoption, with mobile parts coming from Qualcomm, MediaTek and Marvell. He said he hopes to see 64-bit phones by Christmas, although we suspect the first entry-level products will appear much sooner.
Lantzsch points out that even 32-bit code will run more efficiently on 64-bit ARMv8 parts. As software support improves, the performance gains will become more evident.
But where does this leave the Cortex A57? It is supposed to replace the Cortex A15, which had a few teething problems. Like the A15 it is a relatively big core. The A15 was simply too big and impractical on the 32nm node. On 28nm it’s better, but not perfect. It is still a huge core and its market success has been limited.
As a result, it’s highly unlikely that we will see any 28nm Cortex A57 parts. Qualcomm’s upcoming Snapdragon 810 is the first consumer oriented A57 SoC. It is a 20nm design and it is coming later this year, just in time for Christmas as ARM puts it. However, although the Snapdragon 810 will be ready by the end of the year, the first phones based on the new chip are expected to ship in early 2015.
While we will be able to buy 64-bit Android (and possibly Windows Phone) devices before Christmas, most if not all of them will be based on the A53. That’s not necessarily a bad thing. Consumers won’t have to spend $500 to get a 64-bit ARM device, so the user base could start growing long before high-end parts start shipping, thus forcing developers and Google to speed up 64-bit development.
If rumors are to be believed, Google is doing just that and it is not shying away from small 64-bit cores. The search giant is reportedly developing a $100 Nexus phone for emerging markets. It is said to be based on MediaTek’s MT6732 clocked at 1.5GHz. Sounds interesting, provided the rumour turns out to be true.
Is Qualcomm In Trouble?
Qualcomm’s activities in China may lead to regulatory penalties for the chip vendor, this time from the U.S. Securities and Exchange Commission over bribery allegations.
The company is currently facing an anti-monopoly probe from Chinese authorities for allegedly overcharging clients. Qualcomm has also said that the SEC may also consider penalizing the company, as part of an anti-corruption investigation.
The SEC’s Los Angeles Regional Office has made a preliminary decision to recommend that the SEC take action against Qualcomm for violating anti-bribery controls, the company said in its second quarter report. The accusations involve Qualcomm offering benefits to “individuals associated with Chinese state-owned companies or agencies,” the report added.
Both the SEC and the U.S. Department of Justice have been probing the company over alleged violations of the nation’s Foreign Corrupt Practices Act.
In cooperation with those official investigations, Qualcomm said it’s found instances of preferential hiring, and giving gifts and other benefits to “several individuals” with China’s state-owned companies. The gifts and benefits amounted to less than US$250,000 in value.
If the SEC takes action against Qualcomm, penalties could include giving up profits, facing injunctions, and other monetary penalties, the company said. Earlier this month, Qualcomm filed a submission with the U.S. regulator, countering any claims of wrongdoing.
Qualcomm is facing the investigations at a time when China is increasingly become a bigger part of its business. The nation is the world’s largest smartphone market, and more Chinese device manufacturers are expanding globally.
Last year, however, Chinese regulators began investigating Qualcomm due to complaints from industry groups. The company was allegedly abusing its market position and charging higher fees for its patent licensing business. In November, Chinese authorities conducted two surprise raids of Qualcomm offices in China for documents.
Chinese regulators could decide to penalize Qualcomm by confiscating financial gains made, and even imposing a fine of 1 to 10 percent on its revenues for the prior year, the company said in its quarterly report.
Can AMD Lead?
He is one of the drivers behind AMD’s transformation, with the ultimate goal of turning the chipmaker into a new organization that is not so heavily dependent on the PC market. John confirmed that the company is on the road to achieve a huge milestone in its transition plans, generating approximately 50 percent of its revenue from the non-PC market by the end of 2015.
The time for the talk could not been better, as the market reacted positively to AMD’s Q1 earnings and at press time the stock was at $4.14, up $0.45 or 12.06 percent which is a huge jump for a tech stock. Keep in mind that many tech stocks have been bearish over the last four weeks, with several massive selloffs, especially in software and internet companies.
AMD fighting back in CPU space
We covered numerous topics from desktops, notebooks and tablets strategy all the way to the server, semi-custom APUs and of course the graphics market.
John said that leadership in the graphics sector is critical in AMD’s strategy, none more so than in the PC space where AMD wants to use their performance APU’s to compete with Intel’s Core i3 and Core i5 processors in the lucrative mainstream market. This is what AMD wants to address with Kaveri and to some extent with Kabini APUs.
AMD has high hopes for its upcoming server parts where they just launched their first ARM 64-bit product for the dense server space, where AMD expects to be a leader. On the other side of the spectrum the frugal AM1 platform launched a few weeks ago and it is getting very positive reviews. The first Kaveri parts have been on sale for a while, although we would like to see more desktop SKUs, not to mention mobile Kaveri APUs, including ULV variants.
Semi-custom APUs are blurring the line between AMD’s traditional product classes, but sales appear to be good, with more than 12 million Xbox One and PlayStation 4 consoles in the wild.
Phenomenal discrete GPU sales
Byrne is quietly confident when it comes to the GPU market, having just seen very strong sales in the performance and enthusiast high end segments of the market. The surge was driven by competitive products, great games and bundles, even with the cryptocurrency craze which was more or less a fluke for AMD.
The company remains committed to the GPU market, and expects to bring the successful R9 / R7 architecture further down into the mainstream price points in 2014, with similar traction. This means AMD will continue the fight against Nvidia in desktop and notebook GPU markets, while at the same time taking on Intel on desktop and notebook side with new APUs.
AMD thinks that the mix of great gaming performance, HSA, Mantle, Open CL, compute performance and some cool technologies like facial recognition can boost its position in the GPU market. This is just one part of the magic potion that is really starting to work for AMD, but it’s good to know that when it comes to graphics and gaming, AMD will stay committed to these markets in 2014 and beyond.
Enthusiasts need not worry. Although the company is reinventing itself and pursuing non-PC revenue streams, AMD will still be there to cater to their needs.
Can AMD Grow
AMD posted some rather encouraging Q1 numbers last night, but slow PC sales are still hurting the company, along with the rest of the sector.
When asked about the PC market slump, AMD CEO Rory Read confirmed that the PC market was down sequentially 7 percent. This was a bit better than the company predicted, as the original forecast was that the PC market would decline 7 to 10 percent.
Rory pointed out that AMD can grow in the PC market as there is a lot of ground that can be taken from the competition. The commercial market did better than expected and Rory claims that AMD’s diversification strategy is taking off. AMD is trying to win market share in desktop and commercial segments, hence AMD sees an opportunity to grown PC revenue in the coming quarters. Rory also expects that tablets will continue to cannibalize the PC market. This is not going to change soon.
Kaveri and Kabini will definitely help this effort as both are solid parts priced quite aggressively. Kabini is also available in AMD’s new AM1 platform and we believe it is an interesting concept with plenty of mass market potential. Desktop and Notebook ASPs are flat which is something that the financial community really appreciated. It would not be so unusual that average selling prices were down since the global PC market was down.
Kaveri did well in the desktop high-end market in Q1 2014 and there will be some interesting announcements in the mobile market in Q2 2014 and beyond.
Can Plastic Replace Silicon?
Can plastic materials morph into computers? A research breakthrough recently published brings such a possibility closer to reality.
Researchers are looking at the possibility of making low-power, flexible and inexpensive computers out of plastic materials. Plastic is not normally a good conductive material. However, researchers said this week that they have solved a problem related to reading data.
The research, which involved converting electricity from magnetic film to optics so data could be read through plastic material, was conducted by researchers at the University of Iowa and New York University. A paper on the research was published in this week’s Nature Communications journal.
More research is needed before plastic computers become practical, acknowledged Michael Flatte, professor of physics and astronomy at the University of Iowa. Problems related to writing and processing data need to be solved before plastic computers can be commercially viable.
Plastic computers, however, could conceivably be used in smartphones, sensors, wearable products, small electronics or solar cells, Flatte said.
The computers would have basic processing, data gathering and transmission capabilities but won’t replace silicon used in the fastest computers today. However, the plastic material could be cheaper to produce as it wouldn’t require silicon fab plants, and possibly could supplement faster silicon components in mobile devices or sensors.
“The initial types of inexpensive computers envisioned are things like RFID, but with much more computing power and information storage, or distributed sensors,” Flatte said. One such implementation might be a large agricultural field with independent temperature sensors made from these devices, distributed at hundreds of places around the field, he said.
The research breakthrough this week is an important step in giving plastic computers the sensor-like ability to store data, locally process the information and report data back to a central computer.
Mobile phones, which demand more computing power than sensors, will require more advances because communication requires microwave emissions usually produced by higher-speed transistors than have been made with plastic.
It’s difficult for plastic to compete in the electronics area because silicon is such an effective technology, Flatte acknowledged. But there are applications where the flexibility of plastic could be advantageous, he said, raising the possibility of plastic computers being information processors in refrigerators or other common home electronics.
“This won’t be faster or smaller, but it will be cheaper and lower power, we hope,” Flatte said.
SkySQL Joins IBM On SQL Merger
SkySQL has announced a line of MariaDB products that combine NoSQL and SQL technology, offering users the ability to handle large unstructured data sets alongside traditional database features to ensure data consistency.
Available immediately, MariaDB Enterprise 2 and MariaDB Enterprise Cluster 2 are based on the code used in the firm’s MariaDB 10 database server, which it also released today.
According to SkySQL, the availability of an enterprise grade SQL database system with NoSQL interoperability will be a game changer for developers building revenue generating applications and database administrators in charge of large, complex environments.
The two new products have been developed with support from other partners in the open source community, including Red Hat, IBM and Google, according to the firm, and are aimed at giving IT managers more options for managing large volumes of data.
In fact, Red Hat will use MariaDB Enterprise 2 as the default database for its enterprise customers, while Google has also moved large parts of its infrastructure to MariaDB, according to Dion Cornett, VP of Global Sales for SkySQL .
Cornett said that customers have been using a wide variety of databases over the past few years in order to meet the diverse requirements of applications.
“The types of applications have evolved over time, and the challenge we now have today is that people have different IT stack structures, and trying to integrate all that has been very challenging and required lots of custom code to be created. What we’re doing with MariaDB is introduce an array of features to combine the best of both worlds,” he said.
The features are designed to allow developers and database administrators to take many different data structures and integrate them and use them in a cohesive application, in the same way that standard database tools presently allow.
These include the Connect Storage Engine, which enables access to a wide variety of file formats such as XML and CSV files, and the ability to run familiar SQL commands against that data.
A key feature is dynamic columns, which enables MariaDB to “smartly interpret” incoming data and adapt it to the data structure that best fits, according to Cornett.
“At a technical level what you’re actually looking at are files within the cells of information that can vary in size, which is not a capability you’ve traditionally had in databases and that flexibility is a big leap forward,” he said.
The new MariaDB products can also plug into the Apache Cassandra storage engine, which can take a columnar data store and read or write against it like it is a traditional SQL table.
An example of how MariaDB Enterprise 2 might be used is if a service provider has a large-scale video server and wants to combine that with billing information, Cornett said.
“The customer’s video history and what they’re consuming could be very unstructured, but the billing structure will be very fixed, and it has been something of a challenge to bring the two of those together up to this point,” he explained.
Can DirectX-12 Give Mobile A Boot?
Microsoft announced DirectX 12 just a few days ago and for the first time Redmond’s API is relevant beyond the PC space. Some DirectX 12 tech will end up in phones and of course Windows tablets.
Qualcomm likes the idea, along with Nvidia. Qualcomm published an blog post on the potential impact of DirectX 12 on the mobile industry and the takeaway is very positive indeed.
DirectX 12 equals less overhead, more battery life
Qualcomm says it has worked closely with Microsoft to optimise “Windows mobile operating systems” and make the most of Adreno graphics. The chipmaker points out that current Snapdragon chipsets already support DirectX 9.3 and DirectX 11. However, the transition to DirectX 12 will make a huge difference.
“DirectX 12 will turbocharge gaming on Snapdragon enabled devices in many ways. Just a few years ago, our Snapdragon processors featured one CPU core, now most Snapdragon processors offer four. The new libraries and API’s in DirectX 12 make more efficient use of these multiple cores to deliver better performance,” Qualcomm said.
DirectX 12 will also allow the GPU to be used more efficiently, delivering superior performance per watt.
“That means games will look better and deliver longer gameplay longer on a single charge,” Qualcomm’s gaming and graphics director Jim Merrick added.
What about eye candy?
Any improvement in efficiency also tends to have a positive effect on overall quality. Developers can get more out of existing hardware, they will have more resources at their disposal, simple as that.
Qualcomm also points out that DirectX 12 is also the first version to launch on Microsoft’s mobile operating systems at the same time as its desktop and console counterparts.
The company believes this emphasizes the growing shift and consumer demand for mobile gaming. However, it will also make it easier to port desktop and console games to mobile platforms.
Of course, this does not mean that we’ll be able to play Titanfall on a Nokia Lumia, or that similarly demanding titles can be ported. However, it will speed up development and allow developers and publishers to recycle resources used in console and PC games. Since Windows Phone isn’t exactly the biggest mobile platform out there, this might be very helpful and it might attract more developers.
AMD, Intel & nVidia Go OpenGL
AMD, Intel and Nvidia teamed up to tout the advantages of the OpenGL multi-platform application programming interface (API) at this year’s Game Developers Conference (GDC).
Sharing a stage at the event in San Francisco, the three major chip designers explained how, with a little tuning, OpenGL can offer developers between seven and 15 times better performance as opposed to the more widely recognised increases of 1.3 times.
AMD manager of software development Graham Sellers, Intel graphics software engineer Tim Foley and Nvidia OpenGL engineer Cass Everitt and senior software engineer John McDonald presented their OpenGL techniques on real-world devices to demonstrate how these techniques are suitable for use across multiple platforms.
During the presentation, Intel’s Foley talked up three techniques that can help OpenGL increase performance and reduce driver overhead: persistent-mapped buffers for faster streaming of dynamic geometry, integrating Multidrawindirect (MDI) for faster submission of many draw calls, and packing 2D textures into arrays, so texture changes no longer break batches.
They also mentioned during their presentation that with proper implementations of these high-level OpenGL techniques, driver overhead could be reduced to almost zero. This is something that Nvidia’s software engineers have already claimed is impossible with Direct3D and only possible with OpenGL (see video below).
Nvidia’s VP of game content and technology, Ashu Rege, blogged his account of the GDC joint session on the Nvidia blog.
“The techniques presented apply to all major vendors and are suitable for use across multiple platforms,” Rege wrote.
“OpenGL can cut through the driver overhead that has been a frustrating reality for game developers since the beginning of the PC game industry. On desktop systems, driver overhead can decrease frame rate. On mobile devices, however, driver overhead is even more insidious, robbing both battery life and frame rate.”
The slides from the talk, entitled Approaching Zero Driver Overhead, are embedded below.
At the Game Developers Conference (GDC), Microsoft also unveiled the latest version of its graphics API, Directx 12, with Direct3D 12 for more efficient gaming.
Showing off the new Directx 12 API during a demo of Xbox One racing game Forza 5 running on a PC with an Nvidia Geforce Titan Black graphics card, Microsoft said Directx 12 gives applications the ability to directly manage resources to perform synchronisation. As a result, developers of advanced applications can control the GPU to develop games that run more efficiently.
Do Chip Makers Have Cold Feet?
It is starting to look like chip makers are having cold feet about moving to the next technology for chipmaking. Fabricating chips on larger silicon wafers is the latest cycle in a transition, but according to the Wall Street Journal chipmakers are mothballing their plans.
Companies have to make massive upfront outlays for plants and equipment and they are refusing, because the latest change could boost the cost of a single high-volume factory to as much as $10 billion from around $4 billion. Some companies have been reining in their investments, raising fears the equipment needed to produce the new chips might be delayed for a year or more.
ASML, a maker of key machines used to define features on chips, recently said it had “paused” development of gear designed to work with the larger wafers. Intel said it has slowed some payments to the Netherlands-based company under a deal to help develop the technology.
Gary Dickerson, chief executive of Applied Materials said that the move to larger wafers “has definitely been pushed out from a timing standpoint”