China Keeps Supercomputing Title
A supercomputer developed by China’s National Defense University still is the fastest publically known computer in the world, while the U.S. is close to an historic low in the latest edition of the closely followed Top 500 supercomputer ranking, which was just published.
The Tianhe-2 computer, based at the National Super Computer Center in Guangzhou, has been on the top of the list for more than two years and its maximum achieved performance of 33,863 teraflops per second is almost double that of the U.S. Department of Energy’s Cray Titan supercomputer, which is at the Oak Ridge National Laboratory in Tennessee.
The IBM Sequoia computer at the Lawrence Livermore National Laboratory in California is the third fastest machine, and fourth on the list is the Fujitsu K computer at Japan’s Advanced Institute for Computational Science. The only new machine to enter the top 10 is the Shaheen II computer of King Abdullah University of Science and Technology in Saudi Arabia, which is ranked seventh.
The Top 500 list, published twice a year to coincide with supercomputer conferences, is closely watched as an indicator of the status of development and investment in high-performance computing around the world. It also provides insights into what technologies are popular among organizations building these machines, but participation is voluntary. It’s quite possible a number of secret supercomputers exist that are not counted in the list.
With 231 machines in the Top 500 list, the U.S. remains the top country in terms of the number of supercomputers, but that’s close to the all-time low of 226 hit in mid-2002. That was right about the time that China began appearing on the list. It rose to claim 76 machines this time last year, but the latest count has China at 37 computers.
The Top 500 list is compiled by supercomputing experts at the University of Mannheim, Germany; the University of Tennessee, Knoxville; and the Department of Energy’s Lawrence Berkeley National Laboratory.
PC Sales Continue The Downward Trend
Gartner is reporting the biggest slump in PC sales for almost two years. The second quarter report saw 68.4 million units shifted in the three-month period, a year-on-year reduction of 9.4 percent, and the steepest drop in seven quarters.
What’s more, the prediction is that the next quarter will see a further reduction of 4.4 percent.
It seems that the dislike of Windows 8, coupled with the impending arrival of Windows 10, has battered the sales of new PCs.
The fact that most PC users will be entitled to a free upgrade, coupled with the fact that chip and RAM technology haven’t moved on at a spectacular pace this year, has created a perfect storm among consumers who are waiting it out for their machines to be born again on 29 July (or 30, or 31, or possibly 1 August).
If you’re reading this and thinking ‘It’s just a dying market’ you’re not wrong, but you have only to look at today’s IDC figures to see that this really is made of Microsoft.
IDC is even more pessimistic than Gartner, quoting 66.1 million units, down 11.8 percent year on year.
But more importantly, when drilled down to the OEMs, you can see where the real problem lies. Apple is the only company in the top five not rooted in the Windows ecosystem.
It is also the only manufacturer to see a rise in its market share, and is now the fourth biggest vendor in the world, up 16.1 percent. Acer at number five has seen its share plummet by 25.9 percent.
Things were a bit rosier this time last year, because businesses were migrating away from Windows XP (not all of them, mind). This year, there’s no ballast and a lot of hesitation to see exactly how Windows 10 does before big orders start being deployed in enterprises.
“The price hike of PCs became more apparent in some regions due to a sharp appreciation of the US dollar against local currencies,” said Mikako Kitagawa, principal analyst at Gartner.
“The worldwide PC market experienced unusually positive desk-based growth last year due to the end of Windows XP support. After the XP impact was phased out, there have not been any major growth drivers to stimulate a PC refresh.”
IDC’s Loren Loverde, VP of worldwide PC trackers and forecasting, said: “We’re expecting the Windows 10 launch to go relatively well, though many users will opt for a free OS upgrade rather than buying a new PC.
“Competition from 2-in-1 devices and phones remains an issue, but the economic environment has had a larger impact lately, and that should stabilize or improve going forward.”
Meanwhile, Apple, despite having a tiny market share for its OS X operating system at just 7.5 percent, according to this month’s Netmarketshare figures, has managed to avoid being the winner or loser OEM by being the referee, which is a nice trick if you can do it.
Both analyst firms see the top three remaining as Lenovo, HP and Dell. Nothing to see there.
Will Cortana Impact Windows 10 Battery Life?
Comments Off on Will Cortana Impact Windows 10 Battery Life?
It is just over a month until Microsoft introduces Windows 10, and as you should know by now, Cortana is one of the key elements of the new OS.
Cortana always listens in order to hear its name and be a smart digital assistant. This is Microsoft answer to Siri and Google Now that is making its way to Windows 10.
Unfortunately, this will affect your notebook battery life. We have spoken with a few industry sources and we can definitely confirm that Windows 10 with enabled Cortana will have an impact on the battery life. We are testing this as we speak to check how big the impact is.
We don’t know how significant the battery life decrease will be, but the good thing is that you will be able to switch Cortana off in case you don’t need it. We heard that many new Toshiba notebooks will come with a dedicated Cortana button, as this is the easiest way to save battery life. Cortana on Toshiba won’t listen until you press the button.
It would be smart if Microsoft would come up with Cortana enable / disable keyboard shortcut. Win + Q will enable Cortana news while Win + S will bring you directly to the Cortana search engine.
Windows 10 seems to be a logical upgrade for anyone who has Windows 8.1 on their notebooks and misses the options from Windows 7, and some familiar UI elements. We use Windows 8.1 on some devices, while most of our computers still have Windows 7 and nothing more. Microsoft DirectX 12 will force us to Windows 10 but from what awe can tell from Preview release, the upgrade to Windows 10 from with 7 seems like quite seamless and logical step.
Just make sure to be aware that your notebook battery life might suffer because of Cortana. Have in mind that this “talk to your PC and expect a smart answer” option can be disabled.
IBM Partners With BOX
IBM and BOX have signed a global agreement to combine their strengths into a cloud powerhouse.
The star-crossed ones said in a joint statement: “The integration of IBM and Box technologies, combined with our global cloud capabilities and the ability to enrich content with analytics, will help unlock actionable insights for use across the enterprise.”
Box will bring its collaboration and productivity tools to the party, while IBM brings social, analytic, infrastructure and security services.
The move is described as a strategic alliance and will see the two companies jointly market products under a co-banner.
IBM will enable the use of Box APIs in enterprise apps and web services to make a whole new playground for developers.
The deal will see Box integrate IBM’s content management, including content capture, extraction, analytics, case management and governance. Also aboard will be Watson Analytics to study in depth the content being stored in Box.
Box will also be integrated into IBM Verse and IBM Connections to allow full integration for email and social.
IBM’s security and consulting services will be part of the deal, and the companies will work together to create mobile apps for industries under the IBM MobileFirst programme.
Finally, the APIs for Box will be enabled in Bluemix meaning that anyone working on rich apps in the cloud can make Box a part of their creation.
Box seems to be the Nick Clegg to IBM’s ham-faced posh-boy robot in this relationship, but is in fact bringing more than you’d think to the party with innovations delivered by its acquisition of 3D modelling company Verold.
What’s more, the results of these collaborations should allow another major player to join Microsoft and Google in the wars over productivity platforms.
It was announced today that Red Hat and Samsung are forming their own coalition to bring enterprise mobile out of the hands of the likes of IBM and Apple which already have a cool thing going on with MobileFirst.
Facebook To Require Stronger Digital Signature
Comments Off on Facebook To Require Stronger Digital Signature
Facebook will require application developers to adopt a more secure type of digital signature for their apps, which is used to verify a program’s legitimacy.
As of Oct. 1, apps will have to use SHA-2 certificate signatures rather than ones signed with SHA-1. Both are cryptographic algorithms that are used to create a hash of a digital certificate that can be mathematically verified.
Apps that use SHA-1 after October won’t work on Facebook anymore, wrote Adam Gross, a production engineer at the company, in a blog post.
“We recommend that developers check their applications, SDKs, or devices that connect to Facebook to ensure they support the SHA-2 standard,” Gross wrote.
SHA-1 has been considered weak for about a decade. Researchers have shown it is possible to create a forged digital certificate that carries the same SHA-1 hash as legitimate one.
The type of attack, called a hash collision, could trick a computer into thinking it is interacting with a legitimate digital certificate when it actually is a spoofed one with the same SHA-1 hash. Using such a certificate could allow an attacker to spy on the connection between a user and an application or website.
Microsoft, Google, Mozilla and other organizations have also moved away from SHA-1 and said they will warn users of websites that are using a connection that should not be trusted.
The Certificate and Browser Forum, which developers best practices for web security, has recommended in its Baseline Requirements that digital certificate issuers stop using SHA-1 as of Jan. 1.
USAA Exploring Bitcoins
May 20, 2015 by admin
Filed under Around The Net
Comments Off on USAA Exploring Bitcoins
USAA, a San Antonio, Texas-based financial institution serving current and former members of the military, is researching the underlying technology behind the digital currency bitcoin to help make its operations more efficient, a company executive said.
Alex Marquez, managing director of corporate development at USAA, said in an interview that the company and its banking, insurance, and investment management subsidiaries hoped the “blockchain” technology could help decentralize its operations such as the back office.
He said USAA had a large team researching the potential of the blockchain, an open ledger of a digital currency’s transactions, viewed as bitcoin’s main technological innovation. It lets users make payments anonymously, instantly, and without government regulation.
The blockchain ledger is accessible to all users of bitcoin, a virtual currency created through a computer “mining” process that uses millions of calculations. Bitcoin has no ties to a central bank and is viewed as an alternative to paying for goods and services with credit cards.
“We have serious interest in the blockchain and we think the technology would have an impact on the organization,” said Marquez. “The fact that we have such a large group of people working on this shows how serious we are about the potential of this technology.”
USAA, which provides banking, insurance and other products to 10.7 million current or former members of the military, owns and manages assets of about $213 billion.
Marquez said USAA had no plans to dabble in the bitcoin as a currency. Its foray into the blockchain reflects a trend among banking institutions trying to integrate bitcoin technology into their systems. BNY Mellon and UBS have announced initiatives to explore the blockchain technology.
Most large banks are testing the blockchain internally, said David Johnston, managing director at Dapps Venture Fund in San Antonio, Texas. “All of the banks are going through that process of trying to understand how this technology is going to evolve.”
“I would say that by the end of the year, most will have solidified a blockchain technology strategy, how the bank is going to implement and how it will move the technology forward.”
USAA is still in early stages of its research and has yet to identify how it will implement the technology.
In January this year, USAA invested in Coinbase, the biggest bitcoin company, which runs a host of services, including an exchange and a wallet, which is how bitcoins are stored by users online.
Qualcomm Gives Snapdragon More Umph
Qualcomm has released a new Trepn Profiler app for Android which will profile Snapdragon processors and tinker with them.
The Trepn Profiler app identifies apps that overwork the CPU or are eating too much data. The app will pinpoint which of the apps drain the battery faster.
All data that will be obtained by this app can provide information you need to know which program is slowing down your phone.
Most Android phone users will not give a damn, but developers will find it useful. Those who are interested in testing roms, custom kernels, and their own apps can use the data gathered by the Trepn Profiler.
Developers can measure optimisation and performance on Snapdragon-powered mobile devices. Data are real-time include network usage, battery power, GPU frequency load, and CPU cores’ load. Key features also include six fast-loading profiling presets, and an advanced mode to manually select data points and save for analysis.
The Advanced Mode allows profiling a single app or device, offline data analysis, and increasing of data collection interval. This special mode also allows longer profiling sessions, displaying two data point in one overlay, and viewing of profile data.
All up this should enable developers to come up with more Snapdragon friendly apps.
USB 3.1 Coming Later This Year
The emerging USB 3.1 standard is on track to reach desktops as hardware companies release motherboards with ports that can transfer data twice as fast as the previous USB technology.
MSI recently announced a 970A SLI Krait motherboard that will support the AMD processors and the USB 3.1 protocol. Motherboards with USB 3.1 ports have also been released by Gigabyte, ASRock and Asus, but those boards support Intel chips.
USB 3.1 can shuffle data between a host device and peripheral at 10Gbps, which is two times faster than USB 3.0. USB 3.1 is also generating excitement for the reversible Type-C cable, which is the same on both ends so users don’t have to worry about plug orientation.
The motherboards with USB 3.1 technology are targeted at high-end desktops. Some enthusiasts like gamers seek the latest and greatest technologies and build desktops with motherboards sold by MSI, Asus and Gigabyte. Many of the new desktop motherboards announced have the Type-C port interface, which is also in recently announced laptops from Apple and Google.
New technologies like USB 3.1 usually first appear in high-end laptops and desktops, then make their way down to low-priced PCs, said Dean McCarron, principal analyst of Mercury Research.
PC makers are expected to start putting USB 3.1 ports in more laptops and desktops starting later this year.
Can Linux Succeed On The Desktop?
Every three years I install Linux and see if it is ready for prime time yet, and every three years I am disappointed. What is so disappointing is not so much that the operating system is bad, it has never been, it is just that who ever designs it refuses to think of the user.
To be clear I will lay out the same rider I have for my other three reviews. I am a Windows user, but that is not out of choice. One of the reasons I keep checking out Linux is the hope that it will have fixed the basic problems in the intervening years. Fortunately for Microsoft it never has.
This time my main computer had a serious outage caused by a dodgy Corsair (which is now a c word) power supply and I have been out of action for the last two weeks. In the mean time I had to run everything on a clapped out Fujitsu notebook which took 20 minutes to download a webpage.
One Ubuntu Linux install later it was behaving like a normal computer. This is where Linux has always been far better than Windows – making rubbish computers behave. I could settle down to work right? Well not really.
This is where Linux has consistently disqualified itself from prime-time every time I have used it. Going back through my reviews, I have been saying the same sort of stuff for years.
Coming from Windows 7, where a user with no learning curve can install and start work it is impossible. Ubuntu can’t. There is a ton of stuff you have to upload before you can get anything that passes for an ordinary service. This uploading is far too tricky for anyone who is used to Windows.
It is not helped by the Ubuntu Software Centre which is supposed to make like easier for you. Say that you need to download a flash player. Adobe has a flash player you can download for Ubuntu. Click on it and Ubuntu asks you if you want to open this file with the Ubuntu Software Center to install it. You would think you would want this right? Thing is is that pressing yes opens the software center but does not download Adobe flash player. The center then says it can’t find the software on your machine.
Here is the problem which I wrote about nearly nine years ago – you can’t download Flash or anything proprietary because that would mean contaminating your machine with something that is not Open Sauce.
Sure Ubuntu will download all those proprietary drivers, but you have to know to ask – an issue which has been around now for so long it is silly. The issue of proprietary drives is only a problem for those who are hard core open saucers and there are not enough numbers of them to keep an operating system in the dark ages for a decade. However, they have managed it.
I downloaded LibreOffice and all those other things needed to get a basic “windows experience” and discovered that all those typefaces you know and love are unavailable. They should have been in the proprietary pack but Ubuntu has a problem installing them. This means that I can’t share documents in any meaningful way with Windows users, because all my formatting is screwed.
LibreOffice is not bad, but it really is not Microsoft Word and anyone who tries to tell you otherwise is lying.
I download and configure Thunderbird for mail and for a few good days it actually worked. However yesterday it disappeared from the side bar and I can’t find it anywhere. I am restricted to webmail and I am really hating Microsoft’s outlook experience.
The only thing that is different between this review and the one I wrote three years ago is that there are now games which actually work thanks to Steam. I have not tried this out yet because I am too stressed with the work backlog caused by having to work on Linux without regular software, but there is an element feeling that Linux is at last moving to a point where it can be a little bit useful.
So what are the main problems that Linux refuses to address? Usability, interface and compatibility.
I know Ubuntu is famous for its shit interface, and Gnome is supposed to be better, but both look and feel dated. I also hate Windows 8′s interface which requires you to use all your computing power to navigate through a touch screen tablet screen when you have neither. It should have been an opportunity for Open saucers to trump Windows with a nice interface – it wasn’t.
You would think that all the brains in the Linux community could come up with a simple easy to use interface which lets you have access to all the files you need without much trouble. The problem here is that Linux fans like to tinker they don’t want usability and they don’t have problems with command screens. Ordinary users, particularly more recent generations will not go near a command screen.
Compatibly issues for games has been pretty much resolved, but other key software is missing and Linux operators do not seem keen to get them on board.
I do a lot of layout and graphics work. When you complain about not being able to use Photoshop, Linux fanboys proudly point to GIMP and say that does the same things. You want to grab them down the throat and stuff their heads down the loo and flush. GIMP does less than a tenth of what Photoshop can do and it does it very badly. There is nothing that can do what CS or any real desktop publishers can do available on Linux.
Proprietary software designed for real people using a desktop tends to trump anything open saucy, even if it is producing a technology marvel.
So in all these years, Linux has not attempted to fix any of the problems which have effectively crippled it as a desktop product.
I will look forward to next week when the new PC arrives and I will not need another Ubuntu desktop experience. Who knows maybe they will have sorted it in three years time again.
Qualcomm Goes Ultrasonic
Qualcomm has unveiled what it claims is the world’s first ‘ultrasonic’ fingerprint scanner, in a bid to improve mobile security and further boost Android’s chances in the enterprise space.
The Qualcomm Snapdragon Sense ID 3D Fingerprint technology debuted during the chipmaker’s Mobile World Congress (MWC) press conference on Monday.
The firm claimed that the new feature will outperform the fingerprint scanners found on smartphones such as the iPhone 6 and Galaxy S6.
Qualcomm also claimed that, as well as “better protecting user data”, the 3D ultrasonic imaging technology is much more accurate than capacitive solutions currently available, and is not hindered by greasy or sweaty fingers.
Sense ID offers a more “innovative and elegant” design for manufacturers, the firm said, owing to its ability to scan fingerprints through any material, be it glass, metal or sapphire.
This means, in theory, that future fingerprint sensors could be included directly into a smartphone’s display.
Derek Aberle, Qualcomm president, said: “This is another industry first for Qualcomm and has the potential to revolutionise mobile security.
“It’s also another step towards the end of the password, and could mean that you’ll never have to type in a password on your smartphone again.”
No specific details or partners have yet been announced, but Qualcomm said that the Sense ID technology will arrive in devices in the second half of 2015, when the firm’s next-generation Snapdragon 820 processor is also tipped to debut.
The firm didn’t reveal many details about this chip, except that it will feature Kryo 64-bit CPU tech and a new machine learning feature dubbed Zeroth.
Qualcomm also revealed more details about LTE-U during Monday’s press conference, confirming plans to extend LTE to unused spectrum using technology integrated in its latest small-cell solutions and RF transceivers for mobile devices.
“We face many challenges as demand for data constantly grows, and we think the best way to fix this is by taking advantage of unused spectrum,” said Aberle.
Finally, the chipmaker released details about a new a partnership with Cyanogen, the open-source outfit responsible for the CyanogenMod operating system.
Qualcomm said that it will provide support for the best features and UI enhancements of CyanogenMod on Snapdragon processors, which will be available for the release of Qualcomm Reference Design in April.
The MWC announcements follow the launch of the ARM Cortex-based Snapdragon 620 and 618 chips last month, which promise to improve connectivity and user experience on high-end smartphones and tablets.
Aberle said that these chips will begin to show up in devices in mid to late 2015.