Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Can AMD Grow

May 8, 2014 by  
Filed under Computing

Comments Off on Can AMD Grow

AMD posted some rather encouraging Q1 numbers last night, but slow PC sales are still hurting the company, along with the rest of the sector.

When asked about the PC market slump, AMD CEO Rory Read confirmed that the PC market was down sequentially 7 percent. This was a bit better than the company predicted, as the original forecast was that the PC market would decline 7 to 10 percent.

Rory pointed out that AMD can grow in the PC market as there is a lot of ground that can be taken from the competition. The commercial market did better than expected and Rory claims that AMD’s diversification strategy is taking off. AMD is trying to win market share in desktop and commercial segments, hence AMD sees an opportunity to grown PC revenue in the coming quarters. Rory also expects that tablets will continue to cannibalize the PC market. This is not going to change soon.

Kaveri and Kabini will definitely help this effort as both are solid parts priced quite aggressively. Kabini is also available in AMD’s new AM1 platform and we believe it is an interesting concept with plenty of mass market potential. Desktop and Notebook ASPs are flat which is something that the financial community really appreciated. It would not be so unusual that average selling prices were down since the global PC market was down.

Kaveri did well in the desktop high-end market in Q1 2014 and there will be some interesting announcements in the mobile market in Q2 2014 and beyond.

Source

AMD, Intel & nVidia Go OpenGL

April 7, 2014 by  
Filed under Computing

Comments Off on AMD, Intel & nVidia Go OpenGL

AMD, Intel and Nvidia teamed up to tout the advantages of the OpenGL multi-platform application programming interface (API) at this year’s Game Developers Conference (GDC).

Sharing a stage at the event in San Francisco, the three major chip designers explained how, with a little tuning, OpenGL can offer developers between seven and 15 times better performance as opposed to the more widely recognised increases of 1.3 times.

AMD manager of software development Graham Sellers, Intel graphics software engineer Tim Foley and Nvidia OpenGL engineer Cass Everitt and senior software engineer John McDonald presented their OpenGL techniques on real-world devices to demonstrate how these techniques are suitable for use across multiple platforms.

During the presentation, Intel’s Foley talked up three techniques that can help OpenGL increase performance and reduce driver overhead: persistent-mapped buffers for faster streaming of dynamic geometry, integrating Multidrawindirect (MDI) for faster submission of many draw calls, and packing 2D textures into arrays, so texture changes no longer break batches.

They also mentioned during their presentation that with proper implementations of these high-level OpenGL techniques, driver overhead could be reduced to almost zero. This is something that Nvidia’s software engineers have already claimed is impossible with Direct3D and only possible with OpenGL (see video below).

Nvidia’s VP of game content and technology, Ashu Rege, blogged his account of the GDC joint session on the Nvidia blog.

“The techniques presented apply to all major vendors and are suitable for use across multiple platforms,” Rege wrote.

“OpenGL can cut through the driver overhead that has been a frustrating reality for game developers since the beginning of the PC game industry. On desktop systems, driver overhead can decrease frame rate. On mobile devices, however, driver overhead is even more insidious, robbing both battery life and frame rate.”

The slides from the talk, entitled Approaching Zero Driver Overhead, are embedded below.

At the Game Developers Conference (GDC), Microsoft also unveiled the latest version of its graphics API, Directx 12, with Direct3D 12 for more efficient gaming.

Showing off the new Directx 12 API during a demo of Xbox One racing game Forza 5 running on a PC with an Nvidia Geforce Titan Black graphics card, Microsoft said Directx 12 gives applications the ability to directly manage resources to perform synchronisation. As a result, developers of advanced applications can control the GPU to develop games that run more efficiently.

Source

Is AMD Worried?

March 17, 2014 by  
Filed under Computing

Comments Off on Is AMD Worried?

AMD’s Mantle has been a hot topic for quite some time and despite its delayed birth, it has finally came delivered performance in Battlefield 4. Microsoft is not sleeping it has its own answer to Mantle that we mentioned here.

Oddly enough we heard some industry people calling it DirectX 12 or DirectX Next but it looks like Microsoft is getting ready to finally update the next generation DirectX. From what we heard the next generation DirectX will fix some of the driver overhead problems that were addressed by Mantle, which is a good thing for the whole industry and of course gamers.

AMD got back to us officially stating that “AMD would like you to know that it supports and celebrates a direction for game development that is aligned with AMD’s vision of lower-level, ‘closer to the metal’ graphics APIs for PC gaming. While industry experts expect this to take some time, developers can immediately leverage efficient API design using Mantle. “

AMD also told us that we can expect some information about this at the Game Developers Conference that starts on March 17th, or in less than two weeks from now.

We have a feeling that Microsoft is finally ready to talk about DirectX Next, DirectX 11.X, DirectX 12 or whatever they end up calling it, and we would not be surprised to see Nvidia 20nm Maxwell chips to support this API, as well as future GPUs from AMD, possibly again 20nm parts.

Source

AMD’s Richland Shows Up

September 26, 2013 by  
Filed under Computing

Comments Off on AMD’s Richland Shows Up

Kaveri is coming in a few months, but before it ships AMD will apparently spice up the Richland line-up with a few low-power parts.

CPU World has come across an interesting listing, which points to two new 45W chips, the A8-6500T and the A10-6700T. Both are quads with 4MB of cache. The A8-6500T is clocked at 2.1GHz and can hit 3.1GHz on Turbo, while the A10-6700T’s base clock is 2.5GHz and it maxes out at 3500MHz.

The prices are $108 and $155 for the A8 and A10 respectively, which doesn’t sound too bad although they are still significantly pricier than regular FM2 parts.

Source

AMD’s Kaveri Coming In Q4

September 19, 2013 by  
Filed under Computing

Comments Off on AMD’s Kaveri Coming In Q4

AMD really needs to make up its mind and figure out how it interprets its own roadmaps. A few weeks ago the company said desktop Kaveri parts should hit the channel in mid-February 2014. The original plan called for a launch in late 2013, but AMD insists the chip was not delayed.

Now though, it told Computerbase.de that the first desktop chips will indeed appear in late 2013 rather than 2014, while mobile chips will be showcased at CES 2014 and they will launch in late Q1 or early Q2 2014.

As we reported earlier, the first FM2+ boards are already showing up on the market, but at this point it’s hard to say when Kaveri desktop APUs will actually be available. The most logical explanation is that they will be announced sometime in Q4, with retail availability coming some two months later.

Kaveri is a much bigger deal than Richland, which was basically Trinity done right. Kaveri is based on new Steamroller cores, it packs GCN graphics and it’s a 28nm part. It is expected to deliver a significant IPC boost over Piledriver-based chips, but we don’t have any exact numbers to report.

Source

ARM & Oracel Optimize Java

August 7, 2013 by  
Filed under Computing

Comments Off on ARM & Oracel Optimize Java

ARM’s upcoming ARMv8 architecture will form the basis for several processors that will end up in servers. Now the firm has announced that it will work with Oracle to optimise Java SE for the architecture to squeeze out as much performance as possible.

ARM’s chip licensees are looking to the 64-bit ARMv8 architecture to make a splash in the low-power server market and go up against Intel’s Atom processors. However unlike Intel that can make use of software already optimised for x86, ARM and its vendors need to work with software firms to ensure that the new architecture will be supported at launch.

Oracle’s Java is a vital piece of software that is used by enterprise firms to run back-end systems, so poor performance from the Java virtual machine could be a serious problem for ARM and its licensees. To prevent that, ARM said it will work with Oracle to improve performance, boot-up performance and power efficiency, and optimize libraries.

Henrik Stahl, VP of Java Product Management at Oracle said, “The long-standing relationship between ARM and Oracle has enabled our mutual technologies to be deployed across a broad spectrum of products and applications.

“By working closely with ARM to enhance the JVM, adding support for 64-bit ARM technology and optimizing other aspects of the Java SE product for the ARM architecture, enterprise and embedded customers can reap the benefits of high-performance, energy-efficient platforms based on ARM technology.”

A number of ARM vendors including x86 stalwart AMD are expected to bring out 64-bit ARMv8 processors in 2014, though it is thought that Applied Micro will be the first to market with an ARMv8 processor chip later this year.

Source

Twitter’s Authentication Has Vulnerabilities

June 6, 2013 by  
Filed under Around The Net

Comments Off on Twitter’s Authentication Has Vulnerabilities

Twitter’s SMS-based, two-factor authentication feature could be abused to lock users who have not enabled it for their accounts if attackers gain access to their log-in credentials, according to researchers from Finnish antivirus vendor F-Secure.

Twitter introduced two-factor authentication last week as an optional security feature in order to make it harder for attackers to hijack users’ accounts even if they manage to steal their usernames and passwords. If enabled, the feature introduces a second authentication factor in the form of secret codes sent via SMS.

According to Sean Sullivan, a security advisor at F-Secure, attackers could actually abuse this feature in order to prolong their unauthorized access to those accounts that don’t have two-factor authentication enabled. The researcher first described the issue Friday in a blog post.

An attacker who steals someone’s log-in credentials, via phishing or some other method, could associate a prepaid phone number with that person’s account and then turn on two-factor authentication, Sullivan said Monday. If that happens, the real owner won’t be able to recover the account by simply performing a password reset, and will have to contact Twitter support, he said.

This is possible because Twitter doesn’t use any additional method to verify that whoever has access to an account via Twitter’s website is also authorized to enable two-factor authentication.

When the two-factor authentication option called “Account Security” is first enabled on the account settings page, the site asks users if they successfully received a test message sent to their phone. Users can simply click “yes,” even if they didn’t receive the message, Sullivan said.

Instead, Twitter should send a confirmation link to the email address associated with the account for the account owner to click in order to confirm that two-factor authentication should be enabled, Sullivan said.

As it is, the researcher is concerned that this feature could be abused by determined attackers like the Syrian Electronic Army, a hacker group that recently hijacked the Twitter accounts of several news organizations, in order to prolong their unauthorized access to compromised accounts.

Some security researchers already expressed their belief that Twitter’s two-factor authentication feature in its current implementation is impractical for news organizations and companies with geographically dispersed social media teams, where different employees have access to the same Twitter account and cannot share a single phone number for authentication.

Twitter did not immediately respond to a request for comment regarding the issue described by Sullivan.

Source

AMC Goes To The Clouds

April 15, 2013 by  
Filed under Computing

Comments Off on AMC Goes To The Clouds

Applied Micro Circuits has released its cloud chip which takes networking and computing and crams it all onto one SoC.

The X-Gene server on a chip, is being billed as the first 64-bit-capable ARM-based server in existence. According to the company it is the first chip to contain a software-defined network (SDN) controller on the die that will offer network services such as load balancing and ensuring service-level agreements on the chip.

Paramesh Gopi, president and CEO of Applied Micro, said that these new chips have now made it past the prototype stage and are being used by Dell and Red Hat. Gopi expects physical servers containing the X-Gene to hit the market by the end of this year.

The chip is manufactured at 40 nanometers and has eight 2.4 GHz ARM cores, four smaller ARM Cortex A5 cores running the SDN controller software, four 10-gigabit ethernet ports, and various ports that can support more Ethernet, SSDs, accelerator cards such as those from Fusion-io or SATA drives.

The cost of ownership, which includes power requirements are about half of that of a comparable x86 product, but wouldn’t discuss actual power consumption, the company claims.

Source

Hitachi Releases Terabyte SAS Drive

February 6, 2013 by  
Filed under Computing

Comments Off on Hitachi Releases Terabyte SAS Drive

Hitachi has released the first 1.2TB 10,000 RPM Serial Attached SCSI (SAS) hard drive for servers.

Hitachi’s hard drive operation, which is now part of Western Digital, continues to develop server and workstation hard drives while its parent firm concentrates on units pitched at desktop and laptop computers. The firm, which was the first to introduce a 1TB hard drive back in 2007, has now surpassed that barrier with its 10,000 RPM 2.5in Ultrastar C10K1200 SAS drive.

Hitachi slips a 64MB cache in each hard drive and quotes a mean time between failure (MTBF) for the Ultrastar C10K1200 of two million hours, suggesting that the drive will be perfect for those users that do big data analysis. The firm touts connector compatibility with its own Ultrastar solid state disk (SSD) drives and promotes the use of tiered storage for those considering SSDs.

Dell has announced support for Hitachi’s Ultrastar C10K1200 drives in its Poweredge and Powervault servers, with Hitachi saying that other OEMs have also qualified the drive for use in their servers.

Enterprise storage vendors such as Hitachi are pushing tiered storage for those firms that want the performance of SSDs but require the capacity of traditional hard drives. While Hitachi is right in pointing out the need for firms to deploy both SSDs and hard drives, with SSD makers rapidly bringing down prices that mix might become SSD biased within a few years.

Source…

Kingston Goes 1TB

January 17, 2013 by  
Filed under Around The Net

Comments Off on Kingston Goes 1TB

Kingston Technology is claiming the world’s highest capacity USB 3.0 flash drive with the one terabyte (1TB) Datatraveler HyperX Predator 3.0.

Announced by the company’s Kingston Digital Europe affiliate, the Datatraveler HyperX Predator 3.0 is shipping now in a 512GB model, with the 1TB capacity set to be available later in the first quarter.

The new drive is also slated as the fastest USB 3.0 flash drive in the Kingston storage line, with read speeds of up to 240MB/s and write speeds of 160MB/s, according to the firm.

“The large capacity and fast USB 3.0 transfer speeds allow users to save time as they can access, edit and transfer applications or files such as HD movies directly from the drive without any performance lag,” said Ann Keefe, Kingston regional director for the UK and Ireland.

Featuring a casing made of zinc alloy for shock resistance and high-end design, the Datatraveler HyperX Predator 3.0 also comes with a custom Kingston key ring and a HyperX valet keychain.

The new drive is fully certified for Superspeed USB 3.0 operation, while keeping backwards compatibility with USB 2.0 to allow it to be used with older computer hardware.

Source…

« Previous PageNext Page »