Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Oracle Goes After SAP’s HANA

October 4, 2013 by  
Filed under Consumer Electronics

Comments Off on Oracle Goes After SAP’s HANA

Oracle has upped its game in its fight against SAP HANA, having added in-memory processing to its Oracle 12c database management system, which it claims will speed up queries by 100 times.

Oracle CEO Larry Ellison revealed the update on Sunday evening during his opening keynote at the Oracle Openworld show in San Francisco.

The in-memory option for Oracle Database 12c is designed to ramp up the speeds of data queries – and will also give Oracle a new weapon in the fight against SAP’s rival HANA in-memory system.

“When you put data in memory, one of the reasons you do that is to make the system go faster,” Ellison said. “It will make queries go faster, 100 times faster. You can load the same data into the identical machines, and it’s 100 times faster, you get results at the speed of thought.”

Ellison was keen to allay concerns that these faster query times would have a negative impact on transactions.

“We didn’t want to make transactions go slower with adding and changing data in the database. We figured out a way to speed up query processing and at least double your transaction processing rates,” he said.

In traditional databases, data is stored in rows, for example a row of sales orders, Ellison explained. These types of row format databases were designed to operate at high speeds when processing a few rows that each contain lots of columns. More recently, a new format was proposed to store data in columns rather than rows to speed up query processing.

Oracle plans to store the data in both formats simultaneously, according to Ellison, so transactions run faster in the row format and analytics run faster in column format.

“We can process data at ungodly speeds,” Ellison claimed. As evidence of this, Oracle demoed the technology, showing seven billion rows could be queried per second via in-memory compared to five million rows per second in a traditional database.

The new approach also allows database administrators to speed up their workloads by removing the requirement for analytics indexes.

“If you create a table in Oracle today, you create the table but also decide which columns of the table you’ll create indexes for,” Ellison explained. “We’re replacing the analytics indexes with the in-memory option. Let’s get rid of analytic indexes and replace them with the column store.”

Ellison added that firms can choose to have just part of the database for in-memory querying. “Hot data can be in DRAM, you can have some in flash, some on disk,” he noted. “Data automatically migrates from disk into flash into DRAM based on your access patterns. You only have to pay by capacity at the cost of disk.”

Firms wanting to take advantage of this new in-memory option can do so straightaway, according to Ellison, with no need for changes to functions, no loading or reloading of data, and no data migration. Costs were not disclosed.

And for those firms keen to rush out and invest in new hardware to take advantage of this new in-memory option, Ellison took the wraps off the M6-32, dubbed the Big Memory Machine. According to Ellison, the M6-32 has twice the memory, can process data much faster and costs less than a third of IBM’s biggest comparable machine, making it ideal for in-memory databases.

Source

AMD’s Richland Shows Up

September 26, 2013 by  
Filed under Computing

Comments Off on AMD’s Richland Shows Up

Kaveri is coming in a few months, but before it ships AMD will apparently spice up the Richland line-up with a few low-power parts.

CPU World has come across an interesting listing, which points to two new 45W chips, the A8-6500T and the A10-6700T. Both are quads with 4MB of cache. The A8-6500T is clocked at 2.1GHz and can hit 3.1GHz on Turbo, while the A10-6700T’s base clock is 2.5GHz and it maxes out at 3500MHz.

The prices are $108 and $155 for the A8 and A10 respectively, which doesn’t sound too bad although they are still significantly pricier than regular FM2 parts.

Source

AMD’s Kaveri Coming In Q4

September 19, 2013 by  
Filed under Computing

Comments Off on AMD’s Kaveri Coming In Q4

AMD really needs to make up its mind and figure out how it interprets its own roadmaps. A few weeks ago the company said desktop Kaveri parts should hit the channel in mid-February 2014. The original plan called for a launch in late 2013, but AMD insists the chip was not delayed.

Now though, it told Computerbase.de that the first desktop chips will indeed appear in late 2013 rather than 2014, while mobile chips will be showcased at CES 2014 and they will launch in late Q1 or early Q2 2014.

As we reported earlier, the first FM2+ boards are already showing up on the market, but at this point it’s hard to say when Kaveri desktop APUs will actually be available. The most logical explanation is that they will be announced sometime in Q4, with retail availability coming some two months later.

Kaveri is a much bigger deal than Richland, which was basically Trinity done right. Kaveri is based on new Steamroller cores, it packs GCN graphics and it’s a 28nm part. It is expected to deliver a significant IPC boost over Piledriver-based chips, but we don’t have any exact numbers to report.

Source

nVidia Launching New Cards

September 10, 2013 by  
Filed under Computing

Comments Off on nVidia Launching New Cards

We weren’t expecting this and it is just a rumour, but reports are emerging that Nvidia is readying two new cards for the winter season. AMD of course is launching new cards four weeks from now, so it is possible that Nvidia would try to counter it.

The big question is with what?

VideoCardz claims one of the cards is an Ultra, possibly the GTX Titan Ultra, while the second one is a dual-GPU job, the Geforce GTX 790. The Ultra is supposedly GK110 based, but it has 2880 unlocked CUDA cores, which is a bit more than the 2688 on the Titan.

The GTX 790 is said to feature two GK110 GPUs, but Nvidia will probably have to clip their wings to get a reasonable TDP.

We’re not entirely sure this is legit. It is plausible, but that doesn’t make it true. It would be good for Nvidia’s image, especially if the revamped GK110 products manage to steal the performance crown from AMD’s new Radeons. However, with such specs, they would end up quite pricey and Nvidia wouldn’t sell that many of them – most enthusiasts would probably be better off waiting for Maxwell.

Source

Java 6 Security Hole Found

September 6, 2013 by  
Filed under Security

Comments Off on Java 6 Security Hole Found

Security firms are urging users of Oracle’s Java 6 software to upgrade to Java 7 as soon as possible to avoid becoming the victims of active cyber attacks.

F-secure senior analyst Timo Hirvonen warned about the exploit this weekend over Twitter, advising that he had found an exploit in the wild actively targeting an unpatched vulnerability in Java 6, named CVE-2013-2463.

PoC for CVE-2013-2463 was released last week, now it’s exploited in the wild. No patch for JRE6… Uninstall or upgrade to JRE7 update 25.

— Timo Hirvonen (@TimoHirvonen) August 26, 2013

CVE-2013-2463 was addressed by Oracle in the June 2013 Critical Patch Update for Java 7. Java 6 has the same vulnerability, as Oracle acknowledged in the update, but since Java 6 became unsupported in April 2013, there is no patch for the Java 6 vulnerability.

Cloud security provider Qualys described the bug as an “implicit zero-day vulnerability”. The firm’s CTO Wolfgang Kandek said he had seen it included in the spreading Neutrino exploit kit threat, which “guarantees that it will find widespread adoption”.

“We know about its existence, but do not have a patch at hand,” Kandek said in a blog post. “This happens each time a software package loses support and we track these instances in Qualysguard with our ‘EOL/Obsolete’ detections, in this case.

“In addition, we still see very high rates of Java 6 installed, a bit over 50 percent, which means many organisations are vulnerable.”

Like F-secure, Kandek recommended that any users with Java 6 upgrade to Java 7 as soon as they can.

“Without doubt, organisations should update to Java 7 where possible, meaning that IT administrators need to verify with their vendors if an upgrade path exists,” he added.

Source

ARM & Oracel Optimize Java

August 7, 2013 by  
Filed under Computing

Comments Off on ARM & Oracel Optimize Java

ARM’s upcoming ARMv8 architecture will form the basis for several processors that will end up in servers. Now the firm has announced that it will work with Oracle to optimise Java SE for the architecture to squeeze out as much performance as possible.

ARM’s chip licensees are looking to the 64-bit ARMv8 architecture to make a splash in the low-power server market and go up against Intel’s Atom processors. However unlike Intel that can make use of software already optimised for x86, ARM and its vendors need to work with software firms to ensure that the new architecture will be supported at launch.

Oracle’s Java is a vital piece of software that is used by enterprise firms to run back-end systems, so poor performance from the Java virtual machine could be a serious problem for ARM and its licensees. To prevent that, ARM said it will work with Oracle to improve performance, boot-up performance and power efficiency, and optimize libraries.

Henrik Stahl, VP of Java Product Management at Oracle said, “The long-standing relationship between ARM and Oracle has enabled our mutual technologies to be deployed across a broad spectrum of products and applications.

“By working closely with ARM to enhance the JVM, adding support for 64-bit ARM technology and optimizing other aspects of the Java SE product for the ARM architecture, enterprise and embedded customers can reap the benefits of high-performance, energy-efficient platforms based on ARM technology.”

A number of ARM vendors including x86 stalwart AMD are expected to bring out 64-bit ARMv8 processors in 2014, though it is thought that Applied Micro will be the first to market with an ARMv8 processor chip later this year.

Source

SanDisk Debuts Wireless Flash Drive

August 5, 2013 by  
Filed under Around The Net

Comments Off on SanDisk Debuts Wireless Flash Drive

SanDisk on Monday announced a line of wireless flash drives that can hold up to 64GB of data.

The new drives include the Connect Wireless Flash Drive — a thumb drive — and the Connect Wireless Media Drive, a larger, but still pocket-sized storage device. The Connect Wireless Flash Drive comes in 16GB and 32GB capacities; the Connect Wireless Media Drive comes in 32GB and 64GB capacities.

The Connect Wireless Flash drive is 3.07-in. x 1.04-in. x 0.54-in. The Connect Wireless Media Drive is 2.6-in. x 2.6-in. x 0.52-in.

The Connect Wireless drive family allows users to not only store but share and stream files across multiple mobile devices. They offer up to eight simultaneous device connections and three media streams, and support separate streams of 720p video content at 2MB/sec to three or five devices concurrently (for the Flash Drive and Media Drive, respectively).

According to a SanDisk spokesman, video streaming performance isn’t affected by multiple streams because device limits are set at a point that supports the streams without degradation. Devices can connect to the drives up to 150 feet away.

The Connect Wireless drives work with all iOS and Android devices, and Kindle Fire tablets, as well as PC and Mac computers. The drives are compatible with Windows 8, Windows 7, Windows Vista, Windows XP and Mac OS 10.6 or higher

Movies, music, photos and documents can be loaded onto the wireless drives by simply dragging and dropping the files, which can then be accessed via the SanDisk Connect apps. Those apps are available for download from the App Store, Google Play Store and the Amazon Appstore for Android.

The drives contain an internal router, so no external router or Internet connection is needed to stream media. In order to use the drives, mobile device users simply download SanDisk’s Connect App.

The drives run on lithium-ion batteries. A single charge provides up to four hours of wireless streaming, with streaming data protected by Wi-Fi Password Protection (WPA2).

“With the new SanDisk Connect product line, we’re raising the bar on what consumers can expect from personal storage,” said Dinesh Bahal, vice president for product marketing for SanDisk.

The SanDisk Connect Wireless Flash Drive is available in 16GB or 32GB capacities for $49.99 and $59.99, respectively. In the U.S., it is available for preorder on Amazon.com, Newegg.com and Micro Center, with availability at Best Buy starting in August. It will also be available for preorder on Amazon.com in Germany and UK.

The SanDisk Connect Wireless Media Drive has a retail price of $79.99 for 32GB or $99.99 for 64GB storage capacity. It is available for preorder in the U.S. on Amazon.com, with availability in Germany and UK in the fourth quarter of 2013.

Source

Oracle Issues Massive Security Update

July 29, 2013 by  
Filed under Computing

Comments Off on Oracle Issues Massive Security Update

Oracle has issued its critical patch update advisory for July, plugging a total of 89 security holes across its product portfolio.

The fixes focus mainly on remotely exploitable vulnerabilities in four widely used products, with 27 fixes issued for the Oracle Database, Fusion Middleware, the Oracle and Sun Systems Product Suite and the MySQL database.

Out of the 89 security fixes included with this update, the firm said six are for Oracle Database, with one of the vulnerabilities being remotely exploitable without authentication.

Oracle revealed that the highest CVSS Base Score for these database vulnerabilities is 9.0, a score related to vulnerability CVE-2013-3751, which affects the XML Parser on Oracle Database 11.2.0.2 and 11.2.0.3.

A further 21 patched vulnerabilities listed in Oracle’s Critical Patch Update are for Oracle Fusion Middleware; 16 of these vulnerabilities are remotely exploitable without authentication, with the highest CVSS Base Score being 7.5.

As for the Oracle and Sun Systems Products Suite, these products received a total of 16 security fixes, eight of which were also remotely exploitable without authentication, with a maximum CVSS Base Score of 7.8.

“As usual, Oracle recommends that customers apply this Critical Patch Update as soon as possible,” Oracle’s director of Oracle Software Security Assurance Eric Maurice wrote in a blog post.

Craig Young, a security researcher at Tripwire commented on the Oracle patch, saying the “drumbeat of critical patches” is more than alarming because the vulnerabilities are frequently reported by third parties who presumably do not have access to full source code.

“It’s also noteworthy that […] every Oracle CPU release this year has plugged dozens of vulnerabilities,” he added. “By my count, Oracle has already acknowledged and fixed 343 security issues in 2013. In case there was any doubt, this should be a big red flag to end users that Oracle’s security practices are simply not working.”

Source

Oracle Changing Berkeley

July 18, 2013 by  
Filed under Computing

Comments Off on Oracle Changing Berkeley

Oracle has changed the license of its embedded database library, Berkeley DB. The software is widely used as a key-value store within other applications and historically used an OSI-approved strong copyleft license which was similar to the GPL.

Under that license, distributing software that embedded Berkeley DB involved also providing “information on how to obtain complete source code for the DB software and any accompanying software that uses the DB software.”

Now future versions of Berkeley DB use the GNU Affero General Public License (AGPL). This says “your modified version must prominently offer all users interacting with it remotely through a computer network … an opportunity to receive the Corresponding Source of your version.”

This will cause some problems for Web developers using Berkeley DB for local storage. Compliance has not really been an issue because they never “redistributed” the source of their Web apps.Now they will have to make sure their whole Web app is compliant with the AGPL and make full corresponding source to their Web application available.

They also need to ensure the full app has compatible licensing. Practically that means that the whole source code has to be licensed under the GPLv3 or the AGPL.

Source

Intel Releases 16GB Xeon Phi

June 26, 2013 by  
Filed under Computing

Comments Off on Intel Releases 16GB Xeon Phi

Intel has announced five Xeon Phi accelerators including a high density add-in card while upping memory capacity to 16GB.

Intel has managed to get its Xeon Phi accelerator cards to power the Tianhe-2 cluster to the summit of the Top 500 list, however the firm isn’t waiting around to bring out new products. At the International Supercomputing show, Intel extended its Xeon Phi range with five new products, all of which have more than one TFLOPS double precision floating point performance, and the Xeon Phi 7120P and 7120X cards, which have 16GB of GDDR5 memory.

Intel’s Xeon Phi 7120P and 7120X cards have peak double precision floating point performance of over 1.2 TFLOPS, with 352GB/s bandwidth to the 16GB of GDDR5 memory. The firm also updated its more modest Xeon Phi 3100 series with the 3120P and 3120A cards, both with more than one TFLOPS of double precision floating point performance and 6GB of GDDR5 memory with bandwidth of 240GB/s.

Intel has also brought out the Xeon Phi 5120D, a high density card that uses mini PCI-Express slots. The firm said that the Xeon Phi 5120D card offers double precision floating point performance of more than one TFLOPS and 8GB of GDDR5 memory with bandwidth greater than 300GB/s.

That Intel is concentrating on double precision floating point performance with its Xeon Phi accelerators highlights the firm’s focus on research rather than graphics rendering or workstation tasks. However the firm’s ability to pack 16GB into its Xeon Phi 7100 series cards is arguably the most important development, as larger locally addressable memory means higher resolution simulations.

Intel clearly seems to believe that there is significant money to be made in the high performance PC market, and despite early reservations from industry observers the firm seems to be ramping up its Xeon Phi range at a rate that will start to give rival GPGPU accelerator designer Nvidia cause for concern.

Source

« Previous PageNext Page »