Is SAP Searching In The Clouds?
Esoteric business software maker, which no one is really certain what it does, SAP is debating whether to accelerate moving more of its business to the cloud.
The move would be a change in strategy which might initially have only a small impact on its sales. Co-chief executive Jim Hagemann-Snabe said the change would generate more sales by 2017 particularly in markets like the US where there is a big push onto the cloud.
Talking to a Morgan Stanley investor conference this morning, Hagemann-Snabe said that this would have impact on the 2015 level, I don’t expect enormous impact but it would have some impact because you are delaying some revenues. In the long term however it makes a lot of sense, which is not the sort of thing people expect from SAP.
SAP To Stop Offering SME
The maker of expensive esoteric software which no-one is really sure what it does, SAP has decided to pull the plug on its offering for small businesses. Business weekly Wirtschaftswoche said SAP would stop the development of a software dubbed Business By Design, although existing customers will be able to continue to use it.
SAP insists that development capacity for Business By Design was being reduced, but that the product was not being shut down. Business by Design was launched in 2010 and was supposed to generate $1 billion of revenue. The product, which cost roughly 3 billion euros to develop, currently has only 785 customers and is expected to generate no more than 23 million euros in sales this year.
The Wirtschaftswoche report said that ever since the SAP product’s launch, customers had complained about technical issues and the slow speed of the software.
Oracle Goes After SAP’s HANA
October 4, 2013 by admin
Filed under Consumer Electronics
Comments Off on Oracle Goes After SAP’s HANA
Oracle has upped its game in its fight against SAP HANA, having added in-memory processing to its Oracle 12c database management system, which it claims will speed up queries by 100 times.
Oracle CEO Larry Ellison revealed the update on Sunday evening during his opening keynote at the Oracle Openworld show in San Francisco.
The in-memory option for Oracle Database 12c is designed to ramp up the speeds of data queries – and will also give Oracle a new weapon in the fight against SAP’s rival HANA in-memory system.
“When you put data in memory, one of the reasons you do that is to make the system go faster,” Ellison said. “It will make queries go faster, 100 times faster. You can load the same data into the identical machines, and it’s 100 times faster, you get results at the speed of thought.”
Ellison was keen to allay concerns that these faster query times would have a negative impact on transactions.
“We didn’t want to make transactions go slower with adding and changing data in the database. We figured out a way to speed up query processing and at least double your transaction processing rates,” he said.
In traditional databases, data is stored in rows, for example a row of sales orders, Ellison explained. These types of row format databases were designed to operate at high speeds when processing a few rows that each contain lots of columns. More recently, a new format was proposed to store data in columns rather than rows to speed up query processing.
Oracle plans to store the data in both formats simultaneously, according to Ellison, so transactions run faster in the row format and analytics run faster in column format.
“We can process data at ungodly speeds,” Ellison claimed. As evidence of this, Oracle demoed the technology, showing seven billion rows could be queried per second via in-memory compared to five million rows per second in a traditional database.
The new approach also allows database administrators to speed up their workloads by removing the requirement for analytics indexes.
“If you create a table in Oracle today, you create the table but also decide which columns of the table you’ll create indexes for,” Ellison explained. “We’re replacing the analytics indexes with the in-memory option. Let’s get rid of analytic indexes and replace them with the column store.”
Ellison added that firms can choose to have just part of the database for in-memory querying. “Hot data can be in DRAM, you can have some in flash, some on disk,” he noted. “Data automatically migrates from disk into flash into DRAM based on your access patterns. You only have to pay by capacity at the cost of disk.”
Firms wanting to take advantage of this new in-memory option can do so straightaway, according to Ellison, with no need for changes to functions, no loading or reloading of data, and no data migration. Costs were not disclosed.
And for those firms keen to rush out and invest in new hardware to take advantage of this new in-memory option, Ellison took the wraps off the M6-32, dubbed the Big Memory Machine. According to Ellison, the M6-32 has twice the memory, can process data much faster and costs less than a third of IBM’s biggest comparable machine, making it ideal for in-memory databases.
Java 6 Security Hole Found
Security firms are urging users of Oracle’s Java 6 software to upgrade to Java 7 as soon as possible to avoid becoming the victims of active cyber attacks.
F-secure senior analyst Timo Hirvonen warned about the exploit this weekend over Twitter, advising that he had found an exploit in the wild actively targeting an unpatched vulnerability in Java 6, named CVE-2013-2463.
PoC for CVE-2013-2463 was released last week, now it’s exploited in the wild. No patch for JRE6… Uninstall or upgrade to JRE7 update 25.
— Timo Hirvonen (@TimoHirvonen) August 26, 2013
CVE-2013-2463 was addressed by Oracle in the June 2013 Critical Patch Update for Java 7. Java 6 has the same vulnerability, as Oracle acknowledged in the update, but since Java 6 became unsupported in April 2013, there is no patch for the Java 6 vulnerability.
Cloud security provider Qualys described the bug as an “implicit zero-day vulnerability”. The firm’s CTO Wolfgang Kandek said he had seen it included in the spreading Neutrino exploit kit threat, which “guarantees that it will find widespread adoption”.
“We know about its existence, but do not have a patch at hand,” Kandek said in a blog post. “This happens each time a software package loses support and we track these instances in Qualysguard with our ‘EOL/Obsolete’ detections, in this case.
“In addition, we still see very high rates of Java 6 installed, a bit over 50 percent, which means many organisations are vulnerable.”
Like F-secure, Kandek recommended that any users with Java 6 upgrade to Java 7 as soon as they can.
“Without doubt, organisations should update to Java 7 where possible, meaning that IT administrators need to verify with their vendors if an upgrade path exists,” he added.
Oracle Issues Massive Security Update
Oracle has issued its critical patch update advisory for July, plugging a total of 89 security holes across its product portfolio.
The fixes focus mainly on remotely exploitable vulnerabilities in four widely used products, with 27 fixes issued for the Oracle Database, Fusion Middleware, the Oracle and Sun Systems Product Suite and the MySQL database.
Out of the 89 security fixes included with this update, the firm said six are for Oracle Database, with one of the vulnerabilities being remotely exploitable without authentication.
Oracle revealed that the highest CVSS Base Score for these database vulnerabilities is 9.0, a score related to vulnerability CVE-2013-3751, which affects the XML Parser on Oracle Database 11.2.0.2 and 11.2.0.3.
A further 21 patched vulnerabilities listed in Oracle’s Critical Patch Update are for Oracle Fusion Middleware; 16 of these vulnerabilities are remotely exploitable without authentication, with the highest CVSS Base Score being 7.5.
As for the Oracle and Sun Systems Products Suite, these products received a total of 16 security fixes, eight of which were also remotely exploitable without authentication, with a maximum CVSS Base Score of 7.8.
“As usual, Oracle recommends that customers apply this Critical Patch Update as soon as possible,” Oracle’s director of Oracle Software Security Assurance Eric Maurice wrote in a blog post.
Craig Young, a security researcher at Tripwire commented on the Oracle patch, saying the “drumbeat of critical patches” is more than alarming because the vulnerabilities are frequently reported by third parties who presumably do not have access to full source code.
“It’s also noteworthy that […] every Oracle CPU release this year has plugged dozens of vulnerabilities,” he added. “By my count, Oracle has already acknowledged and fixed 343 security issues in 2013. In case there was any doubt, this should be a big red flag to end users that Oracle’s security practices are simply not working.”
Oracle Changing Berkeley
Oracle has changed the license of its embedded database library, Berkeley DB. The software is widely used as a key-value store within other applications and historically used an OSI-approved strong copyleft license which was similar to the GPL.
Under that license, distributing software that embedded Berkeley DB involved also providing “information on how to obtain complete source code for the DB software and any accompanying software that uses the DB software.”
Now future versions of Berkeley DB use the GNU Affero General Public License (AGPL). This says “your modified version must prominently offer all users interacting with it remotely through a computer network … an opportunity to receive the Corresponding Source of your version.”
This will cause some problems for Web developers using Berkeley DB for local storage. Compliance has not really been an issue because they never “redistributed” the source of their Web apps.Now they will have to make sure their whole Web app is compliant with the AGPL and make full corresponding source to their Web application available.
They also need to ensure the full app has compatible licensing. Practically that means that the whole source code has to be licensed under the GPLv3 or the AGPL.
Will Oracle Retire MySQL?
The founder of MySQL Michael Widenius “Monty” claims that Oracle is killing off his MySQL database and he is recommending that people move to his new project MariaDB. In an interview with Muktware Widenius said his MariaDB, which is also open source, its on track to replacing MySQL at WikiMedia and other major organizations and companies.
He said MySQL was widely popular long before MySQL was bought by Sun because it was free and had good support. There was a rule that anyone should get MySQL up and running in 15 minutes. Widenius was concerned about MySQL’s sale to Oracle and has been watching as the popularity of MySQL has been declining. He said that Oracle was making a number of mistakes. Firstly new ‘enterprise’ extensions in MySQL were closed source, the bugs database is not public, and the MySQL public repositories are not anymore actively updated.
Widenius said that security problems were not communicated nor addressed quickly and instead of fixing bugs, Oracle is removing features. It is not all bad. Some of the new code is surprisingly good by Oracle, but unfortunately the quality varies and a notable part needs to be rewritten before we can include it in things like MariaDB. Widenius said that it’s impossible for the community to work with the MySQL developers at Oracle as it doesn’t accept patches, does not have a public roadmap and there was no way to discuss with MySQL developers how to implement things or how the current code works.
Basically Oracle has made the project less open and the beast has tanked, while at the same time more open versions of the code, such as MariaDB are rising in popularity.
SOA’s New API Goes To The Cloud
SOA Software has launched an application programming interface (API) gateway today that allows businesses to expose their API’s with a built-in cloud based developer community, helping to grow their services and make it quicker for them to get up and running.
The firm’s CTO Alistair Farquharson said the API Gateway is unique due to it being a new concept in API and SOA management, aiming to “deliver new advantages in the application-level security space”.
“The new API Gateway provides monitory, security, and more uniquely, a developer community as well, so kind of a turnkey approach to an API gateway where a customer can buy that product, get it up and running, expose their API and expose the developer community to the outside world,” Farquharson said.
“[It will] support and manage the porting of mobile applications or web apps or B2B partnerships.”
Farquharson explained that there are three main components within the Gateway, which SOA Software has termed a “unified services gateway”, including a runtime component, a policy manager, and a developer community.
The runtime component handles the message traffic, whereas the policy manager component is capable of managing a range of different policies, such as threat protection, authentication, authorisation, anti-virus, monitorin, auditing, logging, for example.
“The whole objective here is to get a customer up and running with API’s as quickly as possible to meet some kind of a business need that they have, whether that’s mobile an application initiative or a web application, integration or syndication,” Farquharson added.
The third component is the API’s cloud-based “developer community”, which exposes an organisation to the outside world so developers can come take a look at its API, read its documentation, and see what APIs it has to figure out how to interact with them.
It’s this component that sets SOA Software’s Gateway apart form other firms doing similar appliances on the market, claims Farquharson.
“It essentially becomes the developer site for your organisation, with it all running on a single appliance which is rather unique,” he added.
“The interesting thing about the gateway is that it does API’s as well as services [that are] needed for mobile devices so you have old and the new encapsulated in the single appliance, which is very important to our customers.”
The developer community is offered through the API as a service, “like the Salesforce of APIs”, Farquharson said.
“Developers can go there and build their community and it provides them with high level service and availability and saglobla infrastructure and leverage the strength of their community to get themselves going.”
Will Intel Drop Itanium?
Intel has scaled back plans for its next Itanium chip, prompting observers the question Intel’s commitment to the chip.Intel said the next version of Itanium, codenamed Kittson, will be a 32nm part. It will not migrate to a more advanced process. The new chips will use the same socket as the existing Itanium 9300 and 9500 chips.
Analyst Nathan Brookwood said the move is Intel’s idea of an exit strategy.
“It may very well be that Itanium’s time has come and gone,” he said.
Gartner analyst Martin Reynolds told Computerworld that Itanium might see a new process in the future, if it proves successful enough to make the investment worthwhile. However, he does not expect any more major updates to the architecture.
Itanium launched in 2001 and it quickly became a running joke in the industry. It never achieved the volumes expected by Intel and AMD seized the opportunity to take on Intel with 64-bit Opterons. However, Itanium soldiered on for years, although many vendors stopped developing software for the chip.
AMD And Oracle Join Forces
AMD is taking part in the OpenJDK project “Sumatra” in collaboration with Oracle.
The project aims to bring heterogeneous computing capabilities to Java for servers and clouds. It will look at how the Java virtual machine, language and APIs, can be spruced up to allow applications to take advantage of GPU acceleration, either in discrete graphics cards or in high-performance graphics processor cores such as those found in AMD APUs.
Manju Hegde, corporate vice president heterogeneous applications and developer solutions at AMD said that the OpenJDK Project represents the next step towards bringing heterogeneous computing to millions of Java developers. AMD has an established track record of collaboration with open-software development communities from OpenCL to the heterogeneous system architecture (HSA) foundation, and with this initiative we will help further the development of graphics acceleration within the Java community, he said.