Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Oracle Goes After SAP’s HANA

October 4, 2013 by  
Filed under Consumer Electronics

Comments Off on Oracle Goes After SAP’s HANA

Oracle has upped its game in its fight against SAP HANA, having added in-memory processing to its Oracle 12c database management system, which it claims will speed up queries by 100 times.

Oracle CEO Larry Ellison revealed the update on Sunday evening during his opening keynote at the Oracle Openworld show in San Francisco.

The in-memory option for Oracle Database 12c is designed to ramp up the speeds of data queries – and will also give Oracle a new weapon in the fight against SAP’s rival HANA in-memory system.

“When you put data in memory, one of the reasons you do that is to make the system go faster,” Ellison said. “It will make queries go faster, 100 times faster. You can load the same data into the identical machines, and it’s 100 times faster, you get results at the speed of thought.”

Ellison was keen to allay concerns that these faster query times would have a negative impact on transactions.

“We didn’t want to make transactions go slower with adding and changing data in the database. We figured out a way to speed up query processing and at least double your transaction processing rates,” he said.

In traditional databases, data is stored in rows, for example a row of sales orders, Ellison explained. These types of row format databases were designed to operate at high speeds when processing a few rows that each contain lots of columns. More recently, a new format was proposed to store data in columns rather than rows to speed up query processing.

Oracle plans to store the data in both formats simultaneously, according to Ellison, so transactions run faster in the row format and analytics run faster in column format.

“We can process data at ungodly speeds,” Ellison claimed. As evidence of this, Oracle demoed the technology, showing seven billion rows could be queried per second via in-memory compared to five million rows per second in a traditional database.

The new approach also allows database administrators to speed up their workloads by removing the requirement for analytics indexes.

“If you create a table in Oracle today, you create the table but also decide which columns of the table you’ll create indexes for,” Ellison explained. “We’re replacing the analytics indexes with the in-memory option. Let’s get rid of analytic indexes and replace them with the column store.”

Ellison added that firms can choose to have just part of the database for in-memory querying. “Hot data can be in DRAM, you can have some in flash, some on disk,” he noted. “Data automatically migrates from disk into flash into DRAM based on your access patterns. You only have to pay by capacity at the cost of disk.”

Firms wanting to take advantage of this new in-memory option can do so straightaway, according to Ellison, with no need for changes to functions, no loading or reloading of data, and no data migration. Costs were not disclosed.

And for those firms keen to rush out and invest in new hardware to take advantage of this new in-memory option, Ellison took the wraps off the M6-32, dubbed the Big Memory Machine. According to Ellison, the M6-32 has twice the memory, can process data much faster and costs less than a third of IBM’s biggest comparable machine, making it ideal for in-memory databases.

Source

Does The Cloud Need To Standardize?

September 20, 2013 by  
Filed under Computing

Comments Off on Does The Cloud Need To Standardize?

Frank Baitman, the CIO of the U.S. Department of Health and Human Services (HHS), was at the Amazon Web Services conference  praising the company’s services. Baitman’s lecture was on the verge of becoming a long infomercial, when he stepped back and changed direction.

Baitman has reason to speak well of Amazon. As the big government system integrators slept, Amazon rushed in with its cloud model and began selling its services to federal agencies. HHS and Amazon worked together in a real sense.

The agency helped Amazon get an all-important security certification best known by its acronym, FedRAMP, while Amazon moved its health data to the cloud. It was the first large cloud vendor to get this security certification.

“[Amazon] gives us the scalability that we need for health data,” said Baitman.

But then he said that while it would “make things simpler and nicer” to work with Amazon, since they did the groundwork to get Amazon federal authorizations, “we also believe that there are different reasons to go with different vendors.”

Baitman said that HHS will be working with other vendors as it has with Amazon.

“We recognize different solutions are needed for different problems,” said Baitman. “Ultimately we would love to have a competitive environment that brings best value to the taxpayer and keeps vendors innovating.”

To accomplish this, HHS plans to implement a cloud broker model, an intermediary process that can help government entities identify the best cloud approach for a particular workload. That means being able to compare different price points, terms of service and service-level agreements.

To make comparisons possible, Baitman said the vendors will have to “standardize in those areas that we evaluate cloud on.”

The Amazon conference had about 2,500 registered to attend, and judging from the size of the crowd it certainly appeared to have that many at the Washington Convention Center. It was a leap in attendance. In 2012, attendance at Amazon’s government conference was about 900; in 2011, 300 attended; and in 2010, just 50, Teresa Carlson, vice president of worldwide public sector at Amazon, said in an interview.

Source

Developers Hack Dropbox

September 11, 2013 by  
Filed under Security

Comments Off on Developers Hack Dropbox

Two developers have penetrated Dropbox’s security, even intercepting SSL data from its servers and bypassing the cloud storage provider’s two-factor authentication, according to a paper they published at USENIX 2013.

“These techniques are generic enough and we believe would aid in future software development, testing and security research,” the paper says in its abstract.

Dropbox, which claims more than 100 million users upload more than a billion files daily, said the research didn’t actually represent a vulnerability in its servers.

“We appreciate the contributions of these researchers and everyone who helps keep Dropbox safe,” a spokesperson said in an email to Computerworld. “In the case outlined here, the user’s computer would first need to have been compromised in such a way that it would leave the entire computer, not just the user’s Dropbox, open to attacks across the board.”

The two developers, Dhiru Kholia, with the Openwall open source project , and Przemyslaw Wegrzyn, with CodePainters, said they reverse-engineered Dropbox, an application written in Python.

“Our work reveals the internal API used by Dropbox client and makes it straightforward to write a portable open-source Dropbox client,” the paper states. “Additionally, we show how to bypass Dropbox’s two-factor authentication and gain access to users’ data.”

The paper presents “new and generic techniques to reverse engineer frozen Python applications, which are not limited to just the Dropbox world,” the developers wrote.

The researchers described in detail how they were able to unpack, decrypt and decompile Dropbox from scratch. And, once someone has de-compiled its source code, how “it is possible to study how Dropbox works in detail.

“We describe a method to bypass Dropbox’s two-factor authentication and hijack Dropbox accounts. Additionally, generic techniques to intercept SSL data using code injection techniques and monkey patching are presented,” the developers wrote in the paper.

The process they used included various code injection techniques and monkey-patching to intercept SSL data in a Dropbox client. They also used the techniques successfully to snoop on SSL data in other commercial products as well, they said.

The developers are hoping their white hat hacking prompts Dropbox to open source its platform so that it is no longer a “black box.”

Source

U.S. Cloud Vendors Hurt By NSA

September 4, 2013 by  
Filed under Computing

Comments Off on U.S. Cloud Vendors Hurt By NSA

Edward Snowden’s public unveiling of the National Security Agency’s Prism surveillance program could cause U.S. providers of cloud-based services to lose 10% to 20% of the foreign market — a slice of business valued at up to $35 billion.

A new report from the Information Technology & Innovation Foundation (ITIF) concludes that European cloud computing companies, in particular, might successfully exploit users’ fears about the secret data collection program to challenge U.S. leadership in the hosted services business.

Daniel Castro, author of the report, acknowledges that the conclusions are based, so far, on thin data, but nonetheless argues that the risks to U.S. cloud vendors are real.

Indeed, a month prior, the Cloud Security Alliance reported that in a survey of 207 officials of non-U.S. companies, 10% of the respondents said that they had canceled contracts with U.S. service providers after Snowden’s leak of NSA Prism documents earlier this year.

“If U.S. companies lose market share in the short term, it will have long-term implications on their competitive advantage in this new industry,” said Castro in the ITIF report. “Rival countries have noted this opportunity and will try to exploit it.”

To counter such efforts, the U.S. must challenge overstated claims about the program by foreign companies and governments, said Jason Weinstein, a partner in the Washington office of law firm Steptoe & Johnson and a former federal prosecutor and deputy assistant attorney general specializing in computer crime.

“There are a lot of reasons to be concerned about just how significant those consequences will be,” Weinstein said. “The effort by European governments and European cloud providers to cloud the truth about data protection in the U.S. was going on well before anyone knew who Edward Snowden was. It just picked up new momentum once the Prism disclosures came out.”

Weinstein contends that European countries have fewer data protection rules than the U.S.

For example, he said that in the U.K. and France, a wiretap to get content can be issued by a government official without court authority, but that can’t happen in the U.S.

“U.S. providers have done nothing other than comply with their legal obligations,” he said. But because of Snowden’s leaks, “they are facing potentially significant economic consequences.”

Gartner analyst Ed Anderson said his firm has yet to see any revenue impact on cloud providers since the Prism disclosures, but added, “I don’t think Prism does U.S. providers any favors, that’s for sure.”

Nonetheless, Anderson added, “I think the reality is [the controversy] is likely to die down over time, and we expect adoption to probably continue on the path that it has been on.”

One reason why U.S. providers may not suffer is because “the alternatives aren’t great if you are a European company looking for a cloud service,” he said.

Source

FTC Warns Google And FB

August 30, 2013 by  
Filed under Around The Net

Comments Off on FTC Warns Google And FB

The Federal Trade Commission (FTC) has promised that her organisation will come down hard on companies that do not meet requirements for handling personal data.

FTC Chairwoman Edith Ramirez gave a keynote speech at the Technology Policy Institute at the Aspen Forum. She said that the FTC has a responsibility to protect consumers and prevent them from falling victim to unfair commercial practices.

“In the FTC’s actions against Google, Facebook, Myspace and others, we alleged that each of these companies deceived consumers by breaching commitments to keep their data confidential. That isn’t okay, and it is the FTC’s responsibility to make sure that companies live up to their commitments,” she said.

“All told, the FTC has brought over 40 data security cases under our unfairness and deception authority, many against very large data companies, including Lexisnexis, Choicepoint and Twitter, for failing to provide reasonable security safeguards.”

Ramirez spoke about the importance of consumer privacy, saying that there is too much “shrouding” of what happens in that area. She said that under her leadership the FTC will not be afraid of suing companies when it sees fit.

“A recurring theme I have emphasized – and one that runs through the agency’s privacy work – is the need to move commercial data practices into the sunlight. For too long, the way personal information is collected and used has been at best an enigma enshrouded in considerable smog. We need to clear the air,” she said.

Ramirez compared the work of the FTC to the work carried out by lifeguards, saying that it too has to be vigilant.

“Lifeguards have to be mindful not just of the people swimming, surfing, and playing in the sand. They also have to be alert to approaching storms, tidal patterns, and shifts in the ocean’s current. With consumer privacy, the FTC is doing just that – we are alert to the risks but confident that those risks can be managed,” she added.

“The FTC recognizes that the effective use of big data has the potential to unleash a new wave of productivity and growth. Like the lifeguard at the beach, though, the FTC will remain vigilant to ensure that while innovation pushes forward, consumer privacy is not engulfed by that wave.”

It’s all just lip service, of course. Companies might be nominally bound by US privacy laws in online commerce, and that might be overseen by the FTC, but the US National Security Agency (NSA) collects all internet traffic anyway, and makes data available to other US government agencies and even some private companies.

Source

Google Snubs Privacy

August 29, 2013 by  
Filed under Around The Net

Comments Off on Google Snubs Privacy

Search giant Google has told the British government it is immune to prosecution on privacy issues and it can do what it like. The US Company is accused of illegally snooping on its British customers by bypassing privacy settings on Apple devices, such as iPads, to track their browsing history.

A group of British people took Google to court but the search engine is trying to get the case thrown out. Its argument is that it is not subject to British privacy law because it is based in California. This is the second time that Google has tried to avoid British law by pretending to operate in another country. It has come under fire for failing to pay tax in the UK

Nick Pickles, director of Big Brother Watch, said: ‘It is deeply worrying for a company with millions of British users to be brazenly saying they do not regard themselves bound by UK law. Solicitor Dan Tench, of law firm Olswang, said this was another instance of Google being here when it suits them and not being here when it doesn’t. Ironically when the US ordered Google to stop what it was doing, it forced the search engine to pay a $22.5million to regulators.

There are some indications that Google may not get its way. In July the Information Commissioner’s Office told Google its privacy rules breached UK law so it will be very hard for it to stand up in court and say it didn’t.

Source

Google Encrypts Data

August 27, 2013 by  
Filed under Around The Net

Comments Off on Google Encrypts Data

Google officially announced it will by default encrypt data warehoused in its Cloud Storage service.

The server-side encryption is now active for all new data written to Cloud Storage, and older data will be encrypted in the coming months, wrote Dave Barth, a Google product manager, in a blog post.

“If you require encryption for your data, this functionality frees you from the hassle and risk of managing your own encryption and decryption keys,” Barth wrote. “We manage the cryptographic keys on your behalf using the same hardened key management systems that Google uses for our own encrypted data, including strict key access controls and auditing.”

The data and metadata around an object stored in Cloud Storage is encrypted with a unique key using 128-bit Advanced Encryption Standard algorithm, and the “per-object key itself is encrypted with a unique key associated with the object owner,” Barth wrote.

“These keys are additionally encrypted by one of a regularly rotated set of master keys,” he wrote. “Of course, if you prefer to manage your own keys then you can still encrypt data yourself prior to writing it to Cloud Storage.”

Data collection programs revealed by former U.S. National Security Agency contractor Edward Snowden have raised questions about U.S. government data requests made to Internet companies such as Google for national security investigations.

A Google spokeswoman said via email the company does not provide encryption keys to any government and provides user data only in accordance with the law.

“Our legal team reviews each and every request, and we frequently push back when the requests appear to be fishing expeditions or don’t follow the correct process,” she wrote. “When we are required to comply with these requests, we deliver it to the authorities. No government has the ability to pull data directly from our servers or network.”

Source

Oracle Issues Massive Security Update

July 29, 2013 by  
Filed under Computing

Comments Off on Oracle Issues Massive Security Update

Oracle has issued its critical patch update advisory for July, plugging a total of 89 security holes across its product portfolio.

The fixes focus mainly on remotely exploitable vulnerabilities in four widely used products, with 27 fixes issued for the Oracle Database, Fusion Middleware, the Oracle and Sun Systems Product Suite and the MySQL database.

Out of the 89 security fixes included with this update, the firm said six are for Oracle Database, with one of the vulnerabilities being remotely exploitable without authentication.

Oracle revealed that the highest CVSS Base Score for these database vulnerabilities is 9.0, a score related to vulnerability CVE-2013-3751, which affects the XML Parser on Oracle Database 11.2.0.2 and 11.2.0.3.

A further 21 patched vulnerabilities listed in Oracle’s Critical Patch Update are for Oracle Fusion Middleware; 16 of these vulnerabilities are remotely exploitable without authentication, with the highest CVSS Base Score being 7.5.

As for the Oracle and Sun Systems Products Suite, these products received a total of 16 security fixes, eight of which were also remotely exploitable without authentication, with a maximum CVSS Base Score of 7.8.

“As usual, Oracle recommends that customers apply this Critical Patch Update as soon as possible,” Oracle’s director of Oracle Software Security Assurance Eric Maurice wrote in a blog post.

Craig Young, a security researcher at Tripwire commented on the Oracle patch, saying the “drumbeat of critical patches” is more than alarming because the vulnerabilities are frequently reported by third parties who presumably do not have access to full source code.

“It’s also noteworthy that […] every Oracle CPU release this year has plugged dozens of vulnerabilities,” he added. “By my count, Oracle has already acknowledged and fixed 343 security issues in 2013. In case there was any doubt, this should be a big red flag to end users that Oracle’s security practices are simply not working.”

Source

Oracle Changing Berkeley

July 18, 2013 by  
Filed under Computing

Comments Off on Oracle Changing Berkeley

Oracle has changed the license of its embedded database library, Berkeley DB. The software is widely used as a key-value store within other applications and historically used an OSI-approved strong copyleft license which was similar to the GPL.

Under that license, distributing software that embedded Berkeley DB involved also providing “information on how to obtain complete source code for the DB software and any accompanying software that uses the DB software.”

Now future versions of Berkeley DB use the GNU Affero General Public License (AGPL). This says “your modified version must prominently offer all users interacting with it remotely through a computer network … an opportunity to receive the Corresponding Source of your version.”

This will cause some problems for Web developers using Berkeley DB for local storage. Compliance has not really been an issue because they never “redistributed” the source of their Web apps.Now they will have to make sure their whole Web app is compliant with the AGPL and make full corresponding source to their Web application available.

They also need to ensure the full app has compatible licensing. Practically that means that the whole source code has to be licensed under the GPLv3 or the AGPL.

Source

Citrix Updates Xen Server

July 3, 2013 by  
Filed under Computing

Comments Off on Citrix Updates Xen Server

Citrix has released its open source Xen Server 6.2 to go up against VMware’s closed source free Vsphere hypervisor.

Citrix for years has maintained a free, open source version of its Xen hypervisor but it has been losing ground to KVM and in particular VMware’s free Vsphere hypervisor. Now the firm has released Xen Server 6.2 and a community website that the firm hopes will help increase support for its open source hypervisor.

According to Citrix, Xen Server 6.2 supports Cloud Stack, Open Stack and Citrix’s own Cloud Platform. The firm touted support for the latest guest operating systems including Microsoft’s Windows 8 and Windows Server 2012.

Sameer Dholakia, group VP and GM of Citrix’s Cloud Platforms Group said, “The cloud era has brought a lot of exciting opportunities for data center infrastructure, but the reality is that one size doesn’t fit all when it comes to virtualization.

“By empowering our users and partners with a committed open source strategy and community for XenServer – which already powers some of the largest clouds in the world – we are moving the needle in innovation to help customers of all sizes, and at all stages of their cloud strategies, to maximize the benefits they gain from vitualization and the cloud.”

Citrix said its Xen Server 6.2 supports its Xen Desktop software, including Intellicache and Dynamic Memory Control. The firm said it has added Desktop Director alerts so that administrators can be notified of low resources to try to prevent virtual machines from becoming unusuable.

Citrix will be hoping that as firms get used to the free version of Xen Server they will shell out for the full versions that cost up to $3,250. However, Citrix’s continued support of its free, open source Xen Serven means that VMware will have to continue offering a free version of Vsphere if it doesn’t want to leave a gap in the market.

Source

« Previous PageNext Page »