FTC Warns Google And FB
August 30, 2013 by admin
Filed under Around The Net
Comments Off on FTC Warns Google And FB
The Federal Trade Commission (FTC) has promised that her organisation will come down hard on companies that do not meet requirements for handling personal data.
FTC Chairwoman Edith Ramirez gave a keynote speech at the Technology Policy Institute at the Aspen Forum. She said that the FTC has a responsibility to protect consumers and prevent them from falling victim to unfair commercial practices.
“In the FTC’s actions against Google, Facebook, Myspace and others, we alleged that each of these companies deceived consumers by breaching commitments to keep their data confidential. That isn’t okay, and it is the FTC’s responsibility to make sure that companies live up to their commitments,” she said.
“All told, the FTC has brought over 40 data security cases under our unfairness and deception authority, many against very large data companies, including Lexisnexis, Choicepoint and Twitter, for failing to provide reasonable security safeguards.”
Ramirez spoke about the importance of consumer privacy, saying that there is too much “shrouding” of what happens in that area. She said that under her leadership the FTC will not be afraid of suing companies when it sees fit.
“A recurring theme I have emphasized – and one that runs through the agency’s privacy work – is the need to move commercial data practices into the sunlight. For too long, the way personal information is collected and used has been at best an enigma enshrouded in considerable smog. We need to clear the air,” she said.
Ramirez compared the work of the FTC to the work carried out by lifeguards, saying that it too has to be vigilant.
“Lifeguards have to be mindful not just of the people swimming, surfing, and playing in the sand. They also have to be alert to approaching storms, tidal patterns, and shifts in the ocean’s current. With consumer privacy, the FTC is doing just that – we are alert to the risks but confident that those risks can be managed,” she added.
“The FTC recognizes that the effective use of big data has the potential to unleash a new wave of productivity and growth. Like the lifeguard at the beach, though, the FTC will remain vigilant to ensure that while innovation pushes forward, consumer privacy is not engulfed by that wave.”
It’s all just lip service, of course. Companies might be nominally bound by US privacy laws in online commerce, and that might be overseen by the FTC, but the US National Security Agency (NSA) collects all internet traffic anyway, and makes data available to other US government agencies and even some private companies.
Oracle Issues Massive Security Update
Oracle has issued its critical patch update advisory for July, plugging a total of 89 security holes across its product portfolio.
The fixes focus mainly on remotely exploitable vulnerabilities in four widely used products, with 27 fixes issued for the Oracle Database, Fusion Middleware, the Oracle and Sun Systems Product Suite and the MySQL database.
Out of the 89 security fixes included with this update, the firm said six are for Oracle Database, with one of the vulnerabilities being remotely exploitable without authentication.
Oracle revealed that the highest CVSS Base Score for these database vulnerabilities is 9.0, a score related to vulnerability CVE-2013-3751, which affects the XML Parser on Oracle Database 11.2.0.2 and 11.2.0.3.
A further 21 patched vulnerabilities listed in Oracle’s Critical Patch Update are for Oracle Fusion Middleware; 16 of these vulnerabilities are remotely exploitable without authentication, with the highest CVSS Base Score being 7.5.
As for the Oracle and Sun Systems Products Suite, these products received a total of 16 security fixes, eight of which were also remotely exploitable without authentication, with a maximum CVSS Base Score of 7.8.
“As usual, Oracle recommends that customers apply this Critical Patch Update as soon as possible,” Oracle’s director of Oracle Software Security Assurance Eric Maurice wrote in a blog post.
Craig Young, a security researcher at Tripwire commented on the Oracle patch, saying the “drumbeat of critical patches” is more than alarming because the vulnerabilities are frequently reported by third parties who presumably do not have access to full source code.
“It’s also noteworthy that […] every Oracle CPU release this year has plugged dozens of vulnerabilities,” he added. “By my count, Oracle has already acknowledged and fixed 343 security issues in 2013. In case there was any doubt, this should be a big red flag to end users that Oracle’s security practices are simply not working.”
Oracle Changing Berkeley
Oracle has changed the license of its embedded database library, Berkeley DB. The software is widely used as a key-value store within other applications and historically used an OSI-approved strong copyleft license which was similar to the GPL.
Under that license, distributing software that embedded Berkeley DB involved also providing “information on how to obtain complete source code for the DB software and any accompanying software that uses the DB software.”
Now future versions of Berkeley DB use the GNU Affero General Public License (AGPL). This says “your modified version must prominently offer all users interacting with it remotely through a computer network … an opportunity to receive the Corresponding Source of your version.”
This will cause some problems for Web developers using Berkeley DB for local storage. Compliance has not really been an issue because they never “redistributed” the source of their Web apps.Now they will have to make sure their whole Web app is compliant with the AGPL and make full corresponding source to their Web application available.
They also need to ensure the full app has compatible licensing. Practically that means that the whole source code has to be licensed under the GPLv3 or the AGPL.
Will Oracle Retire MySQL?
The founder of MySQL Michael Widenius “Monty” claims that Oracle is killing off his MySQL database and he is recommending that people move to his new project MariaDB. In an interview with Muktware Widenius said his MariaDB, which is also open source, its on track to replacing MySQL at WikiMedia and other major organizations and companies.
He said MySQL was widely popular long before MySQL was bought by Sun because it was free and had good support. There was a rule that anyone should get MySQL up and running in 15 minutes. Widenius was concerned about MySQL’s sale to Oracle and has been watching as the popularity of MySQL has been declining. He said that Oracle was making a number of mistakes. Firstly new ‘enterprise’ extensions in MySQL were closed source, the bugs database is not public, and the MySQL public repositories are not anymore actively updated.
Widenius said that security problems were not communicated nor addressed quickly and instead of fixing bugs, Oracle is removing features. It is not all bad. Some of the new code is surprisingly good by Oracle, but unfortunately the quality varies and a notable part needs to be rewritten before we can include it in things like MariaDB. Widenius said that it’s impossible for the community to work with the MySQL developers at Oracle as it doesn’t accept patches, does not have a public roadmap and there was no way to discuss with MySQL developers how to implement things or how the current code works.
Basically Oracle has made the project less open and the beast has tanked, while at the same time more open versions of the code, such as MariaDB are rising in popularity.
Cloud Storage Specs Approved
The International Organization for Standardization (ISO) has ratified the Cloud Data Management Interface (CDMI), a set of protocols defining how businesses can safely transport data between private and public clouds.
The Storage Networking Industry Association’s (SNIA) Cloud Storage Initiative Group submitted the standard for approval by the ISO last spring. CDMI is the first industry-developed open standard specifically for data storage as a service.
“There is strong demand for cloud computing standards and to see one of our most active consortia partners contribute this specification in such a timely fashion is very gratifying,” Karen Higginbottom, chairwoman of the ISO committee, said in a statement. “The standard will improve cloud interoperability.”
The CDMI specification is a way to create an interface for accessing data in the cloud by preserving metadata about information that an enterprise stores in the cloud. With metadata associated with the information, companies can retrieve data no matter where it’s stored.
“With the metadata piece, it’s also complementary with existing interfaces. The standard can be used with Amazon, for file or block data and it can use any number of storage protocols, such as NFS, CIFS or iSCSI,” said SNIA Chairman Wayne Adams.
Based on a RESTful HTTP protocol, CDMI provides both a data path and control path for cloud storage and standardizes a common interoperable format for securely moving data and its associated data requirements from cloud to cloud. The standard applies to public, private and hybrid deployment models for storage clouds.
Intel Partners With WMware
Intel has teamed up with Microsoft’s rival VMware to deliver a platform for “trusted cloud.”
The technology will mix Intel’s Trusted Execution Technology (TXT) and VMware’s vSphere 5.1, platform for building cloud infrastructures. Intel said its hardware-enhanced security capabilities integrated directly into the processor combined with vSphere 5.1 would provide a hardened and high-integrity platform to run business-critical applications in private and public cloud environments.
Intel thinks that the biggest barrier to cloud adoption is the fact that companies are worried about security. Jason Waxman, general manager of Intel’s Cloud Infrastructure Group, in a statement that Intel TXT provides hardware enforcement to help overcome some of the most challenging aspects of cloud security, including detection and prevention of bios attacks and evolving forms of stealthy malware, such as rootkits.
Google Rewrites Web Pages For Speed
August 2, 2011 by admin
Filed under Around The Net
Comments Off on Google Rewrites Web Pages For Speed
Google has created a hosted service that analyzes Web pages, rewrites their code to make them perform better and serves them up from Google servers.
To use the Page Speed Service, Web publishers must sign up and point their site’s DNS entry to Google. The service grabs the site’s content, optimizes it for speed and delivers the pages to end users.
Visitors will continue to access a site in the same way as before but could see speed enhancements of 25% to 60%, according to Google.
The service is currently being offered free to a limited number of hand-selected webmasters. Google will announce pricing and other details later. Webmasters can sign up to receive information.
Conflicker Worm Still Wreaking Havoc
Comments Off on Conflicker Worm Still Wreaking Havoc
Security firm fighting the dreadful Conflicker worm claim that they have it on the ropes. The team of computer-security researchers said they managed to neutralize the worm’s impact by blocking its ability to communicate with its developer, who is still anonymous.
Unfortunately after years of trying fighting the Conflicker, security experts estimate the worm infects between five million to fifteen million computers. The Conficker worm, showed up in 2008. The worms intent is to disable a computer’s security measures, including Windows software updates and antivirus protection, leaving machines vulnerable to more malicious software. Read more….