Do Smartphones Cause Cancer?
May 18, 2016 by admin
Filed under Around The Net
Comments Off on Do Smartphones Cause Cancer?
It is looking incredibly unlikely that mobile phone use is giving anyone cancer. A long term study into the incidence of brain cancer in the Australian population between 1982 to 2013 shows no marked increase.
The study, summarized on the Conversation site looked at the prevalence of mobile phones among the population against brain cancer rates, using data from national cancer registration.
The results showed a very slight increase in brain cancer rates among males, but a stable level among females. There were significant increases in over-70s, but this problem started before 1982.
The figures should have even been higher as Computed tomography (CT), magnetic resonance imaging (MRI) and related techniques, introduced in Australia in the late 1970s can spot brain tumors which could have otherwise remained undiagnosed.
The data matches up with other studies conducted in other countries, but in Australia all diagnosed cases of cancer have to be legally registered and this creates consistent data.
The argument that mobile phones cause cancer has been running ever since the phones first arrived. In fact the radiation levels on phones has dropped significantly over the years, just to be safe rather than sorry. However it looks like phones have had little impact on cancer statistics – at least in Australia.
http://www.thegurureview.net/mobile-category/do-smartphones-cause-cancer.html
Hospitals Should Brace For Surge In Ransomware Attacks
Comments Off on Hospitals Should Brace For Surge In Ransomware Attacks
U.S. hospitals should brace for a surge in “ransomware” attacks by cyber criminals who take computer networks hostage, then demand payment in return for unlocking them, a non-profit healthcare group warned on Friday.
The Health Information Trust Alliance conducted a study of some 30 mid-sized U.S. hospitals late last year and found that 52 percent of them were infected with malicious software, HITRUST Chief Executive Daniel Nutkis told Reuters.
The most common type of malware was ransomware, Nutkis said, which was present in 35 percent of the hospitals included in the study of network traffic conducted by security software maker Trend Micro Inc.
Ransomware is malicious software that locks up data in computers and leaves messages demanding payment to recover the data. Last month, Hollywood Presbyterian Hospital in Los Angeles paid a ransom of $17,000 to regain access to its systems.
This week, an attack on MedStar Health forced the largest healthcare provider in Washington, D.C., to shut down much of its computer network. The Baltimore Sun reported a ransom of $18,500 was sought. MedStar declined to comment.
HITRUST said it expects such attacks to become more frequent because ransomware has turned into a profitable business for cyber criminals.
The results of the study, which HITRUST has yet to share with the public, demonstrate that hackers have moved away from focusing on stealing patient data, Nutkis said.
“If stuff isn’t working, they move on. If stuff is working, they keep doing it,” said Nutkis. “Organizations that are paying have considered their options, and unfortunately they don’t have a lot of options.”
Extortion has become more popular with cyber criminals because it is seen as a way to generate fast money, said Larry Whiteside, a healthcare expert with cyber security firm Optiv.
Stealing healthcare data is far more labor intensive, requiring attackers to keep their presence in a victim’s network undetected for months as they steal data, then they need to find buyers, he added.
“With ransomware I’m going to get paid immediately,” Whiteside said.
Courtesy- http://www.thegurureview.net/aroundnet-category/hospitals-should-brace-for-surge-in-ransomware-attacks.html
Deutsche Bank Taking Dives Into ‘Big Data’
December 14, 2015 by admin
Filed under Around The Net
Comments Off on Deutsche Bank Taking Dives Into ‘Big Data’
Deutsche Bank is undertaking a major computer systems overhaul that will help it to make greater use of so-called “big data” to provide a detailed picture of how, when and where customers interact with it, the bank’s chief data officer said in an interview.
JP Rangaswami, who joined Deutsche Bank in January as its first-ever chief data officer, said better and cheaper metadata was allowing the bank to analyze previously inaccessible information.
“We are able to see patterns that we could not see beforehand, allowing us to gain insights we couldn’t gain before,” Rangaswami told Reuters in an interview.
Upgrading the technical infrastructure Deutsche Bank needs to get the most out of this data is a priority for Chief Executive John Cryan. He is trying to improve the performance of Germany’s biggest bank, which is struggling to adapt to the tougher climate for banks since the financial crisis.
Cryan, who unveiled a big overhaul at Deutsche on Oct. 29, said at the time that imposing standards on Deutsche’s IT infrastructure was key to improving controls and reducing overheads.
The CEO said in the October presentation that IT design had occurred in silos with the application of little or no common standards. “Our systems are disjointed, cumbersome and far too often just plain incompatible.”
An annual global survey of more than 200 senior bankers published last week by banking software firm Temenos found that “IT Modernization” was now top priority, displacing earlier investment objectives such as regulation and customer friendly mobile apps. IT modernization ranked only fourth among major priorities in the survey last year.
The shift toward technology as a priority shows the extent of the challenge facing banks to modernize infrastructure to analyze internal customer data and try to fend off competition from new financial technology companies.
Rangaswami, who was chief scientist at Silicon Valley marketing software giant Salesforce from 2010 until 2014, said the data would allow Deutsche to tailor services to customers’ needs and to identify bottlenecks and regional implications faster and solve problems more quickly.
Source- http://www.thegurureview.net/aroundnet-category/deutsche-bank-taking-a-deeper-dive-into-big-data.html
Is HP’s Forthcoming Split A Good Idea?
Comments Off on Is HP’s Forthcoming Split A Good Idea?
HP Has released its financial results for the third quarter and they make for somewhat grim reading.
The company has seen drops in key parts of the business and an overall drop in GAAP net revenue of eight percent year on year to $25.3bn, compared with $27.6bn in 2014.
The company failed to meet its projected net earnings per share, which it had put at $0.50-$0.52, with an actual figure of $0.47.
The figures reflect a time of deep uncertainty at the company as it moves ever closer to its demerger into HP and Hewlett Packard Enterprise. The latter began filing registration documents in July to assert its existence as a separate entity, while the boards of both companies were announced two weeks ago.
Dell CEO Michael Dell slammed the move in an exclusive interview with The INQUIRER, saying he would never do the same to his company.
The big boss at HP remained upbeat, despite the drop in dividend against expectations. “HP delivered results in the third quarter that reflect very strong performance in our Enterprise Group and substantial progress in turning around Enterprise Services,” said Meg Whitman, chairman, president and chief executive of HP.
“I am very pleased that we have continued to deliver the results we said we would, while remaining on track to execute one of the largest and most complex separations ever undertaken.”
To which we have to ask: “Which figures were you looking at, lady?”
Breaking down the figures by business unit, Personal Systems revenue was down 13 percent year on year, while notebook sales fell three percent and desktops 20 percent.
Printing was down nine percent, but with a 17.8 percent operating margin. HP has been looking at initiatives to create loyalty among print users such as ink subscriptions.
The Enterprise Group, soon to be spun off, was up two percent year on year, but Business Critical system revenue dropped by 21 percent, cancelled out by networking revenue which climbed 22 percent.
Enterprise Services revenue dropped 11 percent with a six percent margin, while software dropped six percent with a 20.6 percent margin. Software-as-a-service revenue dropped by four percent.
HP Financial Services was down six percent, despite a two percent decrease in net portfolio assets and a two percent decrease in financing volume.
Source- http://www.thegurureview.net/computing-category/is-hps-forthcoming-split-a-good-idea.html
Is Oracle Sliding?
Oracle said weak sales of its traditional database software licenses were made worse by a strong US dollar lowered the value of foreign revenue.
Shares of Oracle, often seen as a barometer for the technology sector, fell 6 percent to $42.15 in extended trading after the company’s earnings report on Wednesday.
Shares of Microsoft and Salesforce.com, two of Oracle’s closest rivals, were close to unchanged.
Daniel Ives, an analyst at FBR Capital Markets said that this announcement speaks to the headwinds Oracle is seeing in the field as their legacy database business is seeing slowing growth.
It also shows that while Cloud business has seen pockets of strength it is not doing as well as many thought,
Oracle, like other established tech companies, is looking to move its business to the cloud-computing model, essentially providing services remotely via data centres rather than selling installed software.
The 38-year-old company has had some success with the cloud model, but is not moving fast enough to make up for declines in its traditional software sales.
Oracle, along with German rival SAP has been losing market share in customer relationship management software in recent years to Salesforce.com, which only offers cloud-based services.
Because of lower software sales and the strong dollar, Oracle’s net income fell to $2.76 billion, or 62 cents per share, in the fourth quarter ended May 31, from $3.65 billion, or 80 cents per share, a year earlier.
Revenue fell 5.4 percent to $10.71 billion. Revenue rose 3 percent on a constant currency basis. Analysts had expected revenue of $10.92 billion, on average.
Sales from Oracle’s cloud-computing software and platform service, an area keenly watched by investors, rose 29 percent to $416 million.
Oracle Launches OpenStack Platform With Intel
Comments Off on Oracle Launches OpenStack Platform With Intel
Oracle and Intel have teamed up for the first demonstration of carrier-grade network function virtualization (NFV), which will allow communication service providers to use a virtualized, software-defined model without degradation of service or reliability.
The Oracle-led project uses the Intel Open Network Platform (ONP) to create a robust service over NFV, using intelligent direction of software to create viable software-defined networking that replaces the clunky equipment still prevalent in even the most modern networks.
Barry Hill, Oracle’s global head of NFV, told The INQUIRER: “It gets us over one of those really big hurdles that the industry is desperately trying to overcome: ‘Why the heck have we been using this very tightly coupled hardware and software in the past if you can run the same thing on standard, generic, everyday hardware?’. The answer is, we’re not sure you can.
“What you’ve got to do is be smart about applying the right type and the right sort of capacity, which is different for each function in the chain that makes up a service.
“That’s about being intelligent with what you do, instead of making some broad statement about generic vanilla infrastructures plugged together. That’s just not going to work.”
Oracle’s answer is to use its Communications Network Service Orchestration Solution to control the OpenStack system and shrink and grow networks according to customer needs.
Use cases could be scaling out a carrier network for a rock festival, or transferring network priority to a disaster recovery site.
“Once you understand the extent of what we’ve actually done here, you start to realize just how big an announcement this is,” said Hill.
“On the fly, you’re suddenly able to make these custom network requirements instantly, just using off-the-shelf technology.”
The demonstration configuration optimizes the performance of an Intel Xeon E5-2600 v3 processor designed specifically for networking, and shows for the first time a software-defined solution which is comparable to the hardware-defined systems currently in use.
In other words, it can orchestrate services from the management and orchestration level right down to a single core of a single processor, and then hyperscale it using resource pools to mimic the specialized characteristics of a network appliance, such as a large memory page.
“It’s kind of like the effect that mobile had on fixed line networks back in the mid-nineties where the whole industry was disrupted by who was providing the technology, and what they were providing,” said Hill.
“Suddenly you went from 15-year business plans to five-year business plans. The impact of virtualization will have the same level of seismic change on the industry.”
Today’s announcement is fundamentally a proof-of-concept, but the technology that powers this kind of next-generation network is already evolving its way into networks.
Hill explained that carrier demand had led to the innovation. “The telecoms industry had a massive infrastructure that works at a very slow pace, at least in the past,” he said.
“However, this whole virtualization push has really been about the carriers, not the vendors, getting together and saying: ‘We need a different model’. So it’s actually quite advanced already.”
NFV appears to be the next gold rush area for enterprises, and other consortium are expected to make announcements about their own solutions within days.
The Oracle/Intel system is based around OpenStack, and the company is confident that it will be highly compatible with other systems.
The ‘Oracle Communications Network Service Orchestration Solution with Enhanced Platform Awareness using the Intel Open Network Platform’ – or OCNSOSWEPAUTIONP as we like to think of it – is currently on display at Oracle’s Industry Connect event in Washington DC.
The INQUIRER wonders whether there is any way the marketing department can come up with something a bit more catchy than OCNSOSWEPAUTIONP before it goes on open sale.
Juniper Networks Goes OpenStack
Juniper and Mirantis are getting close, with news that they are to form a cloud OpenStack alliance.
The two companies have signed an engineering partnership that the companies believe will lead to a reliable, scalable software-defined networking solution.
Mirantis OpenStack will now inter-operate with Juniper Contrail Networking, as well as OpenContrail, an open source software-defined networking system.
The two companies have published a reference architecture for deploying and managing Juniper Contrail Networking with Mirantis OpenStack to simplify deployment and reduce the need for third-party involvement.
Based on OpenStack Juno, Mirantis OpenStack 6.0 will be enhanced by a Fuel plugin in the second quarter that will make it even easier to deploy large-scale clouds in house.
However, Mirantis has emphasized that the arrival of Juniper to the fold is not a snub to the recently constructed integration with VMware.
Nick Chase of Mirantis explained, “…with this Juniper integration, Mirantis will support BOTH VMware vCenter Server and VMware NSX AND Juniper Networks Contrail Networking. That means that even if they’ve got VMware in their environment, they can choose to use NSX or Contrail for their networking components.
“Of course, all of that begs the question, when should you use Juniper, and when should you use VMware? Like all great engineering questions, the answer is ‘it depends’. How you choose is going to be heavily influenced by your individual situation, and what you’re trying to achieve.”
Juniper outlined its goals for the tie-up as:
– Reduce cost by enabling service providers and IT administrators to easily embrace SDN and OpenStack technologies in their environments
– Remove the complexity of integrating networking technologies in OpenStack virtual data centres and clouds
– Increase the effectiveness of their operations with fully integrated management for the OpenStack and SDN environments through Fuel and Juniper Networks® Contrail SDN Controller
The company is keen to emphasize that this is not meant to be a middle finger at VMware, but rather a demonstration of the freedom of choice offered by open source software. However, it serves as another demonstration of how even the FOSS market is growing increasingly proprietary and competitive.
Are We Moving Too Fast Into Cloud Computing?
January 7, 2015 by admin
Filed under Around The Net
Comments Off on Are We Moving Too Fast Into Cloud Computing?
Businesses need to take a hybrid approach when it comes to the cloud, Dell has said.
The firm’s cloud strategy leader, Gordon Davey, told V3.co.uk in an interview that cloud computing is “overhyped” and moving an entire IT infrastructure into the cloud would be an unrealistic goal.
Davey also believes that cloud vendors have enticed companies to make major shifts to the cloud without considering a model that works for their business.
“I think it’s definitely a case of cloud as a buzzword is overhyped. The idea of cloud for the sake of cloud doesn’t really stand out,” he said.
“The problem comes from customers that have seen the buzzword and want to get the benefits and are just jumping on the bandwagon because it is an industry hype thing, rather than actually evaluating the benefits that a true cloud can bring, and applying that to their business requirements.”
Davey outlined the need to take a more considered approach, adopting an IT strategy that mixes on-premise infrastructure with cloud components to harness the technology without escalating IT costs and complexity.
“The future is going to be hybrid. It’s horses for courses – putting the right workload on the right platform,” he said.
“It’s that balanced approach that I think we’re going to see much more often, rather than trying to put everything into the cloud and potentially failing.”
Davey’s position is unsurprising given Dell’s approach of acting as a ‘middleman’ between cloud service providers and end users, providing hardware, software, services and consultancy to enable businesses to use cloud computing in a way that works for them.
“We see our role as enabling the cloud industry, being that underlying technology,” he said, going on to detail Dell’s five pillar approach to acting as a cloud middleman rather than developing its own end-to-end cloud offering.
The strategy involves consulting on a customer’s cloud needs, helping provide cloud infrastructure, brokering deals between vendors and users, providing security, and managing how multiple cloud services are deployed in a single business.
Davey claimed that Dell’s strategy will help companies take a more tailored approach to cloud adoption, adding: “A properly deployed cloud for the correct workloads in hugely beneficial.”
Dell is not alone in promoting a hybrid approach to cloud adoption. Microsoft is adding hybrid cloud capability to the next version of Windows Server.
New Data Suggest IT Hiring Increasing
November 21, 2014 by admin
Filed under Around The Net
Comments Off on New Data Suggest IT Hiring Increasing
Whenever IT hiring increases, as it did last month, the default explanation from analysts is this: The economy is improving.
That might be true, and it may well explain the U.S. Department of Labor’s report today that showed the U.S., overall, added 214,000 jobs last month.
Of that total employment gain, IT hiring grew by 7,800 jobs in October, compared with a gain of 6,900 jobs in September, according to TechServe Alliance, an IT industry group.
Another IT labor analyst group, Janco Associates, calculated last month’s IT gains at 9,500 jobs.
Government data can be reported in different ways, depending on which job categories are included in the IT job estimates, and it is why analysts report job numbers differently.
Hiring trends are also affected by Labor Department adjustments, and the government’s adjusted data adds nearly 25,000 telecom jobs over the past two months, according to Janco. Because of this adjustment, Janco termed the recent growth in IT over the past several months “explosive,” while TechServe put last month’s results as “modestly stronger.”
There is no one reason for October’s gain. An improving economy may be at the heart of any answer. Independent of the government numbers, Computer Economics, in a recent report on contingent versus full-time hiring, said it is seeing a drop in the use of contract workers at large companies and more reliance on full-time workers, which is a sign of an improving economy.
Amazon Tops Apple
November 13, 2014 by admin
Filed under Around The Net
Comments Off on Amazon Tops Apple
A mere five months after Apple snatched J.D. Power’s tablet satisfaction award away from Samsung, it has lost it to up-and-coming Amazon.
Apple’s iPad finished in second place in the latest satisfaction survey conducted by J.D. Power and Associates, with a score of 824 out of a possible 1,000. For the first time, Amazon took first place, scoring 827.
Samsung came in at 821 for third, while Asus and Acer filled out the first five, but those stragglers’ scores were under the category average.
J.D. Power’s satisfaction score included five separate measurements for performance, ease of operation, features, styling and design, and cost, with each accounting for different percentages of the final number. Performance, for example, counted as 28% of the total; cost for 11%.
Apple received high scores in performance and styling and design, while Amazon performed best in ease of operation and cost, said Kirk Parsons, senior director of telecommunications services at J.D. Power.
“Within the tablet segment, there’s a balance of cost and value, and for this period, Amazon was at the equilibrium,” said Parsons. “For the money, [Amazon tablets] do what buyers need them to do. And the Mayday feature really helped them in ease of operation.”
Mayday is a feature on Amazon’s higher-end tablets that lets customers video chat with support representatives using the device.
Parsons called out Amazon’s Fire HDX, which launched in October 2013 in a 7-in. size and a month later in an 8.9-in. format, for driving the brand’s scores. Amazon now sells the 7-in. Fire HDX for $179; the 8.9-in. model starts at $379. “The new Fire HDX did really, really well” in the survey, Parsons noted.
J.D. Power polled nearly 2,700 U.S. tablet owners who had had their current devices for less than a year. The survey period ran from March to August.
The last time J.D. Power published tablet customer satisfaction scores, Amazon placed fourth. Its jump to first was a small surprise, said Parsons. “I figured [Amazon’s] scores would improve, but I didn’t think they’d take the top spot,” he admitted.
Price is increasingly important to satisfaction, said Parson, as costs fall and capabilities climb across the board, making it more difficult for premium-priced tablets like Apple’s iPad, to retain their polled positions. On average, tablet customers now spend $345 on their tablets, $48 less than in April 2013, a decline of 12%.