Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Techies Demand More Money

February 11, 2014 by  
Filed under Around The Net

Comments Off on Techies Demand More Money

Employers may need to loosen their purse strings to retain their IT staffers in 2014, according to a salary survey from IT career websiteDice.com.

Among the tech workers who anticipate changing employers in 2014, 68 percent listed more compensation as their reason for leaving. Other factors include improved working conditions (48 percent), more responsibility (35 percent) and the possibility of losing their job (20 percent). The poll, conducted online between Oct. 14 and Nov. 29 last year, surveyed 17,236 tech professionals.

Fifty-four percent of the workers polled weren’t content with their compensation. This figure is down from 2012′s survey, when 57 percent of respondents were displeased with their pay.

The decrease in salary satisfaction could mean companies will face IT staff retention challenges this year, since 65 percent of respondents said they’re confident they can find a new, better position in 2014.

This dissatisfaction over pay comes even though the survey, released Wednesday, showed that the average tech salary rose 2.6 percent in 2013 to US$87,811 and that more companies gave merit raises. The main reason for last year’s bump in pay, according to 45 percent of respondents, was a merit raise. In comparison, the average tech salary was $85,619 in 2012 and 40 percent of those polled said they received a merit raise.

Meanwhile, 26 percent of respondents attributed their 2013 salary increase to taking a higher-paying job at another company.

Employers realize tech talent is coveted and are attempting to keep workers satisfied by offering them a variety of incentives, the survey found. In 2013, 66 percent of employers provided incentives to retain workers. The two most popular incentives were increased compensation and more interesting work. Incentives that allow employees to better balance their work and personal lives were also offered, such as telecommuting and a flexible work schedule.

Skills that commanded six-figure jobs in 2013 came from some of the hottest areas of IT. Data science led the way with big data backgrounds yielding some of the highest salaries. People skilled in Knowing R, the popular statistical computing language, can expect to make $115,531 on average, while those with NoSQL database development skills command an average salary of $114,796. IT pros skilled in MapReduce to process large data sets make $114,396 on average.

Source

vmWare Buys Airwatch

February 4, 2014 by  
Filed under Computing

Comments Off on vmWare Buys Airwatch

VMware will buy mobile management and security startup outfit Airwatch for $1.54billion.

The firm announced today that the deal has been approved by both companies’ boards and is forecast to close by the end of this quarter.

The deal will see VMware, which also announced estimated revenue of $1.48bn for the fourth quarter of 2013, pay $1.175bn in cash and $365m in installment payments.

Airwatch has nine offices worldwide with a workforce of 1,600 people and lists over 10,000 global customers.

The acquisition, which will help redefine VMware’s product portfolio and bring it more up to date with the industry’s threat landscape, will see the integration of Airwatch staff into the company’s End-User Computing Group, with the team working from its Atlanta base. VMware said it will continue to answer directly to Airwatch founder and CEO John Marshall, who will report to ex-Intel executive and VMware CEO Pat Gelsinger.

VMware EVP and GM of the End-User Computing group Sanjay Poonen said that the company plans to expand Airwatch’s Atlanta offices to become the centre of its mobile operations.

“Our vision is to provide a secure virtual workspace that allows end users to work at the speed of life,” he said. “The combination of Airwatch and VMware will enable us to deliver unprecedented value to our customers and partners across their desktop and mobile environments.”

Almost a year ago, VMWare announced a two percent increase in quarterly profits despite an impressive 22 percent increase in sales, and announced 900 job cuts.

The visualization specialist is one many firms to acquire security companies over the past year. Advanced threat specialist Fireeye announced plans to buy end-point protection firm Mandiant earlier in January for $1bn.

Source

Is The Tech Industry Going Independent?

January 2, 2014 by  
Filed under Computing

Comments Off on Is The Tech Industry Going Independent?

The tech industry is undergoing a shift toward a more independent, contingent IT workforce. And while that trend might not be cause for alarm for retiring baby boomer IT professionals, it could mean younger and mid-career workers need to prepare to make a living solo.

About 18% of all IT workers today are self-employed, according to an analysis by Emergent Research, a firm focused on small businesses trends. This independent IT workforce is growing at the rate of about 7% per year, which is faster than the overall growth rate for independent workers generally, at 5.5%.

The definition of independent workers covers people who work at least 15 hours a week.

Steve King, a partner at Emergent, said the growth in independent workers is being driven by companies that want to stay ahead of change, and can bring in workers with the right skills. “In today’s world, change is happening so quickly that everyone is trying to figure out how to be more flexible and agile, cut fixed costs and move to variable costs,” said King. “Unfortunately, people are viewed as a fixed cost.”

King worked with MBO Partners to produce a recent study that estimated the entire independent worker headcount in the U.S., for all occupations, at 17.7 million. They also estimate that around one million of them are IT professionals.

A separate analysis by research firm Computer Economics finds a similar trend. Over the last two years, there has been a spike in the use of contract labor among large IT organizations — firms with IT operational budgets of more than $20 million, according to John Longwell, vice president of research at Computer Economics.

This year, contract workers make up 15% of a typical large organization’s IT staff at the median. This is up from a median of just 6% in 2011, said Longwell. The last time there was a similar increase in contract workers was in 1998, during the dot.com boom and the run-up to Y2K remediation efforts. Computer Economics recently published a research brief on the topic.

“The difference now is that use of contract or temporary workers is not being driven by a boom, but rather by a reluctance to hire permanent workers as the economy improves,” Longwell said.

Computer Economics expects large IT organizations to step up hiring in 2014, which may cause the percentage of contract workers to decline back to a more normal 10% level. But, Longwell cautioned, it’s not clear whether that new hiring will be involve full-time employees or even more contract labor.

Source

App Stores For Supercomputers Enroute

December 13, 2013 by  
Filed under Computing

Comments Off on App Stores For Supercomputers Enroute

A major problem facing supercomputing is that the firms that could benefit most from the technology, aren’t using it. It is a dilemma.

Supercomputer-based visualization and simulation tools could allow a company to create, test and prototype products in virtual environments. Couple this virtualization capability with a 3-D printer, and a company would revolutionize its manufacturing.

But licensing fees for the software needed to simulate wind tunnels, ovens, welds and other processes are expensive, and the tools require large multicore systems and skilled engineers to use them.

One possible solution: taking an HPC process and converting it into an app.

This is how it might work: A manufacturer designing a part to reduce drag on an 18-wheel truck could upload a CAD file, plug in some parameters, hit start and let it use 128 cores of the Ohio Supercomputer Center’s (OSC) 8,500 core system. The cost would likely be anywhere from $200 to $500 for a 6,000 CPU hour run, or about 48 hours, to simulate the process and package the results up in a report.

Testing that 18-wheeler in a physical wind tunnel could cost as much $100,000.

Alan Chalker, the director of the OSC’s AweSim program, uses that example to explain what his organization is trying to do. The new group has some $6.5 million from government and private groups, including consumer products giant Procter & Gamble, to find ways to bring HPC to manufacturers via an app store.

The app store is slated to open at the end of the first quarter of next year, with one app and several tools that have been ported for the Web. The plan is to eventually spin-off AweSim into a private firm, and populate the app store with thousands of apps.

Tom Lange, director of modeling and simulation in P&G’s corporate R&D group, said he hopes that AweSim’s tools will be used for the company’s supply chain.

The software industry model is based on selling licenses, which for an HPC application can cost $50,000 a year, said Lange. That price is well out of the reach of small manufacturers interested in fixing just one problem. “What they really want is an app,” he said.

Lange said P&G has worked with supply chain partners on HPC issues, but it can be difficult because of the complexities of the relationship.

“The small supplier doesn’t want to be beholden to P&G,” said Lange. “They have an independent business and they want to be independent and they should be.”

That’s one of the reasons he likes AweSim.

AweSim will use some open source HPC tools in its apps, and are also working on agreements with major HPC software vendors to make parts of their tools available through an app.

Chalker said software vendors are interested in working with AweSim because it’s a way to get to a market that’s inaccessible today. The vendors could get some licensing fees for an app and a potential customer for larger, more expensive apps in the future.

AweSim is an outgrowth of the Blue Collar Computing initiative that started at OSC in the mid-2000s with goals similar to AweSim’s. But that program required that users purchase a lot of costly consulting work. The app store’s approach is to minimize cost, and the need for consulting help, as much as possible.

Chalker has a half dozen apps already built, including one used in the truck example. The OSC is building a software development kit to make it possible for others to build them as well. One goal is to eventually enable other supercomputing centers to provide compute capacity for the apps.

AweSim will charge users a fixed rate for CPUs, covering just the costs, and will provide consulting expertise where it is needed. Consulting fees may raise the bill for users, but Chalker said it usually wouldn’t be more than a few thousand dollars, a lot less than hiring a full-time computer scientist.

The AweSim team expects that many app users, a mechanical engineer for instance, will know enough to work with an app without the help of a computational fluid dynamics expert.

Lange says that manufacturers understand that producing domestically rather than overseas requires making products better, being innovative and not wasting resources. “You have to be committed to innovate what you make, and you have to commit to innovating how you make it,” said Lange, who sees HPC as a path to get there.

Source

Is SAP Searching In The Clouds?

December 6, 2013 by  
Filed under Computing

Comments Off on Is SAP Searching In The Clouds?

Esoteric business software maker, which no one is really certain what it does, SAP is debating whether to accelerate moving more of its business to the cloud.

The move would be a change in strategy which might initially have only a small impact on its sales. Co-chief executive Jim Hagemann-Snabe said the change would generate more sales by 2017 particularly in markets like the US where there is a big push onto the cloud.

Talking to a Morgan Stanley investor conference this morning, Hagemann-Snabe said that this would have impact on the 2015 level, I don’t expect enormous impact but it would have some impact because you are delaying some revenues. In the long term however it makes a lot of sense, which is not the sort of thing people expect from SAP.

Source

SAP To Stop Offering SME

November 1, 2013 by  
Filed under Computing

Comments Off on SAP To Stop Offering SME

The maker of expensive esoteric software which no-one is really sure what it does, SAP has decided to pull the plug on its offering for small businesses. Business weekly Wirtschaftswoche said SAP would stop the development of a software dubbed Business By Design, although existing customers will be able to continue to use it.

SAP insists that development capacity for Business By Design was being reduced, but that the product was not being shut down. Business by Design was launched in 2010 and was supposed to generate $1 billion of revenue. The product, which cost roughly 3 billion euros to develop, currently has only 785 customers and is expected to generate no more than 23 million euros in sales this year.

The Wirtschaftswoche report said that ever since the SAP product’s launch, customers had complained about technical issues and the slow speed of the software.

Source

Oracle Goes After SAP’s HANA

October 4, 2013 by  
Filed under Consumer Electronics

Comments Off on Oracle Goes After SAP’s HANA

Oracle has upped its game in its fight against SAP HANA, having added in-memory processing to its Oracle 12c database management system, which it claims will speed up queries by 100 times.

Oracle CEO Larry Ellison revealed the update on Sunday evening during his opening keynote at the Oracle Openworld show in San Francisco.

The in-memory option for Oracle Database 12c is designed to ramp up the speeds of data queries – and will also give Oracle a new weapon in the fight against SAP’s rival HANA in-memory system.

“When you put data in memory, one of the reasons you do that is to make the system go faster,” Ellison said. “It will make queries go faster, 100 times faster. You can load the same data into the identical machines, and it’s 100 times faster, you get results at the speed of thought.”

Ellison was keen to allay concerns that these faster query times would have a negative impact on transactions.

“We didn’t want to make transactions go slower with adding and changing data in the database. We figured out a way to speed up query processing and at least double your transaction processing rates,” he said.

In traditional databases, data is stored in rows, for example a row of sales orders, Ellison explained. These types of row format databases were designed to operate at high speeds when processing a few rows that each contain lots of columns. More recently, a new format was proposed to store data in columns rather than rows to speed up query processing.

Oracle plans to store the data in both formats simultaneously, according to Ellison, so transactions run faster in the row format and analytics run faster in column format.

“We can process data at ungodly speeds,” Ellison claimed. As evidence of this, Oracle demoed the technology, showing seven billion rows could be queried per second via in-memory compared to five million rows per second in a traditional database.

The new approach also allows database administrators to speed up their workloads by removing the requirement for analytics indexes.

“If you create a table in Oracle today, you create the table but also decide which columns of the table you’ll create indexes for,” Ellison explained. “We’re replacing the analytics indexes with the in-memory option. Let’s get rid of analytic indexes and replace them with the column store.”

Ellison added that firms can choose to have just part of the database for in-memory querying. “Hot data can be in DRAM, you can have some in flash, some on disk,” he noted. “Data automatically migrates from disk into flash into DRAM based on your access patterns. You only have to pay by capacity at the cost of disk.”

Firms wanting to take advantage of this new in-memory option can do so straightaway, according to Ellison, with no need for changes to functions, no loading or reloading of data, and no data migration. Costs were not disclosed.

And for those firms keen to rush out and invest in new hardware to take advantage of this new in-memory option, Ellison took the wraps off the M6-32, dubbed the Big Memory Machine. According to Ellison, the M6-32 has twice the memory, can process data much faster and costs less than a third of IBM’s biggest comparable machine, making it ideal for in-memory databases.

Source

FCC To Auction Spectrum

September 23, 2013 by  
Filed under Around The Net

Comments Off on FCC To Auction Spectrum

The U.S. Federal Communications Commission will sell 10 megahertz of spectrum in the 1900MHz band for commercial mobile services in an auction set to start on Jan. 14, the agency announced.

The agency on last  Friday set a minimum price for licenses in the so-called H block of $1.56 billion, with some of the money funding the First Responder Network Authority (FirstNet), a government board building a nationwide broadband network for public safety agencies.

The auction will help mobile providers address a predicted spectrum shortage, said Mignon Clyburn, the FCC’s acting chairwoman. The auction “will help close the spectrum gap as well as contributing to the goal of making mobile broadband available to our nation’s first responders,” she said in a statement.

Congress, in the Middle Class Tax Relief and Job Creation Act of 2012, required the FCC to license 65 megahertz of spectrum, including the 10 megahertz in the H block, by February 2015.

The FCC has considered auctioning the 1915-1920MHz and 1995-2000MHz spectrum in the past, but concerns about interference with a nearby PCS block kept the commission from moving forward. An FCC order adopted in June created technical rules to keep the H block from interfering with PCS signals.

Commissioner Ajit Pai praised Clyburn for scheduling the auction. The spectrum “will help deliver bandwidth-intensive mobile services and applications” over mobile networks, he said in a statement.

Source

U.S. Cloud Vendors Hurt By NSA

September 4, 2013 by  
Filed under Computing

Comments Off on U.S. Cloud Vendors Hurt By NSA

Edward Snowden’s public unveiling of the National Security Agency’s Prism surveillance program could cause U.S. providers of cloud-based services to lose 10% to 20% of the foreign market — a slice of business valued at up to $35 billion.

A new report from the Information Technology & Innovation Foundation (ITIF) concludes that European cloud computing companies, in particular, might successfully exploit users’ fears about the secret data collection program to challenge U.S. leadership in the hosted services business.

Daniel Castro, author of the report, acknowledges that the conclusions are based, so far, on thin data, but nonetheless argues that the risks to U.S. cloud vendors are real.

Indeed, a month prior, the Cloud Security Alliance reported that in a survey of 207 officials of non-U.S. companies, 10% of the respondents said that they had canceled contracts with U.S. service providers after Snowden’s leak of NSA Prism documents earlier this year.

“If U.S. companies lose market share in the short term, it will have long-term implications on their competitive advantage in this new industry,” said Castro in the ITIF report. “Rival countries have noted this opportunity and will try to exploit it.”

To counter such efforts, the U.S. must challenge overstated claims about the program by foreign companies and governments, said Jason Weinstein, a partner in the Washington office of law firm Steptoe & Johnson and a former federal prosecutor and deputy assistant attorney general specializing in computer crime.

“There are a lot of reasons to be concerned about just how significant those consequences will be,” Weinstein said. “The effort by European governments and European cloud providers to cloud the truth about data protection in the U.S. was going on well before anyone knew who Edward Snowden was. It just picked up new momentum once the Prism disclosures came out.”

Weinstein contends that European countries have fewer data protection rules than the U.S.

For example, he said that in the U.K. and France, a wiretap to get content can be issued by a government official without court authority, but that can’t happen in the U.S.

“U.S. providers have done nothing other than comply with their legal obligations,” he said. But because of Snowden’s leaks, “they are facing potentially significant economic consequences.”

Gartner analyst Ed Anderson said his firm has yet to see any revenue impact on cloud providers since the Prism disclosures, but added, “I don’t think Prism does U.S. providers any favors, that’s for sure.”

Nonetheless, Anderson added, “I think the reality is [the controversy] is likely to die down over time, and we expect adoption to probably continue on the path that it has been on.”

One reason why U.S. providers may not suffer is because “the alternatives aren’t great if you are a European company looking for a cloud service,” he said.

Source

FTC Warns Google And FB

August 30, 2013 by  
Filed under Around The Net

Comments Off on FTC Warns Google And FB

The Federal Trade Commission (FTC) has promised that her organisation will come down hard on companies that do not meet requirements for handling personal data.

FTC Chairwoman Edith Ramirez gave a keynote speech at the Technology Policy Institute at the Aspen Forum. She said that the FTC has a responsibility to protect consumers and prevent them from falling victim to unfair commercial practices.

“In the FTC’s actions against Google, Facebook, Myspace and others, we alleged that each of these companies deceived consumers by breaching commitments to keep their data confidential. That isn’t okay, and it is the FTC’s responsibility to make sure that companies live up to their commitments,” she said.

“All told, the FTC has brought over 40 data security cases under our unfairness and deception authority, many against very large data companies, including Lexisnexis, Choicepoint and Twitter, for failing to provide reasonable security safeguards.”

Ramirez spoke about the importance of consumer privacy, saying that there is too much “shrouding” of what happens in that area. She said that under her leadership the FTC will not be afraid of suing companies when it sees fit.

“A recurring theme I have emphasized – and one that runs through the agency’s privacy work – is the need to move commercial data practices into the sunlight. For too long, the way personal information is collected and used has been at best an enigma enshrouded in considerable smog. We need to clear the air,” she said.

Ramirez compared the work of the FTC to the work carried out by lifeguards, saying that it too has to be vigilant.

“Lifeguards have to be mindful not just of the people swimming, surfing, and playing in the sand. They also have to be alert to approaching storms, tidal patterns, and shifts in the ocean’s current. With consumer privacy, the FTC is doing just that – we are alert to the risks but confident that those risks can be managed,” she added.

“The FTC recognizes that the effective use of big data has the potential to unleash a new wave of productivity and growth. Like the lifeguard at the beach, though, the FTC will remain vigilant to ensure that while innovation pushes forward, consumer privacy is not engulfed by that wave.”

It’s all just lip service, of course. Companies might be nominally bound by US privacy laws in online commerce, and that might be overseen by the FTC, but the US National Security Agency (NSA) collects all internet traffic anyway, and makes data available to other US government agencies and even some private companies.

Source

« Previous PageNext Page »