Syber Group
Toll Free : 855-568-TSTG(8784)
Subscribe To : Envelop Twitter Facebook Feed linkedin

Is Microsoft A Risk?

February 29, 2016 by  
Filed under Security

Comments Off on Is Microsoft A Risk?

Hewlett Packard Enterprise (HPE) has cast a shade on what it believes to be the biggest risks facing enterprises, and included on that list is Microsoft.

We ain’t surprised, but it is quite a shocking and naked fact when you consider it. The naming and resulting shaming happens in the HPE Cyber Risk Report 2016, which HPE said “identifies the top security threats plaguing enterprises”.

Enterprises, it seems, have myriad problems, of which Microsoft is just one.

“In 2015, we saw attackers infiltrate networks at an alarming rate, leading to some of the largest data breaches to date, but now is not the time to take the foot off the gas and put the enterprise on lockdown,” said Sue Barsamian, senior vice president and general manager for security products at HPE.

“We must learn from these incidents, understand and monitor the risk environment, and build security into the fabric of the organisation to better mitigate known and unknown threats, which will enable companies to fearlessly innovate and accelerate business growth.”

Microsoft earned its place in the enterprise nightmare probably because of its ubiquity. Applications, malware and vulnerabilities are a real problem, and it is Windows that provides the platform for this havoc.

“Software vulnerability exploitation continues to be a primary vector for attack, with mobile exploits gaining traction. Similar to 2014, the top 10 vulnerabilities exploited in 2015 were more than one-year-old, with 68 percent being three years old or more,” explained the report.

“In 2015, Microsoft Windows represented the most targeted software platform, with 42 percent of the top 20 discovered exploits directed at Microsoft platforms and applications.”

It is not all bad news for Redmond, as the Google-operated Android is also put forward as a professional pain in the butt. So is iOS, before Apple users get any ideas.

“Malware has evolved from being simply disruptive to a revenue-generating activity for attackers. While the overall number of newly discovered malware samples declined 3.6 percent year over year, the attack targets shifted notably in line with evolving enterprise trends and focused heavily on monetisation,” added the firm.

“As the number of connected mobile devices expands, malware is diversifying to target the most popular mobile operating platforms. The number of Android threats, malware and potentially unwanted applications have grown to more than 10,000 new threats discovered daily, reaching a total year-over-year increase of 153 percent.

“Apple iOS represented the greatest growth rate with a malware sample increase of more than 230 percent.”

Courtesy-TheInq

Microsoft Goes Underwater

February 12, 2016 by  
Filed under Computing

Comments Off on Microsoft Goes Underwater

Technology giants are finding some of the strangest places for data centers these days.

Facebook, for example, built a data center in Lulea in Sweden because the icy cold temperatures there would help cut the energy required for cooling. A proposed Facebook data center in Clonee, Ireland, will rely heavily on locally available wind energy. Google’s data center in Hamina in Finland uses sea water from the Bay of Finland for cooling.

Now, Microsoft is looking at locating data centers under the sea.

The company is testing underwater data centers with an eye to reducing data latency for the many users who live close to the sea and also to enable rapid deployment of a data center.

Microsoft, which has designed, built, and deployed its own subsea data center in the ocean, in the period of about a year, started working on the project in late 2014, a year after Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.

A prototype vessel, named the Leona Philpot after an Xbox game character, operated on the seafloor about 1 kilometer from the Pacific coast of the U.S. from August to November 2015, according to a Microsoft page on the project.

The subsea data center experiment, called Project Natick after a town in Massachusetts, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers.

“Project Natick reflects Microsoft’s ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable,” the company said.

Using undersea data centers helps because they can serve the 50 percent of people who live within 200 kilometers from the ocean. Microsoft said in an FAQ that deployment in deepwater offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.

Courtesy- http://www.thegurureview.net/aroundnet-category/microsoft-goes-deep-with-underwater-data-center.html

ARM Goes 4K With Mali

February 5, 2016 by  
Filed under Computing

Comments Off on ARM Goes 4K With Mali

ARM has announced a new mobile graphics chip, the Mali-DP650 which it said was designed to handle 4K content a device’s screen and on an external display.

The new Mali GPU can push enough pixels on the local display it is more likely that it is interested in using the technology for streaming.

Many smartphones can record 4K video and this means that smartphones could be a home to high resolution content which can be streamed to a large, high resolution screen.

It looks like Mali DP650can juggle the device’s native resolution and the external display’s own resolution and the variable refresh rates. At least that is what ARM says it can do.

The GPU is naturally able to handle different resolutions but it is optimized for a “2.5K”, which means WQXGA (2560×1600) on tablets and WQHD (2560×1440) on smartphones, but also Full HD (1920×1080) for slightly lower end devices.

Mark Dickinson, general manager, media processing group, ARM said: “The Mali-DP650 display processor will enable mobile screens with multiple composition layers, for graphics and video, at Full HD (1920×1080 pixels) resolutions and beyond while maintaining excellent picture quality and extending battery life,”

“Smartphones and tablets are increasingly becoming content passports, allowing people to securely download content once and carry it to view on whichever screen is most suitable. The ability to stream the best quality content from a mobile device to any screen is an important capability ARM Mali display technology delivers.”

ARM did not say when the Mali-DP650 will be in the shops or which chips will be the first to incorporate its split-display mode feature.

Courtesy-Fud

Samsung And TSMC Battle It Out

February 4, 2016 by  
Filed under Computing

Comments Off on Samsung And TSMC Battle It Out

Samsung and TSMC are starting to slug it out introducing Gen.3 14 and 16-nano FinFET system semiconductor processes, but the cost could mean that smartphone makers shy away from the technology in the short term.

It is starting to look sales teams for the pair are each trying to show that they can use the technology to reduce the most electricity consumption and production costs.

In its yearly result for 2015, TSMC made an announcement that it is planning to enter mass-production system of chips produced by 16-nano FinFET Compact (FFC) process sometime during 1st quarter of this year. TSMC had finished developing 16-nano FFC process at the end of last year. During the announcement TSMC talked up the fact that its 16-nano FFC process focuses on reducing production cost more than before and implementing low electricity.

TSMC is apparently ready for mass-production of 16-nano FFC process sometime during 1st half of this year and secured Huawei’s affiliate called HiSilicon as its first customer.

HiSilicon’s Kirin 950 that is used for Huawei’s premium Smartphone called Mate 8 is produced by TSMC’s 16-nano FF process. Its A9 Chip, which is used for Apple’s iPhone 6S series, is mass-produced using the 16-nano FinFET Plus (FF+) process that was announced in early 2015. By adding FFC process, TSMC now has three 16-nano processors in action.

Samsung is not far behind it has mass-produced Gen.2 14-nano FinFET using a process called LPP (Low Power Plus). This has 15 per cent lower electricity consumption compared to Gen.1 14-nano process called LPE (Low Power Early).

Samsung Electronics’ 14-nano LPP process was seen in the Exynos 8 OCTA series that is used for Galaxy S7 and Qualcomm’s Snapdragon 820. But Samsung Electronics is also preparing for Gen.3 14-nano FinFET process.

Vice-President Bae Young-chang of Samsung’s LSI Business Department’s Strategy Marketing Team said it will use a process similar to the Gen.2 14-nano process.

Both Samsung and TSMC might have a few problems. It is not clear what the yields of these processes are and this might increase the production costs.

Even if Samsung Electronics and TSMC finish developing 10-nano process at the end of this year and enter mass-production system next year, but they will also have to upgrade their current 14 and 16-nano processes to make them more economic.

Even if 10-nano process is commercialized, there still will be many fabless businesses that will use 14 and 16-nano processes because they are cheaper. While we might see a few flagship phones using the higher priced chips, it might be that we will not see 10nm in the majority of phones for years.

 

Courtesy-Fud

Is Intel Going 10nm Next Year?

February 3, 2016 by  
Filed under Computing

Comments Off on Is Intel Going 10nm Next Year?

Intel is reportedly going to release its first 10nm processor family in 2017, expected to be the first of three generations of processors that will be fabbed on the 10nm process.

Guru 3D found a slide which suggest that Chipzilla will not be sticking to its traditional “tick-tock model.” To be fair Intel has been using the 14nm node for two generations so far – Broadwell and Skylake. Kaby Lake processor architecture that is due later this year, will also use 14nm .

The slide tells us pretty much what we expected. The first processor family to be manufactured on a 10nm node will be Cannonlake, expected to launch in the year 2017. The following year, Intel will reportedly launch Icelake processors, again using the same 10nm node. Icelake will be succeeded by Tigerlake in 2019, the third generation of Intel processors using a 10nm silicon fab process. The codename for Tigerlake’s successor is unknown.  When it comes out in 2020 it will use 5nm.

 

architecture CPU series Tick or Tock Fab node Year Released
Presler/Cedar Mill Pentium 4 / D Tick 65 nm 2006
Conroe/Merom Core 2 Duo/Quad Tock 65 nm 2006
Penryn Core 2 Duo/Quad Tick 45 nm 2007
Nehalem Core i Tock 45 nm 2008
Westmere Core i Tick 32 nm 2010
Sandy Bridge Core i 2xxx Tock 32 nm 2011
Ivy Bridge Core i 3xxx Tick 22 nm 2012
Haswell Core i 4xxx Tock 22 nm 2013
Broadwell Core i 5xxx Tick 14 nm 2014 & 2015 for desktops
Skylake Core i 6xxx Tock 14 nm 2015
Kaby lake Core i 7xxx Tock 14 nm 2016
Cannonlake Core i 8xxx? Tick 10 nm 2017
Ice Lake Core i 8xxx? Tock 10 nm 2018
Tigerlake Core i 9xxx? Tock 10 nm 2019
N/A N/A Tick 5 nm 2020

Courtesy-Fud

Samsung Goes 4GB HBM

February 2, 2016 by  
Filed under Computing

Comments Off on Samsung Goes 4GB HBM

Samsung has begun mass producing what it calls the industry’s first 4GB DRAM package based on the second-generation High Bandwidth Memory (HBM) 2 interface.

Samsung’s new HBM solution will be used in high-performance computing (HPC), advanced graphics, network systems and enterprise servers, and is said to offer DRAM performance that is “seven times faster than the current DRAM performance limit”.

This will apparently allow faster responsiveness for high-end computing tasks including parallel computing, graphics rendering and machine learning.

“By mass producing next-generation HBM2 DRAM, we can contribute much more to the rapid adoption of next-generation HPC systems by global IT companies,” said Samsung Electronics’ SVP of memory marketing, Sewon Chun.

“Also, in using our 3D memory technology here, we can more proactively cope with the multifaceted needs of global IT, while at the same time strengthening the foundation for future growth of the DRAM market.”

The 4GB HBM2 DRAM, which uses Samsung’s 20nm process technology and advanced HBM chip design, is specifically aimed at next-generation HPC systems and graphics cards.

“The 4GB HBM2 package is created by stacking a buffer die at the bottom and four 8Gb core dies on top. These are then vertically interconnected by TSV holes and microbumps,” explained Samsung.

“A single 8Gb HBM2 die contains over 5,000 TSV holes, which is more than 36 times that of an 8Gb TSV DDR4 die, offering a dramatic improvement in data transmission performance compared to typical wire-bonding based packages.”

Samsung’s new DRAM package features 256GBps of bandwidth, which is double that of an HBM1 DRAM package. This is equivalent to a more than seven-fold increase over the 36GBps bandwidth of a 4Gb GDDR5 DRAM chip, which has the fastest data speed per pin (9Gbps) among currently manufactured DRAM chips.

The firm’s 4GB HBM2 also enables enhanced power efficiency by doubling the bandwidth per watt over a 4Gb GDDR5-based solution, and embeds error-correcting code functionality to offer high reliability.

Samsung plans to produce an 8GB HBM2 DRAM package this year, and by integrating this into graphics cards the firm believes designers will be able to save more than 95 percent of space compared with using GDDR5 DRAM. This, Samsung said, will “offer more optimal solutions for compact devices that require high-level graphics computing capabilities”.

Samsung will increase production volume of its HBM2 DRAM over the course of the year to meet anticipated growth in market demand for network systems and servers. The firm will also expand its line-up of HBM2 DRAM solutions in a bid to “stay ahead in the high-performance computing market”.

Courtesy-TheInq

Is AT&T Facing Pressure?

February 1, 2016 by  
Filed under Smartphones

Comments Off on Is AT&T Facing Pressure?

AT&T has announced aggressive discounts on new smartphones and devices, including a 2-for-1 smartphone offer for business customers.

A big focus of the AT&T discounts is special deals on Samsung’s Galaxy smartphones and Gear S2 smartwatches. Analysts interpreted that focus on Samsung devices as a way to clear out inventory prior to expected upgrade announcements coming in late February at Mobile World Congress in Barcelona.

AT&T is also facing pressure to add more subscribers, as analysts — including Evercore ISI this week– have predicted AT&T’s fourth-quarter postpaid subscriber loss will be more than 300,000. That comes amid reports that T-Mobile added 4.5 million net subscribers for the fourth quarter and Verizon Wireless added 525,000.

All the major carriers, including AT&T, hit the December holidays with special device deals, but AT&T apparently didn’t feel enough impact on its inventory from those offers, analysts said.

AT&T and Samsung are motivated to get rid of all the old inventory before new models arrive, said Patrick Moorhead, an analyst at Moor Insights & Strategy. “Retailers won’t run such an aggressive promotion unless they have a lot of stock.”

An AT&T spokeswoman provided a different explanation: “Due to popular demand, AT&T is bringing back some of its holiday promos.”

Those promos — available to both consumers and business customers at AT&T retail stores — include a free Samsung Gear S2 smartwatch for a limited time to any customer buying a Samsung Galaxy smartphone, or a free Samsung Galaxy Tab 4 for buying a Galaxy smartphone on an AT&T Next wireless plan. AT&T is also offering an iPad mini 2 for $99 when a customer buys a new iPhone on the Next plan.

For business customers, the 2-for-1 smartphone deal is new. It allows business customers to buy a new smartphone and then get another smartphone, valued at up to $650, for free.

Source-http://www.thegurureview.net/mobile-category/att-facing-pressure-offers-aggressive-smartphone-discounts.html

Is The Dollar Hurting PC Sales?

January 25, 2016 by  
Filed under Computing

Comments Off on Is The Dollar Hurting PC Sales?

Worldwide PC  shipments dropped 8.3 percent in the fourth quarter  which was the worst sales have been since  2008,, beancounters at Gartner Group said.

PC manufacturers shipped 75.7 million machines in the fourth quarter compared with about 82.6 million a year earlier. Sales sank 3.1 per cent in the US to 16.9 million in the quarter.

Gartner forecasts a fall of  a  percent in 2016 with the potential of a soft recovery later in the year.

Mikako Kitagawa, an analyst at Gartner said that the  fourth quarter of 2015 marked the fifth consecutive quarter of worldwide PC shipment decline. Holiday sales did not boost the overall PC shipments, hinting at changes to consumers’ PC purchase behavior.

Lenovo retained its leadership of the PC market with 20 percent of the global market in the fourth quarter. Its shipments dropped 4.2 percent. HP was the  No. 2 global PC maker, increased its market share slightly to almost 19 percent. The company maintained its top position in the U.S., with 27 percent of the market, despite a decline of 8.4 percent in fourth-quarter shipments. Del increased its global market share to 13.5 percent from 13.1 percent and ranked third.

IDC released similar figures saying that it was all the fault of the strong US dollar hampered overseas sales. It thinks that the decline in PC sales may slow in 2016, with IDC projecting a fall of 3.1 percent compared with 10 percent drop in 2015. Greater commercial adoption of Microsoft Windows 10 operating system may help stabilize sales.

Courtesy-Fud

AMD Goes Polaris

January 19, 2016 by  
Filed under Computing

Comments Off on AMD Goes Polaris

AMD has shown off its upcoming next-generation Polaris GPU architecture at CES 2016 in Las Vegas.

Based on the firm’s fourth generation Graphics Core Next (GCN) architecture and built using a 14nm FinFET fabrication process, the upcoming architecture is a big jump from the current 28nm process.

AMD said that it expects shipments of Polaris GPUs to begin in mid-2016, offering improvements such as HDR monitor support and better performance-per-watt.

The much smaller 14nm FinFET process means that Polaris will deliver “a remarkable generational jump in power efficiency”, according to AMD, offering fluid frame rates in graphics, gaming, virtual reality and multimedia applications running on small form-factor thin and light computer designs.

“Our new Polaris architecture showcases significant advances in performance, power efficiency and features,” said AMD president and CEO Lisa Su. “2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group.”

The Polaris architecture features AMD’s fourth-generation GCN architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.

GCN enables gamers to experience high-performance video games with Mantle, a tool for alleviating CPU bottlenecks such as API overhead and inefficient multi-threading. Mantle, which is basically AMD’s answer to Microsoft’s Direct X, enables improvements in graphics processing performance. In the past, AMD has claimed that Kaveri teamed with Mantle to enable it to offer built-in Radeon dual graphics to provide performance boosts ranging from 49 percent to 108 percent.

The new GPUs are being sampled to OEMs at the moment and we can expect them to appear in products by mid-2016, AMD said. Once they are in the market, we can expect to see much thinner form factors in devices thanks to the much smaller 14nm chip process.

Courtesy-TheInq

IPv6 Turns 20, Did You Notice?

January 14, 2016 by  
Filed under Computing

Comments Off on IPv6 Turns 20, Did You Notice?

IPv6 is 20 years old and the milestone has been celebrated with 10 percent adoption across the world for the first time.

The idea that IPv6 remains so far behind its saturated incumbent, IPv4, is horrifying given that three continents ran out of IPv4 addresses in 2015. Unfortunately, because the product isn’t ‘end of life’ most internet providers have been working on a ‘not broken, don’t fix it’ basis.

But 2016 looks to be the year when IPv6 makes its great leap to the mainstream, in Britain at least. BT, the UK’s biggest broadband provider, has already committed to switch on IPv6 support by the end of the year, and most premises will be IPv6-capable by April. Most companies use the same lines, but it will be up to each individual supplier to switch over. Plusnet, a part of BT, is a likely second.

IPv6 has a number of advantages over IPv4, most notably that it is virtually infinite, meaning that the capacity problems that the expanded network is facing shouldn’t come back to haunt us again. It will also pave the way for ever faster, more secure networks.

Some private corporate networks have already made the switch. Before Christmas we reported that the UK Ministry of Defence was already using the protocol, leaving thousands of unused IPv4 addresses lying idle in its wake.

IPv6 is also incredibly adaptable for the Internet of Things. Version 4.2 of the Bluetooth protocol includes IPv6 connectivity as standard, making it a lot easier for tiny nodes to make up a larger internet-connected grid.

Google’s latest figures suggest that more than 10 percent of users are running IPv6 connections at the weekend, while the number drops to eight percent on weekdays. This suggests that the majority of movement towards IPv6 is happening in the residential broadband market.

That said, it is imperative that businesses begin to make the leap. As Infoblox IPv6 evangelist Tom Coffeen told us last year, it could start to affect the speed at which you are able to trade.

“If someone surfs onto your site and its only available in IPv4, but they are using IPv6, there has to be some translation, which puts your site at a disadvantage. If I’ve not made my site available in IPv6, I’m no longer in control over where that translation occurs.”

In other words, if you don’t catch up, you will soon get left behind. It was ever thus.

Courtesy-TheInq

 

« Previous PageNext Page »