News:

Choose a design and let our professionals help you build a successful website   - ITAcumens

Main Menu

Operating Systems and Servers News

Started by VelMurugan, Mar 17, 2009, 11:27 AM

Previous topic - Next topic

VelMurugan

Cisco shakes up data centre market with Unified Computing launch

Cisco shakes up data centre market with Unified Computing launch
Cisco has launched its Unified Computing System, comprising virtualisation technology, services and blade servers in its bid to shake up the data centre market and put pressure on its previous partners, HP and IBM.

Cisco's new architecture comprises compute, network, storage access and virtualisation within a single rackable system designed to cut IT infrastructure costs and complexity, stretch existing IT investments and allow enterprise customers to build a data centre that could easily be extended for future growth, according to the company.

The company has also taken its first steps in the server market, where it will compete with long-standing partners like HPand IBM, Cisco new UCS B-Series blades, previously known under the code-name Cailfornia, are based on the forthcoming Intel Nehalem processors. Cisco said the blades would incorporate extended memory technology for applications with large data sets.

However, Cisco also is teaming up with software partners such as Microsoft, VMware and BMC to provide technology for its new system. Cisco will pre-package, resell and support Windows Server 2003, Windows Server 2008 with Hyper-V technology and Microsoft SQL Server 2008 as part of the Unified Computing System. For its part, VMware is providing virtualisation technology to the new system and BMC is contributing resource-management software.

Systems integrator Accenture has already signed on to be a services partner, introducing four services options for its customers to deploy the Unified Computing System. Cisco also is inviting its broad network of channel partners to work with the company to provide the new infrastructure to enterprise customers.

Prior to Monday's announcement, Cisco had laid out its intention to eliminate the manual integration of computing and storage platforms with networks and virtualisation systems. In a recent blog posting, CTO Padmasree Warrior acknowledged this will lead Cisco to compete with some of its partners.

Most enterprises have been able to realise the benefits of the first phase of virtualisation, consolidating their data centres for economies of scale and simpler management. But it's been hard to achieve full virtualization, in which virtual machines can continuously move among servers, analysts say. This would allow for adding processing power as demand for an application grows, or for moving tasks off a physical server at night for hardware maintenance.

Vendors of servers, storage and software all can play roles in managing resources in virtualized data centers. Cisco executives have said the network is the best place to tackle many of these tasks because it is the only element of the infrastructure that touches everything.

IBM and HP have not overlooked the importance of networks in controlling data centres. IBM has aligned with Cisco rival Juniper Networks in a broad strategy called Stratus, and HP is expected to increasingly tie its growing ProCurve networking business in with its computing offerings.

Cisco has pushed to bring more functions into the network infrastructure for several years. These have included security, application optimisation, and adaptation of multimedia to fit different clients and purposes. As revenue growth from its core routing and switching businesses slows, the company is aggressively branching out into new areas, including consumer electronics and the Telepresence high-definition videoconferencing line.

Source : TechWorld

VelMurugan

Data centres fail energy efficiency tests

It's not all doom and gloom in the IT industy. A survey of US execs reveals they're spending more on their data centres, although companies are still not curbing power use

On average, data centre budgets are increasing nearly 7 percent this year, according to an online survey of 300 senior-level IT decision makers at some of the largest North American companies.

More than four out of five companies surveyed are planning data centre expansions in the next 12 to 24 months. Space requirements for these data centres are growing 16 percent year over year, to an average of 21,000 square feet in 2009.

"One finding that may surprise people is that companies are increasing their data centre budgets in 2009. This is a reflection of how companies view their data centers as critical assets for increasing productivity while reducing costs," Chris Crosby, a senior vice president at Digital Realty Trust,. Digital Realty Trust, which offers a variety of data centreproducts and services, conducted the survey in January.

The survey comes on the heels of a study by the AFCOM Data Center Institute, which found a bleaker picture. AFCOM said budget cuts are forcing IT departments to shed older, more experienced workers, and that many data centres are delaying or cancelling planned physical expansions and relocations.
Specific findings from Digital Realty Trust include the following:

    * 84 percent of surveyed companies are planning data centre expansions in the next 12 to 24 months
    * 64 percent of companies that plan to expand in 2009 will do so in two or more locations;
    * Surveyed companies plan to increase data centre spending 6.6 percent in 2009;
    * Data centres account for 35 percent of the average IT budget in surveyed companies.

Digital Realty Trust also surveyed companies about PUE - or power usage effectiveness, a Green Grid-devised measure that compares the total power used in a facility with the power devoted specifically to IT equipment. A PUE of 2.0 would indicate that for every watt of power consumed by servers and other IT equipment, an additional watt is needed to cool and distribute power to these IT resources. A PUE of 3.0 indicates that two additional watts are needed to cool and distribute power.

A large majority of companies surveyed were measuring power usage. Four out of five reported PUE ratings of at least 2.0, and 26 percent reported PUE ratings of at least 3.0.

"Current facilities are not highly energy efficient," the Digital Realty Trust concluded.

VelMurugan

Rackspace set to take on Amazon

Hosting company Rackspace is set to take on Amazon with the launch of two new cloud-based services. It has launched CloudServer , the pay-as-go, on-demand server option and has taken Cloud Files, the online storage service, that competes with Amazon's S3, out of beta.

Emil Sayegh, general manager for Rackspace cloud services said that the availability of Cloud Servers, Cloud Files and the existing Cloud Sites meant that the company was the only one that offered a complete suite of cloud services. He added that with its dedicated server offerings it was also the only company that offered a complete choice between dedicated and shared servers.

He said that the company was going to use that flexibility to position itself against Amazon, although he also said that the company was going to be a strong player in the space thanks to its service ethos. "We're a service company, contactable by phone, email or chat, 24x7, that's what we're good at." He added that the completion might find the technical provisioning only one aspect of supplying on-demand products and fall down on the customer service side. "Service is complicated," he said - although Amazon with its high emphasis on customer satisfaction is probably aware of that.

Sayegh said that the company was hoping to build even more on its service offering. "At the moment we manage dedicated and cloud through separate consoles, we're aiming to bring them together. We all have separate billing and are looking to merge them into one in the near future," he said.

In addition to the new launches, the company has announced that its sister companyJungle Disk, the company it bought last year, will now be offering it fle and backup services on Rackspace as well as Amazon S3. "Customers logging on to the service will now be offered a choice," he said. Although Jungle users are offered a choice, Rackspace's Erik Carlin blogged that the Rackspace offering provided significant advantages notably on performance and price.

In a further move, Sayegh announced that the company was dropping the Mosso brand name that it had used in the US. "It was name that was unknown in the rest of the world," he said, "what was the point of keeping it?"

Pricing for the Rackspace cloud serices is currently in dollars although the company is moving to local currency pricing before the end of the year."We hope to have UK pricing up well before then, " he said. He added that Rackspace was also looking to open new data centres in the UK in the near future but wouldn't be drawn on a date.

VelMurugan

Amazon cloud users get reservation option

Amazon Web Services (AWS) has started offering customers the opportunity to reserve capacity in advance on its  Elastic Compute Cloud infrastructure.

The new pricing, called Reserved Instance, is an additional pricing model to the company's pay-as-you-go pricing, said AWS spokeswoman Kay Kintonl. Like its current pricing model, customers will still only pay for the compute capacity they consume, she said.

Under the new initiative, which is only available in the US currently, customers pay once to reserve capacity on EC2 in either one-year or three-year terms. If they find they will need more usage than has already been reserved and paid for, they can also add more capacity via the on-demand model and pay for that as needed.

Reserving computing capacity is a good option for customers who have predictable computing needs and plan for that usage in their IT budgets ahead of time, Kinton said. AWS added the pricing model at the request of some larger customers who told them that reserving capacity "more closely aligns with how they budget for IT," she said.

For companies that can use reserved-capacity pricing, it actually reduces their IT costs, she said. For instance, AWS charges $.10 per hour for what it calls a "small" On-Demand Instance on EC2. A comparable IT deployment using a one-year Reserved Instance pricing costs $325, while a three-year costs $500. That reserved pricing works out to be $.03 per hour, according to an AWS pricing chart.

AWS expects to offer the new pricing to European customers soon, Kinton said. AWS began offering its EC2 service to Europe in December.

VelMurugan

Hitachi enters server, network monitoring business

Hitachi Data Systems has released three new software products to provide enhanced monitoring of business applications, virtualised servers and the storage they use.

Hitachi said the products address a new market for the storage systems and software provider: a holistic view of everything that an enterprise uses to connect to its storage systems. The products will also work with competing vendors' systems.

The Hitachi IT Operations Analyzer is an application aimed at midsize companies that provides a single-pane view of data centre servers and IP networks, as well as Fibre Channel storage-area networks, LANs and the storage hardware attached through those networks. Hitachi said the application provides performance and availability monitoring across heterogeneous servers, networks and storage systems.

"It's certainly a new kind of product for Hitachi in that they obviously come from a storage background. I think this is a first-generation effort to really reach beyond storage and offer customers and channel partners visibility into both network servers and storage," said Mary Johnston Turner, an analyst at IDC.

Turner said Hitachi will likely use the IT Operations Analyzer as a platform for other monitoring tools in the future. The software features automated root-cause analysis and network visualisation and runs off an agentless architecture. It has a web-based interface.

"It uses I/O path awareness to look at bottlenecks proactively so there's no need for active monitoring or the setting of thresholds," said Sean Moser, vice president of software at Hitachi Data Systems.

Moser said Hitachi's IT Operations Analyzer is designed to help users monitor and improve customer service levels and can easily be managed across a data centre without requiring specialised training.

Hitachi's second release, its Virtual Server Reporter (VSR), uses Aptare's StorageConsole to provide an end-to-end view of virtualised VMware servers and their respective storage usage. VSR is intended for enterprise installations and extends the capabilities of heterogeneous storage management reporting to individual virtual machines.

VSR is an agentless application that automatically discovers all ESX server deployments and then monitors hardware capacity and CPU utilisation. The application can automatically load-balance workload across virtual machine server deployments, "and it helps customers find spare capacity across their server footprint," Moser said.

VSR works independently of storage installations, which is a first for Hitachi Data Systems, a storage-centric vendor.

Moser said that by integrating the reporting of virtual servers, users can increase the visibility of their virtual environments and more effectively manage the storage and backups associated with virtual machines, allowing them to make the best use of storage assets and decrease costs.

Hitachi's third product is the Storage Command Portal, which consolidates storage reporting within the Hitachi Storage Command Suite of management software to report on Hitachi storage systems and the business applications that rely on them.

The software provides a business application view of the Hitachi storage environment, with capacity and performance reporting to help manage service-level agreements.

VelMurugan

AMD drops dual-core server CPUs from lineup

Advanced Micro Devices has revealed it is cutting prices of some Opteron chips by up to 43 percent, and has also withdrawn dual-core server processors from its product lineup.

Dual-core Opteron chips were part of AMD's earlier server processors, but were not included in an updated price list posted on Tuesday. That leaves only quad-core Opteron processors as part of AMD's server chip lineup.

"For the bulk of our customers, our AMD Opteron processor offerings are all now quad-core," said Teresa Osborne, an AMD spokeswoman, in an email.

AMD, however, is still offering dual-core Athlon and Phenom chips for consumer desktops and laptops. It also offers triple-core Phenom chips for desktops.

AMD also slashed prices on some Opteron server chips. It cut the prices of Opteron 2374 HE chips by 43 percent from $450 (£306) to $255 (£173) and prices of Opteron 8376 HE processors by 35 percent from $1,514 to $989. AMD also cut prices for other Opteron HE chips.

The price cuts went into effect last week with the introduction of four quad-core Opteron HE processors, Osborne said.

The HE chips are part of AMD's family of Opteron server processors code-named Shanghai. The offerings fall between low-power Opteron EE chips and high-end Opteron SE chips, which are speedier and draw more power than HE chips. Shanghai chips draw between 40 watts and 105 watts of power and operate at speeds between 2.1GHz and 3.1GHz.

The cut of dual-core server processors is an incremental step as chip companies like AMD and Intel add cores to improve processor performance while trying to reduce power consumption. Increasing the number of cores helps servers execute more tasks, which could lower the number of servers needed in data centres and cut hardware acquisition and energy costs for customers.

AMD is set to ship the 6-core Opteron processor code-named Istanbul in May. AMD is also planning to release a 12-core chip codenamed Magny-Cours in 2010. Both the chips will be made using the 45-nanometer process.

In an effort to jump ahead of Intel in the core battle, AMD also last week said it would release a chip codenamed Interlagos in 2011 that will include between 12 and 16 processor cores. Intel has only officially announced an 8-core Xeon chip code-named Nehalem-EX, which is due for release in 2010.

Interlagos chips will go into servers with two to four processor sockets, resulting in a maximum of 64 cores per system. The chip will be primarily used in data centre servers, and will be made using the 32-nm process technology. AMD also announced a processor codenamed Valencia with up to 8 cores, also made using the 32-nm process, which is set for release in 2011.

VelMurugan

Intel touts ROI returns from new machines

Microsoft is not the only vendor smarting from companies delaying PC purchases, after Intel presented new research citing how large firms buying new PCs equipped with its vPro security and management technology, can recoup their investment in less than a year.

A company with 30,000 PCs that upgrades to new Core 2 Duo or Quad computers would make its money back in 17 months - and in just 10 months if those PCs are also equipped with vPro-enabled motherboards, according to Intel.

One analyst, however, said such ROI figures apply only to a limited set of firms and do not encompass other costs of PC upgrades, such as buying or upgrading new software licenses, something many companies have also been delaying.

"This might make sense for IT outsourcing firms or very large companies," said Jim McGregor, an analyst with In-Stat. "But for many of us, this is the worst economic downturn of our lives. Unless it fits in your company budget, it doesn't make sense."

According to an Intel-commissioned survey by Wipro Consulting of 106 North American and European companies, only 32 percent have slowed their PC refresh rates in the last six months. The majority (60 percent) of the companies haven't changed their PC upgrade policy, while 8 percent have actually accelerated them. The companies that were surveyed had a minimum of 5,000 PCs if they were based in North America, 2,500 if they're located in Europe.

"Corporate IT is overdue for an update from Windows XP," said Rob Crook, general manager and vice-president for Intel's business client group. "Yes, it's tough economic times, but those who can, are [upgrading]."

Though still profitable, Intel blamed delayed PC upgrades on down revenues in recent months.

Slower PC upgrades mean that many enterprises haven't tried out vPro, which Intel launched in April 2006. The first desktop PCs featuring the technology began shipping about several months later.

vPro enables a number of services such as Active Management Technology (AMT) that allow IT managers to remotely configure and set policies for PCs, and Intel Trusted Execution Technology (TXT) to enforce security policies.

According to independent analyst, Jack Gold, only 10 to 15 percent of enterprise PCs deployed today have vPro today. Gold released research last week showing that upgrading laptops makes financial sense, primarily due to the cost of maintaining breakage-prone, out-of-warranty hardware.

Desktop PCs don't have that problem, Gold said. Upgrading to new vPro-enabled PCs on the desktop side can still be financially smart, though he said the payback is less straightforward than Intel may make it sound.

TXT is supposed to be more secure because it is in the hardware, not software, layer. That didn't stop security experts from breaking TXT earlier this year, and explaining how it was done.

McGregor still praises vPro's features. But he says vPro technology needs to be paired with strong back-end management tools and run in large environments for it to translate into increased security and lower costs.

VelMurugan

Microsoft talks cloud for infrastructure software

Microsoft has begun outlining its plans for both public and private clouds in relation to its stack of data centre infrastructure software.

At its annual Microsoft Management Summit this week, Redmond laid out its vision of a corporate IT world that straddles both "public clouds and private clouds," six months after it launch its Azure cloud OS.

The announcement comes a week after VMware made a similar announcement around private clouds.

But rather than technology and architecture changes, Microsoft is using semantics to align its existing stack of infrastructure and management software with the trendiest topic in network computing.

Microsoft's "private cloud platform" is built on a familiar IT combination of Microsoft software. And the cloud moniker is more or less the third phase of Microsoft's management platform that began in 2003 as the Dynamic Systems Initiative (DSI), morphed into Dynamic IT and is now its public cloud/private cloud strategy.

"Think about the private cloud as your internal data centre optimised for performance and cost and with the compatibility with your existing applications all built on tools you know today - Windows Server and System Center," said Bob Kelly, Microsoft's corporate vice president for infrastructure server marketing, during his opening keynote on Tuesday. "We believe that business enterprise customers will demand the same levels of reliability and predictability [found in the public cloud] for their internal data centres."

The one piece that many corporate IT shops may not be up to speed with at this point is virtualisation, which Microsoft has built into Windows Server with Hyper-V and System Center with its Virtual Machine Manager.

Both of those products will get more sophisticated as new versions are rolled out in the next six months.

Microsoft also is aligning its Visual Studio development tools and .Net platform with both of its cloud environments to make it easy for users to build applications that straddle the two.

While little of this is new, Microsoft is clearly beginning to turn its marketing message toward the cloud phenomenon and create a bridge to its Azure platform for customers already committed to Windows infrastructure internally.

"We think customers will not standardise on one cloud," Kelly said. "We think it will be critical to deploy applications on premises and federate across two clouds."

To that end, Microsoft is building cloud federation features into a forthcoming version of Virtual Machine Manager that will let users manage from a single console virtual machines that are spread across public and private clouds.

Microsoft also is introducing enhancements to its web platform with Internet Information Server 7.5 and supporting ASP.Net on its Server Core, a scaled down installation of Windows Server 2008. The platform will rely heavily on PowerShell, Microsoft's new scripting technology.

And Microsoft is developing new technologies such as server applications virtualisation so users can create application and operating system images that can be combined on the fly for quick and efficient deployment on any cloud-based environment.

Microsoft said public clouds will eventually take on a number of key characteristics.

"Availability will be king," Kelly said. He also said public clouds will be distributed and heterogeneous in nature and be able to expand and contract.

"These four elements are absolutely critical to deliver that service, that foundation to the private cloud," Kelly said.

VelMurugan

Sun updates Solaris with Nehalem

Sun Microsystems made released a new version of its venerable Solaris server operating system - expected to be the last release before the company's takeover by Oracle.

Enhancements include features that boost performance and lower energy consumption for servers powered by Intel's new Nehalem-class Xeon processors, and improvements to virtualization and storage.

The next major release of Solaris, version 11, is scheduled for the middle of next year, says Larry Wake, group manager for Solaris marketing at Sun. He declined to speculate on the fate of Solaris after the Oracle acquisition.

After buying Sun earlier this month, Oracle CEO Larry Ellison called Solaris "by far the best Unix technology available in the market" and the "heart of the business" for Sun.

Sun released Solaris in 1992 as a successor to its earlier SunOS. The then-proprietary operating system was historically tightly integrated with Sun's Sparc CPUs, though that has changed.

"We want to make sure Solaris runs well on the boxes we make, but also make sure it works on everybody else's systems," Wake said.

That commitment resulted in features tailored for Intel's new Nehalem CPUs such as a "power-aware dispatcher" that enables Solaris to aggregate workloads onto the fewest number of CPU cores needed and then turn off the rest, he said.

Solaris now can also offload network traffic processing from the CPU onto compliant network cards.

Sun released Solaris 10 early in 2005, the same time it announced a plan to gradually make the OS fully open-source. Its last major update to Solaris was in mid-2006, when it introduced its 128-bit ZFS file system and its Containers lightweight virtualization technology, and also bundled the PostGres open-source database.

Both ZFS and Containers were also updated with this release. ZFS now has a high-speed cloning feature that lets a new server point to existing data rather than copy and write a second, unnecessary time. That can speed up the creation of virtual machines under Containers, Wake said.

ZFS also can now recognise solid-state disks (SSD) and take advantage of their faster speed.

Sun officially released OpenSolaris last year. Some analysts speculate that Oracle might want to merge OpenSolaris with Linux. Wake declined to comment.

VelMurugan

Windows 7 could be out as early as October

Forget talk about next year. there are indications that Windows 7 may be available as soon as October.

Speaking at an Acer press launch, a company  representative was quoted as saying that a new Windows 7 PC called the Z5600 wouldl be one of the company's key products in the run up to the Christmas holiday period, and would be on store shelves on 23 October.

Microsoft has so far refused to commit to an exact release date for Windows 7, but speculation has suggested that the company would release it later this year, rather than early 2010 as some executives have been predicting. The operating system was made available as a beta in January and a release candidate (RC) is now with developers.

The Windows RC will be available to the public next week, and those who download it will be able to use it for free for a year.

The distribution of the RC is one of the last steps before the Windows 7 code is locked down and sent off to manufacturers ahead of its commercial release.

VelMurugan

Warning sounded over Win 7's XP emulation

A new Microsoft add-on for Windows 7 that will let some users run Windows XP applications in a virtual machine could create support nightmares for IT managers, analysts have said.

The company had announced last month that the add-on, called Windows XP Mode (XPM), will be available to users of the Windows 7 Professional, Ultimate and Enterprise versions once the new operating system ships later this year. Professional and Ultimate are the two highest-priced versions of Windows 7, while Enterprise is sold only through volume licensing agreements.

Analysts agreed that Microsoft needs to offer the add-on to help persuade users to upgrade to Windows 7, but they also noted that it could cause multiple problems for corporate users.

"This will help the uptake for Windows 7, because it removes one more gotcha, and that's never a bad thing to do," said Michael Cherry, an analyst at Directions on Microsoft.

Cherry added that Microsoft's decision to use virtualisation to provide backward compatibility is a nice "safety net" for users who lack access to Microsoft's Enterprise Desktop Virtualisation technology.

Michael Silver, an analyst at Gartner, echoed Cherry's take on what motivated Microsoft to offer XPM, but he added that it broadly expands support requirements. "You'll have to support two versions of Windows," he said. "Each needs to be secured, anti-virused, firewalled and patched. If a company has 10,000 PCs, that's 20,000 instances of Windows."

Silver also noted that the add-on might lead companies to neglect the important task of making sure their applications are compatible with Windows 7. "This is a great Band-Aid, but companies need to heal their applications," he said. "They'll be doing themselves a disservice if, because of XPM, they're not making sure that all their apps support Windows 7."

Silver added that while Microsoft is effectively extending the life of Windows XP by offering it as a Windows 7 add-on, it hasn't changed its plan to shift the older operating system out of mainstream support and provide only what it calls "extended" support only until mid-April 2014.

"[XPM] will give some a false sense of security," Silver warned. "What happens in 2014, when XP isn't supported anymore? I think companies will be much better off if they make all their applications run on Windows 7."

VelMurugan

Windows 7 beta made available to public

The near-final version of Microsoft's next operating system, Windows 7, has become available to the general public.

Microsoft will collect feedback on the Windows 7 release candidate over the next few months, fixing small issues. The company allowed developers and other testers to begin downloading the release candidate last week.

Windows 7 comes nearly three years after Windows Vista, which took five years for Microsoft to engineer but was regarded by some as underwhelming. Microsoft hasn't said when the final Windows 7 version will be released, although it's rumoured to be out before year's end.

Microsoft warned that it is not offering technical support for the Windows 7 release candidate, so those who install it are on their own. Users should be familiar with installing an operating system from scratch, formatting a hard drive and backing up data, among other skills, Microsoft advised.

In the Windows 7 release notes, Microsoft warns of several problems that haven't been resolved, including issues with its latest Web browser, Internet Explorer 8 (IE8).

Debugging JavaScript with the developer tools in IE8 could throw up a warning that a website is not responding, but that warning can be ignored. Also, some web pages may have misaligned text or missing images. Microsoft recommends clicking on the "compatibility view" button on the address bar as a fix.

Microsoft released the Windows 7 beta in Arabic and Hindi, but those languages have been replaced with French and Spanish in the release candidate. English is available for both versions.

"We needed to ensure certain features were tested for worldwide functionality, and Hindi and Arabic help us test a number of language-related features," Microsoft said.

The Windows 7 release candidate will only work for so long. It is due to expire on June 1, 2010. Three months prior, the release candidate will automatically shut down a person's computer after two hours.

The Window 7 beta expires on Aug. 1, and computers with that version will begin shutting themselves down after two hours beginning July 1.

Microsoft said that Windows Vista users will not need to reinstall their applications after upgrading to the Windows 7 release candidate. The company does, however, recommend backing up data as a precaution. Vista users will have to do a clean install, however, to go from the Windows 7 release candidate to the final version.

Windows XP users should back up their data and do a clean install of the Windows 7 release candidate.

To run the 32-bit version of the release candidate, a computer should have a 1 GHz or faster processor, 1GB of RAM, 16GB of hard disk space and a DirectX 9 graphics processor with WDDM (Windows Display Driver Model) 1.0 or higher driver.

For the 64-bit version, Microsoft recommends a 1 GHz or faster processor, 2GB of RAM, 20GB of hard disk space and a DirectX 9 graphics processor with WDDM 1.0 or higher driver.

VelMurugan

Intel delays Itanium chip again

Intel has once again pushed back the release of its next-generation Itanium server chip to the first quarter of 2010.

The Itanium chip, code-named Tukwila, was originally due for release by the middle of this year. Tukwila is being delayed for certain "application scalability" enhancements that Intel wants to make to the chip architecture, the company said. The chip maker declined to elaborate on the type of enhancements it plans to make.

During system testing, the company saw an opportunity to enhance the architecture for "highly threaded workloads where contention for system resources plays a dominant role in application scalability," said Patrick Ward, an Intel spokesman.

Itanium chips are 64-bit quad-core processors designed to run fault-tolerant servers that require high uptime. The chips use a different instruction set than x86 server chips, and are intended to compete with other server processors based on RISC architecture, like Sun's Sparc and IBM's Power chips. However, the chips have not seen much success, with only a few vendors like HP selling Itanium-based servers.

The Itanium is mainly designed for mainframe-based applications that require plenty of memory bandwidth, like scientific computing and financial transactions, said Jim McGregor, chief technology strategist at In-Stat. Hewlett-Packard has made huge investments in Itanium and may have asked Intel to make particular design changes to meet the needs of its enterprise customers, he said.

"The Itanium processor is pretty much a custom solution for HP. HP has a huge investment in this, and they buy most of the processors," McGregor said.

But the delay could affect HP's ability to win new customers as competitive chips like IBM's Power continue "firing on all cylinders," wrote Gordon Haff, an analyst at Illuminata, in a blog entry.

"Delays to Itanium matter less to Intel and the server makers who use it (meaning HP first and foremost) than in the case of x86 Xeon, where a few months' delay can have a major revenue impact," Haff wrote. Customers for servers like HP's Superdome and NonStop value enterprise-class capabilities more than performance, he wrote.

Advertisement

Itanium has been plagued with development problems that have delayed its release multiple times. Intel earlier this year delayed Tukwila's release to the middle of this year to add a faster interconnect and support for new technologies like DDR3 memory. The last Itanium chip, code-named Montecito, was released in 2006.

The delay however hasn't changed manufacturing plans for Tukwila. Tukwila chips will still be manufactured using the old 65-nanometre process, Intel's Ward said. Intel currently manufactures chips using the 45-nm process and will upgrade to the 32-nm process later this year. Intel upgrades its manufacturing process every two years to make chips faster, smaller and more power efficient.

Itanium chips are not volume chips like PC processors, and are mainly customised to meet the needs of server makers, McGregor said. Performance and reliability are a larger measure than size and power, and based on requests from server makers, extra transistors help improve system performance to scale application performance, he said.

During an investors conference last week, Intel's CEO Paul Otellini tried to build enthusiasm around Itanium by pointing out the uncertainty surrounding the future of the Sparc chip after Oracle acquired Sun. Otellini said customers could increasingly adopt Itanium servers as they abandon Sparc chips. Oracle CEO Larry Ellison has however denied any plans to abandon the Sparc chip.

VelMurugan

Microsoft pulls out of anti-trust hearing

Microsoft has pulled out of a hearing over EU anti-trust allegations that it "shields" Internet Explorer (IE) from competition. The company has said that senior regulators would now be unable to attend.

The EU's Competition Commission had scheduled the hearing for 3-5 June, when Microsoft would be allowed to argue against charges made in January that the company has an unfair distribution advantage because it includes IE with its operating system.

Microsoft blamed an inflexible EU for the cancellation, saying that a scheduling conflict meant European decision makers would be absent, making any presentation a waste of time.

"The dates the commission selected for our hearing, June 3-5, coincide with the most important worldwide intergovernmental competition law meeting, the International Competition Network (ICN) meeting, which will take place this year in Zurich, Switzerland," Dave Heiner, deputy counsel for Microsoft, said in a post to a company blog. "As a result, it appears that many of the most influential commission and national competition officials with the greatest interest in our case will be in Zurich and so unable to attend our hearing in Brussels."

When Microsoft submitted a several-hundred-page written response to the EU allegations, it was also given the June dates for a possible hearing. The company immediately asked the commission to reschedule, said Heiner. The commission refused.

"The commission has informed us that 3-5 June are the only dates that a suitable room is available in Brussels for a hearing," he said.

Without senior EU and European officials able to attend part or all of the hearing, Heiner said Microsoft decided to cancel. "While we would like an opportunity to present our arguments in an oral hearing, we do not think it makes sense to proceed if so many of the most important EC officials and national competition authorities cannot attend," he added.

According to the commission, Microsoft's refusal to meet the scheduled dates means that the company has technically withdrawn its request.

The case against Microsoft stems from a December 2007 complaint by Norwegian browser maker Opera Software, which said that the US company's "tying" of IE with Windows gives it an unfair advantage. In January, the EU sent Microsoft its official charge list, called a "Statement of Objections."

The commission has hinted that it may fine Microsoft and force it to change Windows so that users are offered alternate browsers, such as Opera, Mozilla's Firefox or Google's Chrome. The last two firms have joined the case as "interested third parties," which allowed them to see the EU's allegations and take part in the June hearing.

Although Microsoft added a "kill switch" to Windows 7 earlier this year that lets users block IE8 from running - the browser remains on the system - it has repeatedly refused to link that move to the EU action.

For its part, Opera has said it wants the commission to make Microsoft provide multiple browsers, including its own, perhaps via the Windows Update service that Microsoft uses to patch and upgrade its own software.

Admin(Portal)

It was an amazing thread Velu.  :confused

Thanks for this excellent and informative share.
  8)
A.SK in short