Early users of Intel's new Xeon E5 CPU designed for today's busy servers say it is fast, but the only problem is that it runs way too hot-- a major problem in data centers today.
Jonathan Kozc of Swinburne University's Center for Astrophysics and Supercomputing has used an early sample of the new processor to perform complex mathematical calculations for his research into pulsars.
Speaking at Intel's launch for the E5 in Australia, Kozc said that the new CPU produced results “a factor of four” faster than those achieved with competing Nehalem processors.
Stephen Gillard, CEO of outsourced digital production facility Studio Engine also reported that the E5 delivers real speed. Studio Engine operates a computing platform comprising about 1,000 HP servers using Xeon 5640 series processors stacked into 128-server racks.
The new E5 CPU achieved a Passmark score of over 30,000. The older model scored around 18,000, but Gillard also had issues with too much heat being propagated from the chips. The test server he used was a 2U and he said he could not imagine once the E5 reaches blade servers, that a rack with specifications that makes sense to Studio Engine could be comfortably cooled and deemed 'safe' to place in a modern data centre.
He nonetheless declared the E5 a good product and said it will be welcomed by the film production community he serves as its improved floating point performance will allow the creation of more detailed animations at a much lower cost.
In other IT industry news
Scientists at Deutsche Telekom's labs have set a new record in the amount of data that can be sent down a single optical fiber, shattering the previous long-distance data-transfer record by more than a factor of two.
Specifically, a 512 gigabits-per-second transmission rate was achieved over a single optical fiber from Berlin to Hanover and back, a distance of 734 kilometers.
Subtracting out the error-correction overhead, the total usable bandwidth was 400Gb/s. Enough, T-labs points out, to transmit a stream of data equivalent to about 78 music CDs in just one second.
In December of last year, a team of U.S. and Canadian researchers managed to sustain a computer-to-computer data transfer of a combined 186Gb/s between the University of Victoria Computer Center and the Super Computing 2011 convention in Seattle, and that represented a combined 88Gb/s in one direction and 98Gb/s in the other.
T-Labs popped all their bits down a single 100 GHz fiber line at just over five bits per cycle. "This tremendous transmission performance was reached using innovative transmission technology with two carrier frequencies, two polarization planes, 16-QAM quadrature amplitude modulation and digital offline signal processing for the equalization of fiber influences with soft-FEC forward error correction decoding in the receiver," said the researchers.
This new technology would enable a standard 48-channel, 100 GHz optical-fiber transmission/reception setup to achieve a total throughput of 24.6 terabits per second. A quick bit of multiplication shows that system to be able to squeeze 3,696 CDs-worth of data in one second.
However, and this is significant, there was no need to replace the fiber itself. As T-Labs notes, the channel was in place, and the modifications were made to the transmitters and receivers. As such, the improvements in data-transmission rates could be achieved without the expense of laying new fiber.
"We are very proud of having attained this tremendous transmission performance over the internet under real-world conditions," said T-Labs manager Heinrich Arnold.
If these rates are stable, repeatable, relatively easy and inexpensively deployable, Arnold and his team have much to be proud of, and this could greatly speed up data transmissions over long distances.
In other IT industry news
Microsoft said that it has released the new beta version of its forthcoming Windows 8 Server operating system, a day after making its consumer preview of the Windows 8 client-side code that will be available tomorrow morning.
The software giant is claiming that its Server 8 OS platform has been designed with virtualization and cloud support in mind. The coding has version three of Windows' virtualization platform Hyper-V, which the company promises will be able set up many more virtual systems on the same hardware, while ensuring that data can still be siloed when needed.
"With Hyper-V Network Virtualization, you can simply create virtual networks so that different business units, or even multiple customers, can seamlessly share network infrastructure," said Microsoft's corporate vice president for servers Bill Laing.
"You will be able to move VMs (virtual machines) and servers around without losing their network assignments," he added.
Microsoft is also looking forward in shifting IT admins off using graphical user interface controls and instead get them to use PowerShell to manage the software. PowerShell is a bit of a retro step in that Microsoft was once very keen on server GUIs.
The beta has over 2,300 commands included in PowerShell, and Microsoft recommends dumping the GUI altogether, although you can still install it as an option, if you prefer.
The download comes as a 64-bit ISO image in English, Chinese, French, German, Japanese, and as a virtual hard disk (VHD) edition in English only.
Early reports indicate that the beta code is very stable, but as always with betas, users install and operate it at their own risks, and Microsoft users have been warned by an initial install screen that spells out the risks with any software released in beta form.
Source: Swinburne University's Center for Astrophysics and Supercomputing
No comments:
Post a Comment