Sun Microsystems To Provide Support For OpenOffice

Sun Microsystems on Monday plans to announce that it will provide support for the OpenOffice.org productivity software suite, citing a wave of momentum behind the open-source project.

The support, which starts at $US20 per user per year, will be offered to companies that distribute OpenOffice.org, not directly to end-users, according to senior director of marketing for StarOffice/OpenOffice.org and Network.com, Mark Herring. "For a lot of distributors, they wanted to distribute OpenOffice.org and had no option for back-line support," he said.

OpenOffice.org and StarOffice, Sun's accompanying commercial product, are compatible with Microsoft Office and identical in terms of capabilities, which include word processing, spreadsheets and presentation software. But until now, Sun only supported StarOffice.

Another difference will remain -- Sun does not plan to provide indemnification against lawsuits for OpenOffice.org, as it does for StarOffice, Herring said.

Sun's move comes as OpenOffice.org is being downloaded 1 million times per week, with total downloads to date standing at about 110 million, Herring said.

Out of that number, Sun estimates that "tens of millions" of people are actively using the software, according to Herring. The most recent version is 2.3. Version 2.4 is expected in March and will contain significant new features, according to the [openoffice.org] qebsite.

"Microsoft Office is still the dominant tool out there -- only a fool would deny that," he said. "But [OpenOffice.org] has had a huge amount of momentum."

Sun believes the average OpenOffice.org user skews younger on average, and that download activity in Europe and the US has been greater than in Asian countries, he added.

Developers can create extensions to the core OpenOffice.org suite. Sun has made a new one for shaving down the size of presentation files, Herring said. The wizard-like tool goes through a file and asks users whether they want to keep or compress the various elements, he said.

Sun plans to provide support for any extensions it creates, according to Herring. As for ones made by third parties, "we would have to work with them on that code on a case-by-case basis," he said.

Sun is also releasing StarOffice 8 Server. Herring described it as a conversion engine that changes 40 document types into PDF files. The server, which costs $11,000, is aimed at enterprises with large stores of legacy documents that aren't archived with an open standard, according to Herring.

Sphere: Related Content

Wireless LAN Signals & Xbox 360 Signals Clashing

Microsoft's popular Xbox 360 game console can create a strong and strange signal on wireless LANs, according to IT staff at Morrisville State College.

It's not clear whether the signal disrupts the college's WLAN access points or students' wireless notebooks. There is some anecdotal evidence, however, that it at least affects other radios in the same 2.4GHz band.

Morrisville IT staff typically use Bluetooth headsets, which run in the 2.4GHz band, with their mobile phones when they troubleshoot problems on the spacious campus. "We had problems syncing our headsets to our phone where this signal was strong," says Matt Barber, the college's network administrator. A phone user had to physically touch the headset to the cell phone to make the initial connection, he says.

There may be effects on the WLAN that the equipment itself, from Meru Networks, is circumventing, according to Barber. Part of Meru's WLAN architecture employs software that gives the access points more control over wireless-client transmission behavior than does the software of some of Meru's rivals. An access point near a radiating Xbox may be compensating for interference by in effect guiding a wireless laptop to send and receive when open spectrum is available, essentially dodging around the Xbox signal.

Working with Meru, the small IT staff is planning to test soon the effect of multiple Xbox consoles in a dorm with a large number of active notebook clients.

Network World has asked Microsoft to comment on the Xbox signal phenomenon, but the company was not able to reply before this story was posted. We'll update this report as soon Microsoft provides information.

The latest version of the Xbox, the Xbox 360 Elite, went on sale earlier this year with a 120G-byte hard disk and a high-definition video interface.

Morrisville is a small college in rural New York state, taking its name from a nearby town. In summer 2007, the college deployed a campuswide 802.11a/b/g WLAN based on equipment from Meru.. The plan was to replace those access points with Meru's new, two-radio devices that added support for Draft 2 of 802.11n, the IEEE standard that boosts throughput from 22M to 25Mbps to at least 150M to180Mbps. That replacement was just completed, creating the first large-scale deployment.

During the fall, Morrisville IT staff, working with Meru engineers and IBM, the network integrator, detected an unusual signal in the 2.4GHz band. "We wanted to look at the [radio frequency] environment in our dorms," Barber says. "We always thought we'd run into some strange stuff [there] in the 2.4 range."

The signal was discovered using Cognio Spectrum Expert, from Cognio (recently bought by Cisco). Spectrum Expert is RF-analysis software packaged with a WLAN adapter card that slots into any laptop PC. Among other capabilities, Spectrum Expert identifies sources of radio energy in the 2.4GHz and 5GHz WLAN bands, and identifies the cause, such as a brand of access point or a microwave oven.

"The signal really stood out," Barber says. "In some places it was so strong we thought it might be affecting the air [that is, the radio environment] around it."

The Cognio software, however, was baffled by this new signal: "Unknown emitter" was the classification. The signal shows up in the Cognio display as a kind of green-blizzard effect, covering a large swath of the 2.4 band, Barber says. That means the signal "is jumping all over the spectrum band," he says. In contrast, a nearby Meru access point shows up in the same scan as a strong, stable yellow-red glow, almost like a sun. The green blizzard is shot through with red dashes, which show, Barber says, that the signal at moments nearly rivals the access point in strength.

The mystery signal baffled the IT staff and Meru until Barber had a brainstorm: He brought in his own Xbox 360 and plugged it in, and turned on the Cognio spectrum analyzer. Presto: The same signal appeared.

Barber says the signal seems be created by the console's embedded 2.4GHz radio, which is used to communicate with the handheld wireless controller -- the gizmo with the buttons that manipulate a game running on the console. The Xbox also takes an optional Wi-Fi adapter, in the form of an USB dongle, to connect to a WLAN access point.

Barber says his "best guess" at this point is that the embedded radio, not the USB adapter, causes the signal. The signal is created even if the Xbox console is shut off: Just plugging its AC adapter into an electrical outlet seems to trigger the radio to look for -- and keep looking for -- a companion wireless controller. "It's even worse when you have multiple Xboxes in an area," Barber says.

At one point, IT staff wrapped the console in a static discharge bag, the material used, for example, to wrap and protect consumer electronics gear from static damage during shipment. The same properties make it act like radio "blanket" to muffle a transmission. Sure enough, the Cognio software showed a significant drop in the Xbox signal's strength.

The next step is more systematic testing. "We want to get several consoles together with a bunch of WLAN clients, to create a busy [RF] environment, and do some measurements," Barber says. "Are we seeing frames being dropped in the air, or people getting disconnected?"

Answering that question may be a bit more urgent, with Christmas looming, and the likelihood of still more brand-new Xboxs and other wireless entertainment products turning up in January when students return.

Sphere: Related Content

German Speaking Trojan Defrauding Banks In US, UK, Spain And Italy

Gregg Keizer
A German-speaking hacker crew is looting commercial bank accounts in four countries using a custom-built Trojan put in place by expertly crafted and extremely focused phishing attacks, a security researcher said Thursday.

The malware's most distinguishing feature, said Don Jackson, a senior security researcher at SecureWorks, is its ability to mimic the steps the human account owner would take to move money.

A variant of the Prg Banking malware, the new Trojan has stolen hundreds of thousands from accounts at some of the biggest banks in the US, the UK, Spain and Italy, said Jackson. "This is not widespread, but it is very dangerous. They've already stolen more than US$200,000 from the accounts we've monitored, but this has really flown under the radar."

Jackson also said he has found at least four servers that contain Prg configuration files and bogus versions of legitimate banking sites, as well as caches of data harvested by the Trojan.

The cleverness and technical know-how of the attackers was almost breathtaking. "If you were on the bank side of this connection [with the Trojan], it would appear to be a person on the other end running the account," Jackson said. "It would seem as if someone was clicking the keys on the virtual keyboard and sending wire transfers."

According to Jackson, the hackers -- who speak German, though they may not reside in Germany proper -- mined the vast amount of data collected previously by a less powerful generic version of Prg for evidence of commercial banking accounts, including specific URLs of offshore banks or indications of wire transfers.

The crew targeted commercial accounts, said Jackson, both because those accounts typically contain bigger balances and because they usually have the built-in ability to conduct wire transfers. Once they break into a business account, the hackers can quickly plunder it by using wire transfers to move its monies to hacker-controlled accounts.

With victim accounts picked, the hackers then create what Jackson called "very convincing" phishing e-mails and send them to the account owners, who have been identified using data stolen earlier. "They'll usually have the bank account number, and the first and last name of its owner," said Jackson, as well as security details, such as whether the account is protected by a one-time password. "The e-mail will claim that the user needs to download a new one-time password or soft token, but when the user clicks on the link and reaches the phish site, the Prg Trojan is downloaded instead."

From there, the highly automated account thief takes over. The malware alerts the hacker when the account owner is actually online with his bank, "piggybacking" on the session to silently steal the username and password without actually duping the user into entering it. Then using its ability to simulate keystrokes, the Trojan walks through all the steps a human being would take to, for instance, wire funds to another account. An account can be emptied in seconds.

"That's a very clever part of the Trojan," said Jackson. "How it downloads JavaScript from the command-and-control server so it looks like the [account owner] is accessing the account, not a bot." While less-sophisticated malware heads straight to a money transfer page without first appearing to "visit" the pages a real person would view before reaching the transfer page, Prg visits the bank's pages in order, as a person would. Because most antifraud looks for automated, nonhuman behavior, Prg won't trigger a fraud alert.

Each bank site has had customized code written for it, Jackson added, to make updating the Trojan-controlled PCs easier. If the hackers need to change the destination account -- because it's been spotted and frozen by local law enforcement, say -- a new one can be fed to the Trojans from the server.

"Fewer than 20 banks have been hit by this so far," said Jackson, "but they include some of the biggest banks in the US, UK, Spain and Italy.

He came close to praising the criminals. "To me, the automation of this is very, very crafty."

The surest defense against the Prg Trojan, Jackson concluded, is to be suspicious of any e-mail received from a bank. "Even if you recognize the sender, you should confirm that the sender sent that message before clicking on any links."

Sphere: Related Content

Dell Notebooks Comes With Integrated Mobile Broadband

Rodney Gedda

Dell has announced the addition of BigPond, using Telstra's Next G mobile network, as an option for integrated mobile broadband in a number of its notebooks.

From today 13 notebook models will sport the capability to access BigPond wireless broadband built-in, with plans starting at $34.95 per month on a 12-month plan.

Dell previously only offered built-in support for Vodafone mobile broadband.

Dell client computing strategist Jeff Morris said the move is in response to direct customer feedback and the company has worked with BigPond to make the wireless broadband experience "simple, easy to buy and to use".

BigPond group managing director Justin Milne said customers simply need to "fire up your new notebook, run the connection manager, pick a plan, and you're online".

Notebooks that support mobile broadband are throughout Dell's Latitude, Precision, Inspiron, Vosto, and XPS ranges.

Sphere: Related Content

RecoverGuard Provides Error-Proof Disaster Recovery Plan

Mario Apicella


In IT, change is the only constant, as hardware and software is updated almost continuously. Companies that take business continuity seriously protect themselves by creating a recovery site to run vital business processes during an emergency.

Needless to say, keeping the recovery site current is essential to business continuity, but given the constant flux of hardware and software updates, the outcome of that effort is often uncertain.

And this uncertainty is compounded by the fact that changes to the IT infrastructure are often automated, whereas replicating those updates to the DR (disaster recovery) site remains a manual, error-prone activity.

An overlooked change could cripple your business in the event of a disaster. Think, for example, how damaging it would be if an important database was moved to a different volume to improve performance but that change was never replicated at the recovery site.

Is there a better way other than zealous attention to details to keep a DR plan effective? According to startup Continuity Software, its recently announced RecoverGuard 2.0 is the answer.

Think of RecoverGuard as a watchdog that can automatically compare the details of two IT infrastructures, then finds and reports their differences. Not only does RecoverGuard continuously monitor the two sites, but it also automatically creates a problem ticket when discrepancies arise.

It's interesting to note that RecoverGuard has a bottom-up, data-comes-first discovery process that initially identifies the storage objects of a site, then seeks out the hosts that owns them.

During discovery, RecoverGuard builds an accurate topology map of the datacenter that admins can use to better understand and solve problem tickets.

According to Continuity Software, RecoverGuard 2.0 brings some interesting improvements over previous versions, including a more efficient and faster discovery process, and a Dashboard that empowers nontechies to manage this critical business activity.

How intrusive is RecoverGuard? Not very, according to the vendor. In fact, it sits on a dedicated Windows machine and doesn't require you to install agents on your servers. Understandably, you'll have to provide the software with ample authentication credentials, just as you give your security guards keys to open every door in the building.

I liked just about everything I heard and saw during my briefing and demonstration with Continuity Software, including its assessment challenge -- a sort of gauntlet thrown at your current DR procedure.

It goes like this: Continuity Software volunteers to perform a risk assessment that won't cost you anything if no damaging difference is found between your primary and recovery sites.

What happens if a significant inconsistency is found? Well, then, you pay US$15,000 for the assessment, plus a yearly license fee of US$2,000 per server. Are you confident enough to take that challenge?

Sphere: Related Content

Solid-State Drives In The Market Soon

John Brandon

For laptop owners, flash-memory drives boost battery life and performance while making notebooks lighter and more bearable for frequent business travelers. In the data center, benefits include higher reliability than their magnetic counterparts, lower cooling requirements and better performance for applications that require random access such as e-mail servers.

So far, the biggest barriers to adopting solid-state drives (SSD) in the data center have been price and capacity. Hard disk drives (HDD) are much less expensive and hold much more information. For example, a server-based HDD costs just US$1 to US$2 per gigabyte, while SSD costs from US$15 to US$90 per gigabyte, according to IDC.

Capacities are just as disparate. The Samsung SSD drive only holds 64GB, although the company plans to release a new 128GB version next year. Meanwhile, Hitachi America makes a 1TB HDD that's energy efficient and priced at US$399 for mass deployment in servers.

Enterprise Strategy Group analyst Mark D. Peters explains that solid-state technology has been on the radar for years, but has not been a "slam-dunk" in terms of price and performance for corporate managers. That's about to change, he says, because the IOPS (input/output operations per second) benefits to SSDs are too impressive to ignore. Advantages include how SSD has no moving parts, lasts longer, runs faster and is more energy efficient than an HDD.

And prices are falling fast. Right now, the industry trend is a 40% to 50% drop in SSD pricing per year, according to Samsung.

The arrival of hybrid drives such as Samsung's ReadyDrives -- which use both SSD and HDD technology -- and SSD-only servers "suggests the time for SSD as a genuine -- and growing -- viable option is getting closer," says Peters. He was referring to the recent IBM announcement about BladeCenter servers that use a SSD.

"Price erosion, coupled with increased capacity points, will make SSDs an increasingly attractive alternative to HDDs" in data centers, agrees Jeff Janukowicz, an analyst at IDC.

Two examples of how SSDs solve persistent throughput problems for high-performance computing shows how SSD technology may make new inroads in corporations in 2008, some industry watchers believe.

Solid-state at the Stanford Linear Accelerator Center

At this research center, SSD is being used for some of the most data-intensive work going on today. The Stanford Linear Accelerator Center (SLAC) uses particle accelerators to study questions, including where antimatter went in the early universe and what role neurexin and neuroligin proteins play in autism.

The amount of data is immense -- in the petabytes -- and the lab uses a cluster of 5,000 processor cores. Despite that, the discrete chunks of data that are requested and analyzed by several hundred researchers are highly granular -- usually just 100 to 3,000 bytes of information. At the same time, scientists tend to perform thousands of data requests, accessing a few million chunks of data per second.

Richard Mount, SLAC's director of computing, explains that the response time for these researchers' data requests is limited not by the number of processors or by the amount of network bandwidth, but rather by disk access time. "Flash memory is over a thousand times faster than disk" drive technology," says Mount. "Hard disks are limited to around 2,000 sparse or random accesses per second. When accessing thousand-byte chunks, this means that a disk can use only 1/50th of a gigabit-per-second network link and less than 1/100,000th of a typical computer center network switch capacity."

This limitation has translated into the need to make what the lab calls "skim data sets." In other words, preassembled collections of related data that at least one researcher has already requested. "There is no waiting for skim data sets that already exist, but if somebody wants one that does not already exist, then they normally have to wait for a skim production cycle that takes place once every four to six months," Mount says.

To help researchers receive data in a more ad hoc manner, flash storage may be just the thing. "We have no religious attachment to flash, but we can construct flash-based storage at a reasonable cost and around 25ms latency, and we are doing so."

SLAC has developed its own SSD-based system that is in the final debugging stages, Mount explains. "The first version of this will provide about 2TB of storage, but we can easily grow this to 5 or 10TB just by buying flash chips," though he reckons the scalability will require "more serious expenditure." At the 2TB level, it will serve as a test and development system only.

Eventually, the goal is to use SSD technology as a cache for all particle accelerator research, which will allow scientists to access data at any time from any data store. "SSDs help the entire system run more efficiently by ensuring the I/O capability is in balance with the rest of the application system," adds IDC's Janukowicz. "The characteristics of flash-based SSDs make them a well-suited alternative for high-IOPS applications that are read intensive. SSDs have no rotational latency and have high random-read performance. Thus, with SSDs the time to access the data is consistent and very small regardless of where on the device the data is held."

Considering SSD at the Pacific Northwest National Laboratory

At the Pacific Northwest National Laboratory (PNNL) in Washington, solid-state technology could help alleviate a supercomputer bottleneck. At the lab, researchers run tests that sustain a write speed of 80Gbit/sec. and a read speed of 136Gbit/sec. Yet, one or two slow hard disk drives running at one quarter the speed of other disks causes performance to degrade quickly.

"Solid-state devices such as flash drives can use a RAID striping technique to achieve high streaming bandwidth -- just like [hard] disk drives -- while also maintaining very low latency for random access," says Robert Farber, a senior researcher at PNNL. "This is a very exciting combination."

The lab has not moved to solid-state technology yet. But Farber says the real debate is whether low-latency access for "seek-limited applications" -- in other words, many requests for small amounts of data -- can alleviate the pressure of computing bandwidth. It is not solely a price-per-gigabyte debate. "It remains to be seen how much of a price premium consumers will tolerate before robustness, power, storage capacity and physical space differences cause a mass departure from magnetic media," Farber says.

At the PNNL, the latency goal for its last supercomputer was 25Mbit/sec., per gigaflop of peak rate floating-point performance. This is mostly to be able to handle the data-intensive nature of the NWChem scientific software calculations running. The lab's new environmental molecular sciences facility contains a new supercomputer with a theoretical peak floating point performance of 163 teraflops. And, like at the Stanford lab, disk speed is a critical part of the equation, so solid-state is the forerunner in solving the bottleneck.

One breakthrough Farber expects in the not-too-distant future: Operating systems will change their memory hierarchy to directly access SSD, turning the technology into a hard drive replacement for mass storage.

Complementary, not replacement tech for most users

One question that remains: When will SSD really impact the corporate world? Some say SSD in the data center is just on the horizon, since laptops such as the Dell XPS M1330 uses a Samsung 64GB SSD. Alienware also offers a 64GB option in some of its desktop computers. And SSD is applicable across the commercial landscape; while researchers need the speed to study proteins, retailers may need or want faster POS transactions.

One company to watch in this space: Violin Memory. The company's Terabyte-Scale Memory Appliance provides over 1Gbit/sec. access for sequential and random-access. SLAC's Mount says he tested a DRAM-based prototype appliance from Violin, and that its upcoming flash-based system "seems a good match for our applications."

A Violin spokesman explains that the two key bottlenecks in corporate computing are network speeds and IOPS for storage systems. Today, disks run at about 100Mbit/sec. for sequential operations, but only 1Mbit/sec. for random 4k blocks, he says.

"In some cases, there are minimal capacity requirements which are well suited for SSDs," Janukowicz adds. "Also, in high-performance applications, the IOPS metrics can favor SSDs over HDDs." However, even with all those benefits, he says that "IDC does not see SSDs completely replacing HDDs in servers. SSDs do offer performance advantages and are a 'green' solution. However, there are many applications that require the capacity provided by HDDs."

Enterprise Strategy Group's Peters says that throughput requirements will lead to a gradual shift away from hard disk drives to solid-state technology, but it will take time in the corporate world. "Moving wholeheartedly from one technology to another is a rare thing within data centers," he says.

John Brandon worked in IT management for 10 years before starting a full-time writing career. He can be reached at jbrandonbb@gmail.com.

Sphere: Related Content

MIT Finally Completes Its OpenCourseWare Project.

John Cox

MIT this week announced an important digital achievement: the completion of its pioneering OpenCourseWare project. And everyone involved seems quite happy with being unsure about why exactly it's important.

The achievement is digitizing all the classroom materials for all of MIT's 1,800 academic courses, putting them online, and inviting anyone and everyone to do whatever they want with that information. It's called the OCW project, and it's spawning a global movement to make what had been jealously guarded education resources accessible to educators and learners everywhere.

You can find the outline of a course in fundamentals in data networking, with a syllabus and lecture notes. There's a PowerPoint presentation from 2006 on "Trends in RFID Sensing".

Proposed in 2000 by a faculty committee, announced in 2001, and launched in 2002, OCW has received US$29 million in funding, US$5 million from MIT, the rest from foundations and contributors. One key backer, the William and Flora Hewlett Foundation, has decided on investing another US$100 million over five years in various open education projects largely because of its experience with OCW, according to Marshall Smith, director of the foundation's education program.

MIT has taken a step in doing something more with OCW. As part of Wednesday's celebration on the MIT campus in Massachusetts, University President Susan Hockfield announced a new portal for OCW, one designed specifically for high school teachers and students. Dubbed "Highlights for High School," the portal's home page selectively targets MIT's introductory science, engineering, technology and math courses, with lecturer's notes, reading lists, exams and other classroom information. The OCW resources, including video-taped labs, simulations, assignments and other hands-on material, have been categorized to match up with the requirements of high school Advanced Placement studies.

It's that "letting them do whatever they want" part that creates the uncertainty about why OCW is important. The data on usage are impressive. In the five years since the launch of OCW, with a 50-course pilot site, an estimated 35 million individuals have logged in. About 15% are educators, 30% are students, and the rest are what MIT calls "self learners" with no identifiable education affiliation, says Steve Carson, OCW's external relations director.

The recently formed OpenCourseWare Consortium has 160 member institutions, creating and sharing their own sites, on the MIT model. Something like 5,000 non-MIT courses are now available globally, some but not all using material from the OCW Web site.

Yet, one of the most striking statistics is from a completely unexpected source: iTunes, Apple's Web site for music and videos. MIT President Hockfield said she was told in September by her daughter to check out the iTunes list of most-popular videos. To her astonishment, Hockfield found two OCW videos in the top-10 listing. "No. 3 was 'classical mechanics,' she said. "No. 7 was 'differential equations.' Go figure."

"This expresses, to me, the hunger in this world for learning, and for good learning materials," she told her audience.

A distinguished group of speakers and panelists at the MIT event all agreed that OCW represents...well, something.

"We're unlocking a treasure trove of materials," said Steve Lerman, MIT's dean for graduate students, and chairman of the OCW Faculty Advisory Committee.

OCW's resources will factor large in plans by the government of India to create a massive expansion of educational resources, according to Sam Pitroda, chairman of the government's Knowledge Commission, which is charged with making specific recommendations on how to spend the new US$65 billion the government will invest in education over the next five years. The nation has over a half-billion people younger than 25, Pitroda says. Just one of a series of almost unimaginable goals is to increase the number of universities from 350 today to 1,500 in five years, he said.

Pitroda said the scale of such goals requires questioning basic assumptions about what education is and how it is accomplished. "We don't have enough resources to train teachers and build an entire [traditional] infrastructure to support them," he said. Hence, the commission's interest in open projects like OCW, which hold the promise of a massive transfer not only of knowledge but of teaching approaches and learning structures that can be adapted to local requirements and cultures.

"Given this expansion, OCW plays a key role in these emerging experiments" in education, Pitroda said.

Former Xerox Chief Scientist John Seely Brown, sounding what for him is a recurring theme, said Web technologies in education are creating a new generation of tinkerers, who tinker with content online rather than nuts and bolts. This is the domain of mashups, of combining existing content from various sources and media to create new, often more complex creations, often in the context of a community of peers who share a common passion.

"Maybe the next stage for OCW is shifting from [a focus on] content to actions on or with the content," he said. "We have the ability to bring back tinkering, which is the basis of our intuition. We get our intuitions from playing around with stuff."

Their musing prompted further musings from the audience.

Someone wondered if the new technologies both inspiring and enabling OCW and other projects have rewired the brains of the next generation, so that entirely novel ways of teaching and learning are now needed. Another asked if these technologies were democratizing learning, didn't that call into question the classic idea of the university as a "certifier" by its degree programs that a student has acquired a certain degree of knowledge. Still another wondered how OCW could be augmented by faculty from around the world while balancing a need for maintaining some criteria of excellence.

These and many other questions will have to be addressed as part of a developing global conversation about the "meta university," suggested Charles Vest, MIT's former president and an early and enthusiastic backer of OCW. This concept is an attempt to blend what Vest described as the "deeply human activities" of teaching and learning, with advances in information technology that are making possible new tools for those activities: vast digital archives, open digital publications such as the Public Library of Science, projects like the Sakai open source learning management system, and projects like MIT's iLabs, which lets students around the world use the Internet to access automated lab equipment, run automated experiments, and analyze and share data.

"The emotion I feel right now is humility," said Hal Abelson, professor of computer science and engineering at MIT, and founding director of Creative Commons, a non-profit that offers free tools for content creators to mark their online creative work with the freedoms and permissions they want the work to carry. "What OCW has led us to see is what we're in something like 'Education 1.0' What comes next? We're imagining the future."

Sphere: Related Content

Cisco Confirms Its VOIP Phones Spies On Remote Calls

Linda Leung

Cisco confirmed it is possible to eavesdrop on remote conversations using Cisco VoIP phones. In its security response, Cisco says: "an attacker with valid Extension Mobility authentication credentials could cause a Cisco Unified IP Phone configured to use the Extension Mobility feature to transmit or receive a Real-Time Transport Protocol (RTP) audio stream."

Cisco adds that Extension Mobility authentication credentials are not tied to individual IP phones and that "any Extension Mobility account configured on an IP phone's Cisco Unified Communications Manager/CallManager (CUCM) server can be used to perform an eavesdropping attack."

The technique was described by Telindus researcher Joffrey Czarny at HACK.LU 2007 in Luxembourg in October.

Cisco has published some workarounds to this problem in its security response.

Also in October, two security experts at hacker conference ToorCon9 in San Diego hacked into their hotel's corporate network using a Cisco VoIP phone.

The hackers, John Kindervag and Jason Ostrom said they were able to access the hotel's financial and corporate network and recorded other phone calls, according to a blog on Wired.com.

The hackers used penetration tests propounded by a tool called VoIP Hopper, which mimics the Cisco data packets sent at three minute intervals and then trades a new Ethernet interface, getting the PC - which the hackers switched in place of the hotel phone - into the network running the VoIP, according to the blog post.

The Avaya configuration is superior to Cisco, according to the hackers, because you have to send requests beyond a sniffer. Although it can be breached the same way, by replacing the phone with a PC.

Sphere: Related Content

Botnet Ringleader In New Zealand, Busted

The swoop is the FBI's second phase of "Operation Bot Roast" -- the same operation which resulted in four felony charges against 26-year-old Los Angeles security consultant, John Schiefer.

The New Zealand suspect, who goes by the name of "AKILL", came under fire after an information sharing exercise between the New Zealand Police, the US Secret Service and the FBI. He has been interviewed by New Zealand police and investigators have seized computers from his home.

FBI director Robert S Mueller III, said botnets are the "weapon of choice" for cybercriminals.

"They seek to conceal their criminal activities by using third party computers as vehicles for their crimes. In Bot Roast II, we see the diverse and complex nature of crimes that are being committed through the use of botnets," he said.

Since "Operation Bot Roast" was announced last June, eight individuals have been indicted, pled guilty, or been sentenced for crimes related to botnet activity, according to the FBI. Law enforcement agencies have also served 13 search warrants in the US and overseas.

FBI Assistant Director James E Finch, Cyber Division, warns users to protect their systems.
"Practicing strong computer security habits such as updating antivirus software, installing a firewall, using strong passwords, and employing good e-mail and Web security practices are as basic as putting locks on your doors and windows.

"Without employing these safeguards, botnets, along with criminal and possibly terrorist activities, will continue to flourish," he said.

Except for Alexander Dmitriyevich Paskalov, 38, all individuals identified by the FBI in "Operation Bot Roast II" are male, US citizens, under 30 years old.

Sphere: Related Content

Micron Technology Releases 64GB Solid-State Hard Drives In 2008

Michael Kanellos

Micron Technology, the Boise, Idaho-based maker of BDRAM and flash memory, this week unveiled plans to come out with solid-state drives. The drives function like regular hard drives. But instead of storing data on spinning disks, solid-state drives store it on NAND memory chips -- the kind found in cameras and MP3 players.

Micron will start mass-producing solid-state drives in the first quarter of 2008. The first drives will hold either 32GB or 64GB of memory. While that's less than half the capacity of the average notebook drive today, it's actually more storage than most business users need, said Dean Klein, vice president of memory system development at Micron. Plus, solid-state notebooks can come out of deep sleep or launch applications far more rapidly.

"Sixty GB to 80GB is the sweet spot for the notebook market," he said.

Micron didn't talk pricing, but the drives will likely cost a few hundred dollars, a stumbling block. For example, swapping out a 160GB standard hard drive for a 64GB solid-state drive (from Samsung) on a Dell XPS 1330 notebook costs an additional US$950. Considering that the notebook with the 160GB drive already costs US$1,599, the solid-state drives aren't exactly economical.

Nonetheless, the magic of Moore's Law and the ability of memory makers to take it on the chin are going to make these drives more affordable. The first thing that will happen is that toward the end of 2008, solid-state drive makers will start to incorporate multilevel cell flash chips in the drives, Klein said. Manufacturers currently use single-level cell flash memory.

Multilevel cell chips hold two (and soon four) bits of data per cell. The chips aren't as reliable as single-level cell memory, but the error rates are small enough to make these types of drives more than adequate for the notebook market, he added.

In addition, multilevel cell chips will enable drive makers to increase the capacity of their drives, driving down the price. At equal capacities, multilevel cell chips could cut the price of making a drive by roughly 40 percent, estimated Frankie Roohparvar, vice president of NAND development at Micron.

Meanwhile, the world is swimming in NAND flash, leading to drastic price declines. NAND prices are set to drop 57 percent this year and 52 percent next year, said Joseph Unsworth, an analyst at Gartner.

Put those two factors together, and it could be possible to come out with a 64GB solid-state drive for close to US$300 toward the end of 2008, Unsworth speculated. That's still high. He estimates that only eight million solid-state drives will get shipped in all of '08.

But after that, the industry should begin to be able to show the benefits of these kinds of drives, the Micron executives predict. Even if price declines begin to slow, 64GB drives will likely move toward the US$200 range by late 2009 and then drop to sub-US$100 about 18 months after that. Hard-drive makers will continue to increase the density of their products at the same time, of course, but competition between the two technologies will become tighter.

It happened in MP3 players, after all. Most upscale players came with 1.8-inch drives. The industry, however, at one point abruptly switched to flash.

Unsworth said the flash makers are going to have to tout the supposed benefits of having a flash drive with less capacity than a spinning disk (better battery life, can withstand a drop from a table better, you may not need all that storage, etc.).

He added that notebook makers will have to cooperate by making smaller laptops that showcase the features of flash. Flash takes up less space and, because it doesn't radiate as much heat, you can eliminate a fan. Currently, the notebooks that contain flash are basically the same size as the hard-drive models.

"With MP3 players, it was easy. You just turn it sideways and quote the battery life," Unsworth said.

Sphere: Related Content

QuickTime Vulnerability Experienced In Mac OS X Too

Gregg Keizer

The QuickTime vulnerability disclosed in the Windows version of QuickTime last week also affects Mac OS X, Symantec said yesterday.

According to additional research by Symantec's security response team, the Real-Time Streaming Protocol (RTSP) bug in QuickTime is also present in the Mac versions of Apple's media player. "We tested it, and the [proof-of-concept] exploit does cause a denial of service," said Marc Fossi, manager of the Symantec team, explaining that the Windows-specific attack code fails to give a hacker access to a Macintosh but instead causes QuickTime to crash.

However, Fossi cautioned Mac users against believing that they are in the clear. "QuickTime vulnerabilities have tended to affect both Windows and Mac OS X, and it's always possible that a denial of service could lead to remote code execution," he warned.

Fossi also said that on Windows, it now appears that Microsoft's Internet Explorer Versions 6 and 7, as well as the beta of Apple's Safari browser, will offer some additional protection against attacks that are based on duping users into visiting malicious or compromised sites hosting rigged streaming content.

"The buffer overflow protection built into IE and in Safari prevents the exploit shell code from executing in the [QuickTime] plug-in," said Fossi. To successfully attack a user via IE or Safari, the current exploit example would have to be refined, Symantec added in a posting to its security blog Monday.

Firefox, however, provides no such protection, Fossi noted, because it passes off the streaming content request directly to the stand-alone QuickTime Player, ceding control to the Apple program.

Any attack on a Windows XP or Vista PC that circumvents the browser -- by delivering a rogue attachment that when opened automatically launches the QuickTime Player -- executes and compromises the computer, Symantec's research showed.

Fossi also repeated a warning given over the weekend that in-the-wild use of the exploit will probably come sooner rather than later. "When there is an exploit that's fairly complete, like this one, attacks could be soon in coming," he said. "It doesn't take someone very long to develop a malicious exploit when they start with something like this."

He recommended that network administrators block RTSP traffic -- the protocol's default outbound port is TCP 554 -- and that users steer clear of links to untrusted sites and be wary of file attachments.

Even though Apple has issued six QuickTime security updates so far this year and patched a total of 31 vulnerabilities, Fossi declined to call Apple's player less secure than those of rivals, such as Microsoft's Windows Media Player.

"Windows Media Player has had its share of vulnerabilities," he said. "A lot of people have media players installed on their computers, and they make pretty nice targets. Multimedia in general is a nice target for attackers. Most people know not to open a Word doc or an executable file, but with multimedia, everyone has a natural curiosity to see what their buddy just sent them.

"Then, bang."

Apple did not respond to questions about the QuickTime vulnerability and its plans for patching the program.

Sphere: Related Content

Electricity From Renewable Energy, Google New Goal

Google said Tuesday it will invest in developing ways to create electricity from renewable energy sources that will be cheaper than the electricity produced from coal.

Google Co-founder Larry Page said in 2008, the company plans to spend tens of millions of dollars on research and development and other related investments in renewable energy. Currently, coal supplies 40 percent of the world's electricity, according to Google.

"We have gained expertise in designing and building large-scale, energy-intensive facilities by building efficient data centers," said Page in the statement. "We want to apply the same creativity and innovation to the challenge of generating renewable electricity at globally significant scale, and produce it cheaper than from coal."

Page said Google's goal is to produce one gigawatt of renewable energy capacity that is cheaper than coal.

"We are optimistic this can be done in years, not decades," he said, adding that one gigawatt can power a city the size of San Francisco.

Page said that if Google meets that goal and large-scale renewable deployments are cheaper than coal, then the world could meet a large part of its electricity needs from renewable energy sources and significantly reduce carbon emissions. "We expect this would be a good business for us as well," he said.

Google.org, Google's philanthropic arm, will invest in the initiative, known as RE.

Google.org is now working with two companies on renewable energy technologies: eSolar, a California-based company specializing in solar thermal power that replaces the fuel in a traditional power plant with heat produced from solar energy; and Makani Power, a California-based company developing technologies to harness the wind for energy.

Google said Tuesday's announcement is just the latest step in its commitment to a clean and green energy future.

Sphere: Related Content

Earning The CISSP : What It Takes And What It Is Worth

Greg Schaffer

Recently, I became an International Information Systems Security Certification Consortium (ISC2) Certified Information Systems Security Professional (CISSP). The pursuit was difficult, but that was to be expected, as the certification is one of the most sought-after information security credentials. Like many certifications, it can add significant bargaining weight when changing positions or jockeying for a raise.

Certifications don't necessarily make or break one's career, but can contribute to one's overall package. Whether you're satisfied in a position or looking to upgrade, it's in your best interest to stay as knowledgeable and marketable as possible. Understanding that certifications may not be a panacea but certainly have value is the first step in determining which certifications (if any) are worth pursuing based on your career goals.

The value of certifications

There has been much debate over the validity and usefulness of certifications, but one thing is clear: knowledge without the ability to apply it is functionally useless. That's one reason why some certifications require significant real-world experience as part of the certification process. IT recruiters are keenly aware of this.

"You may be a whiz at taking certification exams," says John Estes, vice president at IT staffing agency Robert Half Technology, "but if you don't have the benefit of troubleshooting [experience] in a business environment, you won't last long." Justin Keller, an infrastructure recruiter at TEKsystems Inc., agrees. "Certifications are something that will set apart qualified candidates from the rest of the field but they cannot be expected to replace real life experience," Keller says.

However, there has to be some value to a certification besides a fancy certificate for display on the wall. Overall, it's not unreasonable to expect a relevant certification to command roughly a 10% average increase in salary over those performing the same duties without the credentials, according to Brian Hunter, an executive and technical recruiter at Talent Scouts Inc. He suggests that people interested in pursuing a particular certification do a cost-benefit analysis to determine the certifications' return on investment.

Without a doubt, pursuing certifications requires tenacity and a willingness to put in long hours of preparation, not to mention the monetary costs, particularly if a "boot camp"-type preparation course is used. As Keller points out, "the financial and time commitments that are required to get many of these certifications are significant."

Basically, certifications by definition should certify that a professional possesses the qualities necessary to accomplish the duties of a particular position. In information security, that means having a very broad experience, knowledge and skills base.

My pursuit to become a CISSP

Information security is one of the fastest growing areas in IT today. Keller notes that "specialization in this area is going to be a solid differentiator in a market that is already very competitive." Certainly in the information security field, having the paperwork to back up the knowledge can be quite valuable. As my information security duties have increased dramatically over the past several years to the point where the majority of my professional activities are related to information security, I felt it was time to achieve that differentiator.

While there are other information security credentials available such as the Certified Information Security Auditor (CISA) from the Information Systems Audit and Control Association (ISACA), I chose to go after the CISSP certification because of its reputation and vendor-neutrality and because my knowledge and experience matched the CISSP requirements well. In addition, the managerial components of the CISSP credential fit with my aspirations to become a chief information security officer (CISO).

To become a CISSP, a minimum of five years' work experience in two of 10 knowledge areas, referred to as domains, is necessary. We're not talking just technical areas here, as the domains include not only nuts-and-bolts topics such as networking and cryptography but managerial and planning tools such as business continuity and disaster recovery. This is because information security is at the core a business process or, more exactly, a method to ensure that the business continuity is unimpeded.

"It seems like almost every week, you hear about a Fortune 500 company or government department having a breach of sensitive information," explains Louie DiNicola, who expects to complete his undergraduate degree in computer information systems next spring and already has a position lined up working in IT assurance. "I want to be able to help companies avoid that and maximize their potential by helping them identify problems in IT policy and implementation."

DiNicola has an edge outside of the certifications with his college degree. Companies will often ignore a potential candidate, regardless of experience and qualifications, if he hasn't earned a degree. "It does not matter if the person can walk on water," according to Hunter. "If they do not have a degree, they won't be considered" for some positions.

DiNicola knows that a degree and certifications, coupled with experience, make for a powerful mix. "I realize that as an entry-level graduate, the certification might be better suited as a long-term goal," he explains. DiNicola has already begun plans to pursue the CISA credential and the CISSP certification after that.

The CISSP credential goes far beyond measuring one's book knowledge. First, the candidate must be endorsed by an ISC2-certified professional confirming that the candidate meets the experience requirements. Also, the candidate must pledge to adhere to a code of ethics. Finally, to maintain certification, the CISSP must constantly engage in security activities, such as ongoing education and participating in security speaking opportunities.

But it all starts with the exam, and there are many ways to prepare for it. For me, self study was the best way to go. You have to be disciplined and self-motivated to forego structured courses, but self study can provide more flexibility while saving costs. Note that bypassing the class route doesn't mean that you have to go it alone. I found valuable resources online from CISSP forums such as one at CCCure.org and free online workshops such as those offered by the University of Fairfax.

Passing the CISSP and other certification exam tips

The following tips helped me pass the CISSP exam, the most difficult certification exam I have ever taken. As learning methods vary, so should your approach to preparing for any certification exam.

My first action was to register for the exam to allow for two months of preparation. While this may seem obvious, registering for the exam a certain period in advance helps to focus on the goal. Without a deadline, it can be difficult to achieve that goal, since the propensity to procrastinate is great.

My next step was to purchase a review book with practice questions and exams. I opted to purchase ISC2's CISSP review book, which came with a CD of practice exams. Of course, there are other study guides with practice exams available. The point is to have a good resource to prepare with. Multiple books can help especially in locking down difficult concepts by approaching them from different angles.

You should take a practice exam before beginning to study because it can point out subject-matter strengths and weaknesses. Predictably, I was strongest in the two domains for which I met the required experience and quite weak in some others. This helped me prioritize my studying.

Plan to study until one week before the exam and spend the last week reviewing material at a leisurely pace. A light review the night before the exam is fine, but do not cram. If the test is given in a location that requires significant travel, plan on arriving the night before, particularly for an early morning exam. I relaxed the night prior to the exam, because I knew I would need all my faculties the next day.

The CISSP test consists of 250 multiple-choice questions that must be completed in six hours. That equates to less than one and a half minutes per question. There are various strategies for attacking such exams; mine was to be well rested and answer every question in the exam in four hours, then review the rest of the time. If time becomes a factor toward the end of the exam, answers will be rushed, so pacing is important.

After I left the exam, confident that I had a 50-50 chance of passing, I began crafting my retest strategy. Since I had just spent so much time over two months preparing for this exam, I planned to register to retake the exam the moment I found out I failed, because I didn't want to lose the freshness of the knowledge. Fortunately, I didn't have to activate that plan, but I was ready to.

Summary

While these tips are based on my pursuit of becoming a CISSP, they have applicability to other certifications as well.

* Match certifications with your goals and skills.
* Study, study, study. Whether that means books, classes or both, studying can't be overemphasized.
* Cramming rarely works. Rather, relax the night before, and get a good night's sleep.
* Certifications should be part of an overall success strategy, not the singular focus.
* The further up the ladder, the more important degrees become. If CISO is a goal, look into pursuing an advanced degree.
* Realize it's just an exam. Everyone has bad days, and failing is not the end of the world.
* If you do fail, plan to retest sooner than later. Don't give up the pursuit.

Finally, don't look at obtaining certifications as the primary goal but as part of an overall strategy for achieving your career aspirations. "No single accreditation will guarantee career success," summarizes Robert Half Technology's Estes. "But a mix of relevant, broad-based certifications can help support an IT professional who has experience in the field as well as a strong set of appropriate skills."

Greg Schaffer is a freelance writer based in Tennessee. He has over 15 years of experience in networking, primarily in higher education. He can be reached at newtnoise@comcast.net.

Sphere: Related Content

Microsoft Set To Check Windows Genuine Advantage's Anormalies

Three months after a major failure of Microsoft's anticounterfeit system fingered legitimate Windows XP and Vista users as pirates, a senior project manager has spelled out the steps his team has taken to prevent a repeat.

Alex Kochis, the senior project manager for Windows Genuine Advantage (WGA), used a company blog to outline new processes that have been put in place, including drills that test the WGA group's response to an outage like the one in late August.

"We've revamped the monitoring that is used to track what's happening within our server infrastructure so that we can identify potential problems faster, ideally before any customer gets impacted," Kochis said. "[And] since August, we have conducted more than a dozen 'fire-drills' designed to improve our ability to respond to issues affecting customers or that could impact the quality of the service."

Those drills, Kochis said, have ranged from pre-announced simulations to surprise alerts that test a specific scenario. "The team is now better prepared overall to take the right action and take it quickly," he promised.

In late August, servers operating the WGA validation system went dark for about 19 hours. Customers who tried to validate their copy of Windows -- a Microsoft requirement for both XP and Vista -- during the blackout were pegged as pirates; Vista owners found parts of the operating system had been disabled, including its Aero graphical interface.

Several days after the weekend meltdown, Microsoft blamed preproduction code for the snafu and said that a rollback to earlier versions of the server software didn't fix the problem immediately, as expected.

Microsoft, however, downplayed the incident, claiming that fewer than 12,000 PCs had been affected. The company's support forums, however, hinted that the problem was much more widespread: one message thread had collected over 450 messages within two days and had been viewed by 45,000 people.

One analyst gave Kochis' status report a mixed grade.

"I was looking for two things from Microsoft, and the first was that they would acknowledge that there was a failure," said Michael Cherry, an analyst at Kirkland, Wash.-based Directions on Microsoft. "If they couldn't do that, it would show a real lack of insight into the severity of the problem. But they called it an 'outage' [here], which I don't think they had actually admitted before."

Cherry was more than on the mark. While Kochis called the incident a "temporary service outage" in his newest post, three months ago, he denied that the word applied. "It's important to clarify that this event was not an outage," he said on August 29, five days after the servers went down.

"Second," said Cherry, "I wondered if Microsoft would acknowledge that failures are going to happen, that something's going to go wrong no matter how many drills they have. And when that happens, what would they do? But I don't see anything like that here."

Kochis said the WGA team has also changed the way it updates the validation service's servers, beefed up free WGA phone support to round-the-clock coverage and improved the speed of delivery of "get-legal" kits to users who discover they're running counterfeit software, but he made no mention of any modifications to the antipiracy program itself, how it's implemented or how users are handled when it determines they're using fake copies of Windows.

"They should make it so that any impact [of an outage] is on Microsoft and not on the customer," Cherry said.

Back in August, Kochis claimed that Microsoft's policy was to do just that -- err on the side of the customer -- but he contended that the outage had been an anomaly. "Our system is designed to default to genuine if the service is disrupted or unavailable," Kochis said then. "If our servers are down, your system will pass validation every time. [But] this event was not the same as an outage, because in this case the trusted source of validations itself responded incorrectly."

That's not good enough, according to Cherry. "If users can't validate, for whatever reason, Microsoft should leave them in their current state, but not invalidate them, or validate them, at least until the next check," he said.

"You have to take the utmost care before you deny something to someone that they have purchased in good faith," he concluded.

Sphere: Related Content

'Gdrive,' Google's Online Storage And Backup Service Out Soon

Google may release an online storage and backup service in the coming months, adding to its suite of hosted services, which already includes a variety of communication and collaboration applications, The Wall Street Journal reported Tuesday.

Rumors that Google is developing such a service, informally known as the "Gdrive", have been circulating for more than two years. It's not clear why it has taken Google so long to deliver Gdrive, considering its concept is far from new and online storage and backup services are available already from a variety of vendors.

In fact, according to the Journal, the company isn't still completely sure it will bring this service to market, and plans for it could be shelved.

As existing services do, Google's would let people store files on Google servers so that they could make backup copies of files on hard drives and access and share them with others from any computer via the Internet, the Journal reported.

The service would provide an undetermined amount of storage for free, with additional space available for a fee, the Journal reported, quoting anonymous sources familiar with the plans.

Google didn't immediately respond to a request for comment from IDG News Service. A Google spokeswoman contacted by the Journal declined to comment about any specific online storage plans.

Google hopes to differentiate the service from existing competing ones by making it easier to use, the Journal reported.

Google already offers hosted storage for a variety of its Web-based applications, but this service would act as a sort of umbrella storage for multiple Google applications, allowing people to do keyword searches on their files, according to the Journal.

Sphere: Related Content

Cisco Gives Oracle 11g A Boost

Stephen Lawson
Cisco did its part for Oracle users as the OpenWorld conference opened Monday, announcing a protocol it developed with the software company for running Oracle databases over larger server clusters.

The two vendors developed the RDS (Reliable Datagram Sockets) protocol and will make it part of an industry-developed open-source software distribution called Open Fabrics Enterprise Distribution, said Pramod Srivatsa, a product line manager for Cisco server fabric switches. It is intended for Cisco switches using Infiniband high-speed data-center technology.

Growing data centers and demands for processing have driven the development of new forms of connectivity, such as Infiniband and 10-Gigabit Ethernet, between servers in data centers. But pure networking speed -- up to 20G bps (bits per second) in the case of Infiniband -- isn't all that's needed to make data centers run faster.

Enterprises that want to set up a very large deployment of the Oracle 11g database software once had to do it on a single large server, Srivatsa said. Oracle already offers RAC (Real Application Clusters) 11g software for distributing that deployment over multiple, smaller Intel-based servers running Linux. But that only works up to a cluster of about four servers, and RDS makes it more scalable, he said. RDS has been tested successfully with as many as 16 servers and is designed to work for clusters of as many as 64 using Infiniband, according to Srivatsa.

Infiniband is well-suited to Oracle database software because it has to quickly exchange many messages of varying sizes, Srivatsa said. Mellanox, which supplies some of Cisco's chips for Infiniband switches, helped develop RDS. In the future, customers will probably be able to use RDS with 10-Gigabit Ethernet too, Srivatsa said.

RDS was designed for clusters of servers in one data center, which could include blade as well as rack servers, he said. Customers of both Oracle and Cisco can request the software from the companies now and start testing it. Cisco will start providing RDS for commercial use in its Infiniband servers after it is certified by Oracle, probably next month, Srivatsa said.

On Tuesday, at the SC07 supercomputing conference in Reno, Nevada, Cisco introduced the SFS (Server Fabric Switch) 3504, designed to let enterprises connect blade servers running Oracle database software with traditional Fibre Channel storage-area networks (SANs). The switch connects to a blade or rack server using Infiniband and serves as a gateway to both an Ethernet LAN and a Fibre Channel SAN, Srivatsa said. In the case of blade servers, it helps IT departments do more with systems that typically have just one type of external connectivity, he said.

The SFS 3504 can be ordered starting later this month and is set to ship in December. The average starting list price, depending on configurations, will be US$150 per port.

Sphere: Related Content

Microsoft Windows XP & 2003 Server Gets New Updates

Robert McMillan
Microsoft has released its November security updates, fixing a critical Windows bug that has been exploited by online criminals.

Microsoft released just two security updates this month, but security experts say that IT staff will want to install both of them as quickly as possible. The MS07-061 update is particularly critical because the flaw it repairs has been seen in Web-based attack code, said Amol Sarwate, manager of Qualys's vulnerability research lab. "This was a zero day [flaw] that was being used in the wild by hackers," he said.

The flaw has to do with the way Windows passes data between applications, using a technology called the URI (Uniform Resource Identifier) protocol handler. This is the part of Windows that allows users to launch applications -- an e-mail or instant messaging client, for example -- by clicking on a Web link. Because Windows does not perform all of the security checks necessary, hackers found ways to sneak unauthorized commands into these Web links and the flaw could be exploited to install unauthorized software on a victim's PC.

This type of flaw lies in both Windows and the programs being launched by the Web link and Microsoft had initially said that it was up to third-party software developers to fix the issue. It later reversed this position and decided to fix the flaw in Windows as well. These URI protocol handler problems have turned up in Adobe, Firefox and Outlook Express.

Microsoft was forced to revise its position on the URI bugs after researchers discovered that they were far more problematic than first thought, said Nathan McFeters, a security researcher with Ernst & Young, who has been studying this problem. "I think that early on it wasn't clear that this was an issue," he said via e-mail. "There's really a handful of issues with this URI use and abuse stuff."

Microsoft's patch for this problem is rated critical for Windows XP and Windows Server 2003 users, but the bug does not affect Windows 2000 or Vista, Microsoft said.

The second vulnerability, rated "important" by Microsoft, has to do with Windows DNS (Domain Name System) servers, which are used to exchange information about the location of computers on the Internet. Attackers could exploit this flaw to redirect victims to malicious Web sites without their knowledge, something known as a "man in the middle" attack. "All system administrators should look very closely at this vulnerability," Sarwate said. "I would have personally rated it as critical," he said.

Security experts were surprised that Microsoft did not include a patch for a known vulnerability in some Macrovision antipiracy software that has been shipping with Windows for the last few years. Microsoft has said that it plans to patch the problem and that it is aware of "limited attacks" that exploit this vulnerability to get elevated privileges on a victim's machine.

The bug lies in the secdrv.sys driver built by Macrovision that ships with Windows XP, Server 2003 and Vista, but Vista is not vulnerable to the problem, according to Microsoft.

Macrovision has also published a patch for this problem.

Its a "bit worrisome" that Microsoft hasn't pushed out a patch for the bug, given that Macrovision has already made its fix available, said Andrew Storms, director of security operations with nCircle Network Security. "However, [it's] understandable that Microsoft would want to run the patch through its QA [quality assurance] and software release cycles," he added. "Given the choice between the URI bug and the Macrovision fix, enterprise security operations teams would much rather have the URI fix."

Users of Microsoft's WSUS (Windows Server Update Services) update system had been wondering if they were going to get Tuesday's patches, after a Microsoft programming error knocked WSUS administration consoles offline on Sunday and Monday. Microsoft had misnamed an entry in WSUS's database causing the consoles to crash.

The problem was fixed on Monday, said Bobbie Harder, a Microsoft senior program manager, in a blog posting. But WSUS servers that synchronized with Microsoft between 5 pm.Sunday and 11 am Monday Pacific Time will need to resynchronize to avoid the problem.

Though she had heard of one user who had to manually updated his WSUS server, Tuesday's updates went off without a hitch, said Susan Bradley, a WSUS user who is chief technology officer with Tamiyasu, Smith, Horn and Braun, Accountancy.

Sphere: Related Content

Microsoft Releases Three Updates For Vista

John Fontana
Microsoft releases three non-security updates to Windows Vista that relate to battery life, sleep/hibernation issues, and glitches in the Media Center version.

The updates are part of an ongoing release of fixes for Vista that will eventually be incorporated into Service Pack 1, which is set to ship in the first quarter next year.

A public beta of the service pack is due by the end of the year.

Microsoft has been using occasional updates to tune Vista since it shipped a year ago. The company has been promoting the on-going updates to Vista as an alternative to one large service pack.

In October, the company released speed and reliability updates for Vista. In August it issued a pair of updates to address reliability and performance in the operating system.

The first update for this latest release addresses system compatibility, reliability and stability, including the extension of battery life for mobile devices and the stability of Windows PowerShell and wireless network services.

The update also improves the stability of portable and desktop computers that use an uninterruptible power supply (UPS), and the reliability of Vista when you open the menu of a startup application. The upgrade also shortens the startup time of Windows Vista by using a better timing structure, the recovery time after Windows Vista experiences a period of inactivity, and the recovery time when users try to exit the Photos screen saver.

The second update addresses a number of USB core components, including problems trying to recover or enter sleep or hibernation mode. The upgrade is a collection of 21 previously released fixes for USB issues. The second upgrade also addresses problems with USB devices that may no longer work correctly after Vista resumes from sleep or hibernation, problems that may occur with USB-connected microphones, the enabling and re-enabling USB composite devices, and problems with the hardware removal and "eject" command when used with an Apple iPod.

The third update focuses on Windows Media Center and issues with its extensibility platform. It also fixes issues associated with interaction between Media Center and Xbox360, when the gaming console is used as a Media Center Extender.

Microsoft plans to make the updates available via Windows Update beginning November 13, which is the same date for the release of the company's monthly security patch updates.

The three updates will also be included with Vista Service Pack 1, which also will include an update to the Windows kernel to align it with the kernel in Windows Server 2008.

The service pack is expected to ship at the same time as the server, which is slated for the first quarter of 2008. Microsoft officials hope to ship the server on or before the February 27, 2008, launch event in Los Angeles.

Sphere: Related Content

Microsoft Unveils Its Stand-Alone Virtualization Server

John Fontana
Microsoft Monday tweaked its virtualization strategy by unveiling a stand-alone virtualization server that won't require users to run the Windows Server 2008 operating system.

The announcement came at the company's annual TechEd IT Forum conference in Barcelona, Spain, where Microsoft also outlined pricing, packaging and licensing for Windows Server 2008 and the availability of management tools that address needs of virtualized environments.

Microsoft's virtualization announcement, however, is just a placeholder since the technology likely won't be available until August 2008. Microsoft's Hyper-V technology, formerly code-named Viridian and Windows Server Virtualization, will ship no more than 180 days following the release of Windows Server 2008, which is now slated between January 1 and March 31, 2008.

Microsoft's stand-alone hypervisor technology is called Hyper-V Server. It is hypervisor virtualization technology that is installed on the "bare metal" of a hardware platform without the need for a Windows operating system.

In fact, the Hyper-V Server could be the only piece of Microsoft technology running on the hardware given that Hyper-V supports virtual machines running operating system other than Windows, including Linux.

Microsoft rival VMWare has an enterprise-focused virtualization product it currently ships called ESX that also installs on bare metal.

Microsoft has been marketing virtualization as a feature of the operating system, but critics say the company is bending to the reality that OEMs will likely include a hypervisor virtualization layer as part of their hardware.

Dell, Fujitsu Siemens Computers, Fujitsu, Hitachi, HP, IBM, Lenovo, NEC and Unisys have all signed up to include Microsoft's Hyper-V server on their platforms.

Microsoft, however, also plans to sell Hyper-V directly to corporate users who could wipe a server clean and install Hyper-V Server, which is priced at US$28 and allows an unlimited number of virtual machines on a single box.

"Microsoft had clearly been very much in the hypervisor-virtualization-is-a-feature-of-the-operating-system camp," says Gordon Haff, an analyst with Illuminata. "I don't think Microsoft would phrase it this way, but clearly this is a step back from you can only get virtualization in the OS."

For its part, Microsoft says Hyper-V Server recognizes the fact that all hardware in essence will be a virtualization appliance.

"What we are trying to do enable customers to live in world where they treat all compute resources -- such as CPU cycles, storage, networking -- as a single blob while providing a consistent way of maximizing effectiveness and utilization while reducing costs for IT and making things more automated for IT," says Andy Lees, corporate vice president in Microsoft's server and tools marketing and solutions group. "And virtualization is the key piece of technology to enable that."

Haff says Microsoft's strategy shift isn't a negative, just a realization of where the technology seems to be headed.

"I think the general direction is going to be that the base hypervisor virtualization is going to be feature of the server rather than the [operating system]," he says " People like Dell and HP are going to embedded a hypervisor in the server, and in my view, it is not a big jump from there to say that in the not too distant future virtualization is just something that comes with the server like BIOS."

In addition to VMWare, others offer hypervisor technology that can install on bare metal including XenSource, which was recently bought by Citrix. Novell and Red Hat are also offering hypervisor technology with their operating systems.

Microsoft has existing partnership deals with both Novell and XenSource around virtualization integration.

But rival VMWare says Microsoft is sending a mixed message.

"Their product architecture is that virtualization is part of the [operating system] so they seem to be rethinking what hypervisor should be," says Raghu Raghuram, vice president of products and solutions for VMWare. "They are going to be coming out in almost one year with a basic-function hypervisor where today we have a robust hypervisor and 20,000 customers." And Raghuram adds WMWare comes with benefits such as availability and various management tools.

Haff says that speaks to what is truly interesting with virtualization, which are the tools needed to run and maintain a virtualized environment, especially requirements around management, and the fact that virtualized environments force IT to think about other parts of the network including storage, VLans, load balancing, SSL acceleration and firewalls.

"It's not about server virtualization," Forrester analyst Frank Gillett told Network World in August, "It's about when I have virtual servers I can completely change how I think about IT infrastructure. When I move virtual servers around I have to have storage that is not only networked but flexible so when I move the virtual server the storage connections go with it."

To that end, Microsoft Monday announced the availability of three of its System Center tools, including Virtual Machine Manager to manage virtualized servers.

The other tools are System Center Configuration Manager 2007, for client and server deployment and update, and System Center Data Protection Manager 2007, for backup and data recovery.

Sphere: Related Content

EU To Critically Review Google's DoubleClick Deal


The European Commission will open a four-month, in-depth review of Google's plans to buy rival DoubleClick for US$3.1 ($4.18) billion, a source familiar with the situation said yesterday.

Google, which stores data on the internet-surfing habits of consumers, wants to buy DoubleClick to increase its clout in tailoring advertisements to consumer activities.

Both companies are involved in the sale of on-line ads, although their business models differ.

Google has already proposed alterations, and the deadline had been extended to Nov. 13 so the changes could be vetted by customers and competitors.

Google competition counsel Julia Holtz has said that in response to third-party concerns the company had committed to the Commission that it would keep certain DoubleClick business practices unchanged.

An in-depth probe will last an additional 90 working days and does not necessarily mean there would be more changes required in the transaction.

Critics have also raised questions about what effect the deal might have on privacy, but the Commission has said privacy by itself is not part of a competition review.

Google has by far the strongest position in Web searching in Europe. The acquisition has drawn vehement opposition from competitors such as Microsoft Corp and Yahoo Inc.

The European Commission is working closely with the US Federal Trade Commission, which has been reviewing the case since May.

In the United States, there has been one congressional hearing on the deal and Republicans are pressuring for more.

Google's purchase is part of a rapid consolidation in the internet ad industry that includes Microsoft's US$6 ($8.10) billion acquisition of aQuantive Inc, home to the largest interactive ad agency.

Yahoo bought BlueLithium for US$300 million and Time Warner Inc's AOL unit bought Tacoda.

Both of the acquired companies use cookie technology to record web surfing habits of consumers so advertisers can target ads based on the information.

Sphere: Related Content

CBS, BSkyB, BBC... To Provide Free Video Content To Bebo



A host of media groups including US television network CBS, BSkyB and the BBC will provide free video content to the 40 million users on Bebo and keep any advertising revenues under a deal with the social networking site.

Bebo said on Tuesday its new "Open Media" system would allow users to store their favourite video and music content within their profile pages, and share it with others on the network.

The company expects the thousands of hours of entertainment content to make the site more attractive to users and therefore increase the value of its own advertising.

For the media companies involved, they will use their own video players to distribute content and will retain all of the advertising-related revenues. They will not be charged to distribute their content.

"Every media company is looking for better ways to deliver their content online," Bebo President Joanna Shields said.

"By opening our platform to media owners, who gain free access to our community while retaining control over their brand, their content and their revenues, we are creating valuable new inventory for advertisers and a new business model for the entire media industry."

Social networking sites such as Bebo, Facebook and MySpace are hugely popular with younger Web users - an important yet hard-to-reach category for advertisers - but media groups have struggled to convert their soaring popularity into revenue.

Bebo says its average user ranges from 13 to 24 years old and spends 40 minutes on the site each day they log on.

Facebook recently announced a new advertising system that will let companies introduce ads into the user pages of its 50 million members, and launch dedicated pages on the site for their brands.

Hulu, a free, advertising-supported online video service formed by NBC Universal and News Corp, launched at the end of last month.

Sphere: Related Content

Two New Updates From Microsoft Expected Tuesday

Gregg Keizer

Microsoft has scheduled just two security updates for Tuesday to fix flaws in Windows 2000, XP and Server 2003. One of the two is a leftover from October that was bumped at the last minute.

Only one of the bulletins will be rated "critical," Microsoft's highest ranking, while the other will be labeled "important," the next-lower rating. As usual, Microsoft disclosed a limited amount of information about the upcoming updates in a prepatch notification posted to the company's Web site today.

The critical update affects Windows XP and Windows Server 2003, said Microsoft, which classified the vulnerability as a remote code execution bug. What the bulletin will fix, however, is up for speculation.

"It could be the Macrovision vulnerability," said Andrew Storms, director of security operations at nCircle, referring to the digital rights management software bundled with Windows that has already been targeted by in-the-wild attacks. "Macrovision has already got a fix, so Microsoft wouldn't have had to do any coding."

Storms noted, however, that Microsoft would have to stretch its usual definition of "remote code execution" to make the Macrovision vulnerability fit the update, since both companies have been calling it a privilege elevation flaw, and thus less serious. "Microsoft sometimes seems to go back and forth about privilege elevation," Storms said. "They might just say, 'sure it's an elevation, but it could also lead to remote code execution.' Or we may just see a reversal here of the bug's severity."

On the other hand, the critical bulletin may be aiming at something completely different. "It could be the URI protocol handler bug," Storms said.

Less than two weeks ago, Microsoft accepted responsibility for fixing a widespread flaw in how Windows deals with the Uniform Resource Identifier (URI) protocol handlers, which let browsers run other programs via commands in a URL. At the time, Bill Sisk, a member of Microsoft's security response team, said that the group was "working around the clock" on a patch. The company would not commit to a release date, however, or say whether it would make the next update rollout, now just a day away.

The debate over who was responsible for patching the problem with the URI protocol handler raged over the summer, when Microsoft denied that its software was at fault, and third-party application vendors, including Mozilla and Adobe Systems, pointed fingers at the company even as they patched their own products.

Tuesday's second update, which targets Windows 2000 and Windows Server 2003 but not XP, appears to be the one that was yanked before October's bulletins hit the Internet. The only hint Microsoft gave of its composition was the "Spoofing" label, which in the past has usually been used to describe vulnerabilities in Internet Explorer that phishers and identity thieves exploit to deceive users.

"I have no idea what this one is about," Storms said.

He was, however, sure of one thing: the light patching load users and administrators faced this month. "It's a 'where's the beef?' kind of month," he said. "Maybe we can all catch up a bit."

Sphere: Related Content

A Look At Microsoft's Anti-Piracy MAR Program

Eric Lai

Microsoft introduced late last week a new pilot program to encourage refurbishers to install legitimate copies of Windows XP on used PCs.

The new Microsoft Authorized Refurbisher (MAR) program offers a discount off the retail price of Windows XP, along with deployment tools to help refurbishers reinstall Windows and all of the relevant drivers on renewed PCs in as little as 15 minutes, said Hani Shakeel, senior product manager of the genuine Windows product marketing team.

When MAR is fully expanded, it will also help stem what Microsoft acknowledges as widespread flouting of Microsoft's XP licensing rules by price-pressured refurbishers.

"There's a range of behavior. Definitely, what you're describing is happening," Shakeel said.

Observers say MAR also attempts to ameliorate another risk: that refurbishers, frustrated by the high cost and difficulty of following Microsoft's arcane Windows licenses to the letter, will simply install a free Linux operating system on renewed PCs instead.

Some resellers "are saying, 'We're just going to ship this stuff out with Ubuntu Linux,'" said Adam Braunstein, an analyst with the Robert Frances Group.

Braunstein estimates that for now, no more than one in ten refurbished PCs goes back out for sale sporting Linux rather than Windows. But Microsoft is worried.

"There are pieces of the armor that are pretty rapidly deteriorating," he said.

And the first two contestants are...

Two large refurbishers have been initially selected to participate in MAR: Redemtech and TechTurn.

Microsoft has long encouraged the donation and reuse of older PCs, albeit in a limited way. Its Community MAR program lets PC recyclers obtain cheap copies of Windows that they could use to install on used computers.

But the catch was that those licenses were only available for PCs destined for use by charities, schools and other non-profit groups. As a result, only 200,000 refurbished PCs worldwide last year benefited from the Community MAR program, according to Microsoft.

Meanwhile, up to 28 million refurbished PCs will be sold this year, making up 10% of the global PC market, according to Microsoft's Shakeel. Nearly all are destined to consumers and smaller companies.

Many of those PCs shipped will be violating some aspect of Microsoft's complicated End User License Agreement (EULA) for Windows. For instance, most refurbishers will assume that they can reinstall Windows onto a recycled PC using the license number on the original Certificate of Authenticity (COA) that shipped with it. In fact, Microsoft requires that refurbishers also have the original Windows installation CD.

That's a "near-impossible" requirement, says Braunstein. "You're probably lucky if half of the PCs [at a large company] still have the COA after three years," he said. As for the installation CD, "one of the first things a company does when they get a new PC is throw away the installation disc."

Confirmed Jake Player, president of TechTurn: "The majority of machines we get don't come with the original Windows CD."

A mixed message

Moreover, refurbishers struggle with even lower margins then conventional PC makers. At TurnTech's site, many several-year-old Pentium 4 desktops with Windows XP Professional command less than US$200.

"It's very difficult to make any money in this market," Braunstein said. "Microsoft has only been standing in the way of folks."

Unofficially, Microsoft has not strictly enforced its EULA with smaller refurbishers, choosing to tolerate this de facto piracy because the alternative -- refurbishers installing Linux instead -- is far worse in Redmond's eyes.

Microsoft's attitude "is 'Please don't pirate software. But if you do it, make it's ours,'" Braunstein said.

But large refurbishers such as TechTurn, which expects to sell 800,000 refurbished PCs this year, have been forced to comply with Microsoft's rules and ship out most of its PCs without any OS on them, says Player, which is why he has been "begging" Microsoft for relief.

Besides giving TurnTech an undisclosed discount on certified copies of Windows XP, Microsoft will supply the company with deployment software that will help detect and load needed drivers onto a large number of PCs in a matter of minutes.

"It helps customers know what they are buying," he said. "We are anticipating a sales uptick of 20%."

Relief and respite on the way

MAR comes just in time, too, from Player's point-of-view. He predicts a huge wave of older PCs will start hitting the market within a year or two, as companies upgrade to Vista and dump their existing, underpowered PCs.

Braunstein applauds MAR: "Microsoft is getting a little smarter," he said. But he thinks that until Microsoft expands the program, especially with smaller refurbishers, the company's likely to "continue to do things as is."

Shakeel said MAR will be "expanded as quickly as we can," first to North American refurbishers selling more than 5,000 systems a month, and then to other parts of the world.

A different program is in the works to encourage smaller refurbishers to get legal, he said. In the meantime, Shakeel confirmed that those firms won't have to worry about sudden anti-piracy enforcement coming from Redmond.

"There's nothing in the immediate horizon that should be a worry to the market," he said.

Sphere: Related Content

Intel's Long-Awaited Power-Efficient 'Penryn Processors' Launched

Agam Shah

Intel has launched its long-awaited new line of power-efficient microprocessors, code-named Penryn, designed to deliver better graphics and application performance as well as virtualisation capabilities.

The chip-maker teamed up with 40 original equipment manufacturers to deliver Penryn-based Xeon and Core 2 processors. Vendors including HP and Lenovo have already announced business desktops with Penryn-based quad-core Xeon 5400 processors, with more server announcements scheduled to come soon.

The processors, manufactured using a 45-nanometre process, feature smaller transistors and cut down on electricity leaks, which made them faster and more power efficient than earlier 65-nm processors, director for Intel's digital enterprise group operations, Stephen Smith, said. The most power-hungry Penryn-based systems will consume no more than 120 watts (W).

Penryn-based notebooks that were due in the first quarter of 2008 would use 25W, Smith said. Today's 65-nm notebooks consumed 35W. While cutting down on power usage, Penryn processors jumped to higher clock rates and feature cache and design improvements that improved the processors' performance compared with earlier 65-nm processors, he said.

The processors deliver a 40-60 per cent improvement in video and imaging performance, Smith said. New instructions on the processor sped up photo manipulation and encoding of high-definition video.

Intel's Penryn processor for gaming systems, the 45nm Intel Core 2 Extreme QX9650 quad-core processor, took advantage of the instructions and included a larger cache to deliver better graphics and video performance, Smith said.

Hardware enhancements allowed virtual machines to load up to 75 per cent faster, Smith said.

The Penryn launch signalled a new era in the way Intel manufactures chips, he said. The processors were the first to use high-k metal-gate transistors, which make the processors faster and less leaky compared with earlier processors that have silicon gates. The processor was lead free, and by the second half of 2008, Intel would produce chips that were halogen free, making them more environmentally friendly, Smith said.

Intel will ship 12 new quad-core Intel Xeon 5400 server chips in November with clock speeds ranging from 2GHz to 3.20GHz, with a 12MB cache. In December, it will ship three dual-core Xeon 5200 server chips with clock speeds of up to 3.40GHz and a 6MB cache. Intel would deliver the 45nm Penryn processors in multiple phases, Smith said.

In the first quarter of 2008, Intel will release the 45nm Core 2 Quad processors and Core 2 Duo processors for desktops. In the same quarter, it will launch the Core 2 Extreme and Core 2 Duo processors for notebooks. Intel plans to release 45-nm processors for ultramobile PCs in 2008, though Smith couldn't provide an exact release date.

Penryn was a significant follow-up to the 65-nm Core 2 processor launched last year, principal analyst at Mercury Research, Dean McCarron, said.

A lot of business workstation users and gamers were interested in the improved media and system performance Penryn processors deliver, he said.

While the Penryn provides a small performance boost, it's not a major change in architecture, an analyst at Insight 64, Nathan Brookwood, said. Rather than upgrading to Penryn systems, customers might wait for Nehalem, the next big overhaul in Intel's chip architecture that was scheduled for release in 2008, Brookwood said.

At Intel Developer Forum in San Francisco in September, Intel CEO, Paul Otellini, demonstrated Nehalem, and said it would deliver better performance-per-watt and better system performance through its QuickPath Interconnect system architecture. Nehalem chips will also include an integrated memory controller and improved communication links between system components.

However, people who need to buy hardware now would invest in Penryn systems, Brookwood said.

"It's not a massive upgrade cycle on notebooks and desktops," he said.

Pricing of the 45-nm Intel Xeon processors ranges from $US177 to $US1279 in quantities of 1000, depending on the model, speeds, features and number ordered.

The 45-nm Intel Core 2 Extreme QX9650 quad-core processor is $999 in quantities of 1000.

Sphere: Related Content