Sun Microsystems To Provide Support For OpenOffice

Sun Microsystems on Monday plans to announce that it will provide support for the OpenOffice.org productivity software suite, citing a wave of momentum behind the open-source project.

The support, which starts at $US20 per user per year, will be offered to companies that distribute OpenOffice.org, not directly to end-users, according to senior director of marketing for StarOffice/OpenOffice.org and Network.com, Mark Herring. "For a lot of distributors, they wanted to distribute OpenOffice.org and had no option for back-line support," he said.

OpenOffice.org and StarOffice, Sun's accompanying commercial product, are compatible with Microsoft Office and identical in terms of capabilities, which include word processing, spreadsheets and presentation software. But until now, Sun only supported StarOffice.

Another difference will remain -- Sun does not plan to provide indemnification against lawsuits for OpenOffice.org, as it does for StarOffice, Herring said.

Sun's move comes as OpenOffice.org is being downloaded 1 million times per week, with total downloads to date standing at about 110 million, Herring said.

Out of that number, Sun estimates that "tens of millions" of people are actively using the software, according to Herring. The most recent version is 2.3. Version 2.4 is expected in March and will contain significant new features, according to the [openoffice.org] qebsite.

"Microsoft Office is still the dominant tool out there -- only a fool would deny that," he said. "But [OpenOffice.org] has had a huge amount of momentum."

Sun believes the average OpenOffice.org user skews younger on average, and that download activity in Europe and the US has been greater than in Asian countries, he added.

Developers can create extensions to the core OpenOffice.org suite. Sun has made a new one for shaving down the size of presentation files, Herring said. The wizard-like tool goes through a file and asks users whether they want to keep or compress the various elements, he said.

Sun plans to provide support for any extensions it creates, according to Herring. As for ones made by third parties, "we would have to work with them on that code on a case-by-case basis," he said.

Sun is also releasing StarOffice 8 Server. Herring described it as a conversion engine that changes 40 document types into PDF files. The server, which costs $11,000, is aimed at enterprises with large stores of legacy documents that aren't archived with an open standard, according to Herring.

Sphere: Related Content

Wireless LAN Signals & Xbox 360 Signals Clashing

Microsoft's popular Xbox 360 game console can create a strong and strange signal on wireless LANs, according to IT staff at Morrisville State College.

It's not clear whether the signal disrupts the college's WLAN access points or students' wireless notebooks. There is some anecdotal evidence, however, that it at least affects other radios in the same 2.4GHz band.

Morrisville IT staff typically use Bluetooth headsets, which run in the 2.4GHz band, with their mobile phones when they troubleshoot problems on the spacious campus. "We had problems syncing our headsets to our phone where this signal was strong," says Matt Barber, the college's network administrator. A phone user had to physically touch the headset to the cell phone to make the initial connection, he says.

There may be effects on the WLAN that the equipment itself, from Meru Networks, is circumventing, according to Barber. Part of Meru's WLAN architecture employs software that gives the access points more control over wireless-client transmission behavior than does the software of some of Meru's rivals. An access point near a radiating Xbox may be compensating for interference by in effect guiding a wireless laptop to send and receive when open spectrum is available, essentially dodging around the Xbox signal.

Working with Meru, the small IT staff is planning to test soon the effect of multiple Xbox consoles in a dorm with a large number of active notebook clients.

Network World has asked Microsoft to comment on the Xbox signal phenomenon, but the company was not able to reply before this story was posted. We'll update this report as soon Microsoft provides information.

The latest version of the Xbox, the Xbox 360 Elite, went on sale earlier this year with a 120G-byte hard disk and a high-definition video interface.

Morrisville is a small college in rural New York state, taking its name from a nearby town. In summer 2007, the college deployed a campuswide 802.11a/b/g WLAN based on equipment from Meru.. The plan was to replace those access points with Meru's new, two-radio devices that added support for Draft 2 of 802.11n, the IEEE standard that boosts throughput from 22M to 25Mbps to at least 150M to180Mbps. That replacement was just completed, creating the first large-scale deployment.

During the fall, Morrisville IT staff, working with Meru engineers and IBM, the network integrator, detected an unusual signal in the 2.4GHz band. "We wanted to look at the [radio frequency] environment in our dorms," Barber says. "We always thought we'd run into some strange stuff [there] in the 2.4 range."

The signal was discovered using Cognio Spectrum Expert, from Cognio (recently bought by Cisco). Spectrum Expert is RF-analysis software packaged with a WLAN adapter card that slots into any laptop PC. Among other capabilities, Spectrum Expert identifies sources of radio energy in the 2.4GHz and 5GHz WLAN bands, and identifies the cause, such as a brand of access point or a microwave oven.

"The signal really stood out," Barber says. "In some places it was so strong we thought it might be affecting the air [that is, the radio environment] around it."

The Cognio software, however, was baffled by this new signal: "Unknown emitter" was the classification. The signal shows up in the Cognio display as a kind of green-blizzard effect, covering a large swath of the 2.4 band, Barber says. That means the signal "is jumping all over the spectrum band," he says. In contrast, a nearby Meru access point shows up in the same scan as a strong, stable yellow-red glow, almost like a sun. The green blizzard is shot through with red dashes, which show, Barber says, that the signal at moments nearly rivals the access point in strength.

The mystery signal baffled the IT staff and Meru until Barber had a brainstorm: He brought in his own Xbox 360 and plugged it in, and turned on the Cognio spectrum analyzer. Presto: The same signal appeared.

Barber says the signal seems be created by the console's embedded 2.4GHz radio, which is used to communicate with the handheld wireless controller -- the gizmo with the buttons that manipulate a game running on the console. The Xbox also takes an optional Wi-Fi adapter, in the form of an USB dongle, to connect to a WLAN access point.

Barber says his "best guess" at this point is that the embedded radio, not the USB adapter, causes the signal. The signal is created even if the Xbox console is shut off: Just plugging its AC adapter into an electrical outlet seems to trigger the radio to look for -- and keep looking for -- a companion wireless controller. "It's even worse when you have multiple Xboxes in an area," Barber says.

At one point, IT staff wrapped the console in a static discharge bag, the material used, for example, to wrap and protect consumer electronics gear from static damage during shipment. The same properties make it act like radio "blanket" to muffle a transmission. Sure enough, the Cognio software showed a significant drop in the Xbox signal's strength.

The next step is more systematic testing. "We want to get several consoles together with a bunch of WLAN clients, to create a busy [RF] environment, and do some measurements," Barber says. "Are we seeing frames being dropped in the air, or people getting disconnected?"

Answering that question may be a bit more urgent, with Christmas looming, and the likelihood of still more brand-new Xboxs and other wireless entertainment products turning up in January when students return.

Sphere: Related Content

German Speaking Trojan Defrauding Banks In US, UK, Spain And Italy

Gregg Keizer
A German-speaking hacker crew is looting commercial bank accounts in four countries using a custom-built Trojan put in place by expertly crafted and extremely focused phishing attacks, a security researcher said Thursday.

The malware's most distinguishing feature, said Don Jackson, a senior security researcher at SecureWorks, is its ability to mimic the steps the human account owner would take to move money.

A variant of the Prg Banking malware, the new Trojan has stolen hundreds of thousands from accounts at some of the biggest banks in the US, the UK, Spain and Italy, said Jackson. "This is not widespread, but it is very dangerous. They've already stolen more than US$200,000 from the accounts we've monitored, but this has really flown under the radar."

Jackson also said he has found at least four servers that contain Prg configuration files and bogus versions of legitimate banking sites, as well as caches of data harvested by the Trojan.

The cleverness and technical know-how of the attackers was almost breathtaking. "If you were on the bank side of this connection [with the Trojan], it would appear to be a person on the other end running the account," Jackson said. "It would seem as if someone was clicking the keys on the virtual keyboard and sending wire transfers."

According to Jackson, the hackers -- who speak German, though they may not reside in Germany proper -- mined the vast amount of data collected previously by a less powerful generic version of Prg for evidence of commercial banking accounts, including specific URLs of offshore banks or indications of wire transfers.

The crew targeted commercial accounts, said Jackson, both because those accounts typically contain bigger balances and because they usually have the built-in ability to conduct wire transfers. Once they break into a business account, the hackers can quickly plunder it by using wire transfers to move its monies to hacker-controlled accounts.

With victim accounts picked, the hackers then create what Jackson called "very convincing" phishing e-mails and send them to the account owners, who have been identified using data stolen earlier. "They'll usually have the bank account number, and the first and last name of its owner," said Jackson, as well as security details, such as whether the account is protected by a one-time password. "The e-mail will claim that the user needs to download a new one-time password or soft token, but when the user clicks on the link and reaches the phish site, the Prg Trojan is downloaded instead."

From there, the highly automated account thief takes over. The malware alerts the hacker when the account owner is actually online with his bank, "piggybacking" on the session to silently steal the username and password without actually duping the user into entering it. Then using its ability to simulate keystrokes, the Trojan walks through all the steps a human being would take to, for instance, wire funds to another account. An account can be emptied in seconds.

"That's a very clever part of the Trojan," said Jackson. "How it downloads JavaScript from the command-and-control server so it looks like the [account owner] is accessing the account, not a bot." While less-sophisticated malware heads straight to a money transfer page without first appearing to "visit" the pages a real person would view before reaching the transfer page, Prg visits the bank's pages in order, as a person would. Because most antifraud looks for automated, nonhuman behavior, Prg won't trigger a fraud alert.

Each bank site has had customized code written for it, Jackson added, to make updating the Trojan-controlled PCs easier. If the hackers need to change the destination account -- because it's been spotted and frozen by local law enforcement, say -- a new one can be fed to the Trojans from the server.

"Fewer than 20 banks have been hit by this so far," said Jackson, "but they include some of the biggest banks in the US, UK, Spain and Italy.

He came close to praising the criminals. "To me, the automation of this is very, very crafty."

The surest defense against the Prg Trojan, Jackson concluded, is to be suspicious of any e-mail received from a bank. "Even if you recognize the sender, you should confirm that the sender sent that message before clicking on any links."

Sphere: Related Content

Dell Notebooks Comes With Integrated Mobile Broadband

Rodney Gedda

Dell has announced the addition of BigPond, using Telstra's Next G mobile network, as an option for integrated mobile broadband in a number of its notebooks.

From today 13 notebook models will sport the capability to access BigPond wireless broadband built-in, with plans starting at $34.95 per month on a 12-month plan.

Dell previously only offered built-in support for Vodafone mobile broadband.

Dell client computing strategist Jeff Morris said the move is in response to direct customer feedback and the company has worked with BigPond to make the wireless broadband experience "simple, easy to buy and to use".

BigPond group managing director Justin Milne said customers simply need to "fire up your new notebook, run the connection manager, pick a plan, and you're online".

Notebooks that support mobile broadband are throughout Dell's Latitude, Precision, Inspiron, Vosto, and XPS ranges.

Sphere: Related Content

RecoverGuard Provides Error-Proof Disaster Recovery Plan

Mario Apicella


In IT, change is the only constant, as hardware and software is updated almost continuously. Companies that take business continuity seriously protect themselves by creating a recovery site to run vital business processes during an emergency.

Needless to say, keeping the recovery site current is essential to business continuity, but given the constant flux of hardware and software updates, the outcome of that effort is often uncertain.

And this uncertainty is compounded by the fact that changes to the IT infrastructure are often automated, whereas replicating those updates to the DR (disaster recovery) site remains a manual, error-prone activity.

An overlooked change could cripple your business in the event of a disaster. Think, for example, how damaging it would be if an important database was moved to a different volume to improve performance but that change was never replicated at the recovery site.

Is there a better way other than zealous attention to details to keep a DR plan effective? According to startup Continuity Software, its recently announced RecoverGuard 2.0 is the answer.

Think of RecoverGuard as a watchdog that can automatically compare the details of two IT infrastructures, then finds and reports their differences. Not only does RecoverGuard continuously monitor the two sites, but it also automatically creates a problem ticket when discrepancies arise.

It's interesting to note that RecoverGuard has a bottom-up, data-comes-first discovery process that initially identifies the storage objects of a site, then seeks out the hosts that owns them.

During discovery, RecoverGuard builds an accurate topology map of the datacenter that admins can use to better understand and solve problem tickets.

According to Continuity Software, RecoverGuard 2.0 brings some interesting improvements over previous versions, including a more efficient and faster discovery process, and a Dashboard that empowers nontechies to manage this critical business activity.

How intrusive is RecoverGuard? Not very, according to the vendor. In fact, it sits on a dedicated Windows machine and doesn't require you to install agents on your servers. Understandably, you'll have to provide the software with ample authentication credentials, just as you give your security guards keys to open every door in the building.

I liked just about everything I heard and saw during my briefing and demonstration with Continuity Software, including its assessment challenge -- a sort of gauntlet thrown at your current DR procedure.

It goes like this: Continuity Software volunteers to perform a risk assessment that won't cost you anything if no damaging difference is found between your primary and recovery sites.

What happens if a significant inconsistency is found? Well, then, you pay US$15,000 for the assessment, plus a yearly license fee of US$2,000 per server. Are you confident enough to take that challenge?

Sphere: Related Content

Solid-State Drives In The Market Soon

John Brandon

For laptop owners, flash-memory drives boost battery life and performance while making notebooks lighter and more bearable for frequent business travelers. In the data center, benefits include higher reliability than their magnetic counterparts, lower cooling requirements and better performance for applications that require random access such as e-mail servers.

So far, the biggest barriers to adopting solid-state drives (SSD) in the data center have been price and capacity. Hard disk drives (HDD) are much less expensive and hold much more information. For example, a server-based HDD costs just US$1 to US$2 per gigabyte, while SSD costs from US$15 to US$90 per gigabyte, according to IDC.

Capacities are just as disparate. The Samsung SSD drive only holds 64GB, although the company plans to release a new 128GB version next year. Meanwhile, Hitachi America makes a 1TB HDD that's energy efficient and priced at US$399 for mass deployment in servers.

Enterprise Strategy Group analyst Mark D. Peters explains that solid-state technology has been on the radar for years, but has not been a "slam-dunk" in terms of price and performance for corporate managers. That's about to change, he says, because the IOPS (input/output operations per second) benefits to SSDs are too impressive to ignore. Advantages include how SSD has no moving parts, lasts longer, runs faster and is more energy efficient than an HDD.

And prices are falling fast. Right now, the industry trend is a 40% to 50% drop in SSD pricing per year, according to Samsung.

The arrival of hybrid drives such as Samsung's ReadyDrives -- which use both SSD and HDD technology -- and SSD-only servers "suggests the time for SSD as a genuine -- and growing -- viable option is getting closer," says Peters. He was referring to the recent IBM announcement about BladeCenter servers that use a SSD.

"Price erosion, coupled with increased capacity points, will make SSDs an increasingly attractive alternative to HDDs" in data centers, agrees Jeff Janukowicz, an analyst at IDC.

Two examples of how SSDs solve persistent throughput problems for high-performance computing shows how SSD technology may make new inroads in corporations in 2008, some industry watchers believe.

Solid-state at the Stanford Linear Accelerator Center

At this research center, SSD is being used for some of the most data-intensive work going on today. The Stanford Linear Accelerator Center (SLAC) uses particle accelerators to study questions, including where antimatter went in the early universe and what role neurexin and neuroligin proteins play in autism.

The amount of data is immense -- in the petabytes -- and the lab uses a cluster of 5,000 processor cores. Despite that, the discrete chunks of data that are requested and analyzed by several hundred researchers are highly granular -- usually just 100 to 3,000 bytes of information. At the same time, scientists tend to perform thousands of data requests, accessing a few million chunks of data per second.

Richard Mount, SLAC's director of computing, explains that the response time for these researchers' data requests is limited not by the number of processors or by the amount of network bandwidth, but rather by disk access time. "Flash memory is over a thousand times faster than disk" drive technology," says Mount. "Hard disks are limited to around 2,000 sparse or random accesses per second. When accessing thousand-byte chunks, this means that a disk can use only 1/50th of a gigabit-per-second network link and less than 1/100,000th of a typical computer center network switch capacity."

This limitation has translated into the need to make what the lab calls "skim data sets." In other words, preassembled collections of related data that at least one researcher has already requested. "There is no waiting for skim data sets that already exist, but if somebody wants one that does not already exist, then they normally have to wait for a skim production cycle that takes place once every four to six months," Mount says.

To help researchers receive data in a more ad hoc manner, flash storage may be just the thing. "We have no religious attachment to flash, but we can construct flash-based storage at a reasonable cost and around 25ms latency, and we are doing so."

SLAC has developed its own SSD-based system that is in the final debugging stages, Mount explains. "The first version of this will provide about 2TB of storage, but we can easily grow this to 5 or 10TB just by buying flash chips," though he reckons the scalability will require "more serious expenditure." At the 2TB level, it will serve as a test and development system only.

Eventually, the goal is to use SSD technology as a cache for all particle accelerator research, which will allow scientists to access data at any time from any data store. "SSDs help the entire system run more efficiently by ensuring the I/O capability is in balance with the rest of the application system," adds IDC's Janukowicz. "The characteristics of flash-based SSDs make them a well-suited alternative for high-IOPS applications that are read intensive. SSDs have no rotational latency and have high random-read performance. Thus, with SSDs the time to access the data is consistent and very small regardless of where on the device the data is held."

Considering SSD at the Pacific Northwest National Laboratory

At the Pacific Northwest National Laboratory (PNNL) in Washington, solid-state technology could help alleviate a supercomputer bottleneck. At the lab, researchers run tests that sustain a write speed of 80Gbit/sec. and a read speed of 136Gbit/sec. Yet, one or two slow hard disk drives running at one quarter the speed of other disks causes performance to degrade quickly.

"Solid-state devices such as flash drives can use a RAID striping technique to achieve high streaming bandwidth -- just like [hard] disk drives -- while also maintaining very low latency for random access," says Robert Farber, a senior researcher at PNNL. "This is a very exciting combination."

The lab has not moved to solid-state technology yet. But Farber says the real debate is whether low-latency access for "seek-limited applications" -- in other words, many requests for small amounts of data -- can alleviate the pressure of computing bandwidth. It is not solely a price-per-gigabyte debate. "It remains to be seen how much of a price premium consumers will tolerate before robustness, power, storage capacity and physical space differences cause a mass departure from magnetic media," Farber says.

At the PNNL, the latency goal for its last supercomputer was 25Mbit/sec., per gigaflop of peak rate floating-point performance. This is mostly to be able to handle the data-intensive nature of the NWChem scientific software calculations running. The lab's new environmental molecular sciences facility contains a new supercomputer with a theoretical peak floating point performance of 163 teraflops. And, like at the Stanford lab, disk speed is a critical part of the equation, so solid-state is the forerunner in solving the bottleneck.

One breakthrough Farber expects in the not-too-distant future: Operating systems will change their memory hierarchy to directly access SSD, turning the technology into a hard drive replacement for mass storage.

Complementary, not replacement tech for most users

One question that remains: When will SSD really impact the corporate world? Some say SSD in the data center is just on the horizon, since laptops such as the Dell XPS M1330 uses a Samsung 64GB SSD. Alienware also offers a 64GB option in some of its desktop computers. And SSD is applicable across the commercial landscape; while researchers need the speed to study proteins, retailers may need or want faster POS transactions.

One company to watch in this space: Violin Memory. The company's Terabyte-Scale Memory Appliance provides over 1Gbit/sec. access for sequential and random-access. SLAC's Mount says he tested a DRAM-based prototype appliance from Violin, and that its upcoming flash-based system "seems a good match for our applications."

A Violin spokesman explains that the two key bottlenecks in corporate computing are network speeds and IOPS for storage systems. Today, disks run at about 100Mbit/sec. for sequential operations, but only 1Mbit/sec. for random 4k blocks, he says.

"In some cases, there are minimal capacity requirements which are well suited for SSDs," Janukowicz adds. "Also, in high-performance applications, the IOPS metrics can favor SSDs over HDDs." However, even with all those benefits, he says that "IDC does not see SSDs completely replacing HDDs in servers. SSDs do offer performance advantages and are a 'green' solution. However, there are many applications that require the capacity provided by HDDs."

Enterprise Strategy Group's Peters says that throughput requirements will lead to a gradual shift away from hard disk drives to solid-state technology, but it will take time in the corporate world. "Moving wholeheartedly from one technology to another is a rare thing within data centers," he says.

John Brandon worked in IT management for 10 years before starting a full-time writing career. He can be reached at jbrandonbb@gmail.com.

Sphere: Related Content

MIT Finally Completes Its OpenCourseWare Project.

John Cox

MIT this week announced an important digital achievement: the completion of its pioneering OpenCourseWare project. And everyone involved seems quite happy with being unsure about why exactly it's important.

The achievement is digitizing all the classroom materials for all of MIT's 1,800 academic courses, putting them online, and inviting anyone and everyone to do whatever they want with that information. It's called the OCW project, and it's spawning a global movement to make what had been jealously guarded education resources accessible to educators and learners everywhere.

You can find the outline of a course in fundamentals in data networking, with a syllabus and lecture notes. There's a PowerPoint presentation from 2006 on "Trends in RFID Sensing".

Proposed in 2000 by a faculty committee, announced in 2001, and launched in 2002, OCW has received US$29 million in funding, US$5 million from MIT, the rest from foundations and contributors. One key backer, the William and Flora Hewlett Foundation, has decided on investing another US$100 million over five years in various open education projects largely because of its experience with OCW, according to Marshall Smith, director of the foundation's education program.

MIT has taken a step in doing something more with OCW. As part of Wednesday's celebration on the MIT campus in Massachusetts, University President Susan Hockfield announced a new portal for OCW, one designed specifically for high school teachers and students. Dubbed "Highlights for High School," the portal's home page selectively targets MIT's introductory science, engineering, technology and math courses, with lecturer's notes, reading lists, exams and other classroom information. The OCW resources, including video-taped labs, simulations, assignments and other hands-on material, have been categorized to match up with the requirements of high school Advanced Placement studies.

It's that "letting them do whatever they want" part that creates the uncertainty about why OCW is important. The data on usage are impressive. In the five years since the launch of OCW, with a 50-course pilot site, an estimated 35 million individuals have logged in. About 15% are educators, 30% are students, and the rest are what MIT calls "self learners" with no identifiable education affiliation, says Steve Carson, OCW's external relations director.

The recently formed OpenCourseWare Consortium has 160 member institutions, creating and sharing their own sites, on the MIT model. Something like 5,000 non-MIT courses are now available globally, some but not all using material from the OCW Web site.

Yet, one of the most striking statistics is from a completely unexpected source: iTunes, Apple's Web site for music and videos. MIT President Hockfield said she was told in September by her daughter to check out the iTunes list of most-popular videos. To her astonishment, Hockfield found two OCW videos in the top-10 listing. "No. 3 was 'classical mechanics,' she said. "No. 7 was 'differential equations.' Go figure."

"This expresses, to me, the hunger in this world for learning, and for good learning materials," she told her audience.

A distinguished group of speakers and panelists at the MIT event all agreed that OCW represents...well, something.

"We're unlocking a treasure trove of materials," said Steve Lerman, MIT's dean for graduate students, and chairman of the OCW Faculty Advisory Committee.

OCW's resources will factor large in plans by the government of India to create a massive expansion of educational resources, according to Sam Pitroda, chairman of the government's Knowledge Commission, which is charged with making specific recommendations on how to spend the new US$65 billion the government will invest in education over the next five years. The nation has over a half-billion people younger than 25, Pitroda says. Just one of a series of almost unimaginable goals is to increase the number of universities from 350 today to 1,500 in five years, he said.

Pitroda said the scale of such goals requires questioning basic assumptions about what education is and how it is accomplished. "We don't have enough resources to train teachers and build an entire [traditional] infrastructure to support them," he said. Hence, the commission's interest in open projects like OCW, which hold the promise of a massive transfer not only of knowledge but of teaching approaches and learning structures that can be adapted to local requirements and cultures.

"Given this expansion, OCW plays a key role in these emerging experiments" in education, Pitroda said.

Former Xerox Chief Scientist John Seely Brown, sounding what for him is a recurring theme, said Web technologies in education are creating a new generation of tinkerers, who tinker with content online rather than nuts and bolts. This is the domain of mashups, of combining existing content from various sources and media to create new, often more complex creations, often in the context of a community of peers who share a common passion.

"Maybe the next stage for OCW is shifting from [a focus on] content to actions on or with the content," he said. "We have the ability to bring back tinkering, which is the basis of our intuition. We get our intuitions from playing around with stuff."

Their musing prompted further musings from the audience.

Someone wondered if the new technologies both inspiring and enabling OCW and other projects have rewired the brains of the next generation, so that entirely novel ways of teaching and learning are now needed. Another asked if these technologies were democratizing learning, didn't that call into question the classic idea of the university as a "certifier" by its degree programs that a student has acquired a certain degree of knowledge. Still another wondered how OCW could be augmented by faculty from around the world while balancing a need for maintaining some criteria of excellence.

These and many other questions will have to be addressed as part of a developing global conversation about the "meta university," suggested Charles Vest, MIT's former president and an early and enthusiastic backer of OCW. This concept is an attempt to blend what Vest described as the "deeply human activities" of teaching and learning, with advances in information technology that are making possible new tools for those activities: vast digital archives, open digital publications such as the Public Library of Science, projects like the Sakai open source learning management system, and projects like MIT's iLabs, which lets students around the world use the Internet to access automated lab equipment, run automated experiments, and analyze and share data.

"The emotion I feel right now is humility," said Hal Abelson, professor of computer science and engineering at MIT, and founding director of Creative Commons, a non-profit that offers free tools for content creators to mark their online creative work with the freedoms and permissions they want the work to carry. "What OCW has led us to see is what we're in something like 'Education 1.0' What comes next? We're imagining the future."

Sphere: Related Content

Cisco Confirms Its VOIP Phones Spies On Remote Calls

Linda Leung

Cisco confirmed it is possible to eavesdrop on remote conversations using Cisco VoIP phones. In its security response, Cisco says: "an attacker with valid Extension Mobility authentication credentials could cause a Cisco Unified IP Phone configured to use the Extension Mobility feature to transmit or receive a Real-Time Transport Protocol (RTP) audio stream."

Cisco adds that Extension Mobility authentication credentials are not tied to individual IP phones and that "any Extension Mobility account configured on an IP phone's Cisco Unified Communications Manager/CallManager (CUCM) server can be used to perform an eavesdropping attack."

The technique was described by Telindus researcher Joffrey Czarny at HACK.LU 2007 in Luxembourg in October.

Cisco has published some workarounds to this problem in its security response.

Also in October, two security experts at hacker conference ToorCon9 in San Diego hacked into their hotel's corporate network using a Cisco VoIP phone.

The hackers, John Kindervag and Jason Ostrom said they were able to access the hotel's financial and corporate network and recorded other phone calls, according to a blog on Wired.com.

The hackers used penetration tests propounded by a tool called VoIP Hopper, which mimics the Cisco data packets sent at three minute intervals and then trades a new Ethernet interface, getting the PC - which the hackers switched in place of the hotel phone - into the network running the VoIP, according to the blog post.

The Avaya configuration is superior to Cisco, according to the hackers, because you have to send requests beyond a sniffer. Although it can be breached the same way, by replacing the phone with a PC.

Sphere: Related Content