You Work Where?

FRAMINGHAM (07/24/2000) - You all love networking - technically speaking - but the draw for some of you is greater than that. Here are the stories of network professionals who find inspiration in their unusual IT jobs.

To infinity and beyond!

With wobbly legs and a queasy stomach, you've just exited CyberSpace Mountain, the most exhilarating, terrifying roller coaster you've ever ridden. Never before have you careened around such death-defying twists, shot straight down so fast or rocketed around so many loops. And yet, there's not a physical track in sight.

CyberSpace Mountain is the crŠme de la crŠme of immersive, virtual attractions at DisneyQuest Chicago, a 90,000-square-foot, five-story indoor theme park at a busy downtown corner. Software designers at Walt Disney Imagineering (WDI) created this extreme build-your-own coaster and other attractions, such as rides in which visitors dodge hungry dinosaurs as they raft down a prehistoric river or battle for buccaneer gold in the new Pirates of the Caribbean ride, pictured above.

While waiting for your nerves to settle, you marvel at this virtual tour de force. "How does Disney create such magic?" you wonder. "And how do I get a job here?"

Expertise in running a remote-site retail operation was the starting point for Priscilla Carney, IS director at Disney Regional Entertainment (DRE), the Burbank, Calif., Disney unit running DisneyQuest. She became part of the magic about three years ago, when DRE hired her to implement the back-end systems and network infrastructure for its first indoor amusement park.

Carney's challenge has been just as difficult as the one Woody, of "Toy Story" fame, faced in convincing Buzz Lightyear that he was not a real spaceman but a toy. She joined DisneyQuest just six months prior to the opening of the prototypical DisneyQuest in Orlando.

"I had three months to spec out and choose the systems, and three months to configure, install and test them," Carney says. "It's been an adventure."

Bumps along the way included changes in operational plans, discoveries during build-out and extra hours worked - all to do the right thing as opposed to the immediate and easy thing. "I don't think most IS people are ever exposed to the challenges that come with starting a business built around a new business concept," says Carney, who prides herself on her flexibility.

Most can't claim to support such snazzy applications, either. CyberSpace Mountain, for example, combines leading-edge, real-time computer image generation, 3-D virtual reality and sophisticated motion simulation. "The total integration and use of all the different technologies is just amazing," Carney says.

To ride CyberSpace Mountain, a guest creates his coaster at a workstation using a WDI-developed program. Once the coaster is designed, the file is transferred across a virtual LAN at 100M bit/sec to the workstation running the ride itself. The guest hops in a motion simulator, and the custom program controls the ride. The simulator is the type typically used by the aerospace industry, capable of 360-degree circles.

All attractions are networked as autonomous VLANs, which span multiple switches for load balancing and ease of maintenance. Likewise, a switch often supports more than one VLAN. If one switch fails, the attraction would not.

The network infrastructure is essentially the same at the three DisneyQuests, in Chicago, Orlando and Philadelphia. At each, Carney has installed approximately 15 VLANs and 10 Ethernet switches. She uses 100M bit/sec links for the bulky file transfers between create and play stations found in the Create Zone, one of four sections in the park. The other zones - Explore, Score and Replay - need only 10M bit/sec connections because those attractions don't require behind-the-scenes file transfers.

The switches are monitored remotely by the Disney network group in Orlando. If one switch issues an alert, a network technician will dial into it. If it can't be fixed remotely, an onsite IS analyst would be notified or a vendor technician dispatched, Carney says.

On the WAN side, Carney relies on Disney's frame relay network. She uses it primarily to transfer back-end data such as daily sales, e-mail and software upgrades for the back-office systems or attractions.

As cool a place as DisneyQuest is, it's obviously not all fun and games for the IT people. Even a tiny outage could make the magic disappear and disappoint guests. "We get guests that only come once a year, maybe even just once a lifetime. At each attraction, we only get them for 3 to 5 minutes, yet we've got to entertain them and make them feel like they've gotten their money's worth," says Joe Garlington, executive director for interactive show development at WDI.

From a network perspective, that means Carney relies on a "solid switch vendor" (that she won't name) with a proven track record and willingness to deal with her stringent specifications. "You won't find a more thorough scope of work document for installations," she says.

Scalability is a huge criterion, Carney says: "We need to be able to respond to future needs - DisneyQuest lives and breathes. Already, two attractions have been added and more are in development. The infrastructure needs to be in place to serve the unknown."

- Beth Schultz

Packets by the sea

The sea has always fascinated people. To us creatures of the land, the sea is a fierce enemy, an inviting playground, a haven for the mythological, a source of food, a thing of beauty, our worst fear. Above all, it is the keeper of the earth's deepest secrets.

From the shores of the Pacific Ocean in La Jolla, Calif., some 1,200 people labor to unlock its mysteries. At the Scripps Institution of Oceanography (SIO), a graduate school at University of California San Diego (UCSD) and a world-renowned research center, students study biology, chemistry, climatology, geophysics, geosciences and physics.

All of which generates boatloads of data.

The care and safety of that data is the responsibility of Mick Laver, manager of SIO's network and administrative systems. In a small, ocean-side metal building, where spare computer parts fight with boogie boards for floor space, Laver holds an enviable job. He lords over a state-of-the art network; gets virtually all the funds he needs for build-out because top management recognizes the network's importance; pioneers really cool projects; and surfs during his lunch hour.

"This sounds corny, but this is my dream job - second to being a rock star. As a boy, I knew I wanted to work at Scripps," says Laver, a tall man with a quick smile, dry wit and the relaxed, easygoing manner common to California beach folk. Rather than the customary sport coat and tie on a hanger in the office, behind Laver's chair hangs a wet suit.

Laver's been at SIO for more than two decades. The first half of his SIO career he was a deep-sea marine biologist, taking month-long turns at sea. "I've been down in Alvin, the sub that found the Titanic, about seven times. The deepest was 4,000 meters in the Panama basin," he says.

With the advent of microprocessors, Laver became fascinated by computing, returning to school for a masters in educational technology and then, in the late 1980s, taking a two-year stint as a microcomputer consultant for UCSD's computer center. As the network became the computer, Laver's job focused increasingly on SIO's network design.

Yet his experience as a researcher guides him to improve SIO's data infrastructure. For instance, a few months ago, he wired the SIO docking pier located at the Point Loma harbor with fiber and connected it to the SIO campus 16 miles away via a T-1 line. Now shipboard crews get the bandwidth they need, at least while their vessels are docked. To provide network connectivity at the pier, Laver ran fiber from a Cisco router in a nearby building through the machine shop (a warehouse-like building located near the pier) out to concrete structures on the pier called bunkers. These bunkers store connections to land-line phones, electricity and now, the network. An SIO ship that pulls up to shore can patch into the fiber and access applications, e-mail and the Web.

Ships collaborating with SIO can plug into a repeater with Cat-5 wiring, 10Base-T or fiber. This gives them access to the Web via the SIO network.

The SIO campus is equally challenging. Such close proximity to the sea means the innards of computing equipment can quickly corrode. The network team stocks up on cheap parts and replaces them often.

Likewise, the institution is forever tearing down or renovating old buildings and erecting new ones. This year, the SIO has doubled its annual budget to $400,000 to accommodate network upgrades. Since 1992, the entire physical infrastructure has been overhauled, with archaic overhead wire ripped out and fiber run underground to all 80 buildings on the sprawling campus. With that, Laver moved the network to an ATM core.

Today, the heart of the network is a Cisco Systems Inc. Catalyst 5500 with a route-switch module. From the 5500, an OC-3 ATM link connects SIO to UCSD's main campus, about a mile away. Another ATM link, this one OC-12, connects the 5500 directly to the San Diego supercomputer center, also located at the UCSD campus, and from the supercomputer center to the Internet.

On the SIO campus, the network supports 10 research groups, plus the Stephen Birch Aquarium Museum and the U.S. Marine Life Fisheries Service. It also supports the Institution of Geophysics and Planetary Physics (IGPP), a University of California-wide research group based at SIO. IGPP owns a supercomputer at the San Diego supercomputer center, and Laver's wires are tapped to transport big-time calculations it's running.

Because research groups are divided among buildings, many buildings have their own switches, all of which link to the main 5500. For these buildings, SIO uses 10 Catalyst 5500-family switches and about 12 Catalyst 2900- or 2800-family edge switches. Bandwidth will never be a constraint, Laver vows.

Every building 5500 connects to the main 5500 via an OC-3 ATM uplink, with one exception: IGPP's 5500s are connected via Gigabit Ethernet. The edge switches in the smaller buildings are connected via OC-3 ATM or Fast Ethernet. The 5500s in the buildings also sport 10/100 Ethernet desktop connections. In all, this network supports more than 1,500 desktops, divided in thirds among various Unix, Windows and MacOS flavors.

"The Internet changes everything. I now estimate what I'll need and then double it. More copper, more fiber, higher-density switches," Laver says.

The technical aptitude of his users, coupled with the script-kiddy culture of university life, makes security his biggest worry. Many researchers and technical users are apt to slap a Linux, Solaris or Windows NT box up, not realizing that any security gaffs they've made could put the entire network at risk.

"These guys don't have the time, interest or expertise to be system administrators," Laver says. "Security is the area that suffers most."

But if the people of Scripps cause him his biggest technical headache, they are also the reason Laver stays put. SIO is recognized as its own village within the UC system, whose citizens throw Friday night beer bashes and are among the brightest scientists in the world. "We are a very close-knit group," Laver says.

- Julie Bort

A life-and-death job

Tragedy strikes during a weekend outing, and suddenly one family in Washington is dealing with death. Turning this misfortune into goodness, the parents decide to donate their child's organs. With this act, they offer life to a few desperately ill souls.

Because it's possible to transplant as many as eight organs from a single donor, organs from the child's body are distributed to transplant patients across the country. The kidneys, which can survive the longest outside the body, are the last organs to be matched. The left kidney goes to a woman in Texas and the right kidney to someone in New Jersey.

Being a vital link in this circle of life keeps Berkeley Keck enthralled with the job he's held for nine years. Keck is IT director for the United Network of Organ Sharing (UNOS), a private nonprofit scientific and educational organization in Richmond, Va.

"The nature of this work is just so interesting," says the soft-spoken but tough Keck.

For nearly 15 years, UNOS has carried the weighty responsibility of quick and equitable organ distribution. It has been overseeing the U.S. Department of Health and Human Services' Organ Procurement and Transplantation Network (OPTN) since 1986. OPTN is the national transplant system through which organ matches are made. The following year, it added responsibility for the U.S. Scientific Registry of Transplant Recipients, which is a database of all organ transplants.

Keck boasts that UNOS' ability to gather and analyze organ-related data has recently become a lot easier. Last October, UNOS launched an extranet, called UNet. It is the crowning glory of a network migration Keck has lorded over since he came to UNOS nearly a decade ago. Slowly but steadily, project after project, the ever-persistent IT director moved UNOS out of the minicomputer and into the Internet era - and closer toward his vision of an open, reliable and infinitely more user-friendly system.

UNet replaces an outmoded, non-Y2K-compliant minicomputer/dumb terminal setup that people accessed via sometimes-flaky dial-up X.25 links. For the Registry, a Web application replaces Lotus Notes replication. Printed and faxed forms were the default, making it difficult to provide more than the minimum information required.

Now, any of the 5,000 UNOS members can access organ transplant information over the Internet with a PC and browser. For example, transplant surgeons can access UNet from home, even late at night. The site is password-protected at various levels so users can only get the information they need to do their jobs.

For OPTN, UNet makes it much easier for users to enter data, so they provide more information than ever before. In turn, policy-makers can use that information to better determine organ placement processes.

UNet also lets users initiate match runs with a click. Processing time varies by organ type, but a recipient list for intestines can be generated in as little as 13 seconds. A kidney match, the longest that can be run, takes 5 minutes at most, Keck says.

The Microsoft Corp. SQL Server 7.0 database used for the match information runs on two clustered Digital Equipment Corp. 7310 servers that have quad 600-MHz Alpha processors. Ease of data entry already has boosted the database to four times the size it started at in October, says Joe Wysowski, assistant director of systems technology at UNOS, and one of Keck's right-hand men. Data storage has quickly grown to 80G bytes, he says.

UNet Web applications reside on a cluster of three load-balanced Compaq Computer Corp. 5500 servers with dual 400-MHz Pentium II Xeon processors running Microsoft's Internet Information Server 4.0 and Windows 2000.

In all, UNOS uses 41 servers for external and internal operations, ranging from Alphas to desktop machines. They're networked on a 100M bit/sec Ethernet LAN.

Everything on the network, including T-1 Internet connections, is fully redundant and locked down. Keck says he's even commissioned a new local exchange carrier, Cavalier Telephone, to bring a SONET ring into UNOS headquarters so he could eliminate the single point of failure that existed on the net.

Member reaction to UNet has made the arduous two-year project worth every effort, Keck says with pride. "Users keep calling and saying how much faster and more efficient this is than the old system. That's what this job is all about for us," he says, beaming.

The payoff is tangible. All one has to do is stroll the UNOS hallways to witness it firsthand. Framed pictures of organ recipients and thank-you letters line the walls of this otherwise nondescript office suite.

Keck says one of his favorites is a long, narrow print of babies who have received new hearts at one transplant center.

Of course, Keck didn't bring UNOS onto the Web on his own. Counting programmers, systems engineers, help desk technicians, training specialists and organ center staffers, he oversees roughly 95 people. Keck says UNet has kept his folks challenged and happy: "UNOS' mission makes people feel like they're doing some good."

Keck says he feels that way himself. Doing good is what keeps him smiling at policy-makers and squeezing vendors to give him more for less.

As gratifying as the job may be, it does carry a grim reality. The number of potential recipients is staggering - as of late April, more than 69,000 patients were waiting for new organs nationally.

With life hanging in the balance, Keck and his team vow with tenacity to provide a glitch-free operation. "We absolutely can't have downtime. If we can't run matches, people don't get organs," Keck says.

- Beth Schultz

Join the newsletter!

Error: Please check your email address.

More about ADVENTAPTCiscoCompaqCompaqDepartment of HealthInfinityIT PeopleMicrosoftSECWalt Disney

Show Comments

Market Place