TORONTO (02/16/2000) - In a memorable scene from the movie Butch Cassidy and the Sundance Kid, those fabled outlaws, played by Paul Newman and Robert Redford, are chased by a posse to the edge of a high cliff above a river.
Realizing that their only means of escape is to jump, Sundance mutters despairingly, "I can't swim." Replies Butch, "Why, you crazy -- the fall will probably kill ya!"
When Jim Andrew became CIO of the new "megacity" of Toronto, he might well have felt like he had just put Sundance's boots on. And who would have blamed him for viewing the crushing task of getting IT through the amalgamation process with a fatalistic eye? Some might even have been tempted to console him with a Butch-like, "Why, you crazy -- year 2000 will probably kill ya!"
It is difficult to contemplate a more daunting scenario for a new CIO. Imagine having to ride herd on all of the following: creating a new unified organizational structure for the IT operations of six large cities (Toronto, York, North York, East York, Etobicoke and Scarborough) and one regional government (Metro); rationalizing and integrating all of the existing technology and applications of those governments; selecting and deploying major new applications and best-of-breed solutions; supporting ongoing business operations; downsizing staff by about 20 per cent; and, lest we forget, simultaneously preparing the entire megacity -- not only its traditional computer systems, but also its fleets, buildings, traffic lights, et cetera -- for year 2000 (this latter task in the space of a year).
Fortunately for Andrew and the rest of the new city's IT team, some of the groundwork for amalgamation had already been done. The IT departments of the six cities and Metro had worked quite closely because of a need to share various types of information, such as mapping and financial information. In many areas there had already been considerable standardization. For example, Metro had put in a fairly large network, and when the other cities upgraded their networks they essentially followed the same standards that Metro had selected.
But this information-sharing and standardization amounted to little more than a good start. There was still a mountain of work to be done, and it began well before the official amalgamation on Jan. 1, 1998.
Rolling up Their Sleeves
"Work started officially in May 1997. The groups were all charged with the task of showing how they would amalgamate and consolidate," said Chief Financial Officer and Treasurer, Wanda Liczyk. "The groups had to answer questions like:
What would the new architecture or structure of their operations look like in the context of the new city? What kind of savings and efficiencies did they think could be realized in the short term? What would each of these operating areas look like for the longer term?"
During this period the IT directors of the former cities and Metro got together and responded with a fairly comprehensive plan for IT. Said Andrew, "That plan went forward to the then provincially appointed transition team, which adopted the report pretty much in its entirety. That essentially became our roadmap for us moving forward to the megacity."
Consulting firm Johnson Smith had been hired to help put the new city and its organizational structure together. The IT organization started working with the consulting firm just as amalgamation took place in January 1998, and over the next several months the new IT model was fleshed out and finalized. Only part of that period, however, was spent on the restructuring process. In reality, it took about 20 days to do the work, interspersed with regular business activities and much-needed periods where those involved in the planning process could evaluate and mull over the major changes that were being contemplated.
Another important activity during this period was the mapping of all of the business activities that IT was doing in the former cities and Metro. "That was one of the biggest challenges, I think," said Andrew. "Getting all that historical data and mapping it, and then putting an organization in place that would be service-oriented -- figuring out what we were going to bring to the new city from a technology perspective."
A New Model for IT
By June, the team had come up with its new IT model, but not without first getting it critiqued by a couple of independent consulting organizations: LGS and Western Management Consultants.
The model is patterned along the lines of a manufacturing organization. There's a head-office function that does corporate standards, corporate initiatives, corporate IT strategy and major corporate studies. Infrastructure technology such as networks, desktop tools and telephones are all handled centrally. For service delivery, IT departments are located in each of the business units and they've been grouped into different clusters -- the so-called manufacturing plants. Those IT departments use common networks and common standards, but they've got business-specific applications that they have to support. They report in centrally but do the work decentrally.
Three client service directors, each with two of the city's six departments to look after, reside in the business units and act as the eyes and ears of IT, both corporately and departmentally. This is a big improvement over the past way of doing things, wherein head-office functions were isolated from the operating departments.
Liczyk likens the client service directors to salesmen in the private sector.
"They are the entrée into the other services of the organization," she said. "I am to go to my client service director and explain my IT needs. Then she will liaise with the other members of her division in terms of getting me the right solution to the problem I have."
Total Cost of Ownership Model
The next big strategic direction for IT is to move to a total cost of ownership (TCO) model for the 2001 budget process. "Right now we're mapping all our activities associated with TCO. This will enable us to put in a very cost-efficient model and give the departments the best bang for their buck," said Andrew.
The department is looking at a base cost per PC, which would include infrastructure charges. The base cost would include such things as help desk, communication line, network software, desktop software, and PC lease cost.
"After that we will go to icon pricing," said Andrew. "If you want Oracle it will cost you this much; if you want SAP it will cost you that much. This puts the onus on the department to justify the business need for having these other products."
IT is also looking at moving to a return-on-investment model on major applications. "Rather than having to go and fight for funds for a project every year, we want to move into the five- or ten-year business case for that model," said Andrew, "so after a project has been installed and is up and running, there will be a sustainable revenue source to maintain and upgrade that system.
I think this approach is quite revolutionary for a government organization."
Out With the Old, In With the New
A number of important steps in setting the proper IT course for the new city have been taken in the last year and a half. These include updating and standardizing the IT infrastructure, retiring several old applications and scaling up others citywide, and implementing some major new applications.
Partly as a result of Y2K efforts, Andrew says the city has "pretty well gotten rid of all the abacuses we found." It is standardizing on one desktop operating system, one desktop suite and one e-mail package, simplifying communications and document exchange. A leasing program will give the city the option of refreshing its PCs every three, four or five years. Client-server technology has been modernized and 577 servers have been eliminated, leaving about 500 mostly IBM RISC and Sun servers. And the plug has been pulled on four mainframes, leaving only one. In short, everything is easier to manage now, and has been brought to a single operating release.
On the applications side, the city's biggest challenge was the revenue systems.
Tax systems for the six cities (Metro did not collect taxes) were all running on different platforms; the cities all had common data coming in from the province but what they did with that data -- how they applied the mill rates -- was unique to each city. To complicate matters, market value assessment was moving forward so that had to be taken into account as well. In the end, the city built a very robust and flexible system that handles all of its property tax information, producing its first tax bill within eight months of the initiation of the project.
As the six cities and Metro had developed their own applications to handle various types of services, it made sense to look for best of breed applications wherever they could be scaled up to the needs of the new city. For parks and permitting, a system that was running in Scarborough was made city-wide and all of the arenas and parks data was put into that. Scarborough also had a good system for building permitting, and based on that a whole new integrated building management system has been built, which will result in inspectors carrying laptops with them to do building inspections, sometime later this year.
Implementing SAP and CA TNG
Support from members of Council and the enthusiasm of the staff have been two big factors in getting the job done. The city invested heavily in technology to achieve various efficiencies and help it meet its downsizing targets, which meant that many of the staff, including some of the senior managers, worked seven days a week over long periods to get the new financial systems up.
In the past year the city has implemented an SAP general ledger system. The SAP HR/payroll system is also being implemented over two years. "Then we'll start looking at add-ons," said Andrew. "We have to walk before we can run."
Another important technology choice was Computer Associates's TNG, which was put in over the last year. Running on all of the city's servers, it in many cases allows for unmanned operations.
Year 2000 Project Director, Lana Viinamae, is a fan of CA TNG's remote software distribution capability. "We have equipment in over 460 locations across the city and every time we completed testing an application or solution we then had to distribute it out to users," she said. "We didn't have enough arms and legs to run out and distribute the software to all the client sites, so we use automation to do that."
The software also allows the city to do asset management, remote workstation management, and virus protection. Added Viinamae, "With every desktop that went out we put this in place so that we could roll out the desktops without having to worry about when we rolled out the applications, because we had the tools to go back and touch that desktop from a centralized point."
The city now collects and keeps all of this inventory information up-to-date electronically. It knows what's on a particular desktop, who's using it, and where that desktop is. All of that information can now be queried.
The city has a very comprehensive Web site (www.city.toronto.on.ca) that it is continuing to expand. It's looking very seriously at Web-enabled applications and at e-commerce as a way to streamline and improve business flow.
"We're taking forward a capital project to look at improving ways of delivering service to the public using the Internet," said Andrew. "Our business applications are all Web-enabled now so we can foresee the day when people can use the Internet to book Parks and Recreational space, pay their tax bill, or receive city-related business information. We would like to enable a self-service mode so that people can do it themselves. We will just provide the interfaces and the secure gateway." Much of this capability is expected within the next year to 18 months.
As for developing electronic relationships with suppliers, a couple of minor pilots are now under way. In time, such capability will become part of the city's main e-commerce solution but an enterprise purchasing module will have to be implemented first, and that is still down the road.
Andrew feels e-commerce will be the key to significant future IT savings.
"We've met with a provincial government that has implemented e-commerce in a substantial way," he said. "With reduced banking charges and decreased paper flow, they say their cost of goods and services are reduced about 20 per cent."
The city is now putting in the necessary infrastructure to move in this direction. "Once we move into data warehousing technologies, e-commerce, and document imaging and management, that's where the big payback is going to come, and very rapidly too," said Andrew.
An important data-retention project is expected to establish data marts for the city -- the first step on the road to data warehousing. Today everything exists in silos and the city wants to be able to get at that information so that it can improve decision-making. Therefore data warehousing is likely to by one of the major IT strategies for the next five years.
The new city generates so much paper that it is important for it to be stored and retrieved on an easy basis. There are now pockets of document management in the organization, but the goal is to have one system that will be able to retrieve anything from a single piece of paper to a large engineering drawing.
"We want our inspectors on the road to be able to call up online and see a map," said Andrew. "We can't do that today - we have to go and get a big paper map. So we're looking at an enabling technology to allow people to be more efficient."
Having survived amalgamation and year 2000, Andrew and the IT department can breathe a little easier. Like Butch and Sundance, they've met their perils head on, wrestled them to the ground, and lived to fight another day.
(David Carey is a veteran journalist specializing in information technology and IT management. Based in Toronto, he is managing editor of CIO Canada.) Sidebar: How Toronto Licked Its Year 2000 Problem By David Carey CIO Canada For the 40 people manning the city of Toronto's year 2000 command centre this past New Year's Eve, and the additional 300 people (over and above the regular 7/24 staff) on site, the night proved uneventful -- boring even. And that's just the way they liked it. Instead of champagne corks popping, you might have heard a tab being snapped off a warm can of pop, or the crunch of someone munching a cracker from the cheese tray.
In short, it was a very quiet night, especially from a Y2K bug perspective. The only glitches were a few building systems that failed.
"Things that we predicted might fail failed, but they were very minor -- building security systems mainly," said CIO Jim Andrew. "Everything was promptly resolved. We identified the problems remotely and immediately dispatched engineers to rectify them."
It was an anticlimactic, if satisfying, end to a difficult and demanding year.
To an IT team already taxed by the huge demands of the city's amalgamation process, the magnitude of the Y2K remediation task came out of left field at them. It wasn't until Nov. 11, 1998 that Council approved going forward with the Y2K project. Needless to say, that was very late in the day.
"We got right off the ground at that point," said Andrew. "We had anticipated getting commitment from Council, and so we had teams already in place and had started to put tenders on the street to get the main players lined up. We had to attack all fronts at the same time, so we approached it very much like a military operation."
A four-star general in that operation was Year 2000 Project Director, Lana Viinamae. In a long list of responsibilities, she and her team had to look at the systems used by the six former cities and Metro and determine if any were year 2000-ready, and if so, whether or not they could scale to the size of the new organization. In addition, various non-traditional systems -- many with embedded chips -- had to be checked out, including more than 17,000 traffic lights, 1,600 buildings, 11,000 vehicles, and various facilities with process control systems, such as water-pumping stations and fuel storage tanks.
"The biggest challenge was the fact that different cultures and different technology environments were being crushed together," said Viinamae. "We had up to 450 people involved and we had to get them focused on the same objective and doing the work."
CFO Wanda Liczyk can confirm the difficulty of motivating people for the project. "I'm one of those who said I'm busy with amalgamation and trying to rationalize my systems -- go away Y2K. I don't need this." In the end, she couldn't avoid it. With the departure of one of her peers, she became part of the Y2K steering committee last July. That meant participating in comprehensive two- to three-hour meetings every week, reviewing the progress of different initiatives, and determining what action needed to be taken and when to bring issues to the attention of senior management.
In the end, success came as a result of pulling together people from all different corners of the city, along with considerable help from business partners and private-sector consultants.
Toting up the Benefits
With all the talk of Y2K overkill that has followed on the heels of the uneventful arrival of Jan. 1, 2000, it is useful to reflect some of the benefits that can accrue from a successful Y2K project. In the city of Toronto's case, there were many.
Said Viinamae, "We created a very strong project-control framework for the Y2K project that we plan to use for all subsequent capital projects -- so that we're very clear on where the accountabilities and responsibilities lie, and have the right resources at the table to get the job done."
From a policy standpoint, the Y2K project helped give the city the ability to establish and implement standards in a very timely fashion. Largely as a result of Y2K, the city has moved to a single operating release of a variety of its key technologies. Y2K also acted as a catalyst to secure the corporate commitment and necessary funding to achieve the IT restructuring essential to amalgamation.
Finally, and not least importantly, Y2K gave the city an accurate inventory of its systems, enabling it to discover a huge number of systems that were taking up storage space and resources, even though they weren't actively being used.
As a result, a significant number of systems have been decommissioned.