Smart Architectures, Architects (Continued)
Anatomy of an Enterprise Architect
Matt Carter: Your title is a mouthful; can you briefly describe your role at Intel?
Prasad L. Rampalli: I serve as the chief architect for applications and the infrastructure that Intel uses around its IT activity. My job is to ensure that we have a unified architecture for both the applications and infrastructureI'm trying to drive a transformation to enable significant reuse and productivity in the development lifecycle, while at the same time optimizing our total cost of ownership (TCO) off the infrastructure. And how we do that in a unified manner, front to back, is at the core of what this title means.
Carter: How did you get to this position at Intel?
Rampalli: I've been at Intel for 19 years, mostly in IT over the past 10 years. Before that, I was a manufacturing engineer working on test systems and utilization and so on. At that time, I was looking for a management system that would give me information on specific test parameters tied to our products. I came into IT as an end userand as one of the harshest critics of systems that don't meet end-user needs.
Back in the early 1980s, the rate of capital cost of most of our process equipment was going up pretty rapidly. At the same time, the realization dawned on most of us that the utilization was terrible. That got the company into a specific focus: "Hey, let's figure out how our equipment is being used and how can we drive and improve the process."
I was an engineer at the time, and the focus was on looking at data and patterns and driving statistical predictive maintenance techniques based on reliable distribution. I asked the IT guys what systems we had to manage that, and they told me we were in a process of defining requirements. I looked at the requirements document and realized we didn't have a lot of bearing on the problems I was trying to solve. It became clear that IT had to engage in the business in a different way to comprehend requirements of technical solutions. And that got me into Information Technology.
Carter: The process for moving all of Intel to a single platform sounds daunting. How did the transformation go, and what benefits did it bring to your company's architecture?
Rampalli: Back in the early 1990s, we didn't have a standard client OS, and I was asked to come in and implement Windows 95. I got in there and looked at the problem statement, and I said, "OK, we can implement Windows 95. But isn't the value proposition one of TCO and agility, and how you move from one upgrade cycle to the next? Looking only at the client is not going to solve a problem.
So we defined that whole program to transform the company's environment to a single platform. At that time we had four operating systems, eight different client configurations, six different types of hardware, and multiple configurations on platforms. From there, we went to a single network operating system, a set of finite images on the clientessentially one for the desktopand a standard build on the back end.
We moved all that to [Microsoft Windows] NT in 1995. I still remember when I got in front of Craig [Barrett, then COO and current CEO of Intel] and Andy [Grove, then Intel CEO and current Chairman of the Board] and the rest of the folks and told them that going with NT will deliver the lowest cost-value proposition. A ton of people from the Unix camp thought it was crazy to promote or even support this. Their concerns involved scalability and the reliability of an industrial-strength transition; everybody relied on Unix for that. Microsoft technology was viewed as something you could implement for personal productivity, but not something to run Intel's complete environment. After significant debate, we decided on NT, and that implementation essentially set the foundation for us to move to an evolutionary process.
Carter: How has this evolutionary process moved forward?
Rampalli: There were about 40,000 users who had to be migrated onto this environment. There were about 600 applicationsI'm talking about significant applicationsthat had to be tested. We had application loads, or "bundles" as we called them, based on user profiles, and we had to create this application repository that was therefore going to be a reusable environment. It was a great experience because it gave me a sense of how influential the standardization could be in the environment, and how the nuances of standardization are beyond the core of decisions with NT.
The mid-1990s saw the dawning of the Internet, and the feeling at Intel was: "Hey, let's have Intel be a shining light in the Internet space, a showcase on Intel architecture that demonstrates significant breakthroughs in business logic, using the Internet system technologies." And I think to myself many times that if we hadn't put the standards infrastructure into the environment, we wouldn't have moved as fast as we did on the implementation of the Internet infrastructure running on top of this.
When we started the Internet implementation in 1995-96, it was just a base foundation hosting dynamic static Web sites. But it quickly moved from that phase to the phase of business processing, which really started in 1997. I would say 1998 through 2002 we focused on integration, during which we ushered in a ton of technologies as standard, reusable layers for the environment. All those layers have been possible because the foundation platform was already there.
Back to top
|