VSLive! Speaker Interview—Mike Devlin

Rational Software belongs to a handful of companies whose products span the full spectrum of lifecycle tools-everything from requirements analysis to production testing. So who better to ask about the impact of .NET on enterprise application development than Rational founder and CEO Mike Devlin? Mike leads the VCDC keynote Agile Software Development: Where Design and Code Meet. Check out some of Mike's views below in this special VSLive! Speaker interview.


Let's discuss the impact of Microsoft .NET on large-scale software development. What will be the most fundamental change for developers?
Being able to focus on developing the application proper instead of having to spend so much time on infrastructure issues. If you talk with a lot of CIOs and development team managers today, you'd be amazed to find that 70 percent of development goes into infrastructure, while only 30 percent goes into building the business logic, application architecture, and application functionality. But the combination of the .NET platform and Visual Studio .NET reverses that by handling so many infrastructure issues automatically.

This continues the tradition of Visual Basic, which made it easy for a developer to access and build upon the relatively complex APIs of Windows. Now .NET takes that same model for developer productivity and applies it to a much broader distributed domain that includes servers, Web Services, and in fact the whole Web. The underlying complexity of this environment is even greater than that of the old Windows APIs. Fortunately, .NET hides some of that complexity from the user. It's not perfect, but it's a big step in that direction. You get a common platform, a common set of components, a common set of services accessing those components, and supporting distribution and access across the Web.

You say "It's not perfect, but it's a big step." What are the gaps in .NET that development managers should look out for?
The biggest warning concerns training—particularly hands-on training. Don't assume that just because you used VS6 you know it all. Visual Studio .NET and Microsoft .NET are powerful. To benefit from that, power developers must understand both the development environment and the runtime platform. Fortunately it's easy to get started with VS .NET, build a few small sample applications, use the tutorials, and thereby learn to leverage the power of the solution.

What changes in the development environment itself will most affect enterprise developers?
Being able to integrate the VS .NET IDE with ISV lifecycle development tools makes it easier to tackle large-scale projects. Visual Studio .NET has been reengineered substantially, with a whole new framework that makes it easier to integrate the VS .NET IDE with third-party lifecycle tools, such as ones for project management, modeling, version control, and testing. Today they all function as separate tools. With VS .NET, developers won't be thinking, "I'm invoking this separate tool." They'll just be using another function of their IDE.

How did Microsoft swing this?
They provided the ISV community a common framework to use—a complete set of technical APIs, and the Visual Studio Integration Program, which supplies technical assistance, early access, and marketing support. Microsoft has always been good about enabling the ISVs when it develops a new platform, and it's learned a lot of lessons along the way, which it's applied specifically to the development area. Microsoft is definitely easier to work with than most other vendors. Also, Microsoft has been able to couple the development environment tightly to the underlying framework. And because of the common platform, the cost is lower for an ISV to bring up new functionality in this environment. As a consequence, developers get not only the current capabilities of the platform but more from the community of providers building on top of the framework.

Good as all this sounds, there's going to be a learning curve. What do you recommend IT managers do to bring their teams into the .NET era?
The technology's great—no question about it. So it's really more a human issue. Training has a short half-life if not applied, and success breeds success. So you need to get the lessons learned in a contained way. Take some of your best people and put them on a real-life project that's important to your business. Pick a project you can apply .NET to quickly. It could be an application or set of components. And while you're doing all this, in the background think about having an overall architecture for the domain you're in. Architectural issues are a little independent of .NET, but without an architecture, it's hard to take full advantage of .NET. I think some organizations will need to get a better handle on that.

Sounds like a plan for the developers. How about one for existing code?
A lot of things can carry over, including lifecycle processes such as configuration management and change management. As for current applications, if you've been developing business models, you have a starting point. For whatever you're not going to migrate, your model provides the foundation for encapsulating that. There's no silver bullet here, but I'd pick the dozen or so services that are critical and package them up as Web Services.

And when everything is a Web application that has to be up 24x7x365, how do you do testing?
It's a real issue—one that our telecom customers have had for many years, by the way. .NET provides a standardized approach for how to test distributed Web applications, but new challenges remain. Say you depend on a given Web Service. Then its vendor publishes a new version of the Web Service with a different compute/sum analytic metric. Then the next time your application calls this Web Service, the updated function breaks your application, causing it to crash.

So now these testing issues cross companies and organizations—that's the big change. The way we expect most reliable services to work, they'll still have a configuration management issue. As a consequence, a Web Service vendor might have to provide several iterations of a mission-critical service at once until all customers have tested the latest version thoroughly. The core of what it means is the basic change control and configuration management issues are still there.

.NET is trying to provide standard ways for vendors to talk about this to one another about change control, and we're trying to make sure our tools scale up. This relates to modeling as well, with defining what a Web Service is supposed to do. If you have a good spec for what the Web Service does, I can test against the spec. We're never going to achieve nirvana here, but with the modeling there's more decoupling—our teams don't have to coordinate as much. The same holds true if I'm a paying customer running a mission-critical component against your service.

You're implicitly raising the issue of working in heterogeneous environments, as most of our readers do to varying degrees. What is .NET bringing to that party? Sometimes Microsoft has appeared to be pretending no one else exists.
Microsoft isn't just doing .NET separate from the rest of the industry. It's trying to provide open standards. For example, Microsoft and IBM are cooperating on a lot of standards having to do with the Web, such as UDDI. It's the only way we'll be able to get financial services providers to support Web Services. Even if you're just using Web Services internally, testing and configuration issues arise, but when you cross organizational boundaries such issues become much more interesting. And not just for new projects. Either internally or across multiple organizations, some can be old legacy systems you're making available as Web Services.

It's nice that IBM is playing ball with Microsoft regarding .NET. How about bitter rivals like Oracle and Sun?
This is where the Web Services thing works. Web Services operate at a higher level than EJB and the like. You can make the interchange based on XML. So it becomes less critical how the underlying application is integrated. You can leverage the Web Service mechanisms, then migrate to common technologies later on if needed. No magic required here.

Ten years ago this conversation probably would have centered on object-oriented programming, yet so far it hasn't come up.
OOP is a small part of the bigger picture at the enterprise level. What we're doing now is analogous with the way Visual Basic hid underlying complexity. You simply can't spend all your time at the language level and get your projects out the door. You have to think at the component level and the project level. Underneath you'll still find all the principles of OOP, such as data abstraction, encapsulation, and information hiding. But as an industry we're focusing at a higher level of abstraction.

Does that also hold true for the messaging-based programming model?
The messaging model places a little more emphasis on the workflow aspect. I see it helping to break down barriers between business modeling—thinking of workflows through your business—and application development per se. The workflows act on objects, so if anything, .NET-based architectures tie OOP and messaging together and provide more commonality.

What does .NET-based architecture do for developers that's new?
In a word, patterns—Microsoft's paradigm for using a set of components. Patterns aren't new, but they get much more momentum with .NET. The focus for an application architecture should be to provide a set of models for common architectures. So you're looking less at a core object model than on pattern mechanisms. And both developers and ISVs can build on Microsoft patterns, of course. We could provide patterns for how to build certain classes of applications on top of .NET—for example, ones for building B2B or B2C apps. I can imagine some of those becoming more verticalized for particular industries, such as logistics and financial services. But initially we're focusing on horizontals, along with providing powerful pattern-generating mechanisms. On the enterprise side, a bank could build a number of different apps and in doing so generate patterns for how to use .NET and capture the essence of the bank's needs. Once those patterns have been generated and distributed, application writers will find it harder to make mistakes.

ISVs can't do patterns by themselves, so all this really depends on the Microsoft .NET architecture. In this environment a customer can build both patterns and components that are reusable. It's easier and more appropriate to do a pattern-based architecture, reusing not just components but whole architectures.

I've been seeing texts about patterns for years. How does .NET make pattern-based architectures more doable?
The key is if this is all separate from runtime environment and the implementation tools, it will be hard to do. Architectures can't be abstract patterns only analysts do. They must be real live artifacts that developers can connect directly to implementation artifacts such as Visual Studio .NET. It's also easier to generate testing from such models, because the models and the code can be much more tightly coupled, as ours are. So it allows us to make the models more valuable. You can generate test cases, allow for model-driven development, and even debug at the model level later on. It makes the models more live—more tightly coupled with the implementation artifacts—and lets the developer view the model as a different way to interact with the application.

Given that the .NET platform is open to all ISVs, doesn't this mean that developers can mix and match .NET-aware lifecycle tools and get this kind of integration?
There are different levels of integration. The VS .NET environment provides much more integration, and in that sense it is certainly easier to use different vendors' tools. But you still face the issue of deeper semantic integration, such as data integration. With different vendors' products, will you get, for example, common representation of use cases between your requirements management tool and your modeling tool? You benefit from deeper semantic integration, where each tool knows how the rest work. For another example, our component test tools rely heavily on information stored in the models for test-case generation and test-stub generation. That's harder to integrate between vendors. So you can get good UI integration and control integration across vendors, but not data integration and process integration. Deep integration helps us in providing more lightweight, agile versions of lifecycle tools as well. That's why we didn't just port our stuff to .NET. We rearchitected it. The fundamentals don't change with .NET, or with lifecycle tools, but the practicality of using them with a wider variety of projects will.

The old saw goes, "Wait for rev 3 of any MS product; by then it will be in great shape." That has to be doubly true with a technology as vast as .NET. What would you tell development managers who think they should sit on the sideline and let others work with the inevitable bugs in a first release this ambitious?
Like all technologies, there will be glitches along the way. However, we've been impressed with the completeness and robustness of both VS .NET and Microsoft .NET. Remember, in the past we integrated with many Microsoft technologies but only embedded a few in our product. Now our basic product architecture depends upon Microsoft technology. We cannot ship if VS .NET does not work. Our experience (having built millions of lines of code on this stuff) is that this is a stable platform. We are betting our business on it, and I am happy to say that we are completely confident. Admittedly, we were pretty scared a year ago, but it is now clear that we made exactly the right bet. Those companies that "wait for version 3.0" will simply miss the boat. Those that move quickly should see (and must demand) immediate business returns.


Mike Devlin
Michael T. Devlin co-founded Rational in 1981 and is currently chief executive officer and director. Before September 1996, Mr. Devlin served as Chairman of the Board of Rational. Mr. Devlin is a Distinguished Graduate of the United States Air Force Academy, Class of 1977, and earned a Master of Science degree from Stanford University.