VSLive! Speaker Interview—Jan Popkin

It's expensive when software projects fail, yet many do. Modeling promises to reduce the number of such failures by making you conduct a complete review of a proposed application before you begin the coding process; the more complex your project, the greater the return on investment modeling promises to deliver. And yet—many developers seem reluctant to use such tools.

In a preview of his VSLive! session, Jan Popkin discusses these issues and how .NET changes the development landscape. Jan is Popkin Software's founder and CEO; the company has been producing modeling applications for more than a decade.


We have the about-to-be-released .NET architecture on the one hand, and the mature, existing space of application design and modeling on the other. How do you see these two things fitting together? How do they interact?
Modeling has continued to be a requirement for large, complex, or collaborative systems that need to talk with other systems. It allows you to draw a blueprint that gives all the participants a view of how the system should be built and how it will interact with other systems.

The .NET architecture is also based upon collaboration. The architecture is complex in that different pieces need to talk with other pieces or components both inside the architecture and with outside systems. This new architecture will also require a blueprint or a "model" in order to satisfy the business goals of the system correctly and minimize the amount of rework and maintenance. Without this model, the system can be difficult to build, costly to maintain, and might be impossible to evolve as business goals change.

Large and/or collaborative systems have required modeling, and the new .NET architectures will also require some form of modeling because of their complexity.

Where is modeling most widely used at this time?
Modeling is required in systems that are complex, that require collaboration, and that have a reasonable shelf life. These are clearly mission-critical systems, large government systems (aviation, defense, transportation), retail, financial services, and other large IT systems. When you start evolving to .NET, you move into the category of people who need modeling. Where larger projects have always needed modeling, and middle-sized projects used modeling where appropriate, .NET and its object-oriented structures are probably going to drive that modeling requirement down further, to smaller levels of projects.

Why is that?
People need blueprints in order to lay out their systems, to get their workflows right, and to have the system run with the right business objectives. You just can't do it as you go, and say "whoops" if it doesn't work. The systems that developers are building now are too complex for developers to go back and fix them when they were supposed to be done. In today's world, where development cycles are becoming increasingly compressed, companies simply can't afford not to model to stay ahead of or abreast of the competition. In addition, with large complex systems, companies can't afford to develop something and hope it meets its business requirements. They need modeling to ensure that every step of the way, the system is aligned to its business need. Sounds pretty simple, but in reality, most companies struggle aligning business and IT. But they are finding if the two aren't in sync, it's costly to their bottom line.

I can understand the importance of getting things right the first time. But, if modeling is so valuable, why do so many developers—possibly a majority—see it as bureaucratic overkill? How do you convince them?
Some of it is historic. Some is doing the right amount of modeling at the right time. Historically, modeling had the promise of generating complete applications. But that didn't turn out to be true. Modeling tools emerged as being key in laying out a blueprint, not the code. Modeling moved toward adopting industry-standard languages such as Unified Modeling Language (UML) and the current Business Process Modeling Notation (BPMN) for business modeling. Standards like this help companies incorporate modeling better into their development efforts.

Modeling is like building a house. If the end product were only a blueprint, people would say, "What is the value?" But people recognize the importance of a blueprint, that it goes a long way toward adding tremendous value to the construction process and eliminating unexpected surprises. They see it as one part of the process to drive toward the outcome—a house built to their specifications. The same is true for enterprise modeling. The key is to apply the right amount of modeling for the right project in a way that adds the value to the overall business process as opposed to becoming an end in itself.

In reality, CIOs are recognizing that modeling is a key competitive advantage because it promotes team collaboration, promotes data reuse, and reduces rework. Those benefits drive better-quality systems in the long run. And industry reports support that. Giga Information Group recently said that companies with strong architectures, driven by modeling, are able to adapt to new IT initiatives, such as Web services, in one-third to two-thirds the time and cost than those that don't use architecture.

Let's take a step back and look at the big picture. Where is modeling headed?
The industry has changed quite a lot through the years. Reuse and leveraging your code over the long haul have played a big role in this change. So, we've seen object-oriented programming, then component-driven and other systems rise to fulfill the promise of reuse. Modeling has played a significant role in that endeavor.

For example, IT has applied modeling in very different ways over the last 40 years. We've been very successful with aspects of it, too. For example, creating a blueprint before you build something can really enhance the chance of a project's success. The larger the project, the more true this is. And the payback was obvious to those people who had larger systems that involved team collaboration, and involved a high level of maintenance. But we didn't have a great deal of success when people said, "Build it now, get it out in three months, and then build it again."

In such cases, there's no maintenance required because you're going to throw it out anyway, and time-to-market is so important. The dot-com model was like this. They needed to build something, and it was OK if they had to build it three times, as long as they had the service running. I believe that approach has gone away. People are looking more closely at how they're spending their money, particularly in this economy. IT budgets have shrunk—you can cite many of the different industry groups—and people want to spend their money wisely and build it right the first time.

So, two things are driving the current model. One is the complexity of the current architecture, and .NET and Web services, which are going to drive modeling. Two is economics. People in IT are watching to see how they should spend their money, and they want to spend it wisely. They don't want to throw out systems because the design didn't meet the business need.

It's not hard to see where some large or complex systems might benefit from modeling, but don't the Web services that VS.NET is designed to create, with their inherent ability to be "snapped together," lessen the need for modeling?
I don't think VS.NET has reached the state of being "snapped together" yet. Before IT developers can snap Web services architectures together, you need to understand your business goals and be able to communicate them to the whole team. If we do reach snap-together assembly of IT systems, we are going back to the promise of cogeneration again. Then we should be able to use modeling tools to drive the snap-together process automatically. But I don't think we are there yet.

A lot of programmers say they can't see the value of a modeling tool unless the tool can be used to generate code. Others can't stand auto-generated code. How useful is code generation in real projects?
I have seen, if not two, then three cycles of code generation at Popkin and in my previous IT experience. It is extremely difficult to do full-code generating. What the industry has been able to support to date is skeleton generation. I'm not sure where the difference lies between skeleton generation and things like the Microsoft Class Wizard, which allows you to do more of the complex things that are mundane.

Having a class wizard is a good thing if you have to know exactly the right grammar and where to put all the stuff. This is especially true if the code you need to generate is complex, can't be remembered easily, and you don't want to train a lot of people. Without making good or bad of it, part of the success of code generators in general occurs when you take a class of IT solutions and narrow it vertically until such a point that you can generalize what the code generation would do.

So, you're saying the success of code generation is related to the discreteness of what you're trying to model. The smaller the scope, the more specific the functionality you're trying to create, the more likely you are to meet with success with code generation.
Exactly. There are quite a number of companies that based their entire history on code generation. Many of them have come and are now gone. It's been a very spotty situation. I'm dating myself where I grew up in relation to computer science, but there's another form of code generation. I know this is a bizarre analogy, but assume you take assembly code and map it to 3GL, to C code, to C++ code. You're taking something that was difficult in assembly language, and grouping it together in higher- and higher-level languages. As an industry, we've been able to generalize enough languages to get a computer to do more and more things.

Another good example at Microsoft is the analogy of C/C++, or now C# with Visual Basic. Visual Basic is a nice UI builder—and other things—but clearly has strengths in the UI area, and we use that to build our UI where we have a core of C++. We have a series of components built in other languages where appropriate. We felt that our UI was strongest by building it with Visual Basic. So there's our code generator. We don't really call it a code generator; we call it a higher GL. But that clearly took a problem such as building a common Windows IDE interface and making it easier to use.

On the subject of VB, System Architect is VBA-enabled. What is the future for VBA-enabling, going forward in the .NET universe?
Having been in the Microsoft family, and having gone through COM and DCOM, and then we had our own Basic, and then we had VBA … clearly we've been looking at System Architect and how it interacts for deployment in a .NET architecture—for how people will start deploying with .NET. So, the future is: We'll pay attention to this and as our customers drive us, we'll take on the extra .NET pieces. It's the users of our tool—the IT departments—that will direct us to develop what they need.

That's a natural evolution of the Windows-client architecture.

You've mentioned a couple times that the industry is moving toward .NET-style architectures. Does the latest version of System Architect have built-in support for .NET?
Yes. We built version 8.5 on top of our previous versions, but added support for the kinds of architecture that are not only possible, but recommended now. The latest version is set and ready to go to support .NET and Web-services type architectures. We believe that in order to have a successful implementation of a .NET-style application, you'll need to have a blueprint. Modeling tools help you create that blueprint.

What support for .NET does it include specifically? Is XML it?
In particular, we've embraced XML in the latest version of System Architect. For example, we now support modeling your business goals, and drilling down through the business modeling straight into the IT architectures with XML. This means that when you get down to the detail implementation, you can use XML.

When you're building a model, you can talk about the high-level business goals, then take those down through the IT architectures, and start reading things like XML. Whether you are using XML as a SOAP interface, a vertical-language collaboration vocabulary, or just a private message, you need to be able to model those.

You mentioned that XML is being integrated with data modeling tools. How will that integration continue to develop down the road?
XML isn't integrated only with our data modeling tools. XML is also integrated with our business-modeling tools and our IT modeling tools such as UML. We look at XML as a core technology. We intend to make sure that XML is available at different levels throughout our tools. And, in fact, if you look at our repository, it actually writes itself in XML. We'll be releasing shortly our XML Metadata Interchange (XMI) interface, which allows you to talk UML language in an industry-standard XML vocabulary called XMI.

So, we think XML is pervasive throughout. We're continuing to look for uses ourselves. As XML finds different uses, we will embrace them. Obviously, there are different vertical vocabularies out there where we can help our customers standardize. For example, if you're in the travel industry or the retail food industry, we can help you by preloading your XML vocabularies in the tool.

Overall, the best modeling tools will enable companies to select the best languages and methodologies for their IT projects, like XML for Web services, and then tie back the system design to its business requirements. That will make both sides of an organization—business and IT—happy.

How do you see the incorporation of XML influencing design methodologies?
There's been a lot of discussion, trying to get different methods to support different IT implementations. For example, there's been discussion of how you should model data in UML. And in fact, we see most customers not using UML for modeling data; we see them using data modeling. And in the world of business-process modeling, there's now an organization called the Business Process Management Initiative (you can find more information on this at www.bpmi.org) that we are involved in. The group is in the middle of defining a standard business-modeling language, and there is an ongoing discussion as to what pieces are appropriate. The BPMI has included a dialect of XML that allows you to exchange BPMI models. The move toward multiple methods is driving the move toward modeling because it advocates an integrated approach that ties together the standards and languages and then relates them to doing business.

Last question: Your company has moved quickly to take advantage of .NET and similar architectures. How quickly do you see your customers doing the same?
How fast? It's too soon to say. I think, like everybody else, and without making this particular to .NET, new architectures have taken hold. I think we've gone through the standardization process. We've agreed on XML, and we've agreed on different collaboration methods. Now Microsoft, IBM, and Sun—among many others, of course—are each positioning themselves with different views on how to execute this kind of vision. I think people will do their first prototype project, they'll take a close look at the cost of that, and then they'll roll it out again. But in general, I think everyone is moving toward this architecture. All we're discussing is the rate of change, not whether it will happen. .NET is emerging as the deployment architecture of note, and it's going to be hard to adopt others.


Jan Popkin
Jan is a software engineer with extensive experience developing large, complex systems. He is founder and CEO of
Popkin Software & Systems. The company produces enterprise-modeling tools, including System Architect V8. Before establishing Popkin Software & Systems in 1986, Jan was a software engineer for Logica, a large computer consulting firm based in the United Kingdom. He worked in the same capacity for TRW and Bradford National Corp. Jan was also one of the principal software architects for the American Express Image Processing System, the Bay Area Rapid Transit (BART) Control System, and the Bank of America Image Teller System.