Hosted by Microsoft at their Redmond campus, the Strategic Architecture Forum [1]is intended to offer an arena for discussion, among the 250 architects from large enterprises and the 100 Microsoft architects in attendance. There were attendees from around the world; I met people from Sweden, Denmark, Turkey, France, Germany, Malaysia, Japan and England.
The theme of the Forum was “software+services”. The vision behind this slogan is that applications will be able to draw upon components located locally as well as in the “cloud.” More on this later.
The Personal Enterprise: Ray Lane
Ray laid out a series of observations but in the end he contrasted the difference in agility an individual has against that of an enterprise through the example of how we make technology choices in our personal life compared with those that we make as a member of a large enterprise. He observed that personal decisions are generally much faster, focused more on what a solution will do. While enterprise decisions need to consider a number of other factors not taken into consideration by the individual, such as technology proliferation (and not to mention standards compliance, technology due diligence, due process, and the likelihood it is costing hundreds of thousands of dollars, not one hundred) he pointed out that they take longer and they are formulated in a process that often degrades into a discussion of what the software won’t do.
Ray posited the question of whether we could marry the two to create what he calls the personal enterprise. In considering the characteristics of personal technology decisions he has developed seven laws for the personal enterprise:
- individuals get value
- organic adoption
- contextual personalized information
- no data entry or training required
- delivers instant value
- user community
- small technology foot print
One attendee I spoke with after the presentation characterized the notion as “ivory tower.” Another offered “shallow.” I will save my opinions until later.
Ray offered a number of other observations:
- We are seeing the end of the current software sales model where applications are licensed to companies and maintenance paid. The replacement candidates include open source, software as a service, and outsourcing
- 85% of profit is made by 3 companies. Such an economic structure can’t be sustained
- 80% of the software companies are US based
- Commoditization will reduce pricing to 10% of their current pricing levels
- The software industry is the next one to be globalized, following textiles and automobile manufacturing
- A move is afoot from a component industry to a full solution/package industry
- 70% of the current software vendors will disappear over the next 5 years. So if a company has 40 software suppliers that means only 12 of them will remain
While interesting I wasn’t quite sure how these related to the title of the session. They seemed more related to a subtext introduced during the session IT doesn’t matter: it’s just a tool. In that vein he noted that the real advantage of purchasing software comes in part from the functionality and more from the speed of deployment. My interpretation of this is that purchased products rarely meet 100% of the business requirements. But rather than spend months trying to top-off the missing 10-20% you’re better off deploying it as is. It reminds me of a colleague who once questioned a 2 month labour strike to receive a $2 an hour pay increase.
Software + Services, Charles Fitgerald
The central notion of this vision is what Charles referred to as the federation of software and data: “you don’t necessarily own it all.” The power of this quote hits home when considering an example. In the brokerage business of the Bank I work for while transaction processing (buying and selling) is managed by internal Bank systems, the actual execution of the buy or sell is completed by the stock exchange (i.e., federation of function). While users of our systems are provided access to company information, news reports, SEC filings through our brokerage web site, these data are actually sourced from external providers (i.e., federation of data). These external sources of function and data are referred to living in the cloud. You don’t know or don’t care where they are. True, to a point. But there is some data, for example customer data, that you do care where it is, for many reasons, legal ones included.
The concept of software+service involves a little more than just federation; to make it work Microsoft has identified 5 pillars:
- Experience: having a compelling and usable user interface
- Delivery: how and where services are deployed (e.g., internally, or externally)
- Federation, as described above
- Composition: the act of combining the various and distributed components
- Monetization: making money
In my experience, the reality at present is there are a lot of inhibitors to externally hosted services such as legal constraints on the deployment of data, security, performance/service levels. So while there are many examples today of such a model, wide spread adoption will be contained until the inhibitors can be over come.
Microsoft Technology Road map
While Microsoft’s direction is strongly influenced by customer demand, broadly speaking, their fundamental objective is to improve productivity, specifically of the office worker. Towards that end, they are focusing on four areas:
- connectivity: global value chains are changing the business models and processes
- collaboration: human collaboration is essential
- exploit information: the information explosion is overwhelming
- better user experience: usability (making it all consumable) or as they say the best experience wins
Service Oriented Programming Models, Chris Keyser
In his presentation Chris surveyed a number of popular programming models/frameworks, including SCA (Service Component Architecture), J2EE, .NET, REST, and Ruby on Rails. He pointed out that a point-by-point comparison is difficult and probably not appropriate, but rather each one should be assessed by the type of problem they are trying to solve.
Chris used three means to compare each model: (1) a measure of complexity to use vs. depth of problem solved (2) a comparison against capabilities one might expect to be needed (3) a measure of the coverage across an architecture stack. For example, Ruby-on-rails was rated as easy to use but the type of problem is limited to simple database access type applications. While to extend its use beyond that type of problem is possible, the suggestion was that it quickly becomes very complicated to use. .NET was rated as moderately easy to use and able to handle moderate range of applications. J2EE was rated complex, but able to handle a wider range of applications.
With respect to SCA, currently a hot topic, Chris indicated that Microsoft is in a “wait and see mode.” They seem to be at a point that while they understand and support the objectives, a few questions remain, and specifically whether this an area that needs to be standardized. He noted that many of the capabilities promoted by SCA are already available in Microsoft tooling.
On the current state of Web services standards, he felt that while it had taken much longer than originally anticipated, we were just on the cusp of getting there. But he was worried of loosing the momentum.
In a discussion on complexity, Chris suggested that a complex infrastructure isn’t necessarily a bad thing rather it is more about the reason you have complexity: (1) you work in a complex environment (2) you have removed complexity from the application.
To follow the second point, if an objective is to push as much complexity out of applications into the commodity-software-based middleware and then a potential next step is to outsource its development. Keep your developers focused on learning and understanding the business and translating that into applications. Leverage outsourcer-provided propeller heads to deal with the underlying complexity. This would seem to be a means to both out source complexity as well as manage the type of complexity you want to deal with.
Round tables
The formal sessions ended with two round table sessions. These were intended to be a small gathering of people to allow for a highly interactive discussion. The first session I attended was on Repositories and service registries. The second was on Perspective-based Architecture.
My take-away from the first round table was: (1) Microsoft is working on a model to manage all meta data so as to provide a complete context for operational information, etc. This should be announced next fall (2) this is a long term journey.
I may have to rethink some of my plans for next year.
The second round table, Perspective-based architecture (PBA), was a big surprise, and turned out to be the best session of the day. They provided each in attendance a white paper and a reprint from an article in the most recent Microsoft Journal. I have yet to read them.
The purpose of PBA is to enable IT architects make better decisions faster. The presenters positioned PBA as a plug in to other methodologies, such as the one we use (TOGAF), but not as a replacement. They argued that many of the current methods promote repeatable activity (when to make a decision, who makes a decision, how to document a decision, etc.) but not better decision making. They observed that if you don’t ask the right questions you won’t get the answers you need. The essence of PBA is a method for asking those right questions. An interesting perspective; one that might help with succession planning.
Overall, I was pleasantly surprise that the sessions where focusing on higher level problems and not discussing detailed product issues; MS-SQL was only mentioned once or twice.
After the conference, I took a stroll down Bellevue avenue. With all the lights it looked like a Christmas Wonderland.
Leave a Reply