Service-oriented architecture (SOA), Web-oriented architecture(WOA), Web services, Web application programming interfaces(APIs)–why are we so concerned with these things? Why do we keepdefining new ways to describe interoperability between disparateand like computing systems? Why do we need to create standardsbodies such as the W3S or ACORD? The simple answer is because weneed to ensure systems can interact in a reasonable and predictablemanner. But the underlying reason they don't play together nicelyis information systems have become too complex. That complexityseems to be increasing exponentially and soon may reach a pointwhere only very large organizations with very large IT budgets andsubstantial staffs can even hope to cope with that complexity.

|

Complexity has increased at the application level,administration level, infrastructure level, and programming level.As business applications become more feature rich, they also becomemore difficult to use. Windows Vista and Office 2007 havefrustrated and disappointed thousands of users. Many organizationshave yet to upgrade to these–now no longer new–softwareproducts.

|

Applications that should be based on standards (such as Webbrowsers) refuse to adopt standards in the same way. The World WideWeb Consortium (W3C) publishes standards every vendor should usefor Web interactions. Why is it then when I design a Webapplication I know in advance it will exhibit different behavior inInternet Explorer (IE) 6 and IE 7 and Firefox (which doesn't seemto change a lot between versions)?

|

I recently discovered Safari installed on my personal Vistamachine–apparently installed because I wasn't paying attention whenI upgraded my iTunes software. I really don't want to startsupporting a fourth or fifth Web browser. Applications designed tointeract with each other often interact in unexpected and anomalousways. And unexpected behavior is the quickest and easiest route Iknow to losing end-user acceptance of any new product.

|

It's All About Productivity

|

Software applications whose intended purpose is to increasebusiness productivity should do that. Green screens were (andprobably still are) adequate for data entry and data retrieval.That does not mean we should return to a world of dumb terminals orterminal emulators, but it does mean we should not make the processmore complex or difficult.

|

I am pounding this article out on the latest and greatest officeproductivity suite available, but I am not using any features Ididn't have available in Word Perfect 5.1 running on a 486 DOSbox.

|

Then when I am finished, I need to save it back to a previousversion, so my editor can read it. Don't get me wrong. I love allthe features available to me with this office suite of software,but I doubt I really need (or will ever use) all those bells andwhistles.

|

Software Administration

|

Administration of software systems also has increased indifficulty. With more services available from a messaging system orcollaboration system or policy system, the administrator of thatsystem has greater responsibilities. Even when the intricacies of aparticular system are learned and understood, the daunting task ofintegrating with everything else can be overwhelming. I have seenorganizations with 500 users and an IT staff of a half-dozenindividuals, and these staff members need to maintain all thedesktops; manage all the infrastructure; support, manage, andmaintain multiple LOB systems; and manage multiple databasesystems, all while they are tasked with bringing new software andsystems online. It simply is becoming too much. Nothing gets donecorrectly because the people managing multiple systems probablyaren't even certified in them. There never is enough time toprovide adequate oversight of existing systems. No small wonderrunning Software as a Service and hosted systems are getting somuch attention these days.

|

Coding to Complex Systems

|

When I was serious about writing code (back in the C/C++ days),it was possible to do almost anything with a system if youunderstood the basics of processor technology and the operatingsystem you were working on. If a published API for an applicationor operating system didn't provide the functionality you needed, itwas easy enough to drop down to the system level and do what youneeded to do.

|

Web services has changed all that. APIs don't always work theway you expect them to. Programmers spend half their time doingGoogle searches and reading blogs determining how "really" tointeract with an API. Even then, a lot of it is just hit and miss.Complex systems throw so many variables into the mix thatsoftware-manufactured code samples are virtually meaningless in thereal world.

|

I love to hook up a vendor demo application to its demo databaseand watch the magic. Then you try to implement that same magic inthe real world, and you discover how many shortcuts the vendor usedin its demo. I recently was configuring an application to performsome interaction with a customer's LDAP system. Even though I wasfollowing the vendor instructions word for word, the interactionwas not happening. After a few hours of soul (and Web) searching, Idiscovered an undocumented trick (or missing step) that caused theapplication to start working correctly. Now, I wonder whether mylast trick was really necessary, or was it only necessary becauseof some other problem with the systems I was working on? I wonderwhether this was a step that just was left out of the vendor'sinstructions because the "missing step" never is missing in thevendor's test environments. More than likely, we never will knowthe answer. Far too often we get a system working, and while wethink we have documented everything, we simply accept the fact itis working and leave it at that.

|

Standards

|

I was involved in a very successful project that includedmultiple software vendors providing parts of a complex solution.The system was designed using what appeared to be best practicesfor a service-oriented architecture that used Web services forinteroperability. Problems arose when every vendor's interpretationof how it would expose and consume Web services was different.

|

The project involved integrating fat-client agent systems withWeb-enabled agency and rating systems as well as integration withiSeries back-end policy administration systems. There were at leasttwo components of the system that essentially were black-boxsystems.

|

One system was designed to input ABC and output XYZ, but we hadno control over how the output was delivered. The vendor apparentlywas not concerned at all with SOA. If we needed an HTTP post for aparticular payload, we had to create that post outside of thesystem.

|

Another subsystem also was black-box-like. It did what it wassupposed to do, but configuring it to do so was a complex anddaunting task that could be accomplished only by an individual whohad deep experience with that subsystem. Black boxes do not makegood software design. During the development process, it never wasreadily apparent exactly where a breakdown was occurring eventhough we had extensive logging built into the environment.

|

The complexity of the system and the lack of a common method ofinteroperability made troubleshooting difficult. This was furthercompounded by our lack of ability to see into the black-boxsystems. Additionally, any change in the environment from firewallsettings to DNS configurations to LDAP changes often hadunpredicted results.

|

Ideally, we would have mapped out a full integration schemebased only upon net service endpoints. Each vendor then would havebeen responsible for working only to those endpoints.Unfortunately, we were working in a real world (as opposed to aperfect SOA world) environment.

|

SOA is a powerful methodology, but like all systems, it is onlyas strong as its weakest link. If not everyone is going to play inan SOA environment, everyone else is going to suffer and needs topick up the slack. In this same project, we were dealing with XMLpayloads that were based on ACORD XML but were not exactly ACORD.That meant someone had to spend a significant effort to map the XMLto "real" data. That effort did not put the project at risk, but itraises the question of why we even have standards if we aren'tgoing to use them the way they are intended.

|

The Rest of the Story

|

The real problem with complex systems is we have only anillusion of control. Earlier I discussed a little software glitch I"solved" by changing a setting in the system (it actually was asystem account permission I changed). That particular fix was on a64-bit Intel system on a physical server. I do not ever rememberneeding to set that same permission level before. Yet I am not 100percent certain I was working in an identical environment. Maybethe last time I implemented that feature was in a 32-bit VirtualMachine running on a 64-bit physical server (or a multitude ofother possibilities). In fact, I have no idea why it now works.

|

Which brings me to the root of the whole thing. Complex systemscan exhibit behavior that is not obvious from analysis orunderstanding of the behavior of the component subsystems. There isa field of science that studies complex systems. Generally thatscience is concerned with systems such as climate or economies orenvironments. Complex systems often behave in a nonlinear fashion.There is not necessarily a direct progression from A to B to C.Nonlinear behavior definitely is not a desirable property of aninformation technology system. In fact, consistently demonstrablenonlinear behavior should be grounds for rejection of a businesssoftware system. On the other hand, it might provide the basis forcreating real artificial intelligence. Nonlinear thinking is onehallmark of intelligence.

|

Do You Know Where Your Bits Have Been?

|

I have seen too many instances where software behaves inunexpected ways, throwing unexpected errors, and producingunpredicted results. You can say this is just the result of poorlydesigned systems and badly written code. I think it is more likelythe result of systems with millions of lines of interacting codefrom the processor to the operating system to the application tothe I/O devices to the transport layer and so on. Not all of thatcode was written as a cohesive whole, nor was it all meantspecifically to interact in a consistent way. How many "fixes" arethere in an operating system that appear to make it work correctlybut may be nothing more than a luck kludge? Today, we are dealingwith extremely complex computing systems with the promise they willget even more complex. As that complexity increases, the reasonableexpectation of predictable results decreases. It may be about 0sand 1s, but unless you can follow that train all the way back toits origin, you never can have 100 percent confidence in theresult. Should this be a cause for alarm? Probably not right now.Systems likely are predictable 99.999 percent of the time. It isthat other 1/1000 of a percent that bothers me. TD

|

Please address comments, complaints, and suggestions to theauthor at [email protected].

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

  • All PropertyCasualty360.com news coverage, best practices, and in-depth analysis.
  • Educational webcasts, resources from industry leaders, and informative newsletters.
  • Other award-winning websites including BenefitsPRO.com and ThinkAdvisor.com.
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.