That was 10 years ago. Windows has come a long way since those days and we now see less of the infamous "blue screen of death". Software in general has matured more in the last decade. However, we are nowhere close to the level of maturity in commercial software where high quality is taken for granted. Any enterprise-scale software is fraught with bugs requiring continuous application of hot-fixes, patches, upgrades etc. Ask any IT professional in any company about the quality of software he supports and invariably you will hear complaints. Complaints are likely to be more vociferous when it comes to business applications software.
So why does software quality in general and business applications software in particular suck so badly? Why is it that software is not as reliable as (say) bridges? Some would argue that the software industry is still very new. After all, they argue, we've been building software for less than 50 years while we've been building bridges for 1000s of years. Others would argue that software is much more complex than building bridges or cars, and as complexity increases, so does the probability of defects. There are some who would point to lack of standardization in the way certain common software elements are built resulting in defects when trying to integrate elements into one application. Then there is the camp that feels that a significant number of software engineers lack the proper training and skills and that the software industry is probably the only industry where we don't make a distinction between technicians and engineers (think of electrician versus electrical engineer, plumber versus civil engineer etc.) - in software, anyone who works with code is a software engineer and problems arise when technicians try to do the job of engineers.
While all of the above reasons may be valid, I believe the fundamental reason for poor software quality is the way we build it. This is more true for applications software than systems software. In spite of all the advances in languages, IDEs, design patterns, open-source libraries, runtimes and standards, building large applications software remains horrendously tedious, labor intensive and error prone. I think we need a radical shift in the fundamental mechanics of writing code in a programming language. We need a new vocabulary for application developers to state their business logic rather than code it. Modeling tools have attempted to address some basic part of this but have been very code and programming oriented e.g. thinking in terms of classes in UML. Model Driven Architectures (MDA) never lived up to the hype and simply provide a language to describe the high-level problem without providing a real-life solution.
I have no idea what this new vocabulary and mechanics for building software would look like in its final form (although I have a name for it - Yeti), but it seems to me intuitively that it will have the following characteristics:
- Rules would figure in it prominently, and application building will primarily involve describing business rules
- Application logic would be expressed in a natural language syntax
- Application builders would not be programming (as we know it) in languages such as Java, C++ or JavaScript
- Applications would be able to process incomplete information employing fuzzy logic, and learning
Maybe I'll never see Yeti in my lifetime. Maybe Yeti simple cannot exist given the basic von Neumann model for computers.
Or maybe not.
No comments:
Post a Comment