The Software Selection Process - Part I

 
 

We first look at why the selection process is a high risk exercise and how the evolution of a typical software package is the cause of many difficulties.  The first part of the article concludes with three new developments that will lead to cheaper and more efficient systems.

In the second part of the article we consider the problems surrounding software selection and suggest a few simple but little used techniques to avoid buying a package that does not meet your expectations and costs a fortune to implement.

For the last 15 years we have worked with software vendors of large complex systems and have experienced the problem from their side of the divide. We know why software companies have limitations, why implementations are so long and costly and why many vendors cannot fix the problems with their software. Most importantly we know the true cost to the customer of the vendor not being able to deliver what was expected; financial cost, difficulties for the implementation team and the irrecoverable loss of good will with the vendor and possibly a long term distrustful relationship.

The selection of a software package for your business can be a difficult and time consuming task. There are three reasons for this:-

  • software applications are made up of complex sets of hidden logic, there will be very few people within the vendors organisation who understand all of the parts let alone the whole picture;
  • software is likean iceberg, pre-sale you only see a fraction of it and without hands-on usage it’s difficult to fully understand how the application will match to your requirements:
  • and lastly customers may not have access to the technical skills required to thoroughly examine the software architecture and build quality so they focus on the visible parts of the system which unfortunately are normally brochures and demonstrations designed for sales and marketing purposes (and sometimes a far cry from the system supplied for implementation).

For many customers faced with evaluating how a package can perform a large number of business processes, (and with most vendors able to think of any number of reasons why you should not have a ‘hands-on’ trial of the application), the degree of fit is only discovered after you have signed the contract and are implementing the system. In addition it is quite common to find that the tasks you have been told are straightforward e.g. reports, require considerably more effort than you budgeted for. It is little wonder that implementations are invariably more lengthy and costly than planned whilst delivering fewer benefits than anticipated. However the cost to the business of the selection and implementation is dwarfed by the cost of work-arounds  (normally using MS Excel) that grow like wild-fire to make up for any deficiencies in the package or implementation. The whole costly process is then repeated every 5-8 years in an attempt to find a better solution.

So let’s look at the reason why most software vendors get into the difficult position of having to sell applications which they cannot allow you to test drive and know may be expensive and difficult to implement.

The evolution of software packages

 

If the marketing material reflected the capability of the software package most packages would appear to provide similar and comprehensive functionality. During the implementation many customers discover at their leisure this is not the case.

To understand the capabilities of a software application let’s look at how it was built and whether it is a sound construction providing the capability and flexibility customers require.

Software is a young industry, and unlike architecture or other established practices it does not have the best practice accreditations to standardise and ensure quality. However, just because software is not a physical object does not mean you should not follow the same inspection process.

To illustrate some of the problems let’s look at a typical vendor life cycle:

  • the software development company is formed 6-10+ years ago;
  • the development company has good knowledge of a vertical industry and builds an application to address their customer’s requirements;
  • the product increases its functionality in response to new customer requirements sometimes adding complexity as it becomes a solution for every new function or variation required by an increasing number of customers;
  • the product evolves adding  generic functionality such as configurable fields, workflow etc.;
  • the product adapts to new technologies by adding new layers of code onto the original core application.

This seems reasonable enough, what could be the problem? Given most people’s experience of implementing packages this approach does not produce a package that is straightforward to implement and maintain.

The founders of our typical company knew the industry and saw an opportunity to improve efficiency; they did what they knew best, solving specific business needs. Once they had completed their early sales new clients asked for more generic functionality such as an audit trail or document management. So over the next few years elements of the common functionality required by all applications were added in response to customer demand.

There are a number of problems with this approach:

  1. the vendor may not know the best practices for workflow design, reporting tools or document management, their core knowledge is in the vertical industry they address.
  2. during the early development of their package customers tended to be smaller companies with less demanding requirements than the larger companies that come along later. So the original code base is overlaid to meet more demanding requirements and can end up requiring constant rework for each new customer. An example of this was an American portfolio accounting package that was adding FX capability after it had built and implemented the investment transaction functionality.
  3. vendors cannot easily change the architecture of their product: new features are added some like sticky plasters, some like a bolt-on to the core product. The software becomes less flexible, more difficult to understand and maintain, and often impossible to move forward to take advantage of new technology capabilities such as Cloud deployment. One way to identify if this has happened with your vendor is to look at the number of versions of the software the vendor is supporting and the length and difficulty of managing the ‘bug’ queue or getting modifications and enhancements added to the product. If you are measuring progress in years and months rather than weeks if not days your business processes are suffering from your vendor’s service.
  4. the typical revenue profile for a software vendor is one-third license income, one-third maintenance income and one third services income. A major element of the services income is generated from assisting customers perform tasks that the customer should be able to complete for themselves. For example creating new fields, dashboards or reports. A cynic may think that this revenue blunts the vendor’s appetite to provide core components that the customer can use without their assistance.
  5. finally, for vendors with rambling Gothic-like software structures that are too big to be efficiently maintained let alone redeveloped the pace of technological change has left them behind. They are defending their businesses managing unhappy customers and selling to new customers based on their market presence and marketing capability.

These systems were built before technologies such as .Net or Cloud technology existed and are often poorly designed packages built on legacy code that cannot be efficiently maintained or developed to take advantage of new capabilities. So if you have had enough of CD installs, poor maintenance because of a myriad of versions, multiple patches/fixes and substantial upgrade costs what should you be looking for?

The future for software packages

There are three new developments that provide a way forward towards efficient cost effective systems to end the long struggle between business and IT. One development is Cloud computing which is enabling the second new capability, continuous maintenance. The third development is to use a new breed of software that provides all of the common core functionality as part of the original design so that business functionality leverages on these capabilities. Taking each in turn:-

Cloud Computing:  Cloud technology transforms IT infrastructure from a costly high risk business management problem into a low cost utility. The cost savings from moving from an in-house single tenanted installation to a multi-tenanted managed service should be significant and it may not be too long before finance directors are demanding these economies are made. Many people will also see a reduction in operational risk as a highly skilled data centre takes over all housekeeping tasks such as operating system upgrades. There are further operational advantages such as automatic scalability, guaranteed up-time, disaster recovery etc.

Several industries see the security of their data as an issue with Cloud computing but these concerns have been addressed and Cloud computing now represents an opportunity to make a major saving in IT cost. For some Cloud computing can offer a competitive advantage for global deployment and cost savings.

Continuous maintenance: Cloud deployment provides an invaluable benefit that enables a step change in the quality of software. With in-house client-server installations software vendors are dependent on Customers reporting bugs to them, preferably with a reproduction path. The vendor reproduces the bug, provides a fix and adds it to a periodic version release: whether the customer can take that release or the vendor has to patch multiple versions adds to this time consuming and fragile process. This process doesn’t address the large number of bugs not reported by customers because the application ‘seems to work’ so users do not report them.

In contrast a Cloud based application immediately captures all bugs and the vendor has access to the diagnostics they have devised. All bugs can be fixed as they occur which can lead to one of the most interesting benefits that is becoming an integral part of Cloud computing, the high frequency of program upgrades or continuous maintenance. With Cloud hosted systems version upgrades can be issued at any time and there is now a strong argument to support the contention that small frequent version upgrades keep software in better health than major periodic version releases. There is good evidence for this as continuous maintenance is now being provided by Microsoft for its flagship Office 365 applications.

New software architecture: this new software architecture is simple and transformational.  We have just been building software the wrong way round!  Rather than build business functionality and then add the core functions (which inevitably leads to a compromised architecture) the core functions should be built first and the business functionality added so that it can leverage on the core components.  The core components we are talking about include:

  • Auditing
  • Configurability
  • Document Management
  • Permissions
  • Querying/Reporting
  • Relationship Management
  • Workflow

Some software developers have gone one stage further and provide a comprehensive configuration layer to the core components allowing business requirements to be built quickly and effectively.

The next part of this article looks at the selection and implementation process.  We look at how to evaluate the long term operational costs and how to try to ensure that the core components will complement the business functionality. There is no point in having a system if you can’t report on the data or see who made changes to it. You should be on notice not to believe any vendor who says “of course we can do that”. Let’s proceed using the tried and tested adage of ‘believing nothing of what you hear and half of what you see’!