Skip to Content

Software solution implementation: Achieve high performance and data integrity

October 18, 2018 Article 3 min read
Authors:
Dennis Bagley

How do you determine if a software application is ready for prime time? These best practices can ensure a successful launch.

A staff member implementing a software solution.

Picture this: You’ve just rolled out a new enterprise system, but the development and deployment had issues. The project’s execution was delayed, the software solution was underdeveloped, testing was insufficient, and critical stakeholders voiced reservations about overall functionality and performance. Despite these concerns, you released the solution to production in order to avoid further delays.

This is an extreme example, but unfortunately, it’s not an uncommon one. Still, knowing that nothing is ever “perfect,” how do you determine if a software application is ready for “prime time?”

Project documentation

Start by reviewing project and testing documentation. This should provide an initial indication of the solution’s robustness, performance, and data integrity. The issue log is another good source of performance data, as it describes defects and tracks issue resolutions that involve both the integrator and user. This information should be reflected within project status and management reports.

Unfortunately, in many instances, companies may not adequately develop baseline project documentation and reports, perform robust testing, or log issues and their associated resolutions. Under these circumstances, it may be difficult to assess the system’s performance.

In many instances, companies may not adequately develop baseline project documentation and reports, perform robust testing, or log issues and their associated resolutions.

Solution performance

A well-designed implementation will have standard reports within the solution to check performance and data integrity. However, some applications lack robust reporting capabilities. In these cases, organizations may realize they have issues because of the solution’s lack of performance, feedback from stakeholders or customers, and from the solution’s output. Prior to undertaking corrective actions, organizations should establish a baseline of the solution’s current performance and measure the level of improvement due to specific efforts.

How do you determine your solution’s current level of performance or data integrity?

Solution performance and data integrity testing

Companies often hire consultants to help evaluate system performance and establish a performance baseline. Take Plante Moran, for example. In addition to reviewing project documentation and reports, we leverage a variety of techniques and tools to assess system performance and data integrity. For instance, we use business intelligence (BI) tools to help us review and segment large volumes of data into smaller, uniquely defined categories, sample the accuracy of these segments, and use a statistical analysis to derive the application’s overall data integrity.

We recently conducted such a review for an association that had implemented a membership management solution. The solution had numerous customizations and implementation issues. Upon release, the association realized the application wasn’t generating accurate financial transactions. They engaged us to determine the potential error within their system. We conducted the following analysis:

1. High-level filters. We started by reviewing all significant financial transactions produced by the solution, which totaled hundreds of thousands. We extracted and uploaded the transaction sales data into a BI tool to allow us to segment data easily and applied high-level filters to identify and segment sales transactions with different error rates, ranging from low to high. In segmenting the data, we assumed that the error rate would increase as the sales transactions become increasingly complex. We ultimately segmented the sales transactions into approximately 10 categories, with varying error rates.

2. Data sampling. Once the data was segmented, we sampled five to 10 transactions from each category. We increased the sampling for those categories that exhibited a higher error rate to gain a better understanding of the error and improve our sample error estimates.From our sampling, we determined that errors stemmed from:

  • Data migration miscues
  • Programming or logic defects
  • Inappropriate manual adjustments
  • Inappropriate use of discount codes

Many of the data migration and programming/logic issues were identified and resolved during the first several months after cutover. Defects associated with manual adjustments and discount codes have continued and increased with transaction volume.

3. Population error estimates. We used statistical methods and the sample error rate to determine the probable error of each category. Using statistics, we estimated that the current solution had a probable error rate of 2 percent beyond the errors identified by the client prior to the assessment. This error rate was considered acceptable, but the system still required further improvement.

Our assessment recommended that the client improve project governance and in-process checks, define and document how to execute manual adjustments and discounts, and continue to measure data integrity to gauge improvements. It confirmed that the client’s efforts were improving data integrity over time and provided senior management with assurance concerning the solution’s performance.

In conclusion

Project documentation, solution performance, and data integrity testing are critical to a success software implementation. If you’re implementing or have implemented a critical solution with data integrity issues, give us a call.

Related Thinking

Shopper looking at products in grocery store aisle, considering SKU rationalization and accurate costing data.
April 22, 2024

The art of SKU rationalization: Getting accurate costing data

Article 5 min read
Close up photo of robotic arm manufacturing equipment
March 22, 2024

The golden age of supply chain technology

In The News 6 min read
Business professional talking to their clients about risk management.
Jan. 23, 2024

Supercharge your risk management through data automation

Webinar 1 hour watch