How code.red helped a Global bank deliver an order of magnitude performance test increase for a regulatory compliance delivery
code.red was engaged to help one of the world’s largest banks meet a major milestone necessary to ensure compliance with the Volcker Rule, a key part of the Frank Dodds Act.
The client had concerns that the system which calculated the required metrics were unstable and that current testing was inadequate and not automatic. As a result:
- Test processes were taking too long – almost 24 hours to run just four processes. This led to significant idle time for developers who needed the results to proceed.
- Poor testing meant the client was receiving a large number of errors and the high level of rework was creating extended delays.
- Issues with the environment meant the client was not able to test the code as often as they liked.
- True DevOps was impossible; a failure to automate testing is a key (and common) barrier to continuous delivery.
We reviewed the client’s development practices and found a number of areas of bad practice, with the absence of performance testing a particular concern. The client had no automated and repeatable mechanism to create a test environment.
We were asked to build an automated performance test environment and to investigate whether performance could be improved by using another technology.
code.red engineers were embedded into the delivery team. The first challenge was to introduce automated testing and then investigate how performance could be improved by using different technologies:
Introducing Repeatable Automated Performance Testing
The team created an automated non-functional test environment that:
- Coached the team in Continuous Integration and introduced performance test into the build pipeline.
- Compared performance across different test runs to allow the impact of changes to be correctly assessed
- Speeded up the testing process and enabled more tests to be run through automation.
The resulting solution meant that every time a developer checked in a piece of code the necessary resources were requisitioned and a performance test run in a clean environment against the current baseline.
Pinpointing The Best Tool for the Job
After successfully creating the test environment, our team worked on producing further performance improvements by taking one of the four processes and re-writing it in another technology.
The original test environment was built Java running on a WebSphere Application Server (WAS), and this was tested against the same environment rewritten in both standalone Java and PL/SQL.
These tests were run with the expectation of further efficiency improvements. Most development teams are constrained by their organisation and only allowed to tackle problems with a restricted set of tools and technologies – but these tools are rarely the best choices. With experience using a wide variety of technologies, we were confident we could deliver further improvements.
The two parts of the project both significantly decreased test time. The creation of the automate test environment brought testing down from 10 hours to 30 minutes, a speed increase of a factor of 20.
In the second phase of the project, further gains were realised. By rewriting in PL/SQL, we achieved an improved time of just 70 seconds, almost two order of magnitude better than the original 10 hours of test time
|Technology||Original Time||Improved Time|
|Java running in WAS (original)||~10 hours||~30 mins|
|Standalone Java||~30 mins||~9 mins|
The level of improvement is typical for projects where the test environment is inefficient and ineffective.
The overall benefits achieved included.
- Order of magnitude improvement on test time, from 10 hours to 70 seconds
- Enabled the client to meet their regulatory milestone and ensure compliance
- Removed a key barrier to DevOps, which is impossible without automated testing.
- Showcased best practice for building efficient test environments, equipping the client to build more efficient test processes in the future.