cutimage - Fotolia
The test phase of software development continues to become increasingly difficult. Sandboxes, containers and data virtualization may be the answer to software testing troubles.
Software is not only physically and geographically interconnected, but it also must handle cases that cross an organizational boundary, such as meshing public and private cloud software, for example. This is particularly true where data access is involved: Data from social media and public cloud must now be part of the same analytics as traditional data warehousing and business intelligence apps.
The best test environment exactly mirrors the runtime environment for the software. How does one do this with a remote service, or a piece of software designed to abstract away the details of physical hardware?
Isolating the software under test from a runtime environment when software testing took place on the same machine used to be difficult. When you tested on the same hardware that run-the-business apps use, you vastly increased the possibility of downtime in the production infrastructure.
Now, however, virtualization technology allows both hardware and software isolation. While the tester plays in a sandbox that looks like the runtime environment -- and in fact is a piece of the runtime environment -- the run-the-business apps safely proceed as if the sandbox wasn't there.
With distributed databases, you have the app, the database(s) and a portion of the database's data store all running on each piece of hardware; the actual architecture under test is more complicated. However, the approach can be the same: Create a sandbox in each machine containing a test copy of app, database and data store, and then run the tests as if the sandbox controlled the whole machine.
The sandbox technique for software testing is about as close to mimicking your runtime environment as you can get. The sandbox only uses a portion of each machine's resources; but the actual runtime of the app often uses only a similar portion.
Containers and app performance
The VM has many virtues, but it is designed to run on an OS kernel that may not run on the same machine or be optimized for that OS. Those factors cause a significant amount of performance overhead -- expense that makes it more difficult to exactly mimic runtime performance during tests.
A container acts like a VM, but only allows the same OS as the kernel, and accesses the kernel at a lower level. Thus, at the price of ruling out running Linux on a z/OS kernel, the container allows app performance that closely approximates that of a Linux app running on a Linux OS without virtualization. This software testing technique gives a better simulation of runtime performance.
Most distributed databases run on the same OS as the app. Redesigning the app to use containers and then testing it on them in sandboxes brings runtime performance benefits as well as a more accurate reflection of the runtime environment.
Data virtualization is a special case, but the logic is that testing should consider cases where some databases or database/data copies are down -- e.g., mirrored storage when one of the two mirrors is unavailable.
Data virtualization software makes this simple: The software tester can switch off some database copies to mimic a hardware failure, and ask the app to perform analytics on the remaining databases by redefining the database to the data virtualization software. In data virtualization, the app doesn't know or care where the data physically resides.
Putting it all into distributed database testing
One way to put this software test architecture together is to write a service that issues test instructions remotely, abstractly and with the user filling in the details of software location and architecture. An agent for each physical location then simulates streams of input from app users, employing containers within sandboxes for the actual software to be tested. Where appropriate, data virtualization software simulates breakdowns of specific physical IT infrastructure.
About the author:
Wayne Kernochan is president of Infostructure Associates, an affiliate of Valley View Ventures, which identifies ways for businesses to leverage information for innovation and competitive advantage. He has been an IT industry analyst for 25 years and has focused on key information-related technologies and ways to measure their effectiveness. Email him at [email protected].
Software testing challenges caused by new technology
How to beat the software testing time, budget crunch
Cut costs with these four software testing techniques