What is Semantic Testing?
Semantic Testing is the belief that software testing is not idiosyncratic. Testing a web application at Company X is the same as testing a web application at Company Y. The software may be different in form, function, and deployment environment but how you fundamentally test the software is the same. This belief holds true whether you are testing a library, a SaaS, an RPC client/server, a cloud service, a mobile application, or AI software.
Semantic Testing applies systems thinking to develop a software testing specification that prescribes habits and practices to effectively test your software. At the heart of Semantic Testing is the concept that all test cases can be categorized into five test categories with specific mission, objectives, strategies and tactics. These testing categories are:
- Unit Test
- Integration Test
- System Test
- Safety Test
- Experience Test
Why do we need Semantic Testing?
If you have been a software developer long enough or have worked at multiple organization you have probably heard tests referred to as unit test, sanity test, acceptance test, performance test, functional test, integration test, system test, regression test, stress test, security test, component test, black-box test, gray-box test, white-box test, validation test, end-to-end test, verification test, smoke test, scenario test, contract test, intake test, alpha test, beta test, destructive test, accessibility test, concurrent test, usability test, etc.
What the heck is a sanity test? Are not all tests sanity tests? What is the difference between a functional test and an integration test? Or a system test and end-to-end test? There does not seem to be consensus as what to call tests between teams let alone organizations. Even worse, what constitutes a functional test at one organization is referred to as sanity test in another. We do not have shared lexicon when we communicate so how can we define what constitutes an effective test or communicate the value of testing our software?
Language affects the way we think and the decisions we make. Do we really need 27 different ways to refer to our tests? Can we make software testing easy to understand and define a consistent approach to applying software testing best practices and principles? With Semantic Testing we believe we can!
Terminology
Before we proceed, let me introduce some important terms that will use:
Software Bug
A software bug is an error, flaw, failure or fault in system that causes it to produce an incorrect or unexpected result, or to behave in unintended ways.
System Under Test
System Under Test (SUT) refers to a system that is being tested for correct operation.
Test Case
A test case is a set of test inputs, execution conditions, and expected results used to determine whether a system under test satisfies requirements and works correctly.
Collaborator
A collaborator is any system required by the system under test to fulfill its duties. There are three types of collaborators:
- Fake Collaborator - A fake collaborator is a collaborator whose behavior is altered to simulate the behavior of a real collaborator for testing purposes.
- Virtual Collaborator - A virtual collaborator is a collaborator that behaves like a real collaborator unless its behavior is altered for testing purpose.
- Real Collaborator - A real collaborator is collaborator that behaves like a real collaborator.
Resource
A resource is any asset drawn on by a the system under test to fulfill its duties. There are three types of resources:
- Local Resource - A local resource is an asset that is:
- managed locally
- simulates a deployment environment resource
- Virtual Resource - A virtual resource is a resource that is:
- managed locally or remotely
- similar to a deployment environment resource
- Remote Resource - A remote resource is an asset that is:
- managed remotely
- identical to a deployment environment resource
Analyzer
An analyzer is any tool drawn on to examine the system to identify potential software bugs.
Mission
A mission defines a broad primary outcome you want to achieve.
Objective
An objective is a specific and measurable step you take to prove you are effectively achieving a mission.
Strategy
A strategy is an approach you take to achieve an objective.
Tactic
A tactic is a tool you use to achieve a strategy.
Unit Test
Mission
The smallest unit of code behaves exactly as you expect in isolation.
Objectives
- Implement one or more test cases that causes the unit of code to fail
- Implement one or more test cases that causes the unit of code to pass
- Deliver production code that has at least 80% code coverage
Strategies
- Assert external resources are not used
- Assert test cases can be executed independent of other test cases
- Assert the result produced by the unit of code
- Assert the state of stateful objects
- Assert interaction between the system under test and its collaborators
Tactics
- Utilize clean coding practices
- Utilize a testing framework
- Utilize fake or virtual collaborators to simulate certain behaviors
- Utilize an assertion library
- Utilize a code coverage analyzer
- Utilize a mutation analyzer
Integration Test
Mission
A service that talks to an external resource behaves exactly as you expect in isolation.
Objectives
- Load and initialize the module
- Load resources used by the module
- Retrieve the service from the module
- Implement one or more test cases that causes the service to fail
- Implement one or more test cases that causes the service to pass
- Deliver production code that has at least 80% code coverage
Strategies
- Assert the module loads correctly
- Assert the service can be retrieved from the module
- Assert test cases can be executed independent of other test cases
- Assert the result produced by the service
- Assert integration between the service and collaborators it depends on
- Assert integration between the service and resources it depends on
Tactics
- Utilize a testing framework
- Utilize fake or virtual collaborators to simulate certain behaviors
- Utilize an assertion library
- Utilize a local or virtual resources
- Utilize a code coverage analyzer
- Utilize a mutation analyzer
System Test
Mission
A client facing service behaves exactly as you expect in isolation.
Objectives
- Load and initialize the system
- Load resources used by the system
- Retrieve the service from the system
- Implement one or more test cases that causes the service to fail
- Implement one or more test cases that causes the service to pass
- Deliver production code that has at least 80% code coverage
Strategies
- Assert the system loads correctly
- Assert the service can be retrieved from the system
- Assert test cases can be executed independent of other test cases
- Assert the input sent to the service by the client
- Assert the output sent to the client by the service
- Assert integration between the service, its collaborators and resources it uses
Tactics
- Utilize a testing framework
- Utilize virtual collaborators to simulate certain behaviors
- Utilize an assertion library
- Utilize a virtual or remote resources
- Utilize a code coverage analyzer
- Utilize a mutation analyzer
Safety Test
Mission
The system does not cause injury to the client.
Objectives
- The system does not cause psychological injury
- The system does not cause physiological injury
- The system does not cause financial injury
Strategies
- Assert access to system data is secure
- Assert access to the system is secure
- Assert access to hardware the system is deployed on is secure
- Assert access to system network environment is secure
- Assert access to the physical location of the system is secure
- Assert the system is safe to use by the client
Tactics
- Utilize safety and security experts
- Utilize internal red teams
- Utilize access control
- Utilize information security standards
- Utilize secure coding standards and practices
- Utilize vulnerability analyzers
Experience Test
Mission
The client has a positive perception and feelings towards the system.
Objectives
- The client perceive the functions of the system as useful
- The client feel the system is easy and efficient to use
- The client see the system as visually attractive
- The client identifies with the system
- The client is inspired by the system
- The client sees value in the system
Strategies
- Assert communication between the client and your teams
- Assert interaction between the client and your teams
- Assert that the client is solicited for feedback
- Assert that client feedback is actionable
- Assert that the client is informed of new features and functionality
- Assert that the client is informed of your mission, vision, and values
Tactics
- Utilize UI/UX experts
- Utilize human-centered design
- Utilize survey tools
- Utilize focus groups
- Utilize usability tests
- Utilize A/B tests
- Utilize a newsletter, blogs, and social media