Skip to content
AuditFront
CQ-2 Tech Due Diligence

Tech Due Diligence CQ-2: Test Coverage and Testing Strategy

What This Control Requires

The assessor evaluates the breadth, depth, and effectiveness of the automated testing strategy, including unit tests, integration tests, end-to-end tests, and the overall test coverage metrics across the codebase.

In Plain Language

Testing strategy reveals how much confidence a team has in their own code - and how safely they can change it. Assessors look beyond raw coverage numbers to understand whether the right things are tested, whether tests are meaningful rather than just padding metrics, and whether the testing pyramid is well-balanced. A healthy codebase has many fast unit tests covering individual functions and components, a moderate number of integration tests verifying interactions between modules, and a smaller set of end-to-end tests validating critical user workflows. An inverted pyramid with many slow E2E tests and few unit tests signals fragility and painfully slow feedback loops. Testing culture matters as much as coverage numbers. Are tests written alongside features, or retroactively added as an afterthought? Is there a testing standard the team follows? Are tests maintained when code changes, or do they go stale? Teams that take testing seriously can refactor with confidence and ship more reliable software - both critical factors for long-term technical health.

How to Implement

Write a testing strategy document that defines the team's approach at each level. Specify what should be unit tested (all business logic, utility functions, data transformations), what should be integration tested (API endpoints, database interactions, external service integrations), and what should be end-to-end tested (critical user journeys, payment flows, authentication). Set measurable coverage targets appropriate to the codebase. For backend business logic, aim for 80%+ line coverage. For critical financial or safety-related code, push for 90%+. For UI components, focus on testing behaviour rather than chasing high coverage numbers. Track coverage over time and require new code to meet the threshold. Make test execution a mandatory gate in CI/CD. Pull requests should not be mergeable if tests fail. Consider adding coverage regression checks that block merging if coverage drops below the baseline. Set up proper testing environments at each level: unit tests run in isolation with mocked dependencies, integration tests use test databases and service stubs, and end-to-end tests run against a staging environment that mirrors production. Focus on test quality during code reviews. Good tests verify meaningful behaviour rather than implementation details, use clear assertions that communicate intent, follow naming conventions that describe the scenario, avoid excessive mocking that makes tests brittle, and are deterministic with no flaky behaviour. Track test reliability as an operational concern. Flaky tests erode confidence in the suite and lead developers to ignore failures. Maintain a flaky test dashboard and quarantine or fix unreliable tests promptly. Consider specialised testing approaches where they add value: property-based testing for complex algorithms, snapshot testing for UI components, contract testing for microservice APIs, and chaos engineering for resilience testing.

Evidence Your Auditor Will Request

  • Test coverage reports showing current coverage by module or service
  • Testing strategy document defining the team's approach
  • Test execution reports from CI/CD pipeline showing pass rates and duration
  • Coverage trend over the past 6-12 months
  • Evidence of test review as part of the code review process

Common Mistakes

  • High coverage numbers achieved through trivial tests that do not validate meaningful behaviour
  • Integration and E2E tests are slow and unreliable, causing developers to skip them
  • Test coverage concentrated in older modules; new features shipped without adequate tests
  • No testing strategy; ad-hoc approach with inconsistent coverage across the codebase
  • Flaky tests are tolerated rather than fixed, undermining confidence in the test suite

Related Controls Across Frameworks

Framework Control ID Relationship
ISO 27001 A.8.25 Related

Frequently Asked Questions

What test coverage percentage is considered good for due diligence?
There is no universal answer, but as a rough guide: below 40% is a red flag, 40-60% suggests room for improvement, 60-80% is good for most applications, and above 80% is excellent. More important than the absolute number is the trend (is it improving or declining?) and whether critical paths are well-covered. A team at 65% and trending upward looks better than one at 75% and sliding.
Is TDD (Test-Driven Development) expected?
TDD is not a requirement. What assessors look for is a strong testing culture regardless of methodology. Whether the team writes tests before or alongside code matters less than whether tests are comprehensive, meaningful, and consistently maintained. The end result is what counts.

Track Tech Due Diligence compliance in one place

AuditFront helps you manage every Tech Due Diligence control, collect evidence, and stay audit-ready.

Start Free Assessment