Testability Checklist
This checklist is designed for planning test access in hardware products, placing test points, and documenting test procedures. The goal is to facilitate post-production testing, expedite fault diagnosis, and standardize quality control processes.
Test Points on PCB
1. Are test points defined for critical circuits and hard-to-access signals?
System testability should be considered in the early design phase.
If critical measurement points are not identified, post-production test processes become slower or error-prone.
Typical signals requiring test points:
- Voltage regulator outputs (Vout)
- Oscillator / clock signals
- Analog sensor inputs
- Communication lines (UART, I²C, SPI, CAN, RS485)
- Reset, Enable, Fault lines
Test pads should be directly accessible to measurement equipment or test jigs. Special "Test Label" names should be assigned to these pads if necessary (e.g., TP_VBAT, TP_CLK).
This item addresses test access requirements under IPC-2221B – Generic Standard on Printed Board Design.
2. Are pads or vias available for in-circuit test (ICT) or functional test?
In-Circuit Test (ICT) or Functional Test (FCT) validates both the electrical and functional integrity of the assembled board.
- Pads for ICT tests should be 100–125 mil (2.54–3.2 mm) in diameter.
- Test pads should be left unmasked with surface finish (ENIG or HASL) suitable for probe contact.
- In FCT tests, all system functions should be externally stimulatable (e.g., UART loopback, SPI verification).
- If using a bed-of-nails fixture, pads should be positioned according to grid layout.
This step facilitates test jig design and enables automation of post-assembly mass production quality tests.
3. Are test pads arranged in a regular grid structure?
Randomly placed test pads cause alignment and contact issues in test fixtures.
Therefore, a regular grid structure ensures both mechanical accuracy and maintenance ease.
- Standard grid spacing: 100 mil (2.54 mm).
- On large boards or complex systems, 125 mil (3.175 mm) grid can be used.
- All test pads should be on the same layer; placement at different heights complicates probe contact.
- Grid coordinates should be documented on the "test layer" in drawings.
This planning also lays the groundwork for implementing IEEE 1149.x Boundary Scan architectures at the PCB level.
4. Are test pads adequately spaced from other components?
There should be no physical obstacles such as high-profile components, screws, or sockets that would block probe contact.
- Minimum component–pad spacing: 50 mil (1.27 mm).
- Test pads should not be placed near high-profile components (e.g., capacitors, relays, connectors).
- A clearance ring (mask clearance) should be left around pads.
- If necessary, remote test vias can be used for test access (test line is routed to an accessible point).
This item complies with IPC-2221B – Section 10.3.4: Test Access Requirements standard.
5. Is a separate test point defined for each power line, GND, and critical signal?
A test access point should exist for each supply line (e.g., 3V3, 5V, VBAT, VBUS) and ground (GND) line.
This allows all power and signal integrity tests to be performed independently.
- At least one test pad should be defined for each power line.
- GND pads should be large enough for oscilloscope clips or ground probes to connect.
- Test pads for critical signals (e.g., Reset, SPI_CLK, UART_TX) should be named.
- During testing, no external loads should be connected to these pads; pads should only be used for measurement or verification.
This control directly affects the reliability of post-production ICT (In-Circuit Test) and Boundary Scan (JTAG) tests.
Test Hardware and Connector Layout
6. Are special connectors or headers defined for testing?
Test access is one of the most critical functions in a system's lifecycle.
Standardized interface connectors, as much as test points, determine the testability level of the design.
Commonly used test interfaces:
- JTAG (IEEE 1149.1 / 1149.7): For boundary-scan, production and debugging tests
- UART Debug Port: Software debugging and error log access
- ISP (In-System Programming) Header: MCU, FPGA, or EEPROM programming
- Custom Test Connector: For special production test signals or power lines
Each test interface should be clearly named in schematics and positioned in drawings. Programming interfaces should be usable for test automation on production lines.
This item is compatible with IEEE 1149.x, IEC 61131-9 (IO-Link), and ISO 17025 test access principles.
7. Are the orientation, type, and pin layout of test connectors specified in drawings?
Incorrect connections can lead to both faulty measurements and hardware failures during testing.
Therefore, the physical orientation and pin order of test connectors should be clearly defined.
Information required in documentation:
- Connector type (e.g., 2x5 IDC, 1x6 PinHeader, MicroMatch)
- Pin numbers and signal names
- Connector orientation (should be marked in schematics with arrow or "key notch" symbol)
- Use of keyed or polarized type is recommended
- Mechanical locking (latch/keying) should be preferred in production connectors to prevent incorrect insertion
This item applies the "Connector Orientation & Keying" principle of IPC-2612A – Design Guidelines for Connectors and Interfaces standard.
8. Are test connectors easily accessible post-production?
Proper positioning of test access points provides significant time savings in product maintenance and verification processes.
- Test connectors should be positioned to be accessible without disassembling the case.
- If the product is in an enclosure, test ports should be on the outer surface or under easily opened covers.
- In development prototypes, all test ports may be accessible; however, in mass production versions, only service-required ports should remain.
- If necessary, labels or color codes should be added to test ports (e.g., blue: UART, yellow: JTAG, red: power).
This approach aligns with the "Ease of Access & Serviceability" principle in DFX (Design for eXcellence) methodology.
9. Is mechanical isolation provided against short circuit risk during testing?
Test connectors typically have direct contact with high frequency, current, or sensitive signals.
Incorrect connections or probe errors can create short circuit risks.
- An insulation label or mechanical guard should be added under the test connector.
- In areas near metal bodies, plastic "umbrella caps" or guard rings should be used.
- Blind plugs (protective caps) can be used in test sockets to prevent open pins from causing short circuits.
- During service, test connectors should only be inserted when de-energized (power-off label recommended).
This item is based on access protection principles of IEC 61010-1 – Safety Requirements for Electrical Equipment for Measurement, Control, and Laboratory Use standard.
10. Is the reference GND point for test equipment clearly marked?
Improper grounding during test processes can cause measurement deviation or system instability.
Therefore, reference GND points should be defined at both physical and documentation levels.
- Each board should have at least one GND test pad or terminal.
- GND pad should be wide enough for secure contact with oscilloscope clips or ICT probes (≥2.5 mm).
- If multiple ground planes (analog/digital) exist, each should be testable as separate GND references.
- These pads should be clearly named in schematics as "TP_GNDx" or "TP_AGND".
- GND reference should be marked as "Reference Node" in production test reports.
This control is compatible with IEC 61967 – Measurement of Electromagnetic Emissions and IPC-2221B – Ground Test Access criteria.
Test Plan and Procedures
11. Is there a written test procedure for each test stage?
Each test step should be documented with defined input conditions, measurement points, reference values, and acceptance criteria.
This ensures tests are conducted in a repeatable and standardized manner rather than randomly.
Typical test stages:
- Board Production Test (ICT – In-Circuit Test): Solder quality, open/short circuit, resistance and capacitance verification
- Functional Test (FCT – Functional Test): System behavior, sensor reading, communication and power tests
- Acceptance Test (FAT – Factory Acceptance Test): Verification of completed product according to customer specifications
Each test procedure should include: test name, purpose, equipment list, application steps, reference range (min–max), acceptance criteria (PASS/FAIL). Procedure documents should be stored with revision control (e.g., TP-001 RevA).
This item meets the documented procedures requirement in ISO/IEC 17025 – General Requirements for Testing and Calibration Laboratories standard.
12. Are test procedures matched with hardware revisions?
Test instructions should only be valid for the correct board or product revision.
Revision mismatches lead to incorrect test steps and false failure reports.
- Each test file should specify the hardware revision it applies to (e.g., "Applicable to PCB Rev. B & higher").
- Test software should automatically select the appropriate procedure based on hardware ID (HW_ID or EEPROM Rev Tag) on the product.
- Test steps should be reviewed and versioned when revisions change.
- Old test plans should be marked as "Obsolete" or "Archived".
This practice is part of the Configuration Management (CM) process and is based on ISO 10007 – Quality Management Systems: Configuration Management standard.
13. Does the test plan include test equipment type and required test durations?
A test plan should define not only "what will be tested" but also how, with what equipment, and in how much time.
For each test stage:
- Test equipment model and manufacturer used (e.g., Keysight 3070 ICT, NI PXI-Express, Rohde & Schwarz Scope)
- Measurement method and connection type
- Average test duration (e.g., FCT ≈ 3 min, ICT ≈ 45 sec)
- Test repetition or retry conditions
Test plans should be based on production line capacity analysis (line balancing) and quality metric targets (DPMO, yield).
This item fulfills the "controlled testing" requirement within ISO 9001:2015 Clause 8.6 – Release of Products and Services.
14. Is the data format defined for automated test equipment (ATE) or semi-automated test fixtures?
Recording test data in structured format ensures automation system consistency.
Therefore, test point–signal–result relationships should be maintained in a specific data format.
Data format (examples):
- CSV: For simple production lines
- JSON/XML: For parametric data exchange in ATE systems
- SQL or ERP integration: To transfer serial numbers and quality reports to central databases
For each test point: pin name, measurement unit, target value, tolerance, and result information should be recorded. Signal mapping tables (pin map) used in fixtures should be kept under revision control.
This item supports data definition approach compliant with IPC-2581 – Open XML Format for Manufacturing Data Exchange standard.
15. Are test results being recorded?
Production test outputs should be stored not only for immediate evaluation but also for long-term traceability and quality analysis.
- PASS/FAIL information should be matched with product serial number in each test cycle.
- Test results should be recorded with timestamp and operator information.
- Failed products should have an error code or description field.
- Data should be automatically transferred to ERP, PLM, or MES systems.
- Test reports should be stored in electronic archives for at least 5 years.
This item directly corresponds to IEC 60300-3-11 – Dependability Management and ISO 9001 Clause 8.7 – Control of Nonconforming Outputs standards.
Test Integration with Production
16. Are test points planned according to production panel design?
In panelized PCB structures, test pad placements should be compatible with panel spacing and fixture geometry.
- Spacing between boards on the panel (routing space) should allow test probe access.
- Pin plates (bed-of-nails) of test fixtures should be designed according to panel dimensions.
- Panel alignment holes should assist test fixture alignment with alignment pins.
- Test points should not be placed too close to panel boundaries (≥3 mm recommended).
This planning ensures DFM (Design for Manufacturing) process execution compatible with test access.
17. Is the solder mask status (open/covered) of test pads defined?
Solder mask directly affects contact quality in test pad access.
Proper masking strategy improves both test accuracy and board surface durability.
- If SMD probes or spring pins are to be used, test pads should be exposed (unmasked).
- For protected, long-term use test pads, masked (covered) areas may be preferred.
- Test mask status should be specified in PCB production drawings as separate test layer (tTest/bTest).
- ENIG coating is recommended to prevent solder mask wear.
This item is compatible with IPC-7351 – Land Pattern Standard and IPC-2221B Section 10.3 – Test Pad Design Rules.
18. Is the test process defined in the production flow (AOI → ICT → FCT sequence)?
The sequence of test steps determines the consistency of the quality control chain.
Incorrect test sequence complicates failure analysis and leads to efficiency losses.
Recommended test flow:
- AOI (Automated Optical Inspection): Solder and placement verification
- ICT (In-Circuit Test): Electrical connection and component tests
- FCT (Functional Test): System behavior and software verification
- Burn-In / Stress Test (optional): Thermal and long-term durability control
Test sequence should be clearly specified in production plans and quality procedure documents.
This item corresponds to ISO 9001:2015 Clause 8.6 – Release of Products and Services standard.
19. Is engineering support for test hardware design documented?
Test hardware is also an engineering product and requires documentation, design responsibility, and maintenance planning.
- Test fixture CAD files (STEP, DXF, Gerber) should be documented.
- Pin mapping table should be matched with test point labels.
- Components used (pogo pins, connectors, adapters) should be listed and maintenance cycle defined.
- Test hardware should be tracked as "Engineering Asset" in ERP/PLM systems.
This practice supports test equipment compliance requirement within ISO/IEC 17025 – Clause 6.4: Equipment.
20. Is a failure analysis procedure established for failed products?
Systematic handling of failed products (FAIL) is the most critical stage of the quality cycle.
- An error code or category (Critical / Major / Minor) should be defined for each FAIL result.
- Rework and reject steps should be specified in written procedures.
- Root cause analysis (RCA) methods should be used:
- 5 Why, Fishbone (Ishikawa), or 8D Methodology
- All analyses should be recorded in the CAPA (Corrective and Preventive Action) system.
This item complies with ISO 9001 Clause 10.2 – Nonconformity and Corrective Action standard.
21. Is test coverage metric (% pin / % signal) being reported?
Test coverage is the most basic metric showing how much of the product is verified.
- Pin coverage (%): Ratio of tested pins to total pins
- Signal coverage (%): Number of functionally tested signals
- Typical target: ICT >90% pin coverage, FCT >80% signal coverage
- Remaining signals should be listed as "not testable" in the report
These metrics are directly used in Six Sigma DPMO (Defects per Million Opportunities) analyses.
22. Is a "golden unit" / calibration standard available for sub-assemblies?
"Golden Unit" is a reference product used to compare test system accuracy.
- A golden unit should be defined for each sub-assembly (e.g., sensor module, control board).
- Golden unit parameters (voltage, current, output response time) should be calibrated.
- This unit should be used periodically for verification at test stations.
This control meets ISO 17025 – Clause 7.7: Ensuring the Validity of Results requirement.
23. Is test equipment verification (calibration log) schedule defined?
Periodic calibration of test equipment (multimeter, power supply, oscilloscope, ATE station) directly affects test result reliability.
- Calibration date, certificate number, and validity period of each device should be recorded.
- Calibration period is typically once a year, but 6 months is recommended for high-precision equipment.
- Calibration logs should be stored electronically (ERP / LIMS).
This item complies with ISO/IEC 17025 Clause 6.4.7 – Equipment Calibration standard.
24. Are automated test result reports matched with serial numbers?
Associating test reports with product identity is the foundation of traceable production.
- Each test result should be linked with product serial number (SN).
- Report format should include test date, test station, operator ID, and result information.
- This matching should be stored as "Product History Record" in ERP or MES systems.