Skip to content

Testing Strategies for Network Automation

Testing Strategies for Network Automation: From Unit Tests to Mock Devices


This post is part of our ongoing series on network automation best practices, grounded in the PRIME Framework and PRIME Philosophy.

Transparency Note

Examples, scenarios, and any outcome figures in this article are provided for education and are based on enterprise delivery experience or anonymised composite scenarios unless explicitly identified as direct Nautomation Prime client outcomes.

Why This Blog Exists

Testing is the difference between "it works on my machine" and "it works in production." This post covers why testing matters, the types of tests you need, and how the PRIME Framework makes testing sustainable and measurable.


🚦 PRIME Philosophy: Measurability and Safety

  • Measurability: Prove your automation works—before, during, and after deployment
  • Safety: Catch errors before they hit production
  • Transparency: Document what is tested and why
  • Ownership: Your team can extend and maintain tests
  • Empowerment: Testing is part of the workflow, not an afterthought


Types of Tests

1. Unit Tests

  • Test individual functions or modules
  • Use pytest or unittest
  • Fast, repeatable, and easy to automate
  • Should cover all logic branches and edge cases
  • Use mocks to isolate dependencies (e.g., device responses, file I/O)

2. Integration Tests

  • Test end-to-end workflows
  • Use real or simulated devices
  • Validate that all parts work together
  • Include error paths, rollback, and recovery scenarios
  • Use testbeds (pyATS) or dynamic inventory (Nornir/NetBox)

3. Mock Device Testing

  • Use tools like pyATS, nornir_utils, or custom mocks
  • Simulate device responses for safe, repeatable tests
  • Enables testing of failure modes, edge cases, and rare device behaviors
  • Use for CI/CD pipelines where real devices are unavailable

Example: Adding Tests to a Script

Unit Test Example:

1
2
3
4
5
import pytest
def test_parse_vlan_output():
  output = 'VLAN Name Status'
  result = parse_vlan_output(output)
  assert result == expected_dict

Advanced: Parametrized Unit Tests

1
2
3
4
5
6
@pytest.mark.parametrize("output,expected", [
  ("VLAN Name Status", expected_dict),
  ("", {}),
])
def test_parse_vlan_output_cases(output, expected):
  assert parse_vlan_output(output) == expected

Mock Device Example:

1
2
3
4
5
6
7
8
from pyats.topology import loader
from pyats.aetest import Testcase
class MyTest(Testcase):
  def test_vlan(self):
    device = loader.load('testbed.yaml').devices['switch-01']
    device.connect()
    result = device.parse('show vlan')
    self.assertIn('10', result['vlans'])

Integration Test Example:

1
2
3
def test_end_to_end_workflow():
  # Setup testbed, push config, validate state, rollback
  ...

PRIME in Action: Test Coverage and Reporting

  • Use CI/CD to run tests on every change (GitHub Actions, GitLab CI, Jenkins)
  • Track test coverage and failures (pytest-cov, coverage.py)
  • Document test cases and expected outcomes (README, wikis, runbooks)
  • Use badges and dashboards to visualize test health
  • Require tests for all new features and bugfixes

Summary: Blog Takeaways

  • Testing is essential for safe, reliable automation
  • Use unit, integration, and mock device tests
  • PRIME principles make testing measurable and sustainable
  • Automate tests in CI/CD and require passing tests for deployment
  • Cover error paths, rollbacks, and edge cases—not just the happy path
  • Use mocks and simulators to test at scale and under failure conditions

📣 Want More?