What are Pytest Markers?
Pytest markers are a powerful feature that allows you to add metadata or labels to your test functions, making it easier to organize and customize your test suite.
Markers help you categorize and select specific tests to run, especially when dealing with large test suites.
They can be used to indicate test priority, skip certain tests under specific conditions, or group tests by categories like performance, integration, or acceptance.
Common Built-in Pytest Markers
Pytest Skip Test / Skip If
@pytest.mark.skip(reason="...")
This marker skips the annotated test with an optional reason, useful for temporarily excluding tests that are known to be failing or not yet implemented. You can read more about the best way to skip tests here.
In the above code:
test_example_skip
: This test uses the @pytest.mark.skip
marker and will always be skipped regardless of conditions. The given reason, “This test is temporarily disabled,” provides context on why it’s being omitted.
test_example_skipif
: Here, we’re using the @pytest.mark.skipif
marker to conditionally skip the test. Specifically, the test will be skipped if it’s executed on a Python version earlier than 3.8. This is a useful way to ensure tests are only run in environments where they’re applicable.
Pytest Expected Failure (Xfail)
@pytest.mark.xfail(reason="...")
Marks a test as expected to fail. It’s helpful when you anticipate a test failure, and you can provide an optional reason to explain why it’s expected to fail. This article covers Pytest xfail
, xpass
and skip
in detail.
This is very useful when you have a known bug in your code and you want to track it until it’s fixed.
Pytest Parameterization
@pytest.mark.parametrize("arg1, arg2, ...", \[(val1, val2, ...), ...\])
This marker allows you to parameterize a test function, running it with different sets of input values specified in the test data. It’s excellent for DRY (Don’t Repeat Yourself) testing, where you avoid redundant test code.
Pytest Fixtures
@pytest.mark.usefixtures("fixture1", "fixture2", ...)
This marker applies one or more fixtures to a test function.
Mark Tests as Fast or Slow
We can easily define our own markers for example, to mark tests as fast
or slow
or external
, internal
etc.
In this example, we have two test functions: test_fast_add
and test_slow_subtraction
.
test_fast_add
: This test is marked as fast
because it’s expected to run quickly. It tests the add
method of the Calculator
class, which adds two numbers. The assertion checks if the result is correct.
test_slow_subtraction
: This test is marked as slow
because it’s expected to run slowly. It tests the subtract
method of the Calculator
class, which subtracts two numbers. The assertion checks if the result is correct. This test is fitted with a 5-second delay to simulate a slow test.
Pytest Timeout
@pytest.mark.timeout(seconds)
Specifies a maximum execution time for a test. If the test runs longer than the specified timeout, it’s automatically marked as a failure. This is useful for preventing tests from running indefinitely.
Please note you’ll need to install the pytest-timeout plugin to use this marker.
Pytest Run Order
@pytest.mark.run(order) or @pytest.mark.run(order=order)
Allows you to control the order in which tests are executed. The order argument specifies the relative execution order of tests.
Best Practices When Using Pytest Markers
Define Markers in Pytest.ini File
The pytest.ini
file plays a central role when working with Pytest.
An example pytest.ini file:
[pytest]
markers =
development: marks tests as development (deselect with '-m "not development"')
production: marks tests as production (deselect with '-m "not production"')
fast: marks tests as fast (run with '-m fast')
slow: marks tests as slow (run with '-m slow')
custom: custom marker example (run with '-m custom')
asyncio: marks tests requiring asyncio (run with pytest-asyncio plugin)
xfail: marks tests that are expected to fail (handled by pytest itself)
xpass: marks tests that unexpectedly pass after being marked xfail (handled by pytest itself)
parameters: marks parameterized tests (handled by pytest itself)
benchmark: marks tests used for benchmarking (handled by pytest-benchmark plugin)
celery: marks tests related to Celery tasks (custom marker, specifics depend on test implementation)
login: dummy login marker for grouping test
signup: dummy signup marker for grouping test
marker1: combined markers
marker2: combined markers
timeout: test with timeout
In the world of testing, pytest markers are like guideposts.