salt/doc/topics/tutorials/writing_tests.rst

Ignoring revisions in .git-blame-ignore-revs. Click here to bypass and see the normal blame view.

539 lines
20 KiB
ReStructuredText
Raw Normal View History

.. _tutorial-salt-testing:
==================================
Salt's Test Suite: An Introduction
==================================
2013-11-25 17:13:48 -07:00
.. note::
This tutorial makes a couple of assumptions. The first assumption is that
you have a basic knowledge of Salt. To get up to speed, check out the
:ref:`Salt Walkthrough <tutorial-salt-walk-through>`.
The second assumption is that your Salt development environment is already
configured and that you have a basic understanding of contributing to the
Salt codebase. If you're unfamiliar with either of these topics, please refer
2016-03-16 16:43:53 -06:00
to the :ref:`Installing Salt for Development<installing-for-development>`
and the :ref:`Contributing<contributing>` pages, respectively.
2013-11-25 17:13:48 -07:00
Salt comes with a powerful integration and unit test suite. The test suite
allows for the fully automated run of integration and/or unit tests from a
single interface.
Salt's test suite is located under the ``tests`` directory in the root of Salt's
[develop] Merge forward from 2016.3 to develop (#33408) * Fix master hanging after a request from minion with removed key. (#33333) * ZMQ monitor for MWorker connections. * Reauth minion if the key was removed on the master side. * Allow concurrency mode in state runs if using sudo (#33325) Closes #30130 * Disambiguate non-exact matches when checking if sysv service is enabled (#33324) Fixes #33323 * remove redundant, incorrect sudo_runas config documentation (#33318) * remove sudo_runas documentation `sudo_runas` was renamed to `sudo_user` and the documentation was not updated accordingly. * conf/minion: update sudo_user description The description from sudo_runas was better. * import ps from psutil_compat in beacons (#33334) * beacons.network_info: import gate psutil * beacons.ps: import gate psutil * Add docs for mine_functions config var (#33326) * Add docs for mine_functions config var * Note that mine_enabled essentially just doesn't add the mine update function to the scheduler. * Bp 28467 calm mine (#33327) * make minion mine update behavior more configurable * Add docs for mine_functions config var * Remove config dup from mine config options Refs #28467 * 2015.8 does not have _DFLT_MULTIPROCESSING_MODE * This won't be in until 2015.8.10. * Fix network.managed for windows (#33312) * Fix some link errors in the test writing tutorial (#33347) * Describes parameters in register_instances function (#33339) * Fix UnboundLocalError in git.latest (#33340) Resolves #32260. * Expanded documentation for boto_elb state and module (#33341) * Describes what happens when the CNAME parameter is given. * Describes what the recognized attributes are for for ELBs. * Properly detect newer Linux Mint distros (#33359) * Properly detect newer Linux Mint distros LMDE 2 and Linux Mint 17.3 changed the DISTRIB_ID in /etc/lsb-release to ``LinuxMint``, breaking OS detection for these distros. This commit fixes that by adding an entry to the OS_NAME_MAP in the core grains. * Remove LinuxMint os_family from aptpkg.py It is no longer necessary as the distro is now detected properly, which will lead to an os_family of Debian. * Update job_cache and keep_jobs docs to be more specific to their behavior (#33328) * Update job_cache and keep_jobs docs to be more specific to their behavior Also fixed a bug discovered when investigating job_cache/keep_jobs functionality where the jid directory and files were removed by the cache cleaner, but not the original jid clash detection directory created in /var/cache/salt/master/jobs/. Fixes #29286 * Add testcase for the changes in the local_cache.clean_old_jobs func * Mark tests as destructive * Put destructive test decorator in correct location * Remove mentions of windows not supporting pkgs param (#33361) Fixes #33313 * Updates docs version to 2015.8.9 Adds note regarding the os grain on Mint Linux Adds an FAQ regarding grains that change due to upstream changes * revved 2015.8 branch to .9 in version selector * Add initscripts, SystemD service units and environment files for Debian (#32857) * Add note to docs about api settings for Hipchat API v2 (#33365) Fixes #27779 * Add win_pkg to list of modules that support "version" in pkg.installed (#33362) Fixes #32913 * Add note about name parameter in git_pillar docs (#33369) Fixes #27737 * Better YAML syntax error handling (#33375) Closes #26574 * Improve doc clarity for disable_modules documentation (#33379) * Improve doc clarity for disable_modules documentation * Additional clarification on blacklisted name * maintain the fallabck because I am totally sick of this crap * blast, put the try/except int he right place * restore whitespace * Fix traceback in logging for config validation (#33386) * 2015.8.10 release notes * Sync pillarstack to latest upstream version (#33391) * Don't lay down all available opts (#33385) * Don't lay down all available opts * We need at least one opt in there * Condense defaults * Put the default hash type back
2016-05-20 14:48:40 -06:00
code base and is divided into two main types of tests:
:ref:`unit tests and integration tests <integration-vs-unit>`. The ``unit`` and
``integration`` sub-test-suites are located in the ``tests`` directory, which is
where the majority of Salt's test cases are housed.
2013-11-25 17:13:48 -07:00
[develop] Merge forward from 2016.3 to develop (#33408) * Fix master hanging after a request from minion with removed key. (#33333) * ZMQ monitor for MWorker connections. * Reauth minion if the key was removed on the master side. * Allow concurrency mode in state runs if using sudo (#33325) Closes #30130 * Disambiguate non-exact matches when checking if sysv service is enabled (#33324) Fixes #33323 * remove redundant, incorrect sudo_runas config documentation (#33318) * remove sudo_runas documentation `sudo_runas` was renamed to `sudo_user` and the documentation was not updated accordingly. * conf/minion: update sudo_user description The description from sudo_runas was better. * import ps from psutil_compat in beacons (#33334) * beacons.network_info: import gate psutil * beacons.ps: import gate psutil * Add docs for mine_functions config var (#33326) * Add docs for mine_functions config var * Note that mine_enabled essentially just doesn't add the mine update function to the scheduler. * Bp 28467 calm mine (#33327) * make minion mine update behavior more configurable * Add docs for mine_functions config var * Remove config dup from mine config options Refs #28467 * 2015.8 does not have _DFLT_MULTIPROCESSING_MODE * This won't be in until 2015.8.10. * Fix network.managed for windows (#33312) * Fix some link errors in the test writing tutorial (#33347) * Describes parameters in register_instances function (#33339) * Fix UnboundLocalError in git.latest (#33340) Resolves #32260. * Expanded documentation for boto_elb state and module (#33341) * Describes what happens when the CNAME parameter is given. * Describes what the recognized attributes are for for ELBs. * Properly detect newer Linux Mint distros (#33359) * Properly detect newer Linux Mint distros LMDE 2 and Linux Mint 17.3 changed the DISTRIB_ID in /etc/lsb-release to ``LinuxMint``, breaking OS detection for these distros. This commit fixes that by adding an entry to the OS_NAME_MAP in the core grains. * Remove LinuxMint os_family from aptpkg.py It is no longer necessary as the distro is now detected properly, which will lead to an os_family of Debian. * Update job_cache and keep_jobs docs to be more specific to their behavior (#33328) * Update job_cache and keep_jobs docs to be more specific to their behavior Also fixed a bug discovered when investigating job_cache/keep_jobs functionality where the jid directory and files were removed by the cache cleaner, but not the original jid clash detection directory created in /var/cache/salt/master/jobs/. Fixes #29286 * Add testcase for the changes in the local_cache.clean_old_jobs func * Mark tests as destructive * Put destructive test decorator in correct location * Remove mentions of windows not supporting pkgs param (#33361) Fixes #33313 * Updates docs version to 2015.8.9 Adds note regarding the os grain on Mint Linux Adds an FAQ regarding grains that change due to upstream changes * revved 2015.8 branch to .9 in version selector * Add initscripts, SystemD service units and environment files for Debian (#32857) * Add note to docs about api settings for Hipchat API v2 (#33365) Fixes #27779 * Add win_pkg to list of modules that support "version" in pkg.installed (#33362) Fixes #32913 * Add note about name parameter in git_pillar docs (#33369) Fixes #27737 * Better YAML syntax error handling (#33375) Closes #26574 * Improve doc clarity for disable_modules documentation (#33379) * Improve doc clarity for disable_modules documentation * Additional clarification on blacklisted name * maintain the fallabck because I am totally sick of this crap * blast, put the try/except int he right place * restore whitespace * Fix traceback in logging for config validation (#33386) * 2015.8.10 release notes * Sync pillarstack to latest upstream version (#33391) * Don't lay down all available opts (#33385) * Don't lay down all available opts * We need at least one opt in there * Condense defaults * Put the default hash type back
2016-05-20 14:48:40 -06:00
.. _getting_set_up_for_tests:
Getting Set Up For Tests
2013-11-25 17:13:48 -07:00
========================
First of all you will need to ensure you install ``nox``.
.. code-block:: bash
pip install nox
Test Directory Structure
========================
As noted in the introduction to this tutorial, Salt's test suite is located in the
``tests`` directory in the root of Salt's code base. From there, the tests are divided
into two groups ``integration`` and ``unit``. Within each of these directories, the
directory structure roughly mirrors the directory structure of Salt's own codebase.
For example, the files inside ``tests/integration/modules`` contains tests for the
files located within ``salt/modules``.
2013-11-25 17:13:48 -07:00
.. note::
``tests/integration`` and ``tests/unit`` are the only directories discussed in
this tutorial. With the exception of the ``tests/runtests.py`` file, which is
used below in the `Running the Test Suite`_ section, the other directories and
files located in ``tests`` are outside the scope of this tutorial.
.. _integration-vs-unit:
Integration vs. Unit
--------------------
Given that Salt's test suite contains two powerful, though very different, testing
approaches, when should you write integration tests and when should you write unit
tests?
Integration tests use Salt masters, minions, and a syndic to test salt functionality
directly and focus on testing the interaction of these components. Salt's integration
test runner includes functionality to run Salt execution modules, runners, states,
shell commands, salt-ssh commands, salt-api commands, and more. This provides a
tremendous ability to use Salt to test itself and makes writing such tests a breeze.
Integration tests are the preferred method of testing Salt functionality when
possible.
Unit tests do not spin up any Salt daemons, but instead find their value in testing
singular implementations of individual functions. Instead of testing against specific
interactions, unit tests should be used to test a function's logic. Unit tests should
be used to test a function's exit point(s) such as any ``return`` or ``raises``
statements.
Unit tests are also useful in cases where writing an integration test might not be
possible. While the integration test suite is extremely powerful, unfortunately at
this time, it does not cover all functional areas of Salt's ecosystem. For example,
at the time of this writing, there is not a way to write integration tests for Proxy
Minions. Since the test runner will need to be adjusted to account for Proxy Minion
processes, unit tests can still provide some testing support in the interim by
testing the logic contained inside Proxy Minion functions.
Running the Test Suite
======================
[develop] Merge forward from 2016.3 to develop (#33408) * Fix master hanging after a request from minion with removed key. (#33333) * ZMQ monitor for MWorker connections. * Reauth minion if the key was removed on the master side. * Allow concurrency mode in state runs if using sudo (#33325) Closes #30130 * Disambiguate non-exact matches when checking if sysv service is enabled (#33324) Fixes #33323 * remove redundant, incorrect sudo_runas config documentation (#33318) * remove sudo_runas documentation `sudo_runas` was renamed to `sudo_user` and the documentation was not updated accordingly. * conf/minion: update sudo_user description The description from sudo_runas was better. * import ps from psutil_compat in beacons (#33334) * beacons.network_info: import gate psutil * beacons.ps: import gate psutil * Add docs for mine_functions config var (#33326) * Add docs for mine_functions config var * Note that mine_enabled essentially just doesn't add the mine update function to the scheduler. * Bp 28467 calm mine (#33327) * make minion mine update behavior more configurable * Add docs for mine_functions config var * Remove config dup from mine config options Refs #28467 * 2015.8 does not have _DFLT_MULTIPROCESSING_MODE * This won't be in until 2015.8.10. * Fix network.managed for windows (#33312) * Fix some link errors in the test writing tutorial (#33347) * Describes parameters in register_instances function (#33339) * Fix UnboundLocalError in git.latest (#33340) Resolves #32260. * Expanded documentation for boto_elb state and module (#33341) * Describes what happens when the CNAME parameter is given. * Describes what the recognized attributes are for for ELBs. * Properly detect newer Linux Mint distros (#33359) * Properly detect newer Linux Mint distros LMDE 2 and Linux Mint 17.3 changed the DISTRIB_ID in /etc/lsb-release to ``LinuxMint``, breaking OS detection for these distros. This commit fixes that by adding an entry to the OS_NAME_MAP in the core grains. * Remove LinuxMint os_family from aptpkg.py It is no longer necessary as the distro is now detected properly, which will lead to an os_family of Debian. * Update job_cache and keep_jobs docs to be more specific to their behavior (#33328) * Update job_cache and keep_jobs docs to be more specific to their behavior Also fixed a bug discovered when investigating job_cache/keep_jobs functionality where the jid directory and files were removed by the cache cleaner, but not the original jid clash detection directory created in /var/cache/salt/master/jobs/. Fixes #29286 * Add testcase for the changes in the local_cache.clean_old_jobs func * Mark tests as destructive * Put destructive test decorator in correct location * Remove mentions of windows not supporting pkgs param (#33361) Fixes #33313 * Updates docs version to 2015.8.9 Adds note regarding the os grain on Mint Linux Adds an FAQ regarding grains that change due to upstream changes * revved 2015.8 branch to .9 in version selector * Add initscripts, SystemD service units and environment files for Debian (#32857) * Add note to docs about api settings for Hipchat API v2 (#33365) Fixes #27779 * Add win_pkg to list of modules that support "version" in pkg.installed (#33362) Fixes #32913 * Add note about name parameter in git_pillar docs (#33369) Fixes #27737 * Better YAML syntax error handling (#33375) Closes #26574 * Improve doc clarity for disable_modules documentation (#33379) * Improve doc clarity for disable_modules documentation * Additional clarification on blacklisted name * maintain the fallabck because I am totally sick of this crap * blast, put the try/except int he right place * restore whitespace * Fix traceback in logging for config validation (#33386) * 2015.8.10 release notes * Sync pillarstack to latest upstream version (#33391) * Don't lay down all available opts (#33385) * Don't lay down all available opts * We need at least one opt in there * Condense defaults * Put the default hash type back
2016-05-20 14:48:40 -06:00
Once all of the :ref:`requirements <getting_set_up_for_tests>` are installed, the
``nox`` command is used to instantiate Salt's test suite:
.. code-block:: bash
nox -e 'test-3(coverage=False)'
The command above, if executed without any options, will run the entire suite of
integration and unit tests. Some tests require certain flags to run, such as
destructive tests. If these flags are not included, then the test suite will only
perform the tests that don't require special attention.
At the end of the test run, you will see a summary output of the tests that passed,
failed, or were skipped.
You can pass any pytest options after the nox command like so:
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/unit/modules/test_ps.py
The above command will run the ``test_ps.py`` test with the zeromq transport, python3,
and pytest. Pass any pytest options after `--`
Running Integration Tests
-------------------------
2013-11-25 17:13:48 -07:00
Salt's set of integration tests use Salt to test itself. The integration portion
of the test suite includes some built-in Salt daemons that will spin up in preparation
of the test run. This list of Salt daemon processes includes:
2013-11-25 17:13:48 -07:00
* 2 Salt Masters
* 2 Salt Minions
* 1 Salt Syndic
These various daemons are used to execute Salt commands and functionality within
the test suite, allowing you to write tests to assert against expected or
unexpected behaviors.
A simple example of a test utilizing a typical master/minion execution module command
is the test for the ``test_ping`` function in the
2017-03-24 17:42:36 +00:00
``tests/integration/modules/test_test.py``
file:
.. code-block:: python
def test_ping(self):
2020-06-09 09:58:34 +01:00
"""
test.ping
2020-06-09 09:58:34 +01:00
"""
self.assertTrue(self.run_function("test.ping"))
The test above is a very simple example where the ``test.ping`` function is
executed by Salt's test suite runner and is asserting that the minion returned
with a ``True`` response.
.. _test-selection-options:
Test Selection Options
~~~~~~~~~~~~~~~~~~~~~~
If you want to run only a subset of tests, this is easily done with pytest. You only
need to point the test runner to the directory. For example if you want to run all
integration module tests:
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/integration/modules/
Running Unit Tests
------------------
If you want to run only the unit tests, you can just pass the unit test directory
as an option to the test runner.
The unit tests do not spin up any Salt testing daemons as the integration tests
do and execute very quickly compared to the integration tests.
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/unit/
.. _running-specific-tests:
Running Specific Tests
----------------------
There are times when a specific test file, test class, or even a single,
individual test need to be executed, such as when writing new tests. In these
situations, you should use the `pytest syntax`_ to select the specific tests.
For running a single test file, such as the pillar module test file in the
integration test directory, you must provide the file path.
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/pytests/integration/modules/test_pillar.py
Some test files contain only one test class while other test files contain multiple
test classes. To run a specific test class within the file, append the name of
the test class to the end of the file path:
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/pytests/integration/modules/test_pillar.py::PillarModuleTest
To run a single test within a file, append both the name of the test class the
individual test belongs to, as well as the name of the test itself:
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/pytests/integration/modules/test_pillar.py::PillarModuleTest::test_data
The following command is an example of how to execute a single test found in
2017-03-24 17:42:36 +00:00
the ``tests/unit/modules/test_cp.py`` file:
.. code-block:: bash
nox -e 'test-3(coverage=False)' -- tests/pytests/unit/modules/test_cp.py::CpTestCase::test_get_file_not_found
Writing Tests for Salt
======================
Once you're comfortable running tests, you can now start writing them! Be sure
to review the `Integration vs. Unit`_ section of this tutorial to determine what
type of test makes the most sense for the code you're testing.
.. note::
There are many decorators, naming conventions, and code specifications
required for Salt test files. We will not be covering all of the these specifics
in this tutorial. Please refer to the testing documentation links listed below
in the `Additional Testing Documentation`_ section to learn more about these
requirements.
In the following sections, the test examples assume the "new" test is added to
a test file that is already present and regularly running in the test suite and
is written with the correct requirements.
Writing Integration Tests
-------------------------
Since integration tests validate against a running environment, as explained in the
`Running Integration Tests`_ section of this tutorial, integration tests are very
easy to write and are generally the preferred method of writing Salt tests.
The following integration test is an example taken from the ``test.py`` file in the
``tests/integration/modules`` directory. This test uses the ``run_function`` method
to test the functionality of a traditional execution module command.
The ``run_function`` method uses the integration test daemons to execute a
``module.function`` command as you would with Salt. The minion runs the function and
returns. The test also uses `Python's Assert Functions`_ to test that the
minion's return is expected.
.. code-block:: python
def test_ping(self):
2020-06-09 09:58:34 +01:00
"""
test.ping
2020-06-09 09:58:34 +01:00
"""
self.assertTrue(self.run_function("test.ping"))
Args can be passed in to the ``run_function`` method as well:
.. code-block:: python
def test_echo(self):
2020-06-09 09:58:34 +01:00
"""
test.echo
2020-06-09 09:58:34 +01:00
"""
self.assertEqual(self.run_function("test.echo", ["text"]), "text")
The next example is taken from the
2017-03-24 17:42:36 +00:00
``tests/integration/modules/test_aliases.py`` file and
demonstrates how to pass kwargs to the ``run_function`` call. Also note that this
test uses another salt function to ensure the correct data is present (via the
``aliases.set_target`` call) before attempting to assert what the ``aliases.get_target``
call should return.
.. code-block:: python
def test_set_target(self):
2020-06-09 09:58:34 +01:00
"""
aliases.set_target and aliases.get_target
2020-06-09 09:58:34 +01:00
"""
set_ret = self.run_function("aliases.set_target", alias="fred", target="bob")
self.assertTrue(set_ret)
tgt_ret = self.run_function("aliases.get_target", alias="fred")
self.assertEqual(tgt_ret, "bob")
2017-02-02 11:46:33 +01:00
Using multiple Salt commands in this manner provides two useful benefits. The first is
that it provides some additional coverage for the ``aliases.set_target`` function.
The second benefit is the call to ``aliases.get_target`` is not dependent on the
presence of any aliases set outside of this test. Tests should not be dependent on
the previous execution, success, or failure of other tests. They should be isolated
from other tests as much as possible.
While it might be tempting to build out a test file where tests depend on one another
before running, this should be avoided. SaltStack recommends that each test should
test a single functionality and not rely on other tests. Therefore, when possible,
individual tests should also be broken up into singular pieces. These are not
hard-and-fast rules, but serve more as recommendations to keep the test suite simple.
This helps with debugging code and related tests when failures occur and problems
are exposed. There may be instances where large tests use many asserts to set up a
use case that protects against potential regressions.
.. note::
The examples above all use the ``run_function`` option to test execution module
functions in a traditional master/minion environment. To see examples of how to
test other common Salt components such as runners, salt-api, and more, please
2016-03-16 16:43:53 -06:00
refer to the :ref:`Integration Test Class Examples<integration-class-examples>`
documentation.
Destructive vs Non-destructive Tests
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2013-11-25 17:13:48 -07:00
2014-05-24 23:42:00 -05:00
Since Salt is used to change the settings and behavior of systems, often, the
2013-11-26 13:00:47 +00:00
best approach to run tests is to make actual changes to an underlying system.
2013-11-25 17:13:48 -07:00
This is where the concept of destructive integration tests comes into play.
Tests can be written to alter the system they are running on. This capability
is what fills in the gap needed to properly test aspects of system management
2013-11-26 13:00:47 +00:00
like package installation.
2013-11-25 17:13:48 -07:00
To write a destructive test, decorate the test function with the
``destructive_test``:
2013-11-25 17:13:48 -07:00
.. code-block:: python
2013-11-25 17:13:48 -07:00
@pytest.mark.destructive_test
def test_pkg_install(salt_cli):
ret = salt_cli.run("pkg.install", "finch")
assert ret
2013-11-25 17:13:48 -07:00
Writing Unit Tests
------------------
As explained in the `Integration vs. Unit`_ section above, unit tests should be
written to test the *logic* of a function. This includes focusing on testing
``return`` and ``raises`` statements. Substantial effort should be made to mock
external resources that are used in the code being tested.
External resources that should be mocked include, but are not limited to, APIs,
function calls, external data either globally available or passed in through
function arguments, file data, etc. This practice helps to isolate unit tests to
test Salt logic. One handy way to think about writing unit tests is to "block
all of the exits". More information about how to properly mock external resources
2016-03-16 16:43:53 -06:00
can be found in Salt's :ref:`Unit Test<unit-tests>` documentation.
Salt's unit tests utilize Python's mock class as well as `MagicMock`_. The
``@patch`` decorator is also heavily used when "blocking all the exits".
A simple example of a unit test currently in use in Salt is the
``test_get_file_not_found`` test in the ``tests/pytests/unit/modules/test_cp.py`` file.
This test uses the ``@patch`` decorator and ``MagicMock`` to mock the return
of the call to Salt's ``cp.hash_file`` execution module function. This ensures
that we're testing the ``cp.get_file`` function directly, instead of inadvertently
testing the call to ``cp.hash_file``, which is used in ``cp.get_file``.
.. code-block:: python
def test_get_file_not_found(self):
2020-06-09 09:58:34 +01:00
"""
Test if get_file can't find the file.
2020-06-09 09:58:34 +01:00
"""
2017-03-24 17:42:36 +00:00
with patch("salt.modules.cp.hash_file", MagicMock(return_value=False)):
path = "salt://saltines"
dest = "/srv/salt/cheese"
ret = ""
assert cp.get_file(path, dest) == ret
Note that Salt's ``cp`` module is imported at the top of the file, along with all
of the other necessary testing imports. The ``get_file`` function is then called
2016-04-04 14:47:12 +02:00
directed in the testing function, instead of using the ``run_function`` method as
the integration test examples do above.
The call to ``cp.get_file`` returns an empty string when a ``hash_file`` isn't found.
Therefore, the example above is a good illustration of a unit test "blocking
the exits" via the ``@patch`` decorator, as well as testing logic via asserting
against the ``return`` statement in the ``if`` clause. In this example we used the
python ``assert`` to verify the return from ``cp.get_file``. Pytest allows you to use
2020-02-10 11:30:38 -05:00
these `asserts`_ when writing your tests and, in fact, plain `asserts`_ is the preferred
way to assert anything in your tests. As Salt dives deeper into Pytest, the use of
`unittest.TestClass` will be replaced by plain test functions, or test functions grouped
in a class, which **does not** subclass `unittest.TestClass`, which, of course, doesn't
work with unittest assert functions.
There are more examples of writing unit tests of varying complexities available
in the following docs:
* :ref:`Simple Unit Test Example<simple-unit-example>`
* :ref:`Complete Unit Test Example<complete-unit-example>`
* :ref:`Complex Unit Test Example<complex-unit-example>`
.. note::
Considerable care should be made to ensure that you're testing something
useful in your test functions. It is very easy to fall into a situation
where you have mocked so much of the original function that the test
results in only asserting against the data you have provided. This results
in a poor and fragile unit test.
2020-04-23 10:00:30 -04:00
Add a python module dependency to the test run
----------------------------------------------
The test dependencies for python modules are managed under the ``requirements/static/ci``
directory. You will need to add your module to the appropriate file under ``requirements/static/ci``.
2020-04-23 10:00:30 -04:00
When ``pre-commit`` is run it will create all of the needed requirement files
under ``requirements/static/ci/py3{6,7,8,9}``. Nox will then use these files to install
2020-04-23 10:00:30 -04:00
the requirements for the tests.
Add a system dependency to the test run
---------------------------------------
If you need to add a system dependency for the test run, this will need to be added in
the `salt-ci-images`_ repo. This repo uses salt states to install system dependencies.
You need to update the ``state-tree/golden-images-provision.sls`` file with
your dependency to ensure it is installed. Once your PR is merged the core team
will need to promote the new images with your new dependency installed.
2020-04-23 10:00:30 -04:00
Checking for Log Messages
=========================
To test to see if a given log message has been emitted, the following pattern
can be used
.. code-block:: python
def test_issue_58763_a(tmp_path, modules, state_tree, caplog):
venv_dir = tmp_path / "issue-2028-pip-installed"
sls_contents = """
test.random_hash:
module.run:
- size: 10
- hash_type: md5
"""
with pytest.helpers.temp_file("issue-58763.sls", sls_contents, state_tree):
with caplog.at_level(logging.DEBUG):
ret = modules.state.sls(
mods="issue-58763",
)
assert len(ret.raw) == 1
for k in ret.raw:
assert ret.raw[k]["result"] is True
assert (
"Detected legacy module.run syntax: test.random_hash" in caplog.messages
)
2023-04-17 14:02:39 -05:00
Test Groups
===========
Salt has four groups
2023-04-17 14:31:11 -05:00
2023-04-17 14:05:57 -05:00
* fast - Tests that are ~10s or faster. Fast tests make up ~75% of tests and can run in 10 to 20 minutes.
* slow - Tests that are ~10s or slower.
* core - Tests of any speed that test the root parts of salt.
* flaky-jail - Test that need to be temporarily skipped.
2023-04-17 14:02:39 -05:00
Pytest Decorators
2023-04-17 14:31:11 -05:00
2023-04-17 14:02:39 -05:00
* @pytest.mark.slow_test
* @pytest.mark.core_test
* @pytest.mark.flaky_jail
.. code-block:: python
2023-04-17 15:58:32 -05:00
2023-04-17 14:02:39 -05:00
@pytest.mark.core_test
def test_ping(self):
"""
test.ping
"""
self.assertTrue(self.run_function("test.ping"))
You can also mark all the tests in file.
.. code-block:: python
2023-04-17 15:58:32 -05:00
2023-04-17 14:31:11 -05:00
pytestmark = [
pytest.mark.core_test,
]
2023-04-17 14:02:39 -05:00
def test_ping(self):
"""
test.ping
"""
self.assertTrue(self.run_function("test.ping"))
def test_ping2(self):
"""
test.ping
"""
for _ in range(10):
self.assertTrue(self.run_function("test.ping"))
You can enable or disable test groups locally by passing there respected flag:
2023-04-17 14:31:11 -05:00
2023-04-17 14:05:57 -05:00
* --no-fast-tests
* --slow-tests
* --core-tests
* --flaky-jail
2023-04-17 14:02:39 -05:00
2023-04-17 14:20:01 -05:00
In your PR you can enable or disable test groups by setting a label.
2023-04-17 14:02:39 -05:00
All thought the fast, slow and core tests specified in the change file will always run.
2023-04-17 14:31:11 -05:00
2023-04-17 14:05:57 -05:00
* test:no-fast
* test:slow
* test:core
* test:flaky-jail
2023-04-17 14:02:39 -05:00
Additional Testing Documentation
================================
In addition to this tutorial, there are some other helpful resources and documentation
that go into more depth on Salt's test runner, writing tests for Salt code, and general
Python testing documentation. Please see the follow references for more information:
2016-03-16 16:43:53 -06:00
* :ref:`Salt's Test Suite Documentation<salt-test-suite>`
* :ref:`Integration Tests<integration-tests>`
* :ref:`Unit Tests<unit-tests>`
* `MagicMock`_
* `Python Unittest`_
* `Python's Assert Functions`_
.. _asserts: https://docs.pytest.org/en/latest/assert.html
.. _pytest syntax: https://docs.pytest.org/en/latest/usage.html#specifying-tests-selecting-tests
.. _MagicMock: https://docs.python.org/3/library/unittest.mock.html
2021-01-13 12:55:50 -06:00
.. _Python Unittest: https://docs.python.org/3/library/unittest.html
.. _Python's Assert Functions: https://docs.python.org/3/library/unittest.html#assert-methods
.. _salt-ci-images: https://github.com/saltstack/salt-ci-images