Merge remote-tracking branch 'saltstack/3007.x' into merge/master/3007.x

This commit is contained in:
Daniel A. Wozniak 2025-02-17 00:09:58 -07:00
commit a017805680
587 changed files with 17463 additions and 19759 deletions

View file

@ -1,11 +1,8 @@
blank_issues_enabled: true
contact_links:
- name: Salt Community Slack
url: https://saltstackcommunity.slack.com/
- name: Salt Community Discord
url: https://discord.com/invite/J7b7EscrAs
about: Please ask and answer questions here.
- name: Salt-Users Forum
url: https://groups.google.com/forum/#!forum/salt-users
about: Please ask and answer questions here.
- name: Salt on LiberaChat
url: https://web.libera.chat/#salt
about: Please ask and answer questions here.

View file

@ -8,7 +8,7 @@ assignees: ''
---
### Description of the tech debt to be addressed, include links and screenshots
<!-- Note: Please direct questions to the salt-users google group, IRC or Community Slack. -->
<!-- Note: Please direct questions to the salt-users google group, GitHub Discussions or Community Discord. -->
### Versions Report
(Provided by running `salt --versions-report`. Please also mention any differences in master/minion versions.)

View file

@ -11,7 +11,9 @@ Remove this section if not relevant
### Merge requirements satisfied?
**[NOTICE] Bug fixes or features added to Salt require tests.**
<!-- Please review the [test documentation](https://docs.saltproject.io/en/master/topics/tutorials/writing_tests.html) for details on how to implement tests into Salt's test suite. -->
<!-- Please review the test documentation for details on how to implement tests
into Salt's test suite:
https://docs.saltproject.io/en/master/topics/tutorials/writing_tests.html -->
- [ ] Docs
- [ ] Changelog - https://docs.saltproject.io/en/master/topics/development/changelog.html
- [ ] Tests written/updated
@ -19,7 +21,13 @@ Remove this section if not relevant
### Commits signed with GPG?
Yes/No
Please review [Salt's Contributing Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html) for best practices, including the
[PR Guidelines](https://docs.saltproject.io/en/master/topics/development/pull_requests.html).
<!-- Please review Salt's Contributing Guide for best practices and guidance in
choosing the right branch:
https://docs.saltproject.io/en/master/topics/development/contributing.html -->
See GitHub's [page on GPG signing](https://help.github.com/articles/signing-commits-using-gpg/) for more information about signing commits with GPG.
<!-- Additional guidance for pull requests can be found here:
https://docs.saltproject.io/en/master/topics/development/pull_requests.html -->
<!-- See GitHub's page on GPG signing for more information about signing commits
with GPG:
https://help.github.com/articles/signing-commits-using-gpg/ -->

View file

@ -1,14 +1,5 @@
self-hosted-runner:
# Labels of self-hosted runner in array of string
labels:
- bastion
- x86_64
- arm64
- aarch64
- amd64
- repo-nightly
- repo-staging
- repo-release
- medium
- large
- macos-13-xlarge
- linux-x86_64
- linux-arm64

View file

@ -26,10 +26,6 @@ inputs:
description: 'Check if a cache entry exists for the given input(s) (key, restore-keys) without downloading the cache'
default: 'false'
required: false
save-always:
description: 'Run the post step to save the cache even if another step before fails'
default: 'false'
required: false
outputs:
cache-hit:
@ -49,7 +45,6 @@ runs:
echo "GHA_CACHE_ENABLE_CROSS_OS_ARCHIVE=${{ inputs.enableCrossOsArchive }}" | tee -a "${GITHUB_ENV}"
echo "GHA_CACHE_FAIL_ON_CACHE_MISS=${{ inputs.fail-on-cache-miss }}" | tee -a "${GITHUB_ENV}"
echo "GHA_CACHE_LOOKUP_ONLY=${{ inputs.lookup-only }}" | tee -a "${GITHUB_ENV}"
echo "GHA_CACHE_SAVE_ALWAYS=${{ inputs.save-always }}" | tee -a "${GITHUB_ENV}"
echo "GHA_CACHE_RESTORE_KEYS=${{ inputs.restore-keys }}" | tee -a "${GITHUB_ENV}"
echo "GHA_CACHE_UPLOAD_CHUNK_SIZE=${{ inputs.upload-chunk-size }}" | tee -a "${GITHUB_ENV}"
@ -63,7 +58,6 @@ runs:
enableCrossOsArchive: ${{ env.GHA_CACHE_ENABLE_CROSS_OS_ARCHIVE }}
fail-on-cache-miss: ${{ env.GHA_CACHE_FAIL_ON_CACHE_MISS }}
lookup-only: ${{ env.GHA_CACHE_LOOKUP_ONLY }}
save-always: ${{ env.GHA_CACHE_SAVE_ALWAYS }}
restore-keys: ${{ env.GHA_CACHE_RESTORE_KEYS }}
upload-chunk-size: ${{ env.GHA_CACHE_UPLOAD_CHUNK_SIZE }}
@ -97,7 +91,6 @@ runs:
enableCrossOsArchive: ${{ env.GHA_CACHE_ENABLE_CROSS_OS_ARCHIVE }}
fail-on-cache-miss: ${{ env.GHA_CACHE_FAIL_ON_CACHE_MISS }}
lookup-only: ${{ env.GHA_CACHE_LOOKUP_ONLY }}
save-always: ${{ env.GHA_CACHE_SAVE_ALWAYS }}
restore-keys: ${{ env.GHA_CACHE_RESTORE_KEYS }}
upload-chunk-size: ${{ env.GHA_CACHE_UPLOAD_CHUNK_SIZE }}

View file

@ -54,10 +54,13 @@ runs:
working-directory: ${{ inputs.cwd }}
run: |
PYTHON_EXE=${{ steps.tools-virtualenv.outputs.python-executable }}
${PYTHON_EXE} -m ensurepip --upgrade
(${PYTHON_EXE} -m pip install --help | grep break-system-packages > /dev/null 2>&1) && exitcode=0 || exitcode=1
if [ $exitcode -eq 0 ]; then
${PYTHON_EXE} -m pip install --break-system-packages --upgrade setuptools
${PYTHON_EXE} -m pip install --break-system-packages -r requirements/static/ci/py${{ steps.get-python-version.outputs.version }}/tools.txt
else
${PYTHON_EXE} -m pip install --upgrade setuptools
${PYTHON_EXE} -m pip install -r requirements/static/ci/py${{ steps.get-python-version.outputs.version }}/tools.txt
fi

94
.github/actions/ssh-tunnel/README.md vendored Normal file
View file

@ -0,0 +1,94 @@
# SSH Tunnel
The ssh-tunnel action will create a reverse tunnel over webrtc to port 22 on the runner.
## Usage
In order to use this action you must have a sdp offer from your local host and a ssh key pair.
Start with creating an sdp offer on your local machine. Provide these values to the ssh-tunnel
action and wait for output from the action with the sdp reply. Provide the reply to the local
rtcforward.py process by pasting it to stdin. If all goes well the local port on your maching
will be forwarded to the ssh port on the runner.
### Getting an sdp offer
To get an sdp offer start rtcforward.py on you local machine with the offer command.
You can also specify which port on the local machine will be used for the tunnel.
``` bash
$ python3 .github/actions/ssh-tunnel/rtcforward.py offer --port 5222
```
rtcforward.py will create an offer an display it to your terminal. (This example offer has been truncated)
After showing the offer the `rtcforward.py` process will wait for a reply.
```
-- offer --
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg4NjUzIDM5NDczODg2NTMgSU4gSVA0IDAuMC4wLjBcclxu
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
bm09YXBwbGljYXRpb24gMzUyNjkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC4wLjIw
IHVkcCAxNjk0NDk4ODE1IDE4NC4xNzkuMjEwLjE1MiAzNTI2OSB0eXAgc3JmbHggcmFkZHIgMTkyLjE2
OC4wLjIwMSBycG9ydCAzNTI2OVxyXG5hPWNhbmRpZGF0ZTozZWFjMzJiZTZkY2RkMTAwZDcwMTFiNWY0
NTo4Qzo2MDoxMTpFQTo3NzpDMTo5RTo1QTo3QzpDQzowRDowODpFQzo2NDowQToxM1xyXG5hPWZpbmdl
cnByaW50OnNoYS01MTIgNjY6MzI6RUQ6MDA6N0I6QjY6NTQ6NzA6MzE6OTA6M0I6Mjg6Q0I6QTk6REU6
MzQ6QjI6NDY6NzE6NUI6MjM6ODA6Nzg6Njg6RDA6QTA6QTg6MjU6QkY6MDQ6ODY6NUY6OTA6QUY6MUQ6
QjA6QzY6ODA6QUY6OTc6QTI6MkM6NDI6QUU6MkI6Q0Q6Mjk6RUQ6MkI6ODc6NTU6ODg6NDY6QTM6ODk6
OEY6ODk6OTE6QTE6QTI6NDM6NTc6M0E6MjZcclxuYT1zZXR1cDphY3RwYXNzXHJcbiIsICJ0eXBlIjog
Im9mZmVyIn0=
-- end offer --
-- Please enter a message from remote party --
```
### Getting an sdp answer
Provide the offer to the ssh-tunnel action. When the action runs, an answer to the offer will be generated.
In the action output you will see that the offer was recieved and the reply in the output.
```
-- Please enter a message from remote party --
-- Message received --
-- reply --
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg3NDcxIDM5NDczODc0NzEgSU4gSVA0IDAuMC4wLjBcclxu
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
bm09YXBwbGljYXRpb24gNTcwMzkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC42NC4x
MFxyXG5hPW1pZDowXHJcbmE9c2N0cG1hcDo1MDAwIHdlYnJ0Yy1kYXRhY2hhbm5lbCA2NTUzNVxyXG5h
MTc6MEI6RTA6OTA6QUM6RjU6RTk6RUI6Q0E6RUE6NTY6REI6NTA6QTk6REY6NTU6MzY6MkM6REI6OUE6
MDc6Mzc6QTM6NDc6NjlcclxuYT1maW5nZXJwcmludDpzaGEtNTEyIDMyOjRDOjk0OkRDOjNFOkU5OkU3
OjNCOjc5OjI4OjZDOjc5OkFEOkVDOjIzOkJDOjRBOjRBOjE5OjlCOjg5OkE3OkE2OjZBOjAwOjJFOkM5
OkE0OjlEOjAwOjM0OjFFOjRDOkVGOjcwOkY5OkNBOjg0OjlEOjcxOjI5OkVCOkIxOkREOkFEOjg5OjUx
OkZFOjhCOjI3OjFDOjFBOkJEOjUxOjQ2OjE4OjBBOjhFOjVBOjI1OjQzOjQzOjZGOkRBXHJcbmE9c2V0
dXA6YWN0aXZlXHJcbiIsICJ0eXBlIjogImFuc3dlciJ9
-- end reply --
```
# Finalizing the tunnel
Paste the sdp reply from the running action into the running `rtcforward.py` process that created the offer.
After receiveing the offer you will see `-- Message received --` and tunnel will be created.
```
-- offer --
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg4NjUzIDM5NDczODg2NTMgSU4gSVA0IDAuMC4wLjBcclxu
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
bm09YXBwbGljYXRpb24gMzUyNjkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC4wLjIw
IHVkcCAxNjk0NDk4ODE1IDE4NC4xNzkuMjEwLjE1MiAzNTI2OSB0eXAgc3JmbHggcmFkZHIgMTkyLjE2
OC4wLjIwMSBycG9ydCAzNTI2OVxyXG5hPWNhbmRpZGF0ZTozZWFjMzJiZTZkY2RkMTAwZDcwMTFiNWY0
NTo4Qzo2MDoxMTpFQTo3NzpDMTo5RTo1QTo3QzpDQzowRDowODpFQzo2NDowQToxM1xyXG5hPWZpbmdl
cnByaW50OnNoYS01MTIgNjY6MzI6RUQ6MDA6N0I6QjY6NTQ6NzA6MzE6OTA6M0I6Mjg6Q0I6QTk6REU6
MzQ6QjI6NDY6NzE6NUI6MjM6ODA6Nzg6Njg6RDA6QTA6QTg6MjU6QkY6MDQ6ODY6NUY6OTA6QUY6MUQ6
QjA6QzY6ODA6QUY6OTc6QTI6MkM6NDI6QUU6MkI6Q0Q6Mjk6RUQ6MkI6ODc6NTU6ODg6NDY6QTM6ODk6
OEY6ODk6OTE6QTE6QTI6NDM6NTc6M0E6MjZcclxuYT1zZXR1cDphY3RwYXNzXHJcbiIsICJ0eXBlIjog
Im9mZmVyIn0=
-- end offer --
-- Please enter a message from remote party --
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg3NDcxIDM5NDczODc0NzEgSU4gSVA0IDAuMC4wLjBcclxu
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
bm09YXBwbGljYXRpb24gNTcwMzkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC42NC4x
MFxyXG5hPW1pZDowXHJcbmE9c2N0cG1hcDo1MDAwIHdlYnJ0Yy1kYXRhY2hhbm5lbCA2NTUzNVxyXG5h
MTc6MEI6RTA6OTA6QUM6RjU6RTk6RUI6Q0E6RUE6NTY6REI6NTA6QTk6REY6NTU6MzY6MkM6REI6OUE6
MDc6Mzc6QTM6NDc6NjlcclxuYT1maW5nZXJwcmludDpzaGEtNTEyIDMyOjRDOjk0OkRDOjNFOkU5OkU3
OjNCOjc5OjI4OjZDOjc5OkFEOkVDOjIzOkJDOjRBOjRBOjE5OjlCOjg5OkE3OkE2OjZBOjAwOjJFOkM5
OkE0OjlEOjAwOjM0OjFFOjRDOkVGOjcwOkY5OkNBOjg0OjlEOjcxOjI5OkVCOkIxOkREOkFEOjg5OjUx
OkZFOjhCOjI3OjFDOjFBOkJEOjUxOjQ2OjE4OjBBOjhFOjVBOjI1OjQzOjQzOjZGOkRBXHJcbmE9c2V0
dXA6YWN0aXZlXHJcbiIsICJ0eXBlIjogImFuc3dlciJ9
-- Message received --
```

16
.github/config.yml vendored
View file

@ -11,18 +11,16 @@ newIssueWelcomeComment: >
Also, check out some of our community
resources including:
- [Community Wiki](https://github.com/saltstack/community/wiki)
- [Salts Contributor Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html)
- [Join our Community Slack](https://via.vmw.com/salt-slack)
- [IRC on LiberaChat](https://web.libera.chat/#salt)
- [Join our Community Discord](https://discord.com/invite/J7b7EscrAs)
- [Salt Project YouTube channel](https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg)
- [Salt Project Twitch channel](https://www.twitch.tv/saltprojectoss)
- [Community Wiki](https://github.com/saltstack/community/wiki)
There are lots of ways to get involved in our community. Every month, there are around a dozen
opportunities to meet with other contributors and the Salt Core team and collaborate in real
time. The best way to keep track is by subscribing to the Salt Community Events Calendar.
If you have additional questions, email us at saltproject@vmware.com. Were glad
If you have additional questions, email us at saltproject.pdl@broadcom.com. Were glad
youve joined our community and look forward to doing awesome things with
you!
@ -37,18 +35,16 @@ newPRWelcomeComment: >
Also, check out some of our community
resources including:
- [Community Wiki](https://github.com/saltstack/community/wiki)
- [Salts Contributor Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html)
- [Join our Community Slack](https://via.vmw.com/salt-slack)
- [IRC on LiberaChat](https://web.libera.chat/#salt)
- [Join our Community Discord](https://discord.com/invite/J7b7EscrAs)
- [Salt Project YouTube channel](https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg)
- [Salt Project Twitch channel](https://www.twitch.tv/saltprojectoss)
- [Community Wiki](https://github.com/saltstack/community/wiki)
There are lots of ways to get involved in our community. Every month, there are around a dozen
opportunities to meet with other contributors and the Salt Core team and collaborate in real
time. The best way to keep track is by subscribing to the Salt Community Events Calendar.
If you have additional questions, email us at saltproject@vmware.com. Were glad
If you have additional questions, email us at saltproject.pdl@broadcom.com. Were glad
youve joined our community and look forward to doing awesome things with
you!

View file

@ -20,12 +20,14 @@ jobs:
github.event.pull_request.merged == true
&& (
contains(github.event.pull_request.labels.*.name, 'backport:master') ||
contains(github.event.pull_request.labels.*.name, 'backport:3007.x') ||
contains(github.event.pull_request.labels.*.name, 'backport:3006.x') ||
contains(github.event.pull_request.labels.*.name, 'backport:3005.x')
)
&& (
(github.event.action == 'labeled' && (
contains(github.event.pull_request.labels.*.name, 'backport:master') ||
contains(github.event.pull_request.labels.*.name, 'backport:3007.x') ||
contains(github.event.pull_request.labels.*.name, 'backport:3006.x') ||
contains(github.event.pull_request.labels.*.name, 'backport:3005.x')
))

View file

@ -34,6 +34,14 @@ on:
type: string
description: The onedir package name to use
default: salt
matrix:
required: true
type: string
description: Json job matrix config
linux_arm_runner:
required: true
type: string
description: Json job matrix config
env:
@ -48,52 +56,22 @@ env:
jobs:
generate-matrix:
name: Generate Matrix
runs-on: ubuntu-latest
outputs:
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
env:
PIP_INDEX_URL: https://pypi.org/simple
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Generate Test Matrix
id: generate-matrix
run: |
tools ci deps-matrix
linux-dependencies:
name: Linux
needs:
- generate-matrix
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
runs-on:
- self-hosted
- linux
- bastion
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
env:
USE_S3_CACHE: 'true'
USE_S3_CACHE: 'false'
timeout-minutes: 90
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include)['linux'] }}
include: ${{ fromJSON(inputs.matrix)['linux'] }}
steps:
- uses: actions/setup-python@v5
with:
python-version: '3.10'
- name: "Throttle Builds"
shell: bash
@ -103,6 +81,10 @@ jobs:
- name: Checkout Source Code
uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.10'
- name: Cache nox.linux.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
id: nox-dependencies-cache
uses: ./.github/actions/cache
@ -136,53 +118,34 @@ jobs:
with:
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
- name: Get Salt Project GitHub Actions Bot Environment
- name: Install System Dependencies
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
echo true
- name: Start VM
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
id: spin-up-vm
run: |
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
- name: List Free Space
- name: Install Nox
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
- name: Upload Checkout To VM
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm rsync ${{ matrix.distro-slug }}
python3 -m pip install 'nox==${{ inputs.nox-version }}'
- name: Install Dependencies
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
env:
PRINT_TEST_SELECTION: "0"
PRINT_SYSTEM_INFO: "0"
RELENV_BUILDENV: "1"
run: |
tools --timestamps vm install-dependencies --nox-session=${{ inputs.nox-session }} ${{ matrix.distro-slug }}
nox --install-only -e ${{ inputs.nox-session }}
- name: Cleanup .nox Directory
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm pre-archive-cleanup ${{ matrix.distro-slug }}
nox --force-color -e "pre-archive-cleanup(pkg=False)"
- name: Compress .nox Directory
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm compress-dependencies ${{ matrix.distro-slug }}
- name: Download Compressed .nox Directory
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm download-dependencies ${{ matrix.distro-slug }}
- name: Destroy VM
if: always() && steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }}
nox --force-color -e compress-dependencies -- linux ${{ matrix.arch }}
- name: Upload Nox Requirements Tarball
uses: actions/upload-artifact@v4
@ -192,14 +155,13 @@ jobs:
macos-dependencies:
name: MacOS
needs:
- generate-matrix
runs-on: ${{ matrix.distro-slug == 'macos-13-arm64' && 'macos-13-xlarge' || matrix.distro-slug }}
runs-on: ${{ matrix.arch == 'x86_64' && 'macos-13' || 'macos-14' }}
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
timeout-minutes: 90
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include)['macos'] }}
include: ${{ fromJSON(inputs.matrix)['macos'] }}
env:
PIP_INDEX_URL: https://pypi.org/simple
steps:
@ -278,21 +240,19 @@ jobs:
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
path: nox.macos.${{ matrix.arch }}.tar.*
windows-dependencies:
needs:
- generate-matrix
name: Windows
runs-on:
- self-hosted
- linux
- bastion
runs-on: windows-latest
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
env:
USE_S3_CACHE: 'true'
USE_S3_CACHE: 'false'
GITHUB_WORKSPACE: 'C:\Windows\Temp\testing'
timeout-minutes: 90
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include)['windows'] }}
include: ${{ fromJSON(inputs.matrix)['windows'] }}
steps:
- name: "Throttle Builds"
@ -300,6 +260,10 @@ jobs:
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: "Show environment"
run: |
env
- name: Checkout Source Code
uses: actions/checkout@v4
@ -325,10 +289,11 @@ jobs:
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
- name: PyPi Proxy
- name: Set up Python ${{ inputs.python-version }}
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
sed -i '7s;^;--index-url=${{ vars.PIP_INDEX_URL }} --trusted-host ${{ vars.PIP_TRUSTED_HOST }} --extra-index-url=${{ vars.PIP_EXTRA_INDEX_URL }}\n;' requirements/static/ci/*/*.txt
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python-version }}"
- name: Setup Python Tools Scripts
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
@ -336,53 +301,33 @@ jobs:
with:
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
- name: Get Salt Project GitHub Actions Bot Environment
- name: Install System Dependencies
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
echo true
- name: Start VM
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
id: spin-up-vm
run: |
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
- name: List Free Space
- name: Install Nox
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
- name: Upload Checkout To VM
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm rsync ${{ matrix.distro-slug }}
python3 -m pip install 'nox==${{ inputs.nox-version }}'
- name: Install Dependencies
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
env:
PRINT_TEST_SELECTION: "0"
PRINT_SYSTEM_INFO: "0"
run: |
tools --timestamps vm install-dependencies --nox-session=${{ inputs.nox-session }} ${{ matrix.distro-slug }}
nox --install-only -e ${{ inputs.nox-session }}
- name: Cleanup .nox Directory
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm pre-archive-cleanup ${{ matrix.distro-slug }}
nox --force-color -e "pre-archive-cleanup(pkg=False)"
- name: Compress .nox Directory
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm compress-dependencies ${{ matrix.distro-slug }}
- name: Download Compressed .nox Directory
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm download-dependencies ${{ matrix.distro-slug }}
- name: Destroy VM
if: always() && steps.nox-dependencies-cache.outputs.cache-hit != 'true'
run: |
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }}
nox --force-color -e compress-dependencies -- windows ${{ matrix.arch }}
- name: Upload Nox Requirements Tarball
uses: actions/upload-artifact@v4

View file

@ -8,12 +8,6 @@ on:
type: string
required: true
description: The Salt version to set prior to building packages.
github-hosted-runners:
type: boolean
required: true
self-hosted-runners:
type: boolean
required: true
cache-seed:
required: true
type: string
@ -26,6 +20,14 @@ on:
required: true
type: string
description: The version of python to use with relenv
matrix:
required: true
type: string
description: Json job matrix config
linux_arm_runner:
required: true
type: string
description: Json job matrix config
env:
RELENV_DATA: "${{ github.workspace }}/.relenv"
@ -41,20 +43,15 @@ jobs:
build-deps-linux:
name: Linux
if: ${{ inputs.self-hosted-runners }}
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
runs-on:
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
strategy:
fail-fast: false
matrix:
arch:
- x86_64
- arm64
runs-on:
- self-hosted
- linux
- medium
- ${{ matrix.arch }}
include: ${{ fromJSON(inputs.matrix)['linux'] }}
env:
USE_S3_CACHE: 'true'
USE_S3_CACHE: 'false'
steps:
- name: "Fail"
run: exit 1
@ -66,6 +63,10 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.10'
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
@ -91,19 +92,23 @@ jobs:
build-deps-macos:
name: macOS
if: ${{ inputs.github-hosted-runners }}
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
strategy:
fail-fast: false
max-parallel: 2
matrix:
arch: ${{ github.event.repository.fork && fromJSON('["x86_64"]') || fromJSON('["x86_64", "arm64"]') }}
include: ${{ fromJSON(inputs.matrix)['macos'] }}
runs-on:
- ${{ matrix.arch == 'arm64' && 'macos-13-xlarge' || 'macos-12' }}
- ${{ matrix.arch == 'arm64' && 'macos-14' || 'macos-13' }}
env:
USE_S3_CACHE: 'false'
PIP_INDEX_URL: https://pypi.org/simple
steps:
- name: "Check cores"
shell: bash
run: sysctl -n hw.ncpu
- name: "Throttle Builds"
shell: bash
run: |
@ -141,14 +146,12 @@ jobs:
build-deps-windows:
name: Windows
if: ${{ inputs.github-hosted-runners }}
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
strategy:
fail-fast: false
max-parallel: 2
matrix:
arch:
- x86
- amd64
include: ${{ fromJSON(inputs.matrix)['windows'] }}
runs-on: windows-latest
env:
USE_S3_CACHE: 'false'

View file

@ -24,19 +24,21 @@ jobs:
build:
name: Build
runs-on:
- ubuntu-latest
- ubuntu-22.04
strategy:
fail-fast: false
matrix:
docs-output:
- linkcheck
- spellcheck
# XXX re-enable lintcheck and spellcheck then fix the errors
# - linkcheck
# - spellcheck
- html
- epub
# - pdf
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.10'
- name: Download Release Patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}

View file

@ -36,6 +36,14 @@ on:
required: true
type: string
description: Seed used to invalidate caches
matrix:
required: true
type: string
description: Json job matrix config
linux_arm_runner:
required: true
type: string
description: Json job matrix config
env:
COLUMNS: 190
@ -46,19 +54,199 @@ env:
jobs:
build-deb-packages:
name: DEB
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
runs-on:
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(inputs.matrix)['linux'] }}
container:
image: ghcr.io/saltstack/salt-ci-containers/packaging:debian-12
steps:
# Checkout here so we can easily use custom actions
- uses: actions/checkout@v4
# We need a more recent rustc
- name: Install a more recent `rustc`
if: ${{ inputs.source == 'src' }}
uses: actions-rust-lang/setup-rust-toolchain@v1
- name: Set rust environment variables
if: ${{ inputs.source == 'src' }}
run: |
CARGO_HOME=${CARGO_HOME:-${HOME}/.cargo}
export CARGO_HOME
echo "CARGO_HOME=${CARGO_HOME}" | tee -a "${GITHUB_ENV}"
echo "${CARGO_HOME}/bin" | tee -a "${GITHUB_PATH}"
# Checkout here for the build process
- name: Checkout in build directory
uses: actions/checkout@v4
with:
path:
pkgs/checkout/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
path: pkgs/checkout/artifacts/
- name: Download Release Patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}.patch
path: pkgs/checkout/
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cwd: pkgs/checkout/
cache-prefix: ${{ inputs.cache-prefix }}
- name: Setup Salt Version
id: setup-salt-version
uses: ./.github/actions/setup-salt-version
with:
salt-version: "${{ inputs.salt-version }}"
cwd: pkgs/checkout/
- name: Configure Git
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
working-directory: pkgs/checkout/
run: |
tools pkg configure-git
- name: Apply release patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
working-directory: pkgs/checkout/
run: |
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
- name: Build Deb
working-directory: pkgs/checkout/
run: |
tools pkg build deb --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
inputs.source == 'onedir' &&
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
||
format('--arch={0}', matrix.arch)
}}
- name: Cleanup
run: |
rm -rf pkgs/checkout/
- name: Set Artifact Name
id: set-artifact-name
run: |
if [ "${{ inputs.source }}" != "src" ]; then
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb" >> "$GITHUB_OUTPUT"
else
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb-from-src" >> "$GITHUB_OUTPUT"
fi
- name: Upload DEBs
uses: actions/upload-artifact@v4
with:
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
path: ${{ github.workspace }}/pkgs/*
retention-days: 7
if-no-files-found: error
build-rpm-packages:
name: RPM
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
runs-on:
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(inputs.matrix)['linux'] }}
container:
image: ghcr.io/saltstack/salt-ci-containers/packaging:rockylinux-9
steps:
- uses: actions/checkout@v4
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
path: artifacts/
- name: Download Release Patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}.patch
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Setup Salt Version
id: setup-salt-version
uses: ./.github/actions/setup-salt-version
with:
salt-version: "${{ inputs.salt-version }}"
- name: Configure Git
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
run: |
tools pkg configure-git
- name: Apply release patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
run: |
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
- name: Build RPM
run: |
tools pkg build rpm --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
inputs.source == 'onedir' &&
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
||
format('--arch={0}', matrix.arch)
}}
- name: Set Artifact Name
id: set-artifact-name
run: |
if [ "${{ inputs.source }}" != "src" ]; then
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm" >> "$GITHUB_OUTPUT"
else
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm-from-src" >> "$GITHUB_OUTPUT"
fi
- name: Upload RPMs
uses: actions/upload-artifact@v4
with:
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
path: ~/rpmbuild/RPMS/${{ matrix.arch == 'arm64' && 'aarch64' || matrix.arch }}/*.rpm
retention-days: 7
if-no-files-found: error
build-macos-pkgs:
name: macOS
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
environment: ${{ inputs.environment }}
strategy:
fail-fast: false
matrix:
arch: ${{ github.event.repository.fork && fromJSON('["x86_64"]') || fromJSON('["x86_64", "arm64"]') }}
source:
- ${{ inputs.source }}
include: ${{ fromJSON(inputs.matrix)['macos'] }}
env:
PIP_INDEX_URL: https://pypi.org/simple
runs-on:
- ${{ matrix.arch == 'arm64' && 'macos-13-xlarge' || 'macos-12' }}
- ${{ matrix.arch == 'arm64' && 'macos-14' || 'macos-13' }}
steps:
- name: Check Package Signing Enabled
@ -162,212 +350,15 @@ jobs:
retention-days: 7
if-no-files-found: error
build-deb-packages:
name: DEB
runs-on:
- self-hosted
- linux
- medium
- ${{ matrix.arch }}
strategy:
fail-fast: false
matrix:
arch:
- x86_64
- arm64
source:
- ${{ inputs.source }}
container:
image: ghcr.io/saltstack/salt-ci-containers/packaging:debian-12
steps:
# Checkout here so we can easily use custom actions
- uses: actions/checkout@v4
# We need a more recent rustc
- name: Install a more recent `rustc`
if: ${{ inputs.source == 'src' }}
uses: actions-rust-lang/setup-rust-toolchain@v1
- name: Set rust environment variables
if: ${{ inputs.source == 'src' }}
run: |
CARGO_HOME=${CARGO_HOME:-${HOME}/.cargo}
export CARGO_HOME
echo "CARGO_HOME=${CARGO_HOME}" | tee -a "${GITHUB_ENV}"
echo "${CARGO_HOME}/bin" | tee -a "${GITHUB_PATH}"
# Checkout here for the build process
- name: Checkout in build directory
uses: actions/checkout@v4
with:
path:
pkgs/checkout/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
path: pkgs/checkout/artifacts/
- name: Download Release Patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}.patch
path: pkgs/checkout/
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cwd: pkgs/checkout/
cache-prefix: ${{ inputs.cache-prefix }}
- name: Setup Salt Version
id: setup-salt-version
uses: ./.github/actions/setup-salt-version
with:
salt-version: "${{ inputs.salt-version }}"
cwd: pkgs/checkout/
- name: Configure Git
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
working-directory: pkgs/checkout/
run: |
tools pkg configure-git
- name: Apply release patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
working-directory: pkgs/checkout/
run: |
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
- name: Build Deb
working-directory: pkgs/checkout/
run: |
tools pkg build deb --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
inputs.source == 'onedir' &&
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
||
format('--arch={0}', matrix.arch)
}}
- name: Cleanup
run: |
rm -rf pkgs/checkout/
- name: Set Artifact Name
id: set-artifact-name
run: |
if [ "${{ inputs.source }}" != "src" ]; then
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb" >> "$GITHUB_OUTPUT"
else
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb-from-src" >> "$GITHUB_OUTPUT"
fi
- name: Upload DEBs
uses: actions/upload-artifact@v4
with:
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
path: ${{ github.workspace }}/pkgs/*
retention-days: 7
if-no-files-found: error
build-rpm-packages:
name: RPM
runs-on:
- self-hosted
- linux
- medium
- ${{ matrix.arch }}
strategy:
fail-fast: false
matrix:
arch:
- x86_64
- arm64
source:
- ${{ inputs.source }}
container:
image: ghcr.io/saltstack/salt-ci-containers/packaging:rockylinux-9
steps:
- uses: actions/checkout@v4
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
path: artifacts/
- name: Download Release Patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}.patch
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Setup Salt Version
id: setup-salt-version
uses: ./.github/actions/setup-salt-version
with:
salt-version: "${{ inputs.salt-version }}"
- name: Configure Git
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
run: |
tools pkg configure-git
- name: Apply release patch
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
run: |
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
- name: Build RPM
run: |
tools pkg build rpm --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
inputs.source == 'onedir' &&
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
||
format('--arch={0}', matrix.arch)
}}
- name: Set Artifact Name
id: set-artifact-name
run: |
if [ "${{ inputs.source }}" != "src" ]; then
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm" >> "$GITHUB_OUTPUT"
else
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm-from-src" >> "$GITHUB_OUTPUT"
fi
- name: Upload RPMs
uses: actions/upload-artifact@v4
with:
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
path: ~/rpmbuild/RPMS/${{ matrix.arch == 'arm64' && 'aarch64' || matrix.arch }}/*.rpm
retention-days: 7
if-no-files-found: error
build-windows-pkgs:
name: Windows
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
environment: ${{ inputs.environment }}
strategy:
fail-fast: false
max-parallel: 2
matrix:
arch:
- x86
- amd64
source:
- ${{ inputs.source }}
include: ${{ fromJSON(inputs.matrix)['windows'] }}
runs-on:
- windows-latest
env:

View file

@ -8,12 +8,6 @@ on:
type: string
required: true
description: The Salt version to set prior to building packages.
github-hosted-runners:
type: boolean
required: true
self-hosted-runners:
type: boolean
required: true
cache-seed:
required: true
type: string
@ -26,6 +20,14 @@ on:
required: true
type: string
description: The version of python to use with relenv
matrix:
type: string
required: true
description: Json config for build matrix
linux_arm_runner:
required: true
type: string
description: Json job matrix config
env:
RELENV_DATA: "${{ github.workspace }}/.relenv"
@ -39,21 +41,18 @@ env:
jobs:
build-salt-linux:
name: Linux
if: ${{ inputs.self-hosted-runners }}
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
env:
USE_S3_CACHE: 'true'
USE_S3_CACHE: 'false'
runs-on:
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
strategy:
fail-fast: false
matrix:
arch:
- x86_64
- arm64
runs-on:
- self-hosted
- linux
- ${{ matrix.arch }}
include: ${{ fromJSON(inputs.matrix)['linux'] }}
steps:
- name: "Throttle Builds"
@ -63,6 +62,10 @@ jobs:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.10'
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
@ -95,18 +98,22 @@ jobs:
build-salt-macos:
name: macOS
if: ${{ inputs.github-hosted-runners }}
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
strategy:
fail-fast: false
max-parallel: 2
matrix:
arch: ${{ github.event.repository.fork && fromJSON('["x86_64"]') || fromJSON('["x86_64", "arm64"]') }}
include: ${{ fromJSON(inputs.matrix)['macos'] }}
runs-on:
- ${{ matrix.arch == 'arm64' && 'macos-13-xlarge' || 'macos-12' }}
- ${{ matrix.arch == 'arm64' && 'macos-14' || 'macos-13' }}
env:
PIP_INDEX_URL: https://pypi.org/simple
steps:
- name: "Check cores"
shell: bash
run: sysctl -n hw.ncpu
- name: "Throttle Builds"
shell: bash
run: |
@ -150,14 +157,12 @@ jobs:
build-salt-windows:
name: Windows
if: ${{ inputs.github-hosted-runners }}
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
strategy:
fail-fast: false
max-parallel: 2
matrix:
arch:
- x86
- amd64
include: ${{ fromJSON(inputs.matrix)['windows'] }}
runs-on: windows-latest
env:
PIP_INDEX_URL: https://pypi.org/simple

1676
.github/workflows/ci.yml vendored

File diff suppressed because it is too large Load diff

132
.github/workflows/draft-release.yml vendored Normal file
View file

@ -0,0 +1,132 @@
---
name: Draft Github Release
on:
workflow_call:
inputs:
salt-version:
type: string
required: true
description: The Salt version to set prior to building packages.
matrix:
required: true
type: string
description: Json job matrix config
build-matrix:
required: true
type: string
description: Json job matrix config
env:
COLUMNS: 190
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
jobs:
list-artifacts:
name: List Artifacts
runs-on: ubuntu-22.04
steps:
# Checkout here so we can easily use custom actions
- uses: actions/download-artifact@v4
with:
path: artifacts/
- name: List Directory Structure
run: ls -R artifacts/
create-github-release:
name: Draft Release v${{ inputs.salt-version }}
runs-on: ubuntu-22.04
outputs:
upload_url: ${{ steps.create_release.outputs.upload_url }}
steps:
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
release_name: "Release v${{ inputs.salt-version }}"
tag_name: v${{ inputs.salt-version }}
draft: true
prerelease: false
- name: Release Output
run: echo "upload_url=${{ steps.create_release.outputs.upload_url }}" >> "$GITHUB_OUTPUT"
upload-source-tarball:
needs:
- create-github-release
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}.tar.gz
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
upload-onedir:
needs:
- create-github-release
strategy:
matrix:
include: ${{ fromJSON(inputs.matrix) }}
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.${{ matrix.platform == 'windows' && 'zip' || 'tar.xz' }}
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
upload-deb-packages:
needs:
- create-github-release
strategy:
matrix:
include: ${{ fromJSON(inputs.build-matrix)['linux'] }}
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
pattern: "*.deb"
upload-rpm-packages:
needs:
- create-github-release
strategy:
matrix:
include: ${{ fromJSON(inputs.build-matrix)['linux'] }}
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
upload-mac-packages:
needs:
- create-github-release
strategy:
matrix:
include: ${{ fromJSON(inputs.build-matrix)['macos'] }}
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-macos
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
upload-windows-msi-packages:
needs:
- create-github-release
strategy:
matrix:
include: ${{ fromJSON(inputs.build-matrix)['windows'] }}
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-MSI
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
upload-windows-nsis-packages:
needs:
- create-github-release
strategy:
matrix:
include: ${{ fromJSON(inputs.build-matrix)['windows'] }}
uses: ./.github/workflows/release-artifact.yml
with:
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-NSIS
upload_url: ${{ needs.create-github-release.outputs.upload_url }}

View file

@ -18,17 +18,13 @@ env:
jobs:
Salt:
name: Lint Salt's Source Code
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['salt'] || fromJSON(inputs.changed-files)['lint'] }}
container:
image: ghcr.io/saltstack/salt-ci-containers/python:3.10
steps:
- name: Install System Deps
run: |
apt-get update
apt-get install -y enchant-2 git gcc make zlib1g-dev libc-dev libffi-dev g++ libxml2 libxml2-dev libxslt-dev libcurl4-openssl-dev libssl-dev libgnutls28-dev
- name: Add Git Safe Directory
run: |
@ -62,18 +58,13 @@ jobs:
Tests:
name: Lint Salt's Test Suite
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['tests'] || fromJSON(inputs.changed-files)['lint'] }}
container:
image: ghcr.io/saltstack/salt-ci-containers/python:3.10
steps:
- name: Install System Deps
run: |
echo "deb http://deb.debian.org/debian bookworm-backports main" >> /etc/apt/sources.list
apt-get update
apt-get install -y enchant-2 git gcc make zlib1g-dev libc-dev libffi-dev g++ libxml2 libxml2-dev libxslt-dev libcurl4-openssl-dev libssl-dev libgnutls28-dev
- name: Add Git Safe Directory
run: |

File diff suppressed because it is too large Load diff

67
.github/workflows/nsis-tests.yml vendored Normal file
View file

@ -0,0 +1,67 @@
---
name: Test NSIS Installer
on:
workflow_call:
inputs:
changed-files:
required: true
type: string
description: JSON string containing information about changed files
jobs:
Test-NSIS-Logic:
name: Logic Tests
runs-on:
- windows-latest
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['nsis_tests'] }}
steps:
- name: Checkout Salt
uses: actions/checkout@v4
- name: Set Up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install NSIS
run: .\pkg\windows\install_nsis.cmd -CICD
shell: cmd
- name: Build Test Installer
run: .\pkg\windows\nsis\tests\setup.cmd -CICD
shell: cmd
- name: Run Config Tests
run: .\pkg\windows\nsis\tests\test.cmd -CICD .\config_tests
shell: cmd
Test-NSIS-Stress:
name: Stress Tests
runs-on:
- windows-latest
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['nsis_tests'] }}
steps:
- name: Checkout Salt
uses: actions/checkout@v4
- name: Set Up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install NSIS
run: .\pkg\windows\install_nsis.cmd -CICD
shell: cmd
- name: Build Test Installer
run: .\pkg\windows\nsis\tests\setup.cmd -CICD
shell: cmd
- name: Run Stress Test
run: .\pkg\windows\nsis\tests\test.cmd -CICD .\stress_tests
shell: cmd

View file

@ -21,21 +21,16 @@ jobs:
Pre-Commit:
name: Run Pre-Commit Against Salt
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
container:
image: ghcr.io/saltstack/salt-ci-containers/python:3.10
image: ghcr.io/saltstack/salt-ci-containers/testing:ubuntu-22.04
env:
PRE_COMMIT_COLOR: always
steps:
- name: Install System Deps
run: |
apt-get update
apt-get install -y wget curl enchant-2 git gcc make zlib1g-dev libc-dev libffi-dev g++ libxml2 libxml2-dev libxslt-dev libcurl4-openssl-dev libssl-dev libgnutls28-dev rustc
- name: Add Git Safe Directory
run: |
git config --global --add safe.directory "$(pwd)"

69
.github/workflows/release-artifact.yml vendored Normal file
View file

@ -0,0 +1,69 @@
---
name: Upload Release Artifact
on:
workflow_call:
inputs:
name:
type: string
required: true
description: The Salt version to set prior to building packages.
upload_url:
type: string
required: true
description: Release's upload url.
pattern:
type: string
required: false
description: Pattern of files to upload
jobs:
list-files:
name: List ${{ inputs.name }}
runs-on: ubuntu-22.04
outputs:
files: ${{ steps.list-files.outputs.files }}
steps:
- uses: actions/download-artifact@v4
with:
name: ${{ inputs.name }}
path: artifacts
- run: find artifacts -maxdepth 1 -type f -printf '%f\n'
- id: list-files
run: |
if [ "${{ inputs.pattern }}" != "" ]; then
echo files="$(find artifacts -maxdepth 1 -type f -name '${{ inputs.pattern }}' -printf '%f\n' | jq -Rnc '[inputs | { file: "\(.)" }]')" >> "$GITHUB_OUTPUT"
else
echo files="$(find artifacts -maxdepth 1 -type f -printf '%f\n' | jq -Rnc '[inputs | { file: "\(.)" }]')" >> "$GITHUB_OUTPUT"
fi
upload-files:
name: Upload ${{ matrix.file }} from ${{ inputs.name }}
runs-on: ubuntu-22.04
needs:
- list-files
strategy:
matrix:
include: ${{ fromJSON(needs.list-files.outputs.files) }}
steps:
- uses: actions/download-artifact@v4
with:
name: ${{ inputs.name }}
path: artifacts
- name: Detect type of ${{ matrix.file }}
id: file-type
run: echo "file_type=$( file --mime-type artifacts/${{ matrix.file }} )" >> "$GITHUB_OUTPUT"
- name: Upload ${{ matrix.file }}
id: upload-release-asset-source
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ inputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
asset_path: artifacts/${{ matrix.file }}
asset_name: ${{ matrix.file }}
asset_content_type: ${{ steps.file-type.outputs.file_type }}

View file

@ -32,7 +32,7 @@ jobs:
permissions:
contents: write # for dev-drprasad/delete-tag-and-release to delete tags or releases
name: Generate Tag and Github Release
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
steps:
- uses: dev-drprasad/delete-tag-and-release@v0.2.0
if: github.event.inputs.reTag == 'true'

View file

@ -19,7 +19,7 @@ permissions:
jobs:
update-winrepo:
name: Update Winrepo
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
steps:
- name: Checkout Salt

View file

@ -31,7 +31,6 @@ jobs:
runs-on:
- self-hosted
- linux
- repo-release
steps:
- name: Checkout Salt

View file

@ -21,7 +21,7 @@ on:
env:
COLUMNS: 190
CACHE_SEED: SEED-2 # Bump the number to invalidate all caches
CACHE_SEED: SEED-1 # Bump the number to invalidate all caches
RELENV_DATA: "${{ github.workspace }}/.relenv"
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
@ -37,7 +37,7 @@ jobs:
check-requirements:
name: Check Requirements
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
environment: release-check
steps:
- name: Check For Admin Permission
@ -49,11 +49,9 @@ jobs:
prepare-workflow:
name: Prepare Workflow Run
runs-on:
- self-hosted
- linux
- repo-release
- linux-x86_64
env:
USE_S3_CACHE: 'true'
USE_S3_CACHE: 'false'
environment: release
needs:
- check-requirements
@ -63,6 +61,7 @@ jobs:
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
releases: ${{ steps.get-salt-releases.outputs.releases }}
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
config: ${{ steps.workflow-config.outputs.config }}
steps:
- uses: actions/checkout@v4
with:
@ -121,12 +120,15 @@ jobs:
run: |
echo "nox-archive-hash=${{ hashFiles('requirements/**/*.txt', 'cicd/golden-images.json', 'noxfile.py', 'pkg/common/env-cleanup-rules.yml', '.github/workflows/build-deps-ci-action.yml') }}" | tee -a "$GITHUB_OUTPUT"
- name: Define workflow config
id: workflow-config
run: |
tools ci workflow-config${{ inputs.skip-salt-pkg-download-test-suite && ' --skip-pkg-download-tests' || '' }} ${{ steps.setup-salt-version.outputs.salt-version }} ${{ github.event_name }} changed-files.json
download-onedir-artifact:
name: Download Staging Onedir Artifact
runs-on:
- self-hosted
- linux
- repo-release
- linux-x86_64
env:
USE_S3_CACHE: 'true'
environment: release
@ -184,15 +186,15 @@ jobs:
nox-version: 2022.8.7
python-version: "3.10"
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|3.10.14
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|3.10.15
nox-archive-hash: "${{ needs.prepare-workflow.outputs.nox-archive-hash }}"
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
backup:
name: Backup
runs-on:
- self-hosted
- linux
- repo-release
- linux-x86_64
needs:
- prepare-workflow
env:
@ -223,9 +225,7 @@ jobs:
publish-repositories:
name: Publish Repositories
runs-on:
- self-hosted
- linux
- repo-release
- linux-x86_64
env:
USE_S3_CACHE: 'true'
needs:
@ -255,40 +255,17 @@ jobs:
run: |
tools pkg repo publish release ${{ needs.prepare-workflow.outputs.salt-version }}
pkg-download-tests:
name: Package Downloads
if: ${{ inputs.skip-salt-pkg-download-test-suite == false }}
needs:
- prepare-workflow
- publish-repositories
- build-ci-deps
- download-onedir-artifact
uses: ./.github/workflows/test-package-downloads-action.yml
with:
nox-session: ci-test-onedir
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|3.10.14
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
environment: release
nox-version: 2022.8.7
python-version: "3.10"
skip-code-coverage: true
latest-release: "${{ needs.prepare-workflow.outputs.latest-release }}"
secrets: inherit
release:
name: Release v${{ needs.prepare-workflow.outputs.salt-version }}
if: ${{ always() && ! failure() && ! cancelled() }}
runs-on:
- self-hosted
- linux
- repo-release
- linux-x86_64
env:
USE_S3_CACHE: 'true'
needs:
- prepare-workflow
- backup
- publish-repositories
- pkg-download-tests
environment: release
steps:
- name: Clone The Salt Repository
@ -395,9 +372,7 @@ jobs:
- release
environment: release
runs-on:
- self-hosted
- linux
- repo-release
- linux-x86_64
env:
USE_S3_CACHE: 'true'
steps:
@ -443,3 +418,38 @@ jobs:
TWINE_PASSWORD: "${{ steps.get-secrets.outputs.twine-password }}"
run: |
tools pkg pypi-upload artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
set-pipeline-exit-status:
# This step is just so we can make github require this step, to pass checks
# on a pull request instead of requiring all
name: Set the ${{ github.workflow }} Pipeline Exit Status
if: always()
runs-on: ubuntu-22.04
needs:
- check-requirements
- prepare-workflow
- publish-repositories
- release
- publish-pypi
- build-ci-deps
steps:
- name: Get workflow information
id: get-workflow-info
uses: im-open/workflow-conclusion@v2
- run: |
# shellcheck disable=SC2129
if [ "${{ steps.get-workflow-info.outputs.conclusion }}" != "success" ]; then
echo 'To restore the release bucket run:' >> "${GITHUB_STEP_SUMMARY}"
echo '```' >> "${GITHUB_STEP_SUMMARY}"
echo 'tools pkg repo restore-previous-releases' >> "${GITHUB_STEP_SUMMARY}"
echo '```' >> "${GITHUB_STEP_SUMMARY}"
fi
- name: Set Pipeline Exit Status
shell: bash
run: |
if [ "${{ steps.get-workflow-info.outputs.workflow_conclusion }}" != "success" ]; then
exit 1
else
exit 0
fi

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -1,9 +1,10 @@
build-ci-deps:
<%- do test_salt_needs.append("build-ci-deps") %>
<%- do test_salt_linux_needs.append("build-ci-deps") %>
name: CI Deps
<%- if workflow_slug != 'release' %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['build-deps-ci'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['build-deps-ci'] }}
<%- endif %>
needs:
- prepare-workflow
@ -20,3 +21,5 @@
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
nox-archive-hash: "${{ needs.prepare-workflow.outputs.nox-archive-hash }}"
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}

View file

@ -23,12 +23,6 @@
with:
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Download DEB Packages
uses: actions/download-artifact@v4
with:

View file

@ -13,12 +13,6 @@
with:
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Download macOS x86_64 Packages
uses: actions/download-artifact@v4
with:

View file

@ -13,12 +13,6 @@
with:
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Download Linux x86_64 Onedir Archive
uses: actions/download-artifact@v4
with:

View file

@ -1,4 +1,9 @@
<%- for backend in ("onedir", "src") %>
<%- if gh_environment != "ci" -%>
<%- set pkg_types = ("onedir", "src") %>
<%- else -%>
<%- set pkg_types = ("onedir",) %>
<%- endif -%>
<%- for backend in pkg_types %>
<%- set job_name = "build-pkgs-{}".format(backend) %>
<%- if backend == "src" %>
<%- do conclusion_needs.append(job_name) %>
@ -6,7 +11,7 @@
<{ job_name }>:
name: Build Packages
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['build-pkgs'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['build-pkgs'] }}
needs:
- prepare-workflow
- build-salt-onedir
@ -17,11 +22,14 @@
relenv-version: "<{ relenv_version }>"
python-version: "<{ python_version }>"
source: "<{ backend }>"
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
<%- if gh_environment != "ci" %>
environment: <{ gh_environment }>
sign-macos-packages: true
sign-macos-packages: false
sign-windows-packages: <% if gh_environment == 'nightly' -%> false <%- else -%> ${{ inputs.sign-windows-packages }} <%- endif %>
secrets: inherit
<%- endif %>
<%- endfor %>

View file

@ -1,35 +0,0 @@
<%- for type, display_name in (
("src", "Source"),
("deb", "DEB"),
("rpm", "RPM"),
("windows", "Windows"),
("macos", "macOS"),
("onedir", "Onedir"),
) %>
<%- set job_name = "build-{}-repo".format(type) %>
<%- do build_repo_needs.append(job_name) %>
<{ job_name }>:
name: Build Repository
environment: <{ gh_environment }>
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
env:
USE_S3_CACHE: 'true'
needs:
- prepare-workflow
<%- if type not in ("src", "onedir") %>
- build-pkgs-onedir
<%- elif type == 'onedir' %>
- build-salt-onedir
<%- elif type == 'src' %>
- build-source-tarball
- build-pkgs-src
<%- endif %>
<%- include "build-{}-repo.yml.jinja".format(type) %>
<%- endfor %>

View file

@ -23,12 +23,6 @@
with:
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Download RPM Packages
uses: actions/download-artifact@v4
with:

View file

@ -13,12 +13,6 @@
with:
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Download Source Tarball
uses: actions/download-artifact@v4
with:

View file

@ -12,7 +12,6 @@
<{ job_name }>:
<%- do conclusion_needs.append(job_name) %>
name: Pre-Commit
if: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
uses: ./.github/workflows/pre-commit-action.yml
needs:
- prepare-workflow
@ -30,7 +29,7 @@
lint:
<%- do conclusion_needs.append('lint') %>
name: Lint
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
uses: ./.github/workflows/lint-action.yml
needs:
- prepare-workflow
@ -39,37 +38,37 @@
<%- endif %>
<%- set job_name = "nsis-tests" %>
<%- if includes.get(job_name, True) %>
<{ job_name }>:
<%- do conclusion_needs.append(job_name) %>
name: NSIS Tests
uses: ./.github/workflows/nsis-tests.yml
needs:
- prepare-workflow
with:
changed-files: ${{ needs.prepare-workflow.outputs.changed-files }}
<%- endif %>
<%- set job_name = "prepare-release" %>
<%- if includes.get(job_name, True) %>
<{ job_name }>:
name: "Prepare Release: ${{ needs.prepare-workflow.outputs.salt-version }}"
<%- if prepare_actual_release %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
runs-on:
- self-hosted
- linux
- medium
- x86_64
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
runs-on: ubuntu-latest
<%- endif %>
- ubuntu-22.04
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
needs:
- prepare-workflow
steps:
- uses: actions/checkout@v4
<%- if not prepare_actual_release %>
- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: "3.10"
<%- endif %>
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
@ -191,7 +190,7 @@
<{ job_name }>:
<%- do conclusion_needs.append(job_name) %>
name: Documentation
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
needs:
- prepare-workflow
- build-source-tarball
@ -208,11 +207,11 @@
<{ job_name }>:
name: Build Source Tarball
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
needs:
- prepare-workflow
- prepare-release
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
@ -245,29 +244,28 @@
<{ job_name }>:
<%- do conclusion_needs.append(job_name) %>
name: Build Dependencies Onedir
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
name: Build Onedir Dependencies
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
needs:
- prepare-workflow
uses: ./.github/workflows/build-deps-onedir.yml
with:
cache-seed: ${{ needs.prepare-workflow.outputs.cache-seed }}
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
self-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
github-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
relenv-version: "<{ relenv_version }>"
python-version: "<{ python_version }>"
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
<%- endif %>
<%- set job_name = "build-salt-onedir" %>
<%- if includes.get(job_name, True) %>
<{ job_name }>:
<%- do conclusion_needs.append(job_name) %>
name: Build Salt Onedir
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
needs:
- prepare-workflow
- build-deps-onedir
@ -276,14 +274,13 @@
with:
cache-seed: ${{ needs.prepare-workflow.outputs.cache-seed }}
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
self-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
github-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
relenv-version: "<{ relenv_version }>"
python-version: "<{ python_version }>"
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
<%- endif %>
<%- set job_name = "build-pkgs" %>
<%- if includes.get(job_name, True) %>
<%- include "build-packages.yml.jinja" %>
@ -309,8 +306,8 @@
combine-all-code-coverage:
<%- do conclusion_needs.append("combine-all-code-coverage") %>
name: Combine Code Coverage
if: ${{ fromJSON(needs.prepare-workflow.outputs.testrun)['skip_code_coverage'] == false }}
runs-on: ubuntu-latest
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['skip_code_coverage'] == false }}
runs-on: ubuntu-22.04
env:
PIP_INDEX_URL: https://pypi.org/simple
needs:
@ -362,8 +359,9 @@
id: get-coverage-reports
uses: actions/download-artifact@v4
with:
name: all-testrun-coverage-artifacts
path: artifacts/coverage/
pattern: all-testrun-coverage-artifacts*
merge-multiple: true
- name: Display structure of downloaded files
run: tree -a artifacts/
@ -416,6 +414,7 @@
path: artifacts/coverage/html/salt
retention-days: 7
if-no-files-found: error
include-hidden-files: true
- name: Report Combined Code Coverage
run: |
@ -432,6 +431,7 @@
path: artifacts/coverage/coverage.json
retention-days: 7
if-no-files-found: error
include-hidden-files: true
- name: Create Combined Code Coverage HTML Report
run: |
@ -444,6 +444,7 @@
path: artifacts/coverage/html/full
retention-days: 7
if-no-files-found: error
include-hidden-files: true
<%- endif %>
<%- endblock jobs %>

View file

@ -5,7 +5,7 @@
<%- set prepare_workflow_skip_pkg_test_suite = prepare_workflow_skip_pkg_test_suite|default("") %>
<%- set prepare_workflow_skip_pkg_download_test_suite = prepare_workflow_skip_pkg_download_test_suite|default("") %>
<%- set prepare_workflow_salt_version_input = prepare_workflow_salt_version_input|default("") %>
<%- set skip_test_coverage_check = skip_test_coverage_check|default("${{ fromJSON(needs.prepare-workflow.outputs.testrun)['skip_code_coverage'] }}") %>
<%- set skip_test_coverage_check = skip_test_coverage_check|default("${{ fromJSON(needs.prepare-workflow.outputs.config)['skip_code_coverage'] }}") %>
<%- set gpg_key_id = "64CBBC8173D76B3F" %>
<%- set prepare_actual_release = prepare_actual_release | default(False) %>
<%- set gh_actions_workflows_python_version = "3.10" %>
@ -34,7 +34,7 @@ on:
env:
COLUMNS: 190
CACHE_SEED: SEED-2 # Bump the number to invalidate all caches
CACHE_SEED: SEED-1 # Bump the number to invalidate all caches
RELENV_DATA: "${{ github.workspace }}/.relenv"
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
@ -78,7 +78,8 @@ jobs:
prepare-workflow:
name: Prepare Workflow Run
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
environment: ci
<%- if prepare_workflow_if_check %>
if: <{ prepare_workflow_if_check }>
<%- endif %>
@ -89,12 +90,7 @@ jobs:
<%- endfor %>
<%- endif %>
outputs:
jobs: ${{ steps.define-jobs.outputs.jobs }}
runners: ${{ steps.runner-types.outputs.runners }}
changed-files: ${{ steps.process-changed-files.outputs.changed-files }}
os-labels: ${{ steps.get-pull-labels.outputs.os-labels }}
pull-labels: ${{ steps.get-pull-labels.outputs.test-labels }}
testrun: ${{ steps.define-testrun.outputs.testrun }}
salt-version: ${{ steps.setup-salt-version.outputs.salt-version }}
cache-seed: ${{ steps.set-cache-seed.outputs.cache-seed }}
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
@ -102,6 +98,11 @@ jobs:
release-changelog-target: ${{ steps.get-release-changelog-target.outputs.release-changelog-target }}
testing-releases: ${{ steps.get-testing-releases.outputs.testing-releases }}
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
config: ${{ steps.workflow-config.outputs.config }}
env:
LINUX_ARM_RUNNER: ${{ vars.LINUX_ARM_RUNNER }}
FULL_TESTRUN_SLUGS: ${{ vars.FULL_TESTRUN_SLUGS }}
PR_TESTRUN_SLUGS: ${{ vars.PR_TESTRUN_SLUGS }}
steps:
- uses: actions/checkout@v4
with:
@ -177,6 +178,9 @@ jobs:
- pkg/**
- *pkg_requirements
- *salt_added_modified
nsis_tests:
- added|modified: &nsis_tests
- pkg/windows/nsis/**
testrun:
- added|modified:
- *pkg_requirements
@ -211,14 +215,6 @@ jobs:
salt-version: "<{ prepare_workflow_salt_version_input }>"
validate-version: true
- name: Get Pull Request Test Labels
id: get-pull-labels
if: ${{ github.event_name == 'pull_request'}}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
tools ci get-pr-test-labels --repository ${{ github.repository }}
- name: Get Hash For Nox Tarball Cache
id: nox-archive-hash
run: |
@ -257,18 +253,6 @@ jobs:
run: |
echo '${{ steps.process-changed-files.outputs.changed-files }}' | jq -C '.'
- name: Define Runner Types
id: runner-types
run: |
tools ci runner-types ${{ github.event_name }}
- name: Define Jobs To Run
id: define-jobs
run: |
tools ci define-jobs<{ prepare_workflow_skip_test_suite }><{
prepare_workflow_skip_pkg_test_suite }><{ prepare_workflow_skip_pkg_download_test_suite
}> ${{ github.event_name }} changed-files.json
- name: Get Salt Releases
id: get-salt-releases
env:
@ -283,18 +267,20 @@ jobs:
run: |
tools ci get-testing-releases ${{ join(fromJSON(steps.get-salt-releases.outputs.releases), ' ') }} --salt-version ${{ steps.setup-salt-version.outputs.salt-version }}
- name: Define Testrun
id: define-testrun
- name: Define workflow config
id: workflow-config
run: |
tools ci define-testrun ${{ github.event_name }} changed-files.json
tools ci workflow-config<{ prepare_workflow_skip_test_suite }><{
prepare_workflow_skip_pkg_test_suite }><{ prepare_workflow_skip_pkg_download_test_suite
}> ${{ steps.setup-salt-version.outputs.salt-version }} ${{ github.event_name }} changed-files.json
- name: Check Contents of generated testrun-changed-files.txt
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['type'] != 'full' }}
if: ${{ fromJSON(steps.workflow-config.outputs.config)['testrun']['type'] != 'full' }}
run: |
cat testrun-changed-files.txt || true
- name: Upload testrun-changed-files.txt
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['type'] != 'full' }}
if: ${{ fromJSON(steps.workflow-config.outputs.config)['testrun']['type'] != 'full' }}
uses: actions/upload-artifact@v4
with:
name: testrun-changed-files.txt
@ -308,18 +294,18 @@ jobs:
{# We can't yet use tokenless uploads with the codecov CLI
- name: Install Codecov CLI
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['skip_code_coverage'] == false }}
if: ${{ fromJSON(steps.define-testrun.outputs.config)['skip_code_coverage'] == false }}
run: |
python3 -m pip install codecov-cli
- name: Save Commit Metadata In Codecov
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['skip_code_coverage'] == false }}
if: ${{ fromJSON(steps.define-testrun.outputs.config)['skip_code_coverage'] == false }}
run: |
codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
create-commit --git-service github --sha ${{ github.sha }}
- name: Create Codecov Coverage Report
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['skip_code_coverage'] == false }}
if: ${{ fromJSON(steps.define-testrun.outputs.config)['skip_code_coverage'] == false }}
run: |
codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
create-report --git-service github --sha ${{ github.sha }}
@ -330,3 +316,48 @@ jobs:
<%- endif %>
<%- endblock jobs %>
set-pipeline-exit-status:
# This step is just so we can make github require this step, to pass checks
# on a pull request instead of requiring all
name: Set the ${{ github.workflow }} Pipeline Exit Status
if: always()
runs-on: ubuntu-22.04
<%- if workflow_slug == "nightly" %>
environment: <{ workflow_slug }>
<%- endif %>
needs:
<%- for need in prepare_workflow_needs.iter(consume=True) %>
- <{ need }>
<%- endfor %>
<%- for need in conclusion_needs.iter(consume=True) %>
- <{ need }>
<%- endfor %>
<%- for need in test_salt_needs.iter(consume=False) %>
- <{ need }>
<%- endfor %>
<%- for need in test_salt_pkg_needs.iter(consume=False) %>
- <{ need }>
<%- endfor %>
<%- for need in test_repo_needs.iter(consume=True) %>
- <{ need }>
<%- endfor %>
<%- if workflow_slug != "release" %>
- test-packages
- test
<%- endif %>
steps:
- name: Get workflow information
id: get-workflow-info
uses: im-open/workflow-conclusion@v2
<%- block set_pipeline_exit_status_extra_steps %>
<%- endblock set_pipeline_exit_status_extra_steps %>
- name: Set Pipeline Exit Status
shell: bash
run: |
if [ "${{ steps.get-workflow-info.outputs.workflow_conclusion }}" != "success" ]; then
exit 1
else
exit 0
fi

View file

@ -1,5 +1,5 @@
<%- set gh_environment = gh_environment|default("nightly") %>
<%- set skip_test_coverage_check = skip_test_coverage_check|default("false") %>
<%- set skip_test_coverage_check = skip_test_coverage_check|default("true") %>
<%- set prepare_workflow_skip_test_suite = "${{ inputs.skip-salt-test-suite && ' --skip-tests' || '' }}" %>
<%- set prepare_workflow_skip_pkg_test_suite = "${{ inputs.skip-salt-pkg-test-suite && ' --skip-pkg-tests' || '' }}" %>
<%- set prepare_workflow_if_check = prepare_workflow_if_check|default("${{ fromJSON(needs.workflow-requirements.outputs.requirements-met) }}") %>
@ -51,217 +51,10 @@ concurrency:
<%- include "workflow-requirements-check.yml.jinja" %>
<%- include "trigger-branch-workflows.yml.jinja" %>
{#- When we start using a slack app, we can update messages, not while using incoming webhooks
<%- if workflow_slug == "nightly" %>
<%- do conclusion_needs.append('notify-slack') %>
notify-slack:
name: Notify Slack
runs-on: ubuntu-latest
environment: <{ gh_environment }>
needs:
<%- for need in prepare_workflow_needs.iter(consume=False) %>
- <{ need }>
<%- endfor %>
outputs:
update-ts: ${{ steps.slack.outputs.update-ts }}
steps:
- name: Notify Slack
id: slack
uses: slackapi/slack-github-action@v1.24.0
with:
payload: |
{
"attachments": [
{
"color": "ffca28",
"fields": [
{
"title": "Workflow",
"short": true,
"value": "${{ github.workflow }}",
"type": "mrkdwn"
},
{
"title": "Workflow Run",
"short": true,
"value": "<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|${{ github.run_id }}>",
"type": "mrkdwn"
},
{
"title": "Branch",
"short": true,
"value": "${{ github.ref_name }}",
"type": "mrkdwn"
},
{
"title": "Commit",
"short": true,
"value": "<${{ github.server_url }}/${{ github.repository }}/commit/${{ github.sha }}|${{ github.sha }}>",
"type": "mrkdwn"
},
{
"title": "Attempt",
"short": true,
"value": "${{ github.run_attempt }}",
"type": "mrkdwn"
},
{
"title": "Status",
"short": true,
"value": "running",
"type": "mrkdwn"
}
],
"author_name": "${{ github.event.sender.login }}",
"author_link": "${{ github.event.sender.html_url }}",
"author_icon": "${{ github.event.sender.avatar_url }}"
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
<%- endif %>
#}
<%- endblock pre_jobs %>
<%- block jobs %>
<{- super() }>
<%- if includes.get("build-repos", True) %>
<%- include "build-repos.yml.jinja" %>
<%- endif %>
publish-repositories:
<%- do conclusion_needs.append('publish-repositories') %>
name: Publish Repositories
if: ${{ always() && ! failure() && ! cancelled() }}
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
environment: <{ gh_environment }>
needs:
- prepare-workflow
- build-docs
<%- for need in build_repo_needs.iter(consume=True) %>
- <{ need }>
<%- endfor %>
<%- if workflow_slug == "nightly" %>
<%- for need in test_salt_needs.iter(consume=True) %>
- <{ need }>
<%- endfor %>
<%- endif %>
steps:
- uses: actions/checkout@v4
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
- name: Download Repository Artifact
uses: actions/download-artifact@v4
with:
pattern: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-*
merge-multiple: true
path: repo/
- name: Decompress Repository Artifacts
run: |
find repo/ -type f -name '*.tar.gz' -print -exec tar xvf {} \;
find repo/ -type f -name '*.tar.gz' -print -exec rm -f {} \;
- name: Show Repository
run: |
tree -a artifacts/pkgs/repo/
- name: Upload Repository Contents (<{ gh_environment }>)
env:
SALT_REPO_DOMAIN_RELEASE: ${{ vars.SALT_REPO_DOMAIN_RELEASE || 'repo.saltproject.io' }}
SALT_REPO_DOMAIN_STAGING: ${{ vars.SALT_REPO_DOMAIN_STAGING || 'staging.repo.saltproject.io' }}
run: |
tools pkg repo publish <{ gh_environment }> --salt-version=${{ needs.prepare-workflow.outputs.salt-version }} artifacts/pkgs/repo/
<%- endblock jobs %>
<%- block set_pipeline_exit_status_extra_steps %>
<%- if workflow_slug == "nightly" %>
- name: Notify Slack
id: slack
if: always()
uses: slackapi/slack-github-action@v1.24.0
with:
{#- When we start using a slack app, we can update messages, not while using incoming webhooks
update-ts: ${{ needs.notify-slack.outputs.update-ts }}
#}
payload: |
{
"attachments": [
{
"fallback": "${{ github.workflow }} Workflow build result for the `${{ github.ref_name }}` branch(attempt: ${{ github.run_attempt }}): `${{ steps.get-workflow-info.outputs.conclusion }}`\n${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}",
"color": "${{ steps.get-workflow-info.outputs.conclusion != 'success' && 'ff3d00' || '00e676' }}",
"fields": [
{
"title": "Workflow",
"short": true,
"value": "${{ github.workflow }}",
"type": "mrkdwn"
},
{
"title": "Workflow Run",
"short": true,
"value": "<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|${{ github.run_id }}>",
"type": "mrkdwn"
},
{
"title": "Branch",
"short": true,
"value": "${{ github.ref_name }}",
"type": "mrkdwn"
},
{
"title": "Commit",
"short": true,
"value": "<${{ github.server_url }}/${{ github.repository }}/commit/${{ github.sha }}|${{ github.sha }}>",
"type": "mrkdwn"
},
{
"title": "Attempt",
"short": true,
"value": "${{ github.run_attempt }}",
"type": "mrkdwn"
},
{
"title": "Status",
"short": true,
"value": "${{ steps.get-workflow-info.outputs.conclusion }}",
"type": "mrkdwn"
}
],
"author_name": "${{ github.event.sender.login }}",
"author_link": "${{ github.event.sender.html_url }}",
"author_icon": "${{ github.event.sender.avatar_url }}"
}
]
}
env:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
<%- endif %>
<%- endblock set_pipeline_exit_status_extra_steps %>

View file

@ -52,7 +52,7 @@ permissions:
<{ job_name }>:
<%- do prepare_workflow_needs.append(job_name) %>
name: Check Requirements
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
environment: <{ gh_environment }>-check
steps:
- name: Check For Admin Permission
@ -71,11 +71,9 @@ permissions:
prepare-workflow:
name: Prepare Workflow Run
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
env:
USE_S3_CACHE: 'true'
USE_S3_CACHE: 'false'
environment: <{ gh_environment }>
<%- if prepare_workflow_needs %>
needs:
@ -89,6 +87,7 @@ permissions:
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
releases: ${{ steps.get-salt-releases.outputs.releases }}
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
config: ${{ steps.workflow-config.outputs.config }}
steps:
- uses: actions/checkout@v4
with:
@ -147,6 +146,14 @@ permissions:
run: |
echo "nox-archive-hash=<{ nox_archive_hashfiles }>" | tee -a "$GITHUB_OUTPUT"
- name: Define workflow config
id: workflow-config
run: |
tools ci workflow-config<{ prepare_workflow_skip_test_suite }><{
prepare_workflow_skip_pkg_test_suite }><{ prepare_workflow_skip_pkg_download_test_suite
}> ${{ steps.setup-salt-version.outputs.salt-version }} ${{ github.event_name }} changed-files.json
<%- endblock prepare_workflow_job %>
<%- endif %>
@ -156,9 +163,7 @@ permissions:
download-onedir-artifact:
name: Download Staging Onedir Artifact
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
env:
USE_S3_CACHE: 'true'
environment: <{ gh_environment }>
@ -211,9 +216,7 @@ permissions:
backup:
name: Backup
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
needs:
- prepare-workflow
env:
@ -245,9 +248,7 @@ permissions:
<%- do conclusion_needs.append('publish-repositories') %>
name: Publish Repositories
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
env:
USE_S3_CACHE: 'true'
needs:
@ -277,18 +278,12 @@ permissions:
run: |
tools pkg repo publish <{ gh_environment }> ${{ needs.prepare-workflow.outputs.salt-version }}
<%- if includes.get("test-pkg-downloads", True) %>
<%- include "test-salt-pkg-repo-downloads.yml.jinja" %>
<%- endif %>
release:
<%- do conclusion_needs.append('release') %>
name: Release v${{ needs.prepare-workflow.outputs.salt-version }}
if: ${{ always() && ! failure() && ! cancelled() }}
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
env:
USE_S3_CACHE: 'true'
needs:
@ -402,9 +397,7 @@ permissions:
name: Restore Release Bucket From Backup
if: ${{ always() && needs.backup.outputs.backup-complete == 'true' && (failure() || cancelled()) }}
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
env:
USE_S3_CACHE: 'true'
needs:
@ -445,9 +438,7 @@ permissions:
- restore #}
environment: <{ gh_environment }>
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- linux-x86_64
env:
USE_S3_CACHE: 'true'
steps:

View file

@ -1,5 +1,5 @@
<%- set prepare_workflow_if_check = "${{ fromJSON(needs.workflow-requirements.outputs.requirements-met) }}" %>
<%- set skip_test_coverage_check = "false" %>
<%- set skip_test_coverage_check = "true" %>
<%- extends 'ci.yml.jinja' %>

View file

@ -51,9 +51,9 @@ on:
<%- block concurrency %>
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.repository }}
cancel-in-progress: false
#concurrency:
# group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.repository }}
# cancel-in-progress: false
<%- endblock concurrency %>
@ -65,7 +65,7 @@ concurrency:
<{ job_name }>:
<%- do prepare_workflow_needs.append(job_name) %>
name: Check Requirements
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
environment: <{ gh_environment }>-check
steps:
- name: Check For Admin Permission
@ -86,21 +86,12 @@ concurrency:
needs:
- prepare-workflow
- build-docs
- build-src-repo
environment: <{ gh_environment }>
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- ubuntu-22.04
steps:
- uses: actions/checkout@v4
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
@ -112,56 +103,16 @@ concurrency:
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}.patch
path: artifacts/release
- name: Download Source Repository
uses: actions/download-artifact@v4
with:
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-src-repo
path: artifacts/release
- name: Download Release Documentation (HTML)
uses: actions/download-artifact@v4
with:
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-docs-html.tar.xz
path: artifacts/release
- name: Download Release Documentation (ePub)
uses: actions/download-artifact@v4
with:
name: Salt-${{ needs.prepare-workflow.outputs.salt-version }}.epub
path: artifacts/release
- name: Show Release Artifacts
run: |
tree -a artifacts/release
{#-
- name: Download Release Documentation (PDF)
uses: actions/download-artifact@v4
with:
name: Salt-${{ needs.prepare-workflow.outputs.salt-version }}.pdf
path: artifacts/release
#}
- name: Upload Release Artifacts
run: |
tools release upload-artifacts ${{ needs.prepare-workflow.outputs.salt-version }} artifacts/release
- name: Upload PyPi Artifacts
uses: actions/upload-artifact@v4
with:
name: pypi-artifacts
path: |
artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz.asc
retention-days: 7
if-no-files-found: error
<%- if includes.get("test-pkg-downloads", True) %>
<%- include "test-salt-pkg-repo-downloads.yml.jinja" %>
<%- endif %>
publish-pypi:
<%- do conclusion_needs.append('publish-pypi') %>
name: Publish to PyPi(test)
@ -180,9 +131,7 @@ concurrency:
<%- endfor %>
environment: <{ gh_environment }>
runs-on:
- self-hosted
- linux
- repo-<{ gh_environment }>
- ubuntu-22.04
steps:
- uses: actions/checkout@v4
@ -227,4 +176,29 @@ concurrency:
run: |
tools pkg pypi-upload --test artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
draft-release:
name: Draft Github Release
if: |
always() && (needs.test.result == 'success' || needs.test.result == 'skipped') &&
(needs.test-packages.result == 'success' || needs.test-packages.result == 'skipped') &&
needs.prepare-workflow.result == 'success' && needs.build-salt-onedir.result == 'success' &&
needs.build-pkgs-onedir.result == 'success' && needs.pre-commit.result == 'success'
needs:
- prepare-workflow
- pre-commit
- build-salt-onedir
- build-pkgs-onedir
- test-packages
- test
permissions:
contents: write
pull-requests: read
id-token: write
uses: ./.github/workflows/draft-release.yml
with:
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['artifact-matrix']) }}
build-matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
<%- endblock jobs %>

View file

@ -6,13 +6,12 @@
<%- do conclusion_needs.append(job_name) %>
name: Package Downloads
<%- if gh_environment == "staging" %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg-download'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['test-pkg-download'] }}
<%- else %>
if: ${{ inputs.skip-salt-pkg-download-test-suite == false }}
<%- endif %>
needs:
- prepare-workflow
- publish-repositories
- build-ci-deps
<%- if gh_environment == "release" %>
- download-onedir-artifact

View file

@ -1,99 +1,19 @@
<%- for os in test_salt_pkg_listing["linux"] %>
<%- set job_name = "{}-pkg-tests{}".format(os.slug.replace(".", ""), os.fips and '-fips' or '') %>
<%- set job_name = "test-packages" %>
<{ job_name }>:
<%- do test_salt_pkg_needs.append(job_name) %>
name: <{ os.display_name }> Package Test<%- if os.fips %> (fips)<%- endif %>
<%- if workflow_slug != "ci" or os.slug in mandatory_os_slugs %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] && contains(fromJSON(needs.prepare-workflow.outputs.os-labels), '<{ os.slug }>') }}
<%- endif %>
name: Test Package
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['test-pkg'] }}
needs:
- prepare-workflow
- build-pkgs-onedir
- build-ci-deps
uses: ./.github/workflows/test-packages-action-linux.yml
uses: ./.github/workflows/test-packages-action.yml
with:
distro-slug: <{ os.slug }>
nox-session: ci-test-onedir
platform: linux
arch: <{ os.arch }>
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
pkg-type: <{ os.pkg_type }>
nox-version: <{ nox_version }>
python-version: "<{ gh_actions_workflows_python_version }>"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
skip-code-coverage: <{ skip_test_coverage_check }>
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
<%- if os.fips %>
fips: true
<%- endif %>
<%- endfor %>
<%- for os in test_salt_pkg_listing["macos"] %>
<%- set job_name = "{}-pkg-tests".format(os.slug.replace(".", "")) %>
<{ job_name }>:
<%- do test_salt_pkg_needs.append(job_name) %>
name: <{ os.display_name }> Package Test
<%- if workflow_slug != "ci" or os.slug in mandatory_os_slugs %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] && contains(fromJSON(needs.prepare-workflow.outputs.os-labels), '<{ os.slug }>') }}
<%- endif %>
needs:
- prepare-workflow
- build-pkgs-onedir
- build-ci-deps
uses: ./.github/workflows/test-packages-action-macos.yml
with:
distro-slug: <{ os.slug }>
runner: <{ os.runner }>
nox-session: ci-test-onedir
platform: macos
arch: <{ os.arch }>
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
pkg-type: macos
nox-version: <{ nox_version }>
python-version: "<{ gh_actions_workflows_python_version }>"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
skip-code-coverage: <{ skip_test_coverage_check }>
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
<%- endfor %>
<%- for os in test_salt_pkg_listing["windows"] %>
<%- set job_name = "{}-{}-pkg-tests".format(os.slug.replace(".", ""), os.pkg_type.lower()) %>
<{ job_name }>:
<%- do test_salt_pkg_needs.append(job_name) %>
name: <{ os.display_name }> <{ os.pkg_type }> Package Test
<%- if workflow_slug != "ci" or os.slug in mandatory_os_slugs %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] && contains(fromJSON(needs.prepare-workflow.outputs.os-labels), '<{ os.slug }>') }}
<%- endif %>
needs:
- prepare-workflow
- build-pkgs-onedir
- build-ci-deps
uses: ./.github/workflows/test-packages-action-windows.yml
with:
distro-slug: <{ os.slug }>
nox-session: ci-test-onedir
platform: windows
arch: <{ os.arch }>
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
pkg-type: <{ os.pkg_type }>
nox-version: <{ nox_version }>
python-version: "<{ gh_actions_workflows_python_version }>"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
skip-code-coverage: <{ skip_test_coverage_check }>
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
<%- endfor %>
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['pkg-test-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}

View file

@ -1,103 +1,24 @@
{#-
Full test runs. Each chunk should never take more than 2 hours. We allow 3, and on windows we add 30 more minutes.
Partial test runs(no chunk parallelization), 6 Hours
#}
<%- set full_testrun_timeout_value = 180 %>
<%- set partial_testrun_timeout_value = 360 %>
<%- set windows_full_testrun_timeout_value = full_testrun_timeout_value + 30 %>
<%- for os in test_salt_listing["windows"] %>
<{ os.slug.replace(".", "") }>:
<%- do test_salt_needs.append(os.slug.replace(".", "")) %>
name: <{ os.display_name }> Test
<%- if workflow_slug != "ci" or os.slug in mandatory_os_slugs %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] && contains(fromJSON(needs.prepare-workflow.outputs.os-labels), '<{ os.slug }>') }}
<%- endif %>
<%- if workflow_slug in ("nightly", "scheduled") %>
<%- set timeout_value = 360 %>
<%- else %>
<%- set timeout_value = 180 %>
<%- endif %>
test:
name: Test Salt
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['test'] }}
needs:
- prepare-workflow
- build-ci-deps
uses: ./.github/workflows/test-action-windows.yml
uses: ./.github/workflows/test-action.yml
with:
distro-slug: <{ os.slug }>
nox-session: ci-test-onedir
platform: windows
arch: amd64
nox-version: <{ nox_version }>
gh-actions-python-version: "<{ gh_actions_workflows_python_version }>"
testrun: ${{ needs.prepare-workflow.outputs.testrun }}
python-version: "<{ gh_actions_workflows_python_version }>"
testrun: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['testrun']) }}
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
skip-code-coverage: <{ skip_test_coverage_check }>
workflow-slug: <{ workflow_slug }>
timeout-minutes: ${{ fromJSON(needs.prepare-workflow.outputs.testrun)['type'] == 'full' && <{ windows_full_testrun_timeout_value }> || <{ partial_testrun_timeout_value }> }}
<%- endfor %>
<%- for os in test_salt_listing["macos"] %>
<{ os.slug.replace(".", "") }>:
<%- do test_salt_needs.append(os.slug.replace(".", "")) %>
name: <{ os.display_name }> Test
<%- if workflow_slug != "ci" or os.slug in mandatory_os_slugs %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] && contains(fromJSON(needs.prepare-workflow.outputs.os-labels), '<{ os.slug }>') }}
<%- endif %>
needs:
- prepare-workflow
- build-ci-deps
uses: ./.github/workflows/test-action-macos.yml
with:
distro-slug: <{ os.slug }>
runner: <{ os.runner }>
nox-session: ci-test-onedir
platform: macos
arch: <{ os.arch }>
nox-version: <{ nox_version }>
gh-actions-python-version: "<{ gh_actions_workflows_python_version }>"
testrun: ${{ needs.prepare-workflow.outputs.testrun }}
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
skip-code-coverage: <{ skip_test_coverage_check }>
workflow-slug: <{ workflow_slug }>
timeout-minutes: ${{ fromJSON(needs.prepare-workflow.outputs.testrun)['type'] == 'full' && <{ full_testrun_timeout_value }> || <{ partial_testrun_timeout_value }> }}
<%- endfor %>
<%- for os in test_salt_listing["linux"] %>
<%- set job_name = "{}{}".format(os.slug.replace(".", ""), os.fips and '-fips' or '') %>
<{ job_name }>:
<%- do test_salt_needs.append(job_name) %>
name: <{ os.display_name }> Test<%- if os.fips %> (fips)<%- endif %>
<%- if workflow_slug != "ci" or os.slug in mandatory_os_slugs %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
<%- else %>
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] && contains(fromJSON(needs.prepare-workflow.outputs.os-labels), '<{ os.slug }>') }}
<%- endif %>
needs:
- prepare-workflow
- build-ci-deps
uses: ./.github/workflows/test-action-linux.yml
with:
distro-slug: <{ os.slug }>
nox-session: ci-test-onedir
platform: linux
arch: <{ os.arch }>
nox-version: <{ nox_version }>
gh-actions-python-version: "<{ gh_actions_workflows_python_version }>"
testrun: ${{ needs.prepare-workflow.outputs.testrun }}
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
skip-code-coverage: <{ skip_test_coverage_check }>
workflow-slug: <{ workflow_slug }>
timeout-minutes: ${{ fromJSON(needs.prepare-workflow.outputs.testrun)['type'] == 'full' && <{ full_testrun_timeout_value }> || <{ partial_testrun_timeout_value }> }}
<%- if os.fips %>
fips: true
<%- endif %>
<%- endfor %>
default-timeout: <{ timeout_value }>
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['test-matrix']) }}
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}

View file

@ -6,7 +6,7 @@
<%- do conclusion_needs.append(job_name) %>
name: Trigger Branch Workflows
if: ${{ github.event_name == 'schedule' && fromJSON(needs.workflow-requirements.outputs.requirements-met) }}
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
needs:
- workflow-requirements

View file

@ -4,7 +4,7 @@
<{ job_name }>:
<%- do prepare_workflow_needs.append(job_name) %>
name: Check Workflow Requirements
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
outputs:
requirements-met: ${{ steps.check-requirements.outputs.requirements-met }}
steps:

View file

@ -1,402 +0,0 @@
---
name: Test Artifact
on:
workflow_call:
inputs:
distro-slug:
required: true
type: string
description: The OS slug to run tests against
nox-session:
required: true
type: string
description: The nox session to run
testrun:
required: true
type: string
description: JSON string containing information about what and how to run the test suite
salt-version:
type: string
required: true
description: The Salt version to set prior to running tests.
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
platform:
required: true
type: string
description: The platform being tested
arch:
required: true
type: string
description: The platform arch being tested
nox-version:
required: true
type: string
description: The nox version to install
timeout-minutes:
required: true
type: number
description: Timeout, in minutes, for the test job
gh-actions-python-version:
required: false
type: string
description: The python version to run tests with
default: "3.10"
fips:
required: false
type: boolean
default: false
description: Test run with FIPS enabled
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
workflow-slug:
required: false
type: string
description: Which workflow is running.
default: ci
env:
COLUMNS: 190
AWS_MAX_ATTEMPTS: "10"
AWS_RETRY_MODE: "adaptive"
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
jobs:
generate-matrix:
name: Test Matrix
runs-on: ubuntu-latest
outputs:
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
build-reports: ${{ steps.generate-matrix.outputs.build-reports }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Generate Test Matrix
id: generate-matrix
run: |
tools ci matrix --workflow=${{ inputs.workflow-slug }} ${{ fromJSON(inputs.testrun)['type'] == 'full' && '--full ' || '' }}${{ inputs.distro-slug }}
test:
name: Test
runs-on:
- self-hosted
- linux
- bastion
timeout-minutes: ${{ inputs.timeout-minutes }}
needs:
- generate-matrix
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include) }}
env:
SALT_TRANSPORT: ${{ matrix.transport }}
TEST_GROUP: ${{ matrix.test-group || 1 }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Salt Version
run: |
echo "${{ inputs.salt-version }}" > salt/_version.txt
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
- name: Download nox.linux.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-linux-${{ inputs.arch }}-${{ inputs.nox-session }}
- name: PyPi Proxy
run: |
sed -i '7s;^;--index-url=${{ vars.PIP_INDEX_URL }} --trusted-host ${{ vars.PIP_TRUSTED_HOST }} --extra-index-url=${{ vars.PIP_EXTRA_INDEX_URL }}\n;' requirements/static/ci/*/*.txt
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Download testrun-changed-files.txt
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
uses: actions/download-artifact@v4
with:
name: testrun-changed-files.txt
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Start VM
id: spin-up-vm
env:
TESTS_CHUNK: ${{ matrix.tests-chunk }}
run: |
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
- name: List Free Space
run: |
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
- name: Upload Checkout To VM
run: |
tools --timestamps vm rsync ${{ inputs.distro-slug }}
- name: Decompress .nox Directory
run: |
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
- name: Show System Info
run: |
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
--nox-session=${{ inputs.nox-session }} ${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }}
- name: Run Changed Tests
id: run-fast-changed-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --core-tests --slow-tests --suppress-no-test-exit-code \
--from-filenames=testrun-changed-files.txt
- name: Run Fast Tests
id: run-fast-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }}
- name: Run Slow Tests
id: run-slow-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['slow'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --no-fast-tests --slow-tests
- name: Run Core Tests
id: run-core-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['core'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --no-fast-tests --core-tests
- name: Run Flaky Tests
id: run-flaky-tests
if: ${{ fromJSON(inputs.testrun)['selected_tests']['flaky'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --no-fast-tests --flaky-jail
- name: Run Full Tests
id: run-full-tests
if: ${{ fromJSON(inputs.testrun)['type'] == 'full' }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
-E TEST_GROUP ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }} -- --slow-tests --core-tests \
--test-group-count=${{ matrix.test-group-count || 1 }} --test-group=${{ matrix.test-group || 1 }}
- name: Combine Coverage Reports
if: always() && inputs.skip-code-coverage == false && steps.spin-up-vm.outcome == 'success'
run: |
tools --timestamps vm combine-coverage ${{ inputs.distro-slug }}
- name: Download Test Run Artifacts
id: download-artifacts-from-vm
if: always() && steps.spin-up-vm.outcome == 'success'
run: |
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}${{ inputs.fips && '.fips' || '' }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}.grp${{ matrix.test-group || '1' }}
fi
- name: Destroy VM
if: always()
run: |
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
- name: Upload Code Coverage Test Run Artifacts
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
uses: actions/upload-artifact@v4
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-grp${{ matrix.test-group || '1' }}-${{ env.TIMESTAMP }}
path: |
artifacts/coverage/
- name: Upload JUnit XML Test Run Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: testrun-junit-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-grp${{ matrix.test-group || '1' }}-${{ env.TIMESTAMP }}
path: |
artifacts/xml-unittests-output/
- name: Upload Test Run Log Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: testrun-log-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-grp${{ matrix.test-group || '1' }}-${{ env.TIMESTAMP }}
path: |
artifacts/logs
report:
name: Test Reports
if: always() && fromJSON(needs.generate-matrix.outputs.build-reports) && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
runs-on: ubuntu-latest
needs:
- test
- generate-matrix
env:
PIP_INDEX_URL: https://pypi.org/simple
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Merge JUnit XML Test Run Artifacts
if: always() && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-junit-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}
pattern: testrun-junit-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Merge Log Test Run Artifacts
if: always() && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-log-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}
pattern: testrun-log-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Merge Code Coverage Test Run Artifacts
if: ${{ inputs.skip-code-coverage == false }}
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}
pattern: testrun-coverage-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Download Code Coverage Test Run Artifacts
uses: actions/download-artifact@v4
if: ${{ inputs.skip-code-coverage == false }}
id: download-coverage-artifacts
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}
path: artifacts/coverage/
- name: Show Downloaded Test Run Artifacts
if: ${{ inputs.skip-code-coverage == false }}
run: |
tree -a artifacts
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
- name: Create XML Coverage Reports
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success' && job.status != 'cancelled'
run: |
nox --force-color -e create-xml-coverage-reports
mv artifacts/coverage/salt.xml artifacts/coverage/salt..${{ inputs.distro-slug }}${{ inputs.fips && '..fips' || '' }}..${{ inputs.nox-session }}.xml
mv artifacts/coverage/tests.xml artifacts/coverage/tests..${{ inputs.distro-slug }}${{ inputs.fips && '..fips' || '' }}..${{ inputs.nox-session }}.xml
- name: Report Salt Code Coverage
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
nox --force-color -e report-coverage -- salt
- name: Report Combined Code Coverage
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
nox --force-color -e report-coverage
- name: Rename Code Coverage DB
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}${{ inputs.fips && '.fips' || '' }}.${{ inputs.nox-session }}
- name: Upload Code Coverage DB
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: all-testrun-coverage-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.nox-session }}
path: artifacts/coverage

View file

@ -1,437 +0,0 @@
---
name: Test Artifact(macOS)
on:
workflow_call:
inputs:
distro-slug:
required: true
type: string
description: The OS slug to run tests against
runner:
required: true
type: string
description: The GitHub runner name
nox-session:
required: true
type: string
description: The nox session to run
testrun:
required: true
type: string
description: JSON string containing information about what and how to run the test suite
gh-actions-python-version:
required: false
type: string
description: The python version to run tests with
default: "3.11"
salt-version:
type: string
required: true
description: The Salt version to set prior to running tests.
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
platform:
required: true
type: string
description: The platform being tested
arch:
required: true
type: string
description: The platform arch being tested
nox-version:
required: true
type: string
description: The nox version to install
timeout-minutes:
required: true
type: number
description: Timeout, in minutes, for the test job
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
workflow-slug:
required: false
type: string
description: Which workflow is running.
default: ci
env:
COLUMNS: 190
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
jobs:
generate-matrix:
name: Test Matrix
runs-on: ubuntu-latest
outputs:
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
build-reports: ${{ steps.generate-matrix.outputs.build-reports }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Generate Test Matrix
id: generate-matrix
run: |
tools ci matrix --workflow=${{ inputs.workflow-slug }} ${{ inputs.distro-slug }}
test:
name: Test
runs-on: ${{ inputs.runner }}
timeout-minutes: ${{ inputs.timeout-minutes }}
needs:
- generate-matrix
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include) }}
env:
SALT_TRANSPORT: ${{ matrix.transport }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Salt Version
run: |
echo "${{ inputs.salt-version }}" > salt/_version.txt
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
- name: Install System Dependencies
run: |
brew install tree
- name: Download nox.macos.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-macos-${{ inputs.arch }}-${{ inputs.nox-session }}
- name: Set up Python ${{ inputs.gh-actions-python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.gh-actions-python-version }}"
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Decompress .nox Directory
run: |
nox --force-color -e decompress-dependencies -- macos ${{ inputs.arch }}
- name: Download testrun-changed-files.txt
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
uses: actions/download-artifact@v4
with:
name: testrun-changed-files.txt
- name: Show System Info
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_SYSTEM_INFO_ONLY: "1"
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }}
- name: Run Changed Tests
id: run-fast-changed-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
-k "mac or darwin" --core-tests --slow-tests --suppress-no-test-exit-code \
--from-filenames=testrun-changed-files.txt
- name: Run Fast Tests
id: run-fast-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] }}
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
-k "mac or darwin" --suppress-no-test-exit-code
- name: Run Slow Tests
id: run-slow-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['slow'] }}
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
-k "mac or darwin" --suppress-no-test-exit-code --no-fast-tests --slow-tests
- name: Run Core Tests
id: run-core-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['core'] }}
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
-k "mac or darwin" --suppress-no-test-exit-code --no-fast-tests --core-tests
- name: Run Flaky Tests
id: run-flaky-tests
if: ${{ fromJSON(inputs.testrun)['selected_tests']['flaky'] }}
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
-k "mac or darwin" --suppress-no-test-exit-code --no-fast-tests --flaky-jail
- name: Run Full Tests
id: run-full-tests
if: ${{ fromJSON(inputs.testrun)['type'] == 'full' }}
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
--slow-tests --core-tests -k "mac or darwin"
- name: Fix file ownership
run: |
sudo chown -R "$(id -un)" .
- name: Combine Coverage Reports
if: always() && inputs.skip-code-coverage == false
run: |
nox --force-color -e combine-coverage
- name: Prepare Test Run Artifacts
id: download-artifacts-from-vm
if: always()
run: |
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}
fi
- name: Upload Code Coverage Test Run Artifacts
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
uses: actions/upload-artifact@v4
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ env.TIMESTAMP }}
path: |
artifacts/coverage/
- name: Upload JUnit XML Test Run Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ env.TIMESTAMP }}
path: |
artifacts/xml-unittests-output/
- name: Upload Test Run Log Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ env.TIMESTAMP }}
path: |
artifacts/logs
report:
name: Test Reports
if: always() && fromJSON(needs.generate-matrix.outputs.build-reports) && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
runs-on: ubuntu-latest
needs:
- test
- generate-matrix
env:
PIP_INDEX_URL: https://pypi.org/simple
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Merge JUnit XML Test Run Artifacts
if: always() && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
pattern: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Merge Log Test Run Artifacts
if: always() && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
pattern: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Merge Code Coverage Test Run Artifacts
if: ${{ inputs.skip-code-coverage == false }}
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
pattern: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Download Code Coverage Test Run Artifacts
uses: actions/download-artifact@v4
if: ${{ inputs.skip-code-coverage == false }}
id: download-coverage-artifacts
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
path: artifacts/coverage/
- name: Show Downloaded Test Run Artifacts
if: ${{ inputs.skip-code-coverage == false }}
run: |
tree -a artifacts
- name: Set up Python ${{ inputs.gh-actions-python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.gh-actions-python-version }}"
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
- name: Create XML Coverage Reports
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success' && job.status != 'cancelled'
run: |
nox --force-color -e create-xml-coverage-reports
mv artifacts/coverage/salt.xml artifacts/coverage/salt..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
mv artifacts/coverage/tests.xml artifacts/coverage/tests..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
- name: Report Salt Code Coverage
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
nox --force-color -e report-coverage -- salt
- name: Report Combined Code Coverage
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
nox --force-color -e report-coverage
- name: Rename Code Coverage DB
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}
- name: Upload Code Coverage DB
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: all-testrun-coverage-artifacts-${{ inputs.distro-slug }}.${{ inputs.nox-session }}
path: artifacts/coverage

View file

@ -1,403 +0,0 @@
---
name: Test Artifact
on:
workflow_call:
inputs:
distro-slug:
required: true
type: string
description: The OS slug to run tests against
nox-session:
required: true
type: string
description: The nox session to run
testrun:
required: true
type: string
description: JSON string containing information about what and how to run the test suite
salt-version:
type: string
required: true
description: The Salt version to set prior to running tests.
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
platform:
required: true
type: string
description: The platform being tested
arch:
required: true
type: string
description: The platform arch being tested
nox-version:
required: true
type: string
description: The nox version to install
timeout-minutes:
required: true
type: number
description: Timeout, in minutes, for the test job
gh-actions-python-version:
required: false
type: string
description: The python version to run tests with
default: "3.10"
fips:
required: false
type: boolean
default: false
description: Test run with FIPS enabled
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
workflow-slug:
required: false
type: string
description: Which workflow is running.
default: ci
env:
COLUMNS: 190
AWS_MAX_ATTEMPTS: "10"
AWS_RETRY_MODE: "adaptive"
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
jobs:
generate-matrix:
name: Test Matrix
runs-on: ubuntu-latest
outputs:
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
build-reports: ${{ steps.generate-matrix.outputs.build-reports }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Generate Test Matrix
id: generate-matrix
run: |
tools ci matrix --workflow=${{ inputs.workflow-slug }} ${{ fromJSON(inputs.testrun)['type'] == 'full' && '--full ' || '' }}${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }}
test:
name: Test
runs-on:
- self-hosted
- linux
- bastion
timeout-minutes: ${{ inputs.timeout-minutes }}
needs:
- generate-matrix
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include) }}
env:
SALT_TRANSPORT: ${{ matrix.transport }}
TEST_GROUP: ${{ matrix.test-group || 1 }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Salt Version
run: |
echo "${{ inputs.salt-version }}" > salt/_version.txt
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
- name: Download nox.windows.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-windows-${{ inputs.arch }}-${{ inputs.nox-session }}
- name: PyPi Proxy
run: |
sed -i '7s;^;--index-url=${{ vars.PIP_INDEX_URL }} --trusted-host ${{ vars.PIP_TRUSTED_HOST }} --extra-index-url=${{ vars.PIP_EXTRA_INDEX_URL }}\n;' requirements/static/ci/*/*.txt
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Download testrun-changed-files.txt
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
uses: actions/download-artifact@v4
with:
name: testrun-changed-files.txt
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Start VM
id: spin-up-vm
env:
TESTS_CHUNK: ${{ matrix.tests-chunk }}
run: |
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
- name: List Free Space
run: |
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
- name: Upload Checkout To VM
run: |
tools --timestamps vm rsync ${{ inputs.distro-slug }}
- name: Decompress .nox Directory
run: |
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
- name: Show System Info
run: |
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
--nox-session=${{ inputs.nox-session }} ${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }}
- name: Run Changed Tests
id: run-fast-changed-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --core-tests --slow-tests --suppress-no-test-exit-code \
--from-filenames=testrun-changed-files.txt
- name: Run Fast Tests
id: run-fast-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }}
- name: Run Slow Tests
id: run-slow-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['slow'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --no-fast-tests --slow-tests
- name: Run Core Tests
id: run-core-tests
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['core'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --no-fast-tests --core-tests
- name: Run Flaky Tests
id: run-flaky-tests
if: ${{ fromJSON(inputs.testrun)['selected_tests']['flaky'] }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ matrix.tests-chunk }} -- --no-fast-tests --flaky-jail
- name: Run Full Tests
id: run-full-tests
if: ${{ fromJSON(inputs.testrun)['type'] == 'full' }}
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
-E TEST_GROUP ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }} -- --slow-tests --core-tests \
--test-group-count=${{ matrix.test-group-count || 1 }} --test-group=${{ matrix.test-group || 1 }}
- name: Combine Coverage Reports
if: always() && inputs.skip-code-coverage == false && steps.spin-up-vm.outcome == 'success'
run: |
tools --timestamps vm combine-coverage ${{ inputs.distro-slug }}
- name: Download Test Run Artifacts
id: download-artifacts-from-vm
if: always() && steps.spin-up-vm.outcome == 'success'
run: |
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}.grp${{ matrix.test-group || '1' }}
fi
- name: Destroy VM
if: always()
run: |
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
- name: Upload Code Coverage Test Run Artifacts
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
uses: actions/upload-artifact@v4
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-grp${{ matrix.test-group || '1' }}-${{ env.TIMESTAMP }}
path: |
artifacts/coverage/
- name: Upload JUnit XML Test Run Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-grp${{ matrix.test-group || '1' }}-${{ env.TIMESTAMP }}
path: |
artifacts/xml-unittests-output/
- name: Upload Test Run Log Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-grp${{ matrix.test-group || '1' }}-${{ env.TIMESTAMP }}
path: |
artifacts/logs
report:
name: Test Reports
if: always() && fromJSON(needs.generate-matrix.outputs.build-reports) && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
runs-on: ubuntu-latest
needs:
- test
- generate-matrix
env:
PIP_INDEX_URL: https://pypi.org/simple
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Merge JUnit XML Test Run Artifacts
if: always() && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
pattern: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Merge Log Test Run Artifacts
if: always() && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
pattern: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Merge Code Coverage Test Run Artifacts
if: ${{ inputs.skip-code-coverage == false }}
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
pattern: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-*
separate-directories: false
delete-merged: true
- name: Download Code Coverage Test Run Artifacts
uses: actions/download-artifact@v4
if: ${{ inputs.skip-code-coverage == false }}
id: download-coverage-artifacts
with:
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
path: artifacts/coverage/
- name: Show Downloaded Test Run Artifacts
if: ${{ inputs.skip-code-coverage == false }}
run: |
tree -a artifacts
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
- name: Create XML Coverage Reports
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success' && job.status != 'cancelled'
run: |
nox --force-color -e create-xml-coverage-reports
mv artifacts/coverage/salt.xml artifacts/coverage/salt..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
mv artifacts/coverage/tests.xml artifacts/coverage/tests..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
- name: Report Salt Code Coverage
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
nox --force-color -e report-coverage -- salt
- name: Report Combined Code Coverage
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
nox --force-color -e report-coverage
- name: Rename Code Coverage DB
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
continue-on-error: true
run: |
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}
- name: Upload Code Coverage DB
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: all-testrun-coverage-artifacts-${{ inputs.distro-slug }}.${{ inputs.nox-session }}
path: artifacts/coverage

1392
.github/workflows/test-action.yml vendored Normal file

File diff suppressed because it is too large Load diff

View file

@ -88,9 +88,7 @@ jobs:
needs:
- generate-matrix
runs-on:
- self-hosted
- linux
- bastion
- ubuntu-latest
env:
USE_S3_CACHE: 'true'
environment: ${{ inputs.environment }}
@ -287,7 +285,7 @@ jobs:
with:
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}-${{ matrix.pkg-type }}
path: |
artifacts
artifacts/
!artifacts/salt/*
!artifacts/salt-*.tar.*
@ -485,7 +483,7 @@ jobs:
with:
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}-${{ matrix.pkg-type }}
path: |
artifacts
artifacts/
!artifacts/salt/*
!artifacts/salt-*.tar.*
@ -497,9 +495,7 @@ jobs:
env:
USE_S3_CACHE: 'true'
runs-on:
- self-hosted
- linux
- bastion
- ubuntu-latest
environment: ${{ inputs.environment }}
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
strategy:
@ -688,6 +684,6 @@ jobs:
with:
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}-${{ matrix.pkg-type }}
path: |
artifacts
artifacts/
!artifacts/salt/*
!artifacts/salt-*.tar.*

View file

@ -1,274 +0,0 @@
name: Test Artifact
on:
workflow_call:
inputs:
distro-slug:
required: true
type: string
description: The OS slug to run tests against
platform:
required: true
type: string
description: The platform being tested
arch:
required: true
type: string
description: The platform arch being tested
pkg-type:
required: true
type: string
description: The platform arch being tested
salt-version:
type: string
required: true
description: The Salt version of the packages to install and test
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
testing-releases:
required: true
type: string
description: A JSON list of releases to test upgrades against
nox-version:
required: true
type: string
description: The nox version to install
python-version:
required: false
type: string
description: The python version to run tests with
default: "3.10"
fips:
required: false
type: boolean
default: false
description: Test run with FIPS enabled
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
nox-session:
required: false
type: string
description: The nox session to run
default: ci-test-onedir
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
env:
COLUMNS: 190
AWS_MAX_ATTEMPTS: "10"
AWS_RETRY_MODE: "adaptive"
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
USE_S3_CACHE: 'true'
jobs:
generate-matrix:
name: Generate Matrix
runs-on:
# We need to run on our self-hosted runners because we need proper credentials
# for boto3 to scan through our repositories.
- self-hosted
- linux
- x86_64
outputs:
pkg-matrix-include: ${{ steps.generate-pkg-matrix.outputs.matrix }}
build-reports: ${{ steps.generate-pkg-matrix.outputs.build-reports }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Generate Package Test Matrix
id: generate-pkg-matrix
run: |
tools ci pkg-matrix ${{ inputs.distro-slug }} \
${{ inputs.pkg-type }} --testing-releases ${{ join(fromJSON(inputs.testing-releases), ' ') }}
test:
name: Test
runs-on:
- self-hosted
- linux
- bastion
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
needs:
- generate-matrix
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Download Packages
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ inputs.arch }}-${{ inputs.pkg-type }}
path: artifacts/pkg/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
- name: List Packages
run: |
tree artifacts/pkg/
- name: Download nox.linux.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-linux-${{ inputs.arch }}-${{ inputs.nox-session }}
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Start VM
id: spin-up-vm
run: |
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
- name: List Free Space
run: |
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
- name: Upload Checkout To VM
run: |
tools --timestamps vm rsync ${{ inputs.distro-slug }}
- name: Decompress .nox Directory
run: |
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
- name: Downgrade importlib-metadata
if: ${{ contains(fromJSON('["amazonlinux-2", "centos-7"]'), inputs.distro-slug) && contains(fromJSON('["upgrade-classic", "downgrade-classic"]'), matrix.tests-chunk) }}
run: |
# This step can go away once we stop testing classic packages upgrade/downgrades to/from 3005.x
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- "sudo python3 -m pip install -U 'importlib-metadata<=4.13.0' 'virtualenv<=20.21.1'"
- name: Show System Info
run: |
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
--nox-session=${{ inputs.nox-session }}-pkgs ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }}
- name: Run Package Tests
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install ${{ inputs.fips && '--fips ' || '' }}\
--nox-session=${{ inputs.nox-session }}-pkgs --rerun-failures ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }} \
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
- name: Download Test Run Artifacts
id: download-artifacts-from-vm
if: always() && steps.spin-up-vm.outcome == 'success'
run: |
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
- name: Destroy VM
if: always()
run: |
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
- name: Upload Test Run Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.pkg-type }}-${{ inputs.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts
!artifacts/pkg/*
!artifacts/salt/*
!artifacts/salt-*.tar.*
report:
name: Report
runs-on: ubuntu-latest
if: always() && fromJSON(needs.generate-matrix.outputs.build-reports) && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
needs:
- generate-matrix
- test
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Merge Test Run Artifacts
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.pkg-type }}
pattern: pkg-testrun-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.pkg-type }}-*
separate-directories: true
delete-merged: true
- name: Download Test Run Artifacts
id: download-test-run-artifacts
uses: actions/download-artifact@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}${{ inputs.fips && '-fips' || '' }}-${{ inputs.pkg-type }}
path: artifacts
- name: Show Test Run Artifacts
if: always() && steps.download-test-run-artifacts.outcome == 'success'
run: |
tree -a artifacts

View file

@ -1,270 +0,0 @@
name: Test Artifact
on:
workflow_call:
inputs:
distro-slug:
required: true
type: string
description: The OS slug to run tests against
runner:
required: true
type: string
description: The GitHub runner name
platform:
required: true
type: string
description: The platform being tested
arch:
required: true
type: string
description: The platform arch being tested
pkg-type:
required: true
type: string
description: The platform arch being tested
salt-version:
type: string
required: true
description: The Salt version of the packages to install and test
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
testing-releases:
required: true
type: string
description: A JSON list of releases to test upgrades against
nox-version:
required: true
type: string
description: The nox version to install
python-version:
required: false
type: string
description: The python version to run tests with
default: "3.10"
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
nox-session:
required: false
type: string
description: The nox session to run
default: ci-test-onedir
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
env:
COLUMNS: 190
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
jobs:
generate-matrix:
name: Generate Matrix
runs-on:
# We need to run on our self-hosted runners because we need proper credentials
# for boto3 to scan through our repositories.
- self-hosted
- linux
- x86_64
outputs:
pkg-matrix-include: ${{ steps.generate-pkg-matrix.outputs.matrix }}
build-reports: ${{ steps.generate-pkg-matrix.outputs.build-reports }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Generate Package Test Matrix
id: generate-pkg-matrix
run: |
tools ci pkg-matrix ${{ inputs.distro-slug }} ${{ inputs.pkg-type }} --testing-releases ${{ join(fromJSON(inputs.testing-releases), ' ') }}
test:
name: Test
runs-on: ${{ inputs.runner }}
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
needs:
- generate-matrix
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Download Packages
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}-${{ inputs.arch }}-${{ inputs.pkg-type }}
path: artifacts/pkg/
- name: Install System Dependencies
run: |
brew install tree
- name: List Packages
run: |
tree artifacts/pkg/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python-version }}"
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Download nox.macos.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-macos-${{ inputs.arch }}-${{ inputs.nox-session }}
- name: Decompress .nox Directory
run: |
nox --force-color -e decompress-dependencies -- macos ${{ inputs.arch }}
- name: Show System Info
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_SYSTEM_INFO_ONLY: "1"
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
- name: Run Package Tests
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }} \
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
- name: Fix file ownership
run: |
sudo chown -R "$(id -un)" .
- name: Prepare Test Run Artifacts
id: download-artifacts-from-vm
if: always()
run: |
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
- name: Upload Test Run Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}-${{ inputs.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts
!artifacts/pkg/*
!artifacts/salt/*
!artifacts/salt-*.tar.*
report:
name: Report
runs-on: ubuntu-latest
if: always() && fromJSON(needs.generate-matrix.outputs.build-reports) && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
needs:
- generate-matrix
- test
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Merge Test Run Artifacts
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}
pattern: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}-*
separate-directories: true
delete-merged: true
- name: Download Test Run Artifacts
id: download-test-run-artifacts
uses: actions/download-artifact@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}
path: artifacts
- name: Show Test Run Artifacts
if: always() && steps.download-test-run-artifacts.outcome == 'success'
run: |
tree -a artifacts
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python-version }}"
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
env:
PIP_INDEX_URL: https://pypi.org/simple

View file

@ -1,273 +0,0 @@
name: Test Artifact
on:
workflow_call:
inputs:
distro-slug:
required: true
type: string
description: The OS slug to run tests against
platform:
required: true
type: string
description: The platform being tested
arch:
required: true
type: string
description: The platform arch being tested
pkg-type:
required: true
type: string
description: The platform arch being tested
salt-version:
type: string
required: true
description: The Salt version of the packages to install and test
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
testing-releases:
required: true
type: string
description: A JSON list of releases to test upgrades against
nox-version:
required: true
type: string
description: The nox version to install
python-version:
required: false
type: string
description: The python version to run tests with
default: "3.10"
fips:
required: false
type: boolean
default: false
description: Test run with FIPS enabled
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
nox-session:
required: false
type: string
description: The nox session to run
default: ci-test-onedir
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
env:
COLUMNS: 190
AWS_MAX_ATTEMPTS: "10"
AWS_RETRY_MODE: "adaptive"
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
jobs:
generate-matrix:
name: Generate Matrix
runs-on:
# We need to run on our self-hosted runners because we need proper credentials
# for boto3 to scan through our repositories.
- self-hosted
- linux
- x86_64
outputs:
pkg-matrix-include: ${{ steps.generate-pkg-matrix.outputs.matrix }}
build-reports: ${{ steps.generate-pkg-matrix.outputs.build-reports }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Generate Package Test Matrix
id: generate-pkg-matrix
run: |
tools ci pkg-matrix ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
${{ inputs.pkg-type }} --testing-releases ${{ join(fromJSON(inputs.testing-releases), ' ') }}
test:
name: Test
runs-on:
- self-hosted
- linux
- bastion
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
needs:
- generate-matrix
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Download Packages
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ inputs.arch }}-${{ inputs.pkg-type }}
path: artifacts/pkg/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
- name: List Packages
run: |
tree artifacts/pkg/
- name: Download nox.windows.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-windows-${{ inputs.arch }}-${{ inputs.nox-session }}
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: Get Salt Project GitHub Actions Bot Environment
run: |
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
- name: Start VM
id: spin-up-vm
run: |
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
- name: List Free Space
run: |
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
- name: Upload Checkout To VM
run: |
tools --timestamps vm rsync ${{ inputs.distro-slug }}
- name: Decompress .nox Directory
run: |
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
- name: Downgrade importlib-metadata
if: ${{ contains(fromJSON('["amazonlinux-2", "centos-7"]'), inputs.distro-slug) && contains(fromJSON('["upgrade-classic", "downgrade-classic"]'), matrix.tests-chunk) }}
run: |
# This step can go away once we stop testing classic packages upgrade/downgrades to/from 3005.x
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- "sudo python3 -m pip install -U 'importlib-metadata<=4.13.0' 'virtualenv<=20.21.1'"
- name: Show System Info
run: |
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
--nox-session=${{ inputs.nox-session }}-pkgs ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }}
- name: Run Package Tests
run: |
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install ${{ matrix.fips && '--fips ' || '' }}\
--nox-session=${{ inputs.nox-session }}-pkgs --rerun-failures ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }} \
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
- name: Download Test Run Artifacts
id: download-artifacts-from-vm
if: always() && steps.spin-up-vm.outcome == 'success'
run: |
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
- name: Destroy VM
if: always()
run: |
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
- name: Upload Test Run Artifacts
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}-${{ inputs.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts
!artifacts/pkg/*
!artifacts/salt/*
!artifacts/salt-*.tar.*
report:
name: Report
runs-on: ubuntu-latest
if: always() && fromJSON(needs.generate-matrix.outputs.build-reports) && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
needs:
- generate-matrix
- test
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Merge Test Run Artifacts
uses: actions/upload-artifact/merge@v4
continue-on-error: true
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}
pattern: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}-*
separate-directories: true
delete-merged: true
- name: Download Test Run Artifacts
id: download-test-run-artifacts
uses: actions/download-artifact@v4
with:
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ inputs.pkg-type }}
path: artifacts
- name: Show Test Run Artifacts
if: always() && steps.download-test-run-artifacts.outcome == 'success'
run: |
tree -a artifacts

View file

@ -0,0 +1,516 @@
---
name: Test Packages
on:
workflow_call:
inputs:
salt-version:
type: string
required: true
description: The Salt version of the packages to install and test
cache-prefix:
required: true
type: string
description: Seed used to invalidate caches
testing-releases:
required: true
type: string
description: A JSON list of releases to test upgrades against
nox-version:
required: true
type: string
description: The nox version to install
python-version:
required: false
type: string
description: The python version to run tests with
default: "3.10"
nox-session:
required: false
type: string
description: The nox session to run
default: ci-test-onedir
skip-code-coverage:
required: false
type: boolean
description: Skip code coverage
default: false
package-name:
required: false
type: string
description: The onedir package name to use
default: salt
matrix:
required: true
type: string
description: Json job matrix config
linux_arm_runner:
required: true
type: string
description: Json job matrix config
env:
COLUMNS: 190
AWS_MAX_ATTEMPTS: "10"
AWS_RETRY_MODE: "adaptive"
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
PIP_DISABLE_PIP_VERSION_CHECK: "1"
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
USE_S3_CACHE: 'false'
jobs:
test-linux:
name: ${{ matrix.display_name }} ${{ matrix.tests-chunk }}
runs-on: ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(inputs.matrix)['linux'] }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Download Packages
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ matrix.arch }}-${{ matrix.pkg_type }}
path: artifacts/pkg/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python-version }}"
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: List Packages
run: |
tree artifacts/pkg/
- name: Download nox.linux.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-linux-${{ matrix.arch }}-${{ inputs.nox-session }}
- name: "Ensure docker is running"
run: |
sudo systemctl start containerd || exit 0
- name: "Pull container ${{ matrix.container }}"
run: |
docker pull ${{ matrix.container }}
- name: "Create container ${{ matrix.container }}"
run: |
/usr/bin/docker create --name ${{ github.run_id }}_salt-test-pkg --workdir /__w/salt/salt --privileged -e "HOME=/github/home" -e GITHUB_ACTIONS=true -e CI=true -v "/var/run/docker.sock":"/var/run/docker.sock" -v "/home/runner/work":"/__w" -v "/home/runner/work/_temp":"/__w/_temp" -v "/home/runner/work/_actions":"/__w/_actions" -v "/opt/hostedtoolcache":"/__t" -v "/home/runner/work/_temp/_github_home":"/github/home" -v "/home/runner/work/_temp/_github_workflow":"/github/workflow" --entrypoint "/usr/lib/systemd/systemd" ${{ matrix.container }} --systemd --unit rescue.target
- name: "Start container ${{ matrix.container }}"
run: |
/usr/bin/docker start ${{ github.run_id }}_salt-test-pkg
- name: Decompress .nox Directory
run: |
docker exec ${{ github.run_id}}_salt-test-pkg python3 -m nox --force-color -e decompress-dependencies -- linux ${{ matrix.arch }}
- name: Setup Python Tools Scripts
uses: ./.github/actions/setup-python-tools-scripts
with:
cache-prefix: ${{ inputs.cache-prefix }}
- name: List Free Space
run: |
df -h || true
- name: Show System Info
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_SYSTEM_INFO_ONLY: "1"
run: |
docker exec ${{ github.run_id }}_salt-test-pkg python3 -m nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
- name: Run Package Tests
env:
SKIP_REQUIREMENTS_INSTALL: "1"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
COVERAGE_CONTEXT: ${{ matrix.slug }}
run: |
/usr/bin/docker exec ${{ github.run_id }}_salt-test-pkg \
python3 -m nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }} \
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
- name: Upload Test Run Log Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-log-artifacts-${{ matrix.slug }}-${{ inputs.nox-session }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts/logs
include-hidden-files: true
- name: Upload Test Run Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts/
!artifacts/pkg/*
!artifacts/salt/*
!artifacts/salt-*.tar.*
include-hidden-files: true
test-macos:
name: ${{ matrix.display_name }} ${{ matrix.tests-chunk }}
runs-on: ${{ matrix.runner }}
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
timeout-minutes: 150 # 2 & 1/2 Hours - More than this and something is wrong (MacOS needs a little more time)
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(inputs.matrix)['macos'] }}
steps:
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Download Packages
uses: actions/download-artifact@v4
with:
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-macos
path: artifacts/pkg/
- name: Install System Dependencies
run: |
brew install tree
- name: List Packages
run: |
tree artifacts/pkg/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
path: artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python-version }}"
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
env:
PIP_INDEX_URL: https://pypi.org/simple
- name: Download nox.macos.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
- name: Decompress .nox Directory
run: |
nox --force-color -e decompress-dependencies -- macos ${{ matrix.arch }}
- name: Show System Info
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_SYSTEM_INFO_ONLY: "1"
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
- name: Run Package Tests
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
COVERAGE_CONTEXT: ${{ matrix.slug }}
run: |
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }} \
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
- name: Fix file ownership
run: |
sudo chown -R "$(id -un)" .
- name: Prepare Test Run Artifacts
id: download-artifacts-from-vm
if: always()
run: |
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
tree -a artifacts
- name: Upload Test Run Log Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-log-artifacts-${{ matrix.slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts/logs
include-hidden-files: true
- name: Upload Test Run Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-artifacts-${{ matrix.slug }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts/
!artifacts/pkg/*
!artifacts/salt/*
!artifacts/salt-*.tar.*
include-hidden-files: true
test-windows:
name: ${{ matrix.display_name }} ${{ matrix.tests-chunk }}
runs-on: ${{ matrix.slug }}
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
strategy:
fail-fast: false
matrix:
include: ${{ fromJSON(inputs.matrix)['windows'] }}
steps:
- name: Set up Python ${{ inputs.python-version }}
uses: actions/setup-python@v5
with:
python-version: "${{ inputs.python-version }}"
- name: "Throttle Builds"
shell: bash
run: |
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
- name: "Set `TIMESTAMP` environment variable"
shell: bash
run: |
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
- name: Checkout Source Code
uses: actions/checkout@v4
- name: Download Packages
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ matrix.arch }}-${{ matrix.pkg_type }}
path: ./artifacts/pkg/
- name: Download Onedir Tarball as an Artifact
uses: actions/download-artifact@v4
with:
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
path: ./artifacts/
- name: Decompress Onedir Tarball
shell: bash
run: |
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
cd artifacts
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
- name: Install Nox
run: |
python3 -m pip install 'nox==${{ inputs.nox-version }}'
env:
PIP_INDEX_URL: https://pypi.org/simple
- run: python3 --version
- name: Download nox.windows.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
uses: actions/download-artifact@v4
with:
name: nox-windows-${{ matrix.arch }}-${{ inputs.nox-session }}
- name: Decompress .nox Directory
run: |
nox --force-color -e decompress-dependencies -- windows ${{ matrix.arch }}
- name: List Important Directories
run: |
dir d:/
dir .
dir artifacts/
dir artifacts/pkg
dir .nox/ci-test-onedir/Scripts
- name: Check onedir python
continue-on-error: true
run: |
artifacts/salt/Scripts/python.exe --version
- name: Check nox python
continue-on-error: true
run: |
.nox/ci-test-onedir/Scripts/python.exe --version
- name: Show System Info
env:
SKIP_REQUIREMENTS_INSTALL: "1"
SKIP_CODE_COVERAGE: "1"
PRINT_SYSTEM_INFO_ONLY: "1"
PYTHONUTF8: "1"
run: |
nox --force-color -f noxfile.py -e "${{ inputs.nox-session }}-pkgs" -- '${{ matrix.tests-chunk }}' --log-cli-level=debug
- name: Run Package Tests
env:
SKIP_REQUIREMENTS_INSTALL: "1"
PRINT_TEST_SELECTION: "0"
PRINT_TEST_PLAN_ONLY: "0"
PRINT_SYSTEM_INFO: "0"
RERUN_FAILURES: "1"
GITHUB_ACTIONS_PIPELINE: "1"
SKIP_INITIAL_ONEDIR_FAILURES: "1"
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
COVERAGE_CONTEXT: ${{ matrix.slug }}
OUTPUT_COLUMNS: "190"
PYTHONUTF8: "1"
run: >
nox --force-color -f noxfile.py -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
- name: Prepare Test Run Artifacts
id: download-artifacts-from-vm
if: always()
shell: bash
run: |
# Delete the salt onedir, we won't need it anymore and it will prevent
# from it showing in the tree command below
rm -rf artifacts/salt*
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ matrix.slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}
fi
- name: Upload Test Run Log Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-log-artifacts-${{ matrix.slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts/logs
include-hidden-files: true
- name: Upload Test Run Artifacts
if: always()
uses: actions/upload-artifact@v4
with:
name: pkg-testrun-artifacts-${{ matrix.slug }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
path: |
artifacts/
!artifacts/pkg/*
!artifacts/salt/*
!artifacts/salt-*.tar.*
include-hidden-files: true
report:
name: Report
runs-on: ubuntu-22.04
if: ${{ false }}
needs:
- test-linux
- test-macos
- test-windows
strategy:
matrix:
include: ${{ fromJSON(inputs.matrix)['linux'] }}
steps:
- name: Checkout Source Code
uses: actions/checkout@v4
- name: "Throttle Builds"
shell: bash
run: |
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
- name: Wait For Artifacts
run: |
sleep 60
- name: Merge Test Run Artifacts
continue-on-error: true
uses: actions/upload-artifact/merge@v4
with:
name: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}
pattern: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}-*
separate-directories: true
delete-merged: true
- name: Wait For Artifacts 2
run: |
sleep 60
- name: Download Test Run Artifacts
id: download-test-run-artifacts
uses: actions/download-artifact@v4
with:
path: artifacts/
pattern: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}*
merge-multiple: true
- name: Show Test Run Artifacts
if: always()
run: |
tree -a artifacts

View file

@ -13,7 +13,7 @@ on:
- 3007.x
- 3006.x
types:
- completed
- completed
permissions:
contents: read

1
.gitignore vendored
View file

@ -90,6 +90,7 @@ tests/unit/templates/roots
# Pycharm
.idea
venv/
.venv/
# VS Code
.vscode

View file

@ -59,7 +59,7 @@ repos:
- id: tools
alias: generate-workflows
name: Generate GitHub Workflow Templates
files: ^(cicd/shared-gh-workflows-context\.yml|tools/precommit/workflows\.py|.github/workflows/.*)$
files: ^(cicd/shared-gh-workflows-context\.yml|tools/utils/__init__.py|tools/precommit/workflows\.py|.github/workflows/.*)$
pass_filenames: false
args:
- pre-commit

112
AUTHORS
View file

@ -8,114 +8,28 @@ Whos Who in Salt
The Man With the Plan
----------------------------
Thomas S. Hatch is the main developer of Salt. He is the founder, owner,
maintainer and lead of the Salt project, as well as author of the majority
of the Salt code and documentation.
Thomas S. Hatch is the creator of Salt. He was the founder, owner,
maintainer that lead Salt project, as well as author of the majority
of initial Salt code and documentation.
SaltStack, Inc. was acquired by VMware in 2020. In 2023, VMware was
acquired by Broadcom.
The Salt Project core team of developers are employed by Broadcom.
Documentation System
----------------------------
The documentation system was put together by Seth House, much of the
documentation is being maintained by Seth.
Developers
----------------------------
Aaron Bull Schaefer <aaron@elasticdog.com>
Aaron Toponce <aaron.toponce@gmail.com>
Andrew Hammond <andrew.george.hammond@gmail.com>
Aditya Kulkarni <adi@saltstack.com>
Alexander Pyatkin <asp@thexyz.net>
Andre Sachs <andre@sachs.nom.za>
Andrew Colin Kissa <andrew@topdog.za.net>
Andrew Kuhnhausen <trane@errstr.com>
Antti Kaihola <akaihol+github@ambitone.com>
archme <archme.mail@gmail.com>
Brad Barden <brad@mifflinet.net>
Bret Palsson <bretep@gmail.com>
Brian Wagner <wags@wagsworld.net>
C. R. Oldham <cr@saltstack.com>
Carl Loa Odin <carlodin@gmail.com>
Carlo Pires <carlopires@gmail.com>
Chris Rebert <chris.rebert@hulu.com>
Chris Scheller <schelcj@umich.edu>
Christer Edwards <christer.edwards@gmail.com>
Clint Savage <herlo1@gmail.com>
Colton Myers <cmyers@saltstack.com>
Corey Quinn <corey@sequestered.net>
Corin Kochenower <ckochenower@saltstack.com>
Dan Garthwaite <dan@garthwaite.org>
Daniel Wallace <danielwallace at gtmanfred dot com>
David Boucha <boucha@gmail.com>
David Pravec <alekibango@pravec.tk>
deutsche
Dmitry Kuzmenko <dkuzmenko@saltstack.com>
Doug Renn <renn@nestegg.com>
Eivind Uggedal <eivind@uggedal.com>
epoelke@gmail.com <epoelke@heartflow.com>
Eric Poelke <epoelke@gmail.com>
Erik Nolte <enolte@beyondoblivion.com>
Evan Borgstrom <evan@fatbox.ca>
Forrest Alvarez <forrest.alvarez@gmail.com>
Fred Reimer <freimer@freimer.org>
Henrik Holmboe <henrik@holmboe.se>
Gareth J. Greenaway <gareth@wiked.org>
Jacob Albretsen <jakea@xmission.com>
Jed Glazner <jglazner@coldcrow.com>
Jeff Bauer <jbauer@rubic.com>
Jeff Hutchins <jhutchins@getjive.com>
Jeffrey C. Ollie <jeff@ocjtech.us>
Jeff Schroeder <jeffschroeder@computer.org>
Johnny Bergström
Jonas Buckner <buckner.jonas@gmail.com>
Jonathan Harker <k.jonathan.harker@hp.com>
Joseph Hall <joseph@saltstack.com>
Josmar Dias <josmarnet@gmail.com>
Kent Tenney <ktenney@gmail.com>
lexual
Marat Shakirov
Marc Abramowitz <marc+github@marc-abramowitz.com>
Martin Schnabel <mb0@mb0.org>
Mathieu Le Marec - Pasquet <kiorky@cryptelium.net>
Matt Black
Matthew Printz <hipokrit@gmail.com>
Matthias Teege <matthias-git@mteege.de>
Maxim Burgerhout <maxim@wzzrd.com>
Mickey Malone <mickey.malone@gmail.com>
Michael Steed <msteed@saltstack.com>
Mike Place <mp@saltstack.com>
Mircea Ulinic <ping@mirceaulinic.net>
Mitch Anderson <mitch@metauser.net>
Mostafa Hussein <mostafa.hussein91@gmail.com>
Nathaniel Whiteinge <seth@eseth.com>
Nicolas Delaby <nicolas.delaby@ezeep.com>
Nicole Thomas <nicole@saltstack.com>
Nigel Owen <nigelowen2.gmail.com>
Nitin Madhok <nmadhok@g.clemson.edu>
Oleg Anashkin <oleg.anashkin@gmail.com>
Pedro Algarvio <pedro@algarvio.me>
Peter Baumgartner
Pierre Carrier <pierre@spotify.com>
Rhys Elsmore <me@rhys.io>
Rafael Caricio <rafael@caricio.com>
Robert Fielding
Sean Channel <pentabular@gmail.com>
Seth House <seth@eseth.com>
Seth Vidal <skvidal@fedoraproject.org>
Stas Alekseev <stas.alekseev@gmail.com>
Thibault Cohen <titilambert@gmail.com>
Thomas Schreiber <tom@rizumu.us>
Thomas S Hatch <thatch45@gmail.com>
Tor Hveem <xt@bash.no>
Travis Cline <travis.cline@gmail.com>
Wieland Hoffmann <themineo+github@gmail.com>
The initial documentation system was put together by Seth House.
Documentation is now primarily maintained by the Salt Project core team and
community members.
Growing Community
--------------------------------
Salt is a rapidly growing project with a large community, to view all
contributors please check Github, this file can sometimes be out of date:
Salt is a rapidly growing project with a large community, and has had more than
2,400 contributors over the years. To view all contributors, please check Github:
https://github.com/saltstack/salt/graphs/contributors

View file

@ -13,10 +13,23 @@ Versions are `MAJOR.PATCH`.
### Removed
- The ``salt.utils.psutil_compat`` was deprecated and now removed in Salt 3008. Please use the ``psutil`` module directly. [#66160](https://github.com/saltstack/salt/issues/66160)
## 3006.9 (2024-07-29)
### Deprecated
- Drop CentOS 7 support [#66623](https://github.com/saltstack/salt/issues/66623)
- No longer build RPM packages with CentOS Stream 9 [#66624](https://github.com/saltstack/salt/issues/66624)
### Fixed
- Made slsutil.renderer work with salt-ssh [#50196](https://github.com/saltstack/salt/issues/50196)
- Fixed defaults.merge is not available when using salt-ssh [#51605](https://github.com/saltstack/salt/issues/51605)
- Fixed config.get does not support merge option with salt-ssh [#56441](https://github.com/saltstack/salt/issues/56441)
- Update to include croniter in pkg requirements [#57649](https://github.com/saltstack/salt/issues/57649)
- Fixed state.test does not work with salt-ssh [#61100](https://github.com/saltstack/salt/issues/61100)
- Made slsutil.findup work with salt-ssh [#61143](https://github.com/saltstack/salt/issues/61143)
- Fixes multiple issues with the cmd module on Windows. Scripts are called using
the ``-File`` parameter to the ``powershell.exe`` binary. ``CLIXML`` data in
stderr is now removed (only applies to encoded commands). Commands can now be
@ -35,6 +48,38 @@ Versions are `MAJOR.PATCH`.
- Change log level of successful master cluster key exchange from error to info. [#66266](https://github.com/saltstack/salt/issues/66266)
- Made `file.managed` skip download of a remote source if the managed file already exists with the correct hash [#66342](https://github.com/saltstack/salt/issues/66342)
- Fixed nftables.build_rule breaks ipv6 rules by using the wrong syntax for source and destination addresses [#66382](https://github.com/saltstack/salt/issues/66382)
- file.replace and file.search work properly with /proc files [#63102](https://github.com/saltstack/salt/issues/63102)
- Fix utf8 handling in 'pass' renderer [#64300](https://github.com/saltstack/salt/issues/64300)
- Fixed incorrect version argument will be ignored for multiple package targets warning when using pkgs argument to yumpkg module. [#64563](https://github.com/saltstack/salt/issues/64563)
- salt-cloud honors root_dir config setting for log_file location and fixes for root_dir locations on windows. [#64728](https://github.com/saltstack/salt/issues/64728)
- Fixed slsutil.update with salt-ssh during template rendering [#65067](https://github.com/saltstack/salt/issues/65067)
- Fix config.items when called on minion [#65251](https://github.com/saltstack/salt/issues/65251)
- Ensure on rpm and deb systems, that user and group for existing Salt, is maintained on upgrade [#65264](https://github.com/saltstack/salt/issues/65264)
- Fix typo in nftables module to ensure unique nft family values [#65295](https://github.com/saltstack/salt/issues/65295)
- pkg.installed state aggregate does not honors requires requisite [#65304](https://github.com/saltstack/salt/issues/65304)
- Added SSH wrapper for logmod [#65630](https://github.com/saltstack/salt/issues/65630)
- Fix for GitFS failure to unlock lock file, and resource cleanup for process SIGTERM [#65816](https://github.com/saltstack/salt/issues/65816)
- Corrected x509_v2 CRL creation `last_update` and `next_update` values when system timezone is not UTC [#65837](https://github.com/saltstack/salt/issues/65837)
- Make sure the root minion process handles SIGUSR1 and emits a traceback like it's child processes [#66095](https://github.com/saltstack/salt/issues/66095)
- Replaced pyvenv with builtin venv for virtualenv_mod [#66132](https://github.com/saltstack/salt/issues/66132)
- Made `file.managed` skip download of a remote source if the managed file already exists with the correct hash [#66342](https://github.com/saltstack/salt/issues/66342)
- Fix win_task ExecutionTimeLimit and result/error code interpretation [#66347](https://github.com/saltstack/salt/issues/66347), [#66441](https://github.com/saltstack/salt/issues/66441)
- Fixed nftables.build_rule breaks ipv6 rules by using the wrong syntax for source and destination addresses [#66382](https://github.com/saltstack/salt/issues/66382)
- Fixed x509_v2 certificate.managed crash for locally signed certificates if the signing policy defines signing_private_key [#66414](https://github.com/saltstack/salt/issues/66414)
- Fixed parallel state execution with Salt-SSH [#66514](https://github.com/saltstack/salt/issues/66514)
- Fix support for FIPS approved encryption and signing algorithms. [#66579](https://github.com/saltstack/salt/issues/66579)
- Fix relative file_roots paths [#66588](https://github.com/saltstack/salt/issues/66588)
- Fixed an issue with cmd.run with requirements when the shell is not the
default [#66596](https://github.com/saltstack/salt/issues/66596)
- Fix RPM package provides [#66604](https://github.com/saltstack/salt/issues/66604)
- Upgrade relAenv to 0.16.1. This release fixes several package installs for salt-pip [#66632](https://github.com/saltstack/salt/issues/66632)
- Upgrade relenv to 0.17.0 (https://github.com/saltstack/relenv/blob/v0.17.0/CHANGELOG.md) [#66663](https://github.com/saltstack/salt/issues/66663)
- Upgrade dependencies due to security issues:
- pymysql>=1.1.1
- requests>=2.32.0
- docker>=7.1.0 [#66666](https://github.com/saltstack/salt/issues/66666)
- Corrected missed line in branch 3006.x when backporting from PR 61620 and 65044 [#66683](https://github.com/saltstack/salt/issues/66683)
- Remove debug output from shell scripts for packaging [#66747](https://github.com/saltstack/salt/issues/66747)
### Added
@ -44,12 +89,18 @@ Versions are `MAJOR.PATCH`.
unbootstrap chocolatey. [#64722](https://github.com/saltstack/salt/issues/64722)
- Add Ubuntu 24.04 support [#66180](https://github.com/saltstack/salt/issues/66180)
- Add Fedora 40 support, replacing Fedora 39 [#66300](https://github.com/saltstack/salt/issues/66300)
- Add Ubuntu 24.04 support [#66180](https://github.com/saltstack/salt/issues/66180)
- Add Fedora 40 support, replacing Fedora 39 [#66300](https://github.com/saltstack/salt/issues/66300)
- Build RPM packages with Rocky Linux 9 (instead of CentOS Stream 9) [#66624](https://github.com/saltstack/salt/issues/66624)
### Security
- Bump to `pydantic==2.6.4` due to https://github.com/advisories/GHSA-mr82-8j83-vxmv [#66433](https://github.com/saltstack/salt/issues/66433)
- Bump to ``jinja2==3.1.4`` due to https://github.com/advisories/GHSA-h75v-3vvj-5mfj [#66488](https://github.com/saltstack/salt/issues/66488)
- Bump to ``jinja2==3.1.4`` due to https://github.com/advisories/GHSA-h75v-3vvj-5mfj [#66488](https://github.com/saltstack/salt/issues/66488)
- CVE-2024-37088 salt-call will fail with exit code 1 if bad pillar data is
encountered. [#66702](https://github.com/saltstack/salt/issues/66702)
## 3006.8 (2024-04-29)

View file

@ -60,7 +60,7 @@ representative at an online or offline event.
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
conduct@saltstack.com.
saltproject.pdl@broadcom.com.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the

View file

@ -1,30 +1,56 @@
============
Contributing
============
==============================================
Contributing to Salt: A Guide for Contributors
==============================================
So you want to contribute to the Salt project? Excellent! You can help
in a number of ways:
So, you want to contribute to the Salt project? That's fantastic! There are many
ways you can help improve Salt:
- Use Salt and open well-written bug reports.
- Join a `working group <https://github.com/saltstack/community>`__.
- Answer questions on `irc <https://web.libera.chat/#salt>`__,
the `community Slack <https://via.vmw.com/salt-slack>`__,
the `salt-users mailing
list <https://groups.google.com/forum/#!forum/salt-users>`__,
`Server Fault <https://serverfault.com/questions/tagged/saltstack>`__,
or `r/saltstack on Reddit <https://www.reddit.com/r/saltstack/>`__.
- Fix bugs.
- `Improve the documentation <https://saltstack.gitlab.io/open/docs/docs-hub/topics/contributing.html>`__.
- Provide workarounds, patches, or other code without tests.
- Tell other people about problems you solved using Salt.
- Use Salt and report bugs with clear, detailed descriptions.
- Join a `working group <https://github.com/saltstack/community>`__ to
collaborate with other contributors.
- Answer questions on platforms like
the `community Discord <https://discord.com/invite/J7b7EscrAs>`__,
the `salt-users mailing list <https://groups.google.com/forum/#!forum/salt-users>`__,
`Server Fault <https://serverfault.com/questions/tagged/saltstack>`__,
or `r/saltstack on Reddit <https://www.reddit.com/r/saltstack/>`__.
- Fix bugs or contribute to the `documentation <https://saltstack.gitlab.io/open/docs/docs-hub/topics/contributing.html>`__.
- Submit workarounds, patches, or code (even without tests).
- Share your experiences and solutions to problems you've solved using Salt.
If you'd like to update docs or fix an issue, you're going to need the
Salt repo. The best way to contribute is using
`Git <https://git-scm.com/>`__.
Choosing the Right Branch for Your Pull Request
===============================================
We appreciate your contributions to the project! To ensure a smooth and
efficient workflow, please follow these guidelines when submitting a Pull
Request. Each type of contribution—whether it's fixing a bug, adding a feature,
updating documentation, or fixing tests—should be targeted at the appropriate
branch. This helps us manage changes effectively and maintain stability across
versions.
- **Bug Fixes:**
Create your Pull Request against the oldest supported branch where the bug
exists. This ensures that the fix can be applied to all relevant versions.
- **New Features**:
For new features or enhancements, create your Pull Request against the master
branch.
- **Documentation Updates:**
Documentation changes should be made against the master branch, unless they
are related to a bug fix, in which case they should follow the same branch as
the bug fix.
- **Test Fixes:**
Pull Requests that fix broken or failing tests should be created against the
oldest supported branch where the issue occurs.
Setting Up Your Salt Development Environment
============================================
Environment setup
=================
To hack on Salt or the docs you're going to need to set up your
development environment. If you already have a workflow that you're
comfortable with, you can use that, but otherwise this is an opinionated
@ -109,7 +135,7 @@ Then activate it:
Sweet! Now you're ready to clone Salt so you can start hacking away! If
you get stuck at any point, check out the resources at the beginning of
this guide. IRC and Slack are particularly helpful places to go.
this guide. Discord and GitHub Discussions are particularly helpful places to go.
Get the source!
@ -605,7 +631,7 @@ your PR is submitted during the week you should be able to expect some
kind of communication within that business day. If your tests are
passing and we're not in a code freeze, ideally your code will be merged
that week or month. If you haven't heard from your assigned reviewer, ping them
on GitHub, `irc <https://web.libera.chat/#salt>`__, or Community Slack.
on GitHub or `Community Discord <https://discord.com/invite/J7b7EscrAs>`__.
It's likely that your reviewer will leave some comments that need
addressing - it may be a style change, or you forgot a changelog entry,

View file

@ -6,17 +6,9 @@
:alt: PyPi Package Downloads
:target: https://pypi.org/project/salt
.. image:: https://img.shields.io/lgtm/grade/python/github/saltstack/salt
:alt: PyPi Package Downloads
:target: https://lgtm.com/projects/g/saltstack/salt/context:python
.. image:: https://img.shields.io/badge/slack-SaltProject-blue.svg?logo=slack
:alt: Salt Project Slack Community
:target: https://via.vmw.com/salt-slack
.. image:: https://img.shields.io/twitch/status/saltprojectoss
:alt: Salt Project Twitch Channel
:target: https://www.twitch.tv/saltprojectoss
.. image:: https://img.shields.io/badge/discord-SaltProject-blue.svg?logo=discord
:alt: Salt Project Discord Community
:target: https://discord.com/invite/J7b7EscrAs
.. image:: https://img.shields.io/reddit/subreddit-subscribers/saltstack?style=social
:alt: Salt Project subreddit
@ -71,20 +63,21 @@ In addition to configuration management Salt can also:
About our sponsors
==================
Salt powers VMware's `VMware Aria Automation Config`_
(previously vRealize Automation SaltStack Config / SaltStack Enterprise), and can be found
Salt powers VMware by Broadcom's `Tanzu Salt`_
(previously Aria Automation Config / vRealize Automation SaltStack Config / SaltStack Enterprise), and can be found
under the hood of products from Juniper, Cisco, Cloudflare, Nutanix, SUSE, and
Tieto, to name a few.
The original sponsor of our community, SaltStack, was `acquired by VMware in 2020 <https://www.vmware.com/company/acquisitions/saltstack.html>`_.
The Salt Project remains an open source ecosystem that VMware supports and
contributes to. VMware ensures the code integrity and quality of the Salt
The original sponsor of our community, SaltStack, was acquired by VMware in 2020.
`VMware was later acquired by Broadcom in 2023 <https://investors.broadcom.com/news-releases/news-release-details/broadcom-completes-acquisition-vmware>`__.
The Salt Project remains an open source ecosystem that Broadcom supports and
contributes to. Broadcom ensures the code integrity and quality of the Salt
modules by acting as the official sponsor and manager of the Salt project. Many
of the core Salt Project contributors are also VMware employees. This team
of the core Salt Project contributors are also Broadcom employees. This team
carefully reviews and enhances the Salt modules to ensure speed, quality, and
security.
Download and install Salt
=========================
Salt is tested and packaged to run on CentOS, Debian, RHEL, Ubuntu, MacOS,
@ -93,9 +86,11 @@ Windows, and more. Download Salt and get started now. See
for more information.
To download and install Salt, see:
* `The Salt install guide <https://docs.saltproject.io/salt/install-guide/en/latest/index.html>`_
* `Salt Project repository <https://repo.saltproject.io/>`_
* `The Salt install guide <https://docs.saltproject.io/salt/install-guide/en/latest/index.html>`_
* `Salt Project Repository: Linux (RPM) <https://packages.broadcom.com/artifactory/saltproject-rpm>`__ - Where Salt ``rpm`` packages are officially stored and distributed.
* `Salt Project Repository: Linux (DEB) <https://packages.broadcom.com/artifactory/saltproject-deb>`__ - Where Salt ``deb`` packages are officially stored and distributed.
* `Salt Project Repository: GENERIC <https://packages.broadcom.com/artifactory/saltproject-generic>`__ - Where Salt Windows, macOS, etc. (non-rpm, non-deb) packages are officially stored and distributed.
Technical support
=================
@ -103,7 +98,8 @@ Report bugs or problems using Salt by opening an issue: `<https://github.com/sal
To join our community forum where you can exchange ideas, best practices,
discuss technical support questions, and talk to project maintainers, join our
Slack workspace: `Salt Project Community Slack`_
Discord server: `Salt Project Community Discord`_
Salt Project documentation
@ -127,7 +123,7 @@ announcements.
Other channels to receive security announcements include the
`Salt Community mailing list <https://groups.google.com/forum/#!forum/salt-users>`_
and the `Salt Project Community Slack`_.
and the `Salt Project Community Discord`_.
Responsibly reporting security vulnerabilities
@ -152,11 +148,9 @@ Please be sure to review our
`Code of Conduct <https://github.com/saltstack/salt/blob/master/CODE_OF_CONDUCT.md>`_.
Also, check out some of our community resources including:
* `Salt Project Community Wiki <https://github.com/saltstack/community/wiki>`_
* `Salt Project Community Slack`_
* `Salt Project: IRC on LiberaChat <https://web.libera.chat/#salt>`_
* `Salt Project Community Discord`_
* `Salt Project YouTube channel <https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg>`_
* `Salt Project Twitch channel <https://www.twitch.tv/saltprojectoss>`_
* `Salt Project Community Notes and Wiki <https://github.com/saltstack/community/>`_
There are lots of ways to get involved in our community. Every month, there are
around a dozen opportunities to meet with other contributors and the Salt Core
@ -164,10 +158,8 @@ team and collaborate in real time. The best way to keep track is by subscribing
to the **Salt Project Community Events Calendar** on the main
`<https://saltproject.io>`_ website.
If you have additional questions, email us at saltproject@vmware.com or reach out
directly to the Community Manager, Jimmy Chunga via Slack. We'd be glad to
have you join our community!
If you have additional questions, email us at saltproject.pdl@broadcom.com or reach out
directly to the Community Discord. We'd be glad to have you join our community!
License
=======
@ -180,10 +172,8 @@ used by external modules.
A complete list of attributions and dependencies can be found here:
`salt/DEPENDENCIES.md <https://github.com/saltstack/salt/blob/master/DEPENDENCIES.md>`_
.. _Salt Project Community Slack: https://via.vmw.com/salt-slack
.. _VMware Aria Automation Config: https://www.vmware.com/products/vrealize-automation/saltstack-config.html
.. _Salt Project Community Discord: https://discord.com/invite/J7b7EscrAs
.. _Tanzu Salt: https://www.vmware.com/products/app-platform/tanzu-salt
.. _Latest Salt Documentation: https://docs.saltproject.io/en/latest/
.. _Open an issue: https://github.com/saltstack/salt/issues/new/choose
.. _SECURITY.md: https://github.com/saltstack/salt/blob/master/SECURITY.md
.. _Calendar html: https://outlook.office365.com/owa/calendar/105f69bacd4541baa849529aed37eb2d@vmware.com/434ec2155b2b4cce90144c87f0dd03d56626754050155294962/calendar.html
.. _Calendar ics: https://outlook.office365.com/owa/calendar/105f69bacd4541baa849529aed37eb2d@vmware.com/434ec2155b2b4cce90144c87f0dd03d56626754050155294962/calendar.ics

View file

@ -1,17 +1,10 @@
Get SaltStack Support and Help
==============================
Get Salt Project Support and Help
=================================
**IRC Chat** - Join the vibrant, helpful and positive SaltStack chat room in
LiberaChat at #salt. There is no need to introduce yourself, or ask permission
to join in, just help and be helped! Make sure to wait for an answer, sometimes
it may take a few moments for someone to reply.
**Salt Project Discord** - Join the Salt Project Community Discord!
Use the following link to join the Discord server:
`<https://web.libera.chat/#salt>`_
**SaltStack Slack** - Alongside IRC is our SaltStack Community Slack for the
SaltStack Working groups. Use the following link to request an invitation.
`<https://via.vmw.com/salt-slack>`_
`<https://discord.com/invite/J7b7EscrAs>`_
**Mailing List** - The SaltStack community users mailing list is hosted by
Google groups. Anyone can post to ask questions about SaltStack products and
@ -20,13 +13,13 @@ anyone can help answer. Join the conversation!
`<https://groups.google.com/forum/#!forum/salt-users>`_
You may subscribe to the list without a Google account by emailing
salt-users+subscribe@googlegroups.com and you may post to the list by emailing
salt-users@googlegroups.com
``salt-users+subscribe@googlegroups.com`` and you may post to the list by emailing
``salt-users@googlegroups.com``
**Reporting Issues** - To report an issue with Salt, please follow the
guidelines for filing bug reports:
`<https://docs.saltproject.io/en/master/topics/development/reporting_bugs.html>`_
**SaltStack Support** - If you need dedicated, prioritized support, please
consider a SaltStack Support package that fits your needs:
`<http://www.saltstack.com/support>`_
**Salt Project Support** - If you need dedicated, prioritized support, please
consider taking a look at the Enterprise product:
`Tanzu Salt <https://www.vmware.com/products/app-platform/tanzu-salt>`__

3
changelog/33669.added.md Normal file
View file

@ -0,0 +1,3 @@
Issue #33669: Fixes an issue with the ``ini_managed`` execution module
where it would always wrap the separator with spaces. Adds a new parameter
named ``no_spaces`` that will not warp the separator with spaces.

1
changelog/41794.fixed.md Normal file
View file

@ -0,0 +1 @@
Fixed `salt.*.get` shorthand via Salt-SSH

View file

@ -1 +0,0 @@
Made slsutil.renderer work with salt-ssh

View file

@ -1 +0,0 @@
Fixed defaults.merge is not available when using salt-ssh

View file

@ -1 +0,0 @@
Fixed config.get does not support merge option with salt-ssh

View file

@ -1 +0,0 @@
Update to include croniter in pkg requirements

4
changelog/58969.fixed.md Normal file
View file

@ -0,0 +1,4 @@
Issue 58969: Fixes an issue with `saltclass.expand_classes_in_order`
function where it was losing nested classes states during class
expansion. The logic now use `salt.utils.odict.OrderedDict` to keep
the inclusion ordering.

2
changelog/61001.fixed.md Normal file
View file

@ -0,0 +1,2 @@
Fixed an issue uninstalling packages on Windows using pkg.removed where there
are multiple versions of the same software installed

View file

@ -1 +0,0 @@
Fixed state.test does not work with salt-ssh

View file

@ -1 +0,0 @@
Made slsutil.findup work with salt-ssh

1
changelog/62501.fixed.md Normal file
View file

@ -0,0 +1 @@
Convert stdin string to bytes regardless of stdin_raw_newlines

4
changelog/63933.fixed.md Normal file
View file

@ -0,0 +1,4 @@
Issue 63933: Fixes an issue with `saltclass.expanded_dict_from_minion`
function where it was passing a reference to minion `dict` which was
overridden by nested classes during class expansion. Copy the node
definition with `copy.deepcopy` instead of passing a reference.

View file

@ -1 +0,0 @@
Fix utf8 handling in 'pass' renderer

View file

@ -1 +0,0 @@
Fixed incorrect version argument will be ignored for multiple package targets warning when using pkgs argument to yumpkg module.

3
changelog/64630.fixed.md Normal file
View file

@ -0,0 +1,3 @@
Fixed an intermittent issue with file.recurse where the state would
report failure even on success. Makes sure symlinks are created
after the target file is created

View file

@ -1 +0,0 @@
salt-cloud honors root_dir config setting for log_file location and fixes for root_dir locations on windows.

View file

@ -1 +0,0 @@
Fixed slsutil.update with salt-ssh during template rendering

1
changelog/65104.fixed.md Normal file
View file

@ -0,0 +1 @@
The 'profile' outputter does not crash with incorrectly formatted data

View file

@ -1 +0,0 @@
Fix config.items when called on minion

2
changelog/65265.fixed.md Normal file
View file

@ -0,0 +1,2 @@
Await on zmq monitor socket's poll method to fix publish server reliability in
environment's with a large amount of minions.

View file

@ -1 +0,0 @@
Added SSH wrapper for logmod

View file

@ -1 +0,0 @@
Fix for GitFS failure to unlock lock file, and resource cleanup for process SIGTERM

1
changelog/66213.fixed.md Normal file
View file

@ -0,0 +1 @@
Fix vault module doesn't respect `server.verify` option during unwrap if verify is set to `False` or CA file on the disk

1
changelog/66228.fixed.md Normal file
View file

@ -0,0 +1 @@
Make sure the master_event_pub.ipc file has correct reed/write permissions for salt group.

1
changelog/66249.fixed.md Normal file
View file

@ -0,0 +1 @@
Fix batch mode hang indefinitely in some scenarios

1
changelog/66252.fixed.md Normal file
View file

@ -0,0 +1 @@
Applying `selinux.fcontext_policy_present` to a shorter path than an existing entry now works

View file

@ -1 +0,0 @@
Fix win_task ExecutionTimeLimit and result/error code interpretation

1
changelog/66376.fixed.md Normal file
View file

@ -0,0 +1 @@
Fixed `salt.*.*` attribute syntax for non-Jinja renderers via Salt-SSH

View file

@ -1 +0,0 @@
Fixed x509_v2 certificate.managed crash for locally signed certificates if the signing policy defines signing_private_key

View file

@ -1 +0,0 @@
Fix win_task ExecutionTimeLimit and result/error code interpretation

View file

@ -0,0 +1 @@
Remove psutil_compat.py file, which should have been removed when RHEL 6 EOL

1
changelog/66560.fixed.md Normal file
View file

@ -0,0 +1 @@
Correct bash-completion for Debian / Ubuntu

View file

@ -1 +0,0 @@
Fix support for FIPS approved encryption and signing algorithms.

2
changelog/66596.fixed.md Normal file
View file

@ -0,0 +1,2 @@
Fixed an issue with cmd.run with requirements when the shell is not the
default

1
changelog/66600.fixed.md Normal file
View file

@ -0,0 +1 @@
Fixed accessing wrapper modules in Salt-SSH Jinja templates via attribute syntax

View file

@ -1 +0,0 @@
Fix RPM package provides

Some files were not shown because too many files have changed in this diff Show more