mirror of
https://github.com/saltstack/salt.git
synced 2025-04-17 10:10:20 +00:00
Compare commits
3231 commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
f906ca5052 | ||
![]() |
c5b39473ed | ||
![]() |
f8ba8c2c62 | ||
![]() |
df17505fed | ||
![]() |
c695e0bcff | ||
![]() |
90c834fa5e | ||
![]() |
788b42db6f | ||
![]() |
1920e80a52 | ||
![]() |
edd5e2e4dd | ||
![]() |
1c5b03f7be | ||
![]() |
356f868571 | ||
![]() |
4115fc6b05 | ||
![]() |
1f36f79999 | ||
![]() |
e355b483fe | ||
![]() |
fa295250af | ||
![]() |
cb39a7d377 | ||
![]() |
08b27c5c7c | ||
![]() |
b2994a1538 | ||
![]() |
cccbe9eb5e | ||
![]() |
41fb26f49d | ||
![]() |
61392d0459 | ||
![]() |
0b7da42cd7 | ||
![]() |
6429654363 | ||
![]() |
7639074fe5 | ||
![]() |
aa981a52ca | ||
![]() |
7fb8eb538b | ||
![]() |
1554e84f3a | ||
![]() |
1898c59db0 | ||
![]() |
b51e75e540 | ||
![]() |
16064410f9 | ||
![]() |
97ea9a11fb | ||
![]() |
8e28d02e2f | ||
![]() |
05b01e268d | ||
![]() |
87dd674da2 | ||
![]() |
bf0c1331d5 | ||
![]() |
65e40daf00 | ||
![]() |
9dcb3de8e8 | ||
![]() |
48f824af65 | ||
![]() |
3f96a2cde3 | ||
![]() |
8c725027db | ||
![]() |
a83582cab1 | ||
![]() |
7d93923b1a | ||
![]() |
17b15c9fc2 | ||
![]() |
17c1c7a0bb | ||
![]() |
3a852d18f6 | ||
![]() |
cd04e6f2f9 | ||
![]() |
e98ecbc0d6 | ||
![]() |
b01ae2ce28 | ||
![]() |
002d02a5e7 | ||
![]() |
59ddf3cf88 | ||
![]() |
bf00196748 | ||
![]() |
f8596ba15c | ||
![]() |
c2bb925bc6 | ||
![]() |
b128203e9c | ||
![]() |
8663f32962 | ||
![]() |
0d47d91006 | ||
![]() |
8c8792d81b | ||
![]() |
ab9e02a1a9 | ||
![]() |
4524d8a54a | ||
![]() |
af906c795e | ||
![]() |
61890f79af | ||
![]() |
905f8c9826 | ||
![]() |
aeecd6b7b2 | ||
![]() |
c44e8e3a66 | ||
![]() |
4fa83b77a4 | ||
![]() |
dd2c86d844 | ||
![]() |
adf62acf9c | ||
![]() |
685d330cc0 | ||
![]() |
680f27f9b8 | ||
![]() |
2b41383e2b | ||
![]() |
24a58e8211 | ||
![]() |
0ad9b3beee | ||
![]() |
9a9b8f8fc3 | ||
![]() |
cbaf7a241e | ||
![]() |
f134da8acf | ||
![]() |
9338d94bd4 | ||
![]() |
b22adbaf59 | ||
![]() |
44d666791e | ||
![]() |
4e3eb1455e | ||
![]() |
bac3653e2f | ||
![]() |
53a7e67f36 | ||
![]() |
d09a741c4e | ||
![]() |
c16232efad | ||
![]() |
5e19e2db18 | ||
![]() |
4c1b4236b3 | ||
![]() |
9a3d947760 | ||
![]() |
b5b850167b | ||
![]() |
3f2b63f5bc | ||
![]() |
e9895ca480 | ||
![]() |
4fe15bc4fa | ||
![]() |
81885bace3 | ||
![]() |
784dea3884 | ||
![]() |
17d2ceb2e8 | ||
![]() |
fa82e19f2b | ||
![]() |
3b4094073e | ||
![]() |
c212603305 | ||
![]() |
1cd6139479 | ||
![]() |
643c26f351 | ||
![]() |
2d2f829920 | ||
![]() |
1d4c12ec3b | ||
![]() |
e120c04617 | ||
![]() |
2dbbcb5ed1 | ||
![]() |
d9652b9a1c | ||
![]() |
fc0b20e657 | ||
![]() |
2404b3222c | ||
![]() |
e899bf426e | ||
![]() |
15fafec88e | ||
![]() |
4725d274dc | ||
![]() |
88a3dabf7b | ||
![]() |
fe2eefccfb | ||
![]() |
c59bd42e60 | ||
![]() |
99871bef43 | ||
![]() |
68f4107bdb | ||
![]() |
fabaa0fb0a | ||
![]() |
ad1b07728b | ||
![]() |
5dfc7f710f | ||
![]() |
06f6251b63 | ||
![]() |
bb7ebb38e4 | ||
![]() |
3219011dfb | ||
![]() |
cecf31f29b | ||
![]() |
4c8ca3062c | ||
![]() |
5f941f75bb | ||
![]() |
e73182646d | ||
![]() |
634d5d11df | ||
![]() |
198758a47b | ||
![]() |
7002d97afd | ||
![]() |
51296c7f23 | ||
![]() |
a37021e6b4 | ||
![]() |
bc007e3c55 | ||
![]() |
0ce2fc6fde | ||
![]() |
42d977aa70 | ||
![]() |
4070ed2e06 | ||
![]() |
2c93bc3d23 | ||
![]() |
dc6796d2fd | ||
![]() |
3701105a59 | ||
![]() |
d0361b7434 | ||
![]() |
9f1c62a847 | ||
![]() |
31e125f09e | ||
![]() |
a3b79144c1 | ||
![]() |
1cae35984d | ||
![]() |
44e4a1ef7d | ||
![]() |
d3401c0c52 | ||
![]() |
9adaf73fa6 | ||
![]() |
fc0c0dd3b1 | ||
![]() |
86c7e1f5c9 | ||
![]() |
a017805680 | ||
![]() |
64a91d459f | ||
![]() |
3ea2573906 | ||
![]() |
3be907b608 | ||
![]() |
d2551346f3 | ||
![]() |
7b74517da5 | ||
![]() |
4e78a0e233 | ||
![]() |
f02cb2efbe | ||
![]() |
9eb019278a | ||
![]() |
0140f6fd46 | ||
![]() |
d625eaeea8 | ||
![]() |
59ab018e0f | ||
![]() |
dcf37bda37 | ||
![]() |
f6db09f724 | ||
![]() |
06a9a3ab96 | ||
![]() |
d714a112b5 | ||
![]() |
e1056ccf39 | ||
![]() |
05cd683eab | ||
![]() |
6dcb7182b3 | ||
![]() |
f6aa6ff807 | ||
![]() |
66e46a32d8 | ||
![]() |
0820ee84b0 | ||
![]() |
4081406715 | ||
![]() |
91dc936543 | ||
![]() |
fb658a369b | ||
![]() |
ec49ec72c8 | ||
![]() |
49950e7a25 | ||
![]() |
bd20c5117d | ||
![]() |
31a3e157f4 | ||
![]() |
4a6bd863e8 | ||
![]() |
c2f129ac58 | ||
![]() |
3aec32f675 | ||
![]() |
39cebf6f1d | ||
![]() |
a7f38d9473 | ||
![]() |
06d813285d | ||
![]() |
ccdc5bb138 | ||
![]() |
092ce44978 | ||
![]() |
368ee2ac38 | ||
![]() |
5779722440 | ||
![]() |
c346ebaad4 | ||
![]() |
d0231f33ee | ||
![]() |
b0029609ab | ||
![]() |
971458eb6d | ||
![]() |
cea608be9f | ||
![]() |
fe28c55227 | ||
![]() |
be25fb8f7e | ||
![]() |
65dff0ba98 | ||
![]() |
98a0673eca | ||
![]() |
ed8a0e0d9c | ||
![]() |
2b44a6ce9a | ||
![]() |
be450ab3f1 | ||
![]() |
d9167652e5 | ||
![]() |
7ca0312ae2 | ||
![]() |
ee49042a0f | ||
![]() |
3d240a12d7 | ||
![]() |
9198bac5e9 | ||
![]() |
2043789125 | ||
![]() |
4719725412 | ||
![]() |
9619b980c7 | ||
![]() |
029b96c996 | ||
![]() |
d72fb55597 | ||
![]() |
21df2a121d | ||
![]() |
edfb640fe3 | ||
![]() |
31fe2b2222 | ||
![]() |
54b277a971 | ||
![]() |
15cd75bdef | ||
![]() |
df5997c400 | ||
![]() |
461e41fc11 | ||
![]() |
b4bf3c355f | ||
![]() |
54768968f1 | ||
![]() |
37029cf8ae | ||
![]() |
fcb47925e8 | ||
![]() |
76cb758f64 | ||
![]() |
045bb73f0d | ||
![]() |
b3ae80492e | ||
![]() |
240f0c3069 | ||
![]() |
98dd3e89ba | ||
![]() |
739608e428 | ||
![]() |
6c4053572b | ||
![]() |
050a7f0a3f | ||
![]() |
d173c1d3c1 | ||
![]() |
9b8078a53b | ||
![]() |
f6c9181c2e | ||
![]() |
d4cc7327ec | ||
![]() |
1b8130e913 | ||
![]() |
5aa82db68a | ||
![]() |
7c95b78604 | ||
![]() |
90fdf4ed28 | ||
![]() |
9c4d258767 | ||
![]() |
2957342d89 | ||
![]() |
63f8d5792f | ||
![]() |
3597c8d5df | ||
![]() |
e39d84ac38 | ||
![]() |
800f33362d | ||
![]() |
e67e43ebe2 | ||
![]() |
1202852c4b | ||
![]() |
08844f7388 | ||
![]() |
2ebf81af42 | ||
![]() |
d6d9484a0c | ||
![]() |
4d8e7021ed | ||
![]() |
d59d7fd3a5 | ||
![]() |
f48d651cc0 | ||
![]() |
3239abeec1 | ||
![]() |
9d11b81a40 | ||
![]() |
fdbb239702 | ||
![]() |
a1b9ca6e2f | ||
![]() |
0a1a89c149 | ||
![]() |
3add99aa12 | ||
![]() |
bdcac9e38b | ||
![]() |
4466cd0035 | ||
![]() |
f3cd64cd1e | ||
![]() |
d658154449 | ||
![]() |
7afd7649f5 | ||
![]() |
34b583d1e6 | ||
![]() |
593d6cb5ec | ||
![]() |
5a097a50be | ||
![]() |
3f0dd6b54f | ||
![]() |
2db89637db | ||
![]() |
d63a5ea374 | ||
![]() |
29d83c1c24 | ||
![]() |
be98f55dcc | ||
![]() |
74a4b5b4ee | ||
![]() |
bc5b823726 | ||
![]() |
6d3178c23f | ||
![]() |
03df109113 | ||
![]() |
41c3d0974c | ||
![]() |
19fe9b4bb0 | ||
![]() |
2656f446c7 | ||
![]() |
4a0ddd4c5a | ||
![]() |
8a9e01a0e9 | ||
![]() |
07a1602326 | ||
![]() |
156a5ca39c | ||
![]() |
a059ff4bfd | ||
![]() |
3d3d6d6e22 | ||
![]() |
263c5e99dc | ||
![]() |
a887a0000b | ||
![]() |
c4a1a6eeba | ||
![]() |
9f1647d0f6 | ||
![]() |
a5594e7bd2 | ||
![]() |
ca81605ce6 | ||
![]() |
7587e8bcf2 | ||
![]() |
739b94bd68 | ||
![]() |
0ca299cc81 | ||
![]() |
60a9edf235 | ||
![]() |
ab900b9375 | ||
![]() |
c0fc37f0d9 | ||
![]() |
3f686c2e89 | ||
![]() |
48647c2080 | ||
![]() |
df03e6f985 | ||
![]() |
630f979214 | ||
![]() |
46ce2f9e2d | ||
![]() |
fc87799639 | ||
![]() |
02e110abc7 | ||
![]() |
2b45cd6ee5 | ||
![]() |
cec866179b | ||
![]() |
511164e10b | ||
![]() |
9cb51fac91 | ||
![]() |
c7a45fd32d | ||
![]() |
2b288fa007 | ||
![]() |
2a384ec9e7 | ||
![]() |
deda5dc15e | ||
![]() |
b02788ed00 | ||
![]() |
d4aa4e7b3b | ||
![]() |
1b24b47276 | ||
![]() |
838d44301e | ||
![]() |
23785aecc5 | ||
![]() |
ca4b04b730 | ||
![]() |
6b1cd9e720 | ||
![]() |
ac8d40735e | ||
![]() |
b6a92f693d | ||
![]() |
2da58365e9 | ||
![]() |
76fcc87785 | ||
![]() |
955529f1c0 | ||
![]() |
5fe28f00f9 | ||
![]() |
b0a06655e4 | ||
![]() |
9a327c5919 | ||
![]() |
2ca8551615 | ||
![]() |
445749cbda | ||
![]() |
08853b63a3 | ||
![]() |
1a09279741 | ||
![]() |
a5b78b55ce | ||
![]() |
30e2d9aaf4 | ||
![]() |
1a97786516 | ||
![]() |
786db9bbbb | ||
![]() |
1798cfa247 | ||
![]() |
db5c7aefc5 | ||
![]() |
805d10969a | ||
![]() |
306bdf8425 | ||
![]() |
dc43eba7b7 | ||
![]() |
f22c6cabc0 | ||
![]() |
a633493f4c | ||
![]() |
4eca31c2f1 | ||
![]() |
a98b9cb138 | ||
![]() |
169f0fda2f | ||
![]() |
93acf2ee36 | ||
![]() |
5ff2c7e2a2 | ||
![]() |
4e03997e27 | ||
![]() |
a7806bc91a | ||
![]() |
65ba161cd8 | ||
![]() |
6756f8897c | ||
![]() |
b7385b0e28 | ||
![]() |
69c11b5f5c | ||
![]() |
76e3e36e39 | ||
![]() |
19b0db156f | ||
![]() |
5dd5b891f0 | ||
![]() |
028dae8379 | ||
![]() |
277b24e4a1 | ||
![]() |
8b98b7990b | ||
![]() |
8f97ff4c59 | ||
![]() |
ee33df82eb | ||
![]() |
faecc6d48c | ||
![]() |
718654ae21 | ||
![]() |
78e06fdb42 | ||
![]() |
6f7b448591 | ||
![]() |
8909607ab9 | ||
![]() |
f7e331a746 | ||
![]() |
2e6671cf86 | ||
![]() |
9ead172225 | ||
![]() |
c7220f13b9 | ||
![]() |
f2046b3a71 | ||
![]() |
05f9bcae18 | ||
![]() |
90d5bd27a8 | ||
![]() |
81ccaa9087 | ||
![]() |
f9f3439835 | ||
![]() |
b24bb0811f | ||
![]() |
0d6f1857ec | ||
![]() |
96a1813845 | ||
![]() |
2571387f27 | ||
![]() |
b368da74f0 | ||
![]() |
7167affa28 | ||
![]() |
6e15319728 | ||
![]() |
4923a9888b | ||
![]() |
cea4faccdc | ||
![]() |
0afeb7af1c | ||
![]() |
c4fe0391c4 | ||
![]() |
d9bfbee860 | ||
![]() |
9bb624db5a | ||
![]() |
1fb68df156 | ||
![]() |
b8a19d13dd | ||
![]() |
99b1b92306 | ||
![]() |
f8b91f3978 | ||
![]() |
5e96a196f7 | ||
![]() |
5ccf6609b3 | ||
![]() |
859f05b6d0 | ||
![]() |
33f2813ce5 | ||
![]() |
88742398d4 | ||
![]() |
6a01d02269 | ||
![]() |
79d4ff772a | ||
![]() |
6f387ded58 | ||
![]() |
58f448405b | ||
![]() |
650ddea75d | ||
![]() |
189cfec727 | ||
![]() |
6532b89405 | ||
![]() |
adffaca779 | ||
![]() |
96ecd10fda | ||
![]() |
fb585ecb2c | ||
![]() |
90e6d25762 | ||
![]() |
3d3aa30f45 | ||
![]() |
210603f915 | ||
![]() |
1fbb0ab2ea | ||
![]() |
f3ef32f6d4 | ||
![]() |
0193650b8a | ||
![]() |
e317c8b601 | ||
![]() |
4848c2a76b | ||
![]() |
70a11cc99c | ||
![]() |
9e82700d97 | ||
![]() |
03ba74a94f | ||
![]() |
97ac4aac20 | ||
![]() |
1fde90d989 | ||
![]() |
06fdd0a895 | ||
![]() |
13f0d59587 | ||
![]() |
c620f8506f | ||
![]() |
3ca60c4c37 | ||
![]() |
3a6f82b5cf | ||
![]() |
2ec732abfa | ||
![]() |
253b69d39e | ||
![]() |
35d1ec361f | ||
![]() |
5309bd7516 | ||
![]() |
f1e08e5416 | ||
![]() |
c9253a72f8 | ||
![]() |
e557c3fec3 | ||
![]() |
2d64d40346 | ||
![]() |
2d163735a4 | ||
![]() |
2c025be79f | ||
![]() |
ddf70dc862 | ||
![]() |
febca23f68 | ||
![]() |
92dfdb24bf | ||
![]() |
4305c4d90e | ||
![]() |
b08ac042b6 | ||
![]() |
5fe4d16511 | ||
![]() |
b0bbbb39a4 | ||
![]() |
c1df3da462 | ||
![]() |
7bb1276695 | ||
![]() |
3183698bca | ||
![]() |
2b1e055cf0 | ||
![]() |
d1ca6aefce | ||
![]() |
3ea7a7de99 | ||
![]() |
54fbc1976e | ||
![]() |
bd1629812a | ||
![]() |
ed63a4621e | ||
![]() |
3729c3f1d1 | ||
![]() |
d6ed893beb | ||
![]() |
ae050ab8ab | ||
![]() |
b41e513547 | ||
![]() |
1d0f6cc311 | ||
![]() |
a16f28a2ac | ||
![]() |
b86d0365db | ||
![]() |
9a86bf1125 | ||
![]() |
0cd20a6ab2 | ||
![]() |
89e2a14796 | ||
![]() |
a56333fd26 | ||
![]() |
bb793ed3bc | ||
![]() |
d8a1dd03c7 | ||
![]() |
7fe5d46660 | ||
![]() |
8a75364276 | ||
![]() |
efe0173aef | ||
![]() |
667ab89d8d | ||
![]() |
e99201ca0b | ||
![]() |
e952a53689 | ||
![]() |
713d60d15f | ||
![]() |
3d41bdccc6 | ||
![]() |
409d5b93cb | ||
![]() |
8b64af11a6 | ||
![]() |
106f5c10c3 | ||
![]() |
890f195ba5 | ||
![]() |
910c2aa7bb | ||
![]() |
c66ecde2a8 | ||
![]() |
709a09a60a | ||
![]() |
971f3f188d | ||
![]() |
ff846d1750 | ||
![]() |
f33713bf96 | ||
![]() |
8d45948830 | ||
![]() |
0b5a9335f3 | ||
![]() |
d39c8d8ceb | ||
![]() |
9d74e0e4f8 | ||
![]() |
0035008f50 | ||
![]() |
1c81d2166e | ||
![]() |
c3b2663c54 | ||
![]() |
8c07ab9972 | ||
![]() |
4145006721 | ||
![]() |
91d52c2a79 | ||
![]() |
e855952d50 | ||
![]() |
41f3b0d986 | ||
![]() |
a7ed7a21b7 | ||
![]() |
8b5cb50095 | ||
![]() |
8c14fcf333 | ||
![]() |
d3728d1ad6 | ||
![]() |
18f0901fcd | ||
![]() |
c385f4f382 | ||
![]() |
046527cea3 | ||
![]() |
163117bcf4 | ||
![]() |
2712a0f863 | ||
![]() |
2dc13c09d1 | ||
![]() |
acd09c6a16 | ||
![]() |
aa9028e49e | ||
![]() |
104c290915 | ||
![]() |
9ab829e3a3 | ||
![]() |
67803c352d | ||
![]() |
d66805b74d | ||
![]() |
dc3c713b44 | ||
![]() |
5707e459d3 | ||
![]() |
9b1ba9eb69 | ||
![]() |
5ac8ae7356 | ||
![]() |
b021a00bc3 | ||
![]() |
8f5cdb35b0 | ||
![]() |
3942f281e4 | ||
![]() |
92f48144ac | ||
![]() |
c95663f760 | ||
![]() |
3634929a2d | ||
![]() |
ee7c9e986e | ||
![]() |
ec092bd2a3 | ||
![]() |
233b047dab | ||
![]() |
7d8dcb559a | ||
![]() |
d3a9324114 | ||
![]() |
1b9754b6d0 | ||
![]() |
f37e47ebd8 | ||
![]() |
05ef78b8a6 | ||
![]() |
4a6498b8df | ||
![]() |
eeb33eb2cd | ||
![]() |
d10a3b98e2 | ||
![]() |
3f5ba08979 | ||
![]() |
b8ccc78991 | ||
![]() |
0bd8e90f6e | ||
![]() |
66858afe63 | ||
![]() |
5ba6add44e | ||
![]() |
6949dbdc9b | ||
![]() |
9bcb8295e1 | ||
![]() |
b828d7a008 | ||
![]() |
a47a62d34d | ||
![]() |
caf0a023f2 | ||
![]() |
2431442d12 | ||
![]() |
6e8ce16b24 | ||
![]() |
521559a423 | ||
![]() |
36dd81ced8 | ||
![]() |
bbb3ada781 | ||
![]() |
e504106978 | ||
![]() |
5a16115b5c | ||
![]() |
d47af5245c | ||
![]() |
c3d8f50212 | ||
![]() |
b5ceeef2c0 | ||
![]() |
a26ffafd46 | ||
![]() |
6760437766 | ||
![]() |
f9e7dc3eb5 | ||
![]() |
49e1d82270 | ||
![]() |
054296f0c3 | ||
![]() |
70e302b938 | ||
![]() |
7daf5c5177 | ||
![]() |
0ef6d70bc0 | ||
![]() |
a6d2b0db65 | ||
![]() |
e76ab0f5f8 | ||
![]() |
a05e836e4b | ||
![]() |
fa6faa97ab | ||
![]() |
62d1c5d76d | ||
![]() |
133ea49344 | ||
![]() |
99aa11cdc1 | ||
![]() |
85c04c5088 | ||
![]() |
cbc94c9b21 | ||
![]() |
4102174c7d | ||
![]() |
c5c71c3637 | ||
![]() |
0f6fb92673 | ||
![]() |
ffd0972456 | ||
![]() |
db278b4fc9 | ||
![]() |
014f425f5d | ||
![]() |
6a7da12f7a | ||
![]() |
74882ae911 | ||
![]() |
00df4de895 | ||
![]() |
a5960cab09 | ||
![]() |
7f280c3515 | ||
![]() |
cfe0a94909 | ||
![]() |
7d820d56a1 | ||
![]() |
d03d2fb7c9 | ||
![]() |
9824b055ae | ||
![]() |
f334b77d26 | ||
![]() |
4343d8f0c0 | ||
![]() |
25f723adf8 | ||
![]() |
e40e052805 | ||
![]() |
ab5641035d | ||
![]() |
24c5561a0d | ||
![]() |
25c807d8a1 | ||
![]() |
f79944f89c | ||
![]() |
04e9b30c82 | ||
![]() |
e765ddd847 | ||
![]() |
8855eaa5d3 | ||
![]() |
2aa71e2ce3 | ||
![]() |
703ca1b830 | ||
![]() |
a133e32421 | ||
![]() |
b6226352ab | ||
![]() |
b1a17b29de | ||
![]() |
589e9bc505 | ||
![]() |
bc7d4b861f | ||
![]() |
53ae41e43d | ||
![]() |
29c4415b9d | ||
![]() |
7311e4bd4b | ||
![]() |
ea95ee18d9 | ||
![]() |
c0f9557ea0 | ||
![]() |
3e156b50e9 | ||
![]() |
2f65744d2a | ||
![]() |
f5a516d81c | ||
![]() |
24b3594c3d | ||
![]() |
4f9db07142 | ||
![]() |
db8b24eb53 | ||
![]() |
454cb18435 | ||
![]() |
a9a47fa2da | ||
![]() |
43b75c6cae | ||
![]() |
aaeebbd0a1 | ||
![]() |
b4d410eec0 | ||
![]() |
a9a1873fef | ||
![]() |
6b926facd6 | ||
![]() |
b55c578c1a | ||
![]() |
6b43f5be9e | ||
![]() |
3e5ada781e | ||
![]() |
e8d4f2524a | ||
![]() |
feeee5f7a4 | ||
![]() |
d523b89b55 | ||
![]() |
1dc6f5b014 | ||
![]() |
8c4221d125 | ||
![]() |
5483e9003e | ||
![]() |
3e4b1931de | ||
![]() |
83f8ecad13 | ||
![]() |
923d09de1e | ||
![]() |
a9131bef9a | ||
![]() |
824edae576 | ||
![]() |
40b1a4b66f | ||
![]() |
5c732df550 | ||
![]() |
452acf103b | ||
![]() |
811551a91c | ||
![]() |
83c90d933e | ||
![]() |
704adde6e3 | ||
![]() |
69df1dcff8 | ||
![]() |
aad3905022 | ||
![]() |
089c82bfca | ||
![]() |
34cac3c7e1 | ||
![]() |
e693ff5fa0 | ||
![]() |
9a349669f5 | ||
![]() |
c51306aca7 | ||
![]() |
b9d16fa14d | ||
![]() |
9ef8f4f041 | ||
![]() |
69de3682f0 | ||
![]() |
88a2ae707f | ||
![]() |
de1079df1d | ||
![]() |
3f71f85c51 | ||
![]() |
2b600fb4c4 | ||
![]() |
9233e1cc3b | ||
![]() |
0794878e79 | ||
![]() |
9d10d699f3 | ||
![]() |
5eb5a4035c | ||
![]() |
486462b992 | ||
![]() |
332b31e701 | ||
![]() |
932ff82d70 | ||
![]() |
d4ebb63638 | ||
![]() |
1168334a40 | ||
![]() |
8b0d609ec6 | ||
![]() |
2d110410d1 | ||
![]() |
2623926be9 | ||
![]() |
4b19b50fbc | ||
![]() |
1a834761bf | ||
![]() |
fd7b0dcea2 | ||
![]() |
a27549800b | ||
![]() |
c4361cf828 | ||
![]() |
0e3d73e35e | ||
![]() |
931e4632ce | ||
![]() |
ed89424059 | ||
![]() |
d19258590b | ||
![]() |
c61822fbd9 | ||
![]() |
bbd061b47b | ||
![]() |
a8a73c40f5 | ||
![]() |
e4b4f571f5 | ||
![]() |
66e5b97785 | ||
![]() |
d7b4d10f2a | ||
![]() |
7c9305418c | ||
![]() |
eee8215933 | ||
![]() |
983cfe75e3 | ||
![]() |
6e4f178ba2 | ||
![]() |
d1ca30c2ad | ||
![]() |
9ef879aee4 | ||
![]() |
3b8c57e29d | ||
![]() |
628c2d2095 | ||
![]() |
ef2fb24bd1 | ||
![]() |
4e429ef772 | ||
![]() |
98f49eba4e | ||
![]() |
6caf16d439 | ||
![]() |
30e0de375f | ||
![]() |
e3f25965af | ||
![]() |
9b157d732f | ||
![]() |
28662c490b | ||
![]() |
061ca0d4d4 | ||
![]() |
1b680c6f13 | ||
![]() |
effada446c | ||
![]() |
76f22f0f92 | ||
![]() |
eeeec9a325 | ||
![]() |
6388536beb | ||
![]() |
688ce681f0 | ||
![]() |
0d4e037c28 | ||
![]() |
78838b4724 | ||
![]() |
9b93d464e3 | ||
![]() |
1752967e5b | ||
![]() |
9e4a11aae2 | ||
![]() |
23b2f654b7 | ||
![]() |
071325c203 | ||
![]() |
7f73274352 | ||
![]() |
221420cd57 | ||
![]() |
66caa58346 | ||
![]() |
08e56a130a | ||
![]() |
ec83f5d506 | ||
![]() |
fc8d60d137 | ||
![]() |
6463994b4d | ||
![]() |
c2e0b57ba9 | ||
![]() |
096dca61ba | ||
![]() |
3bcb67f1c1 | ||
![]() |
0c8cb6890e | ||
![]() |
47e4c80a74 | ||
![]() |
a90e6b3b23 | ||
![]() |
1cf5424953 | ||
![]() |
cbacd30640 | ||
![]() |
e5a6987af1 | ||
![]() |
f1483714ea | ||
![]() |
e02ed441fd | ||
![]() |
88bd13e859 | ||
![]() |
72f8550981 | ||
![]() |
ea019ef27b | ||
![]() |
b9db5d3270 | ||
![]() |
14f5bf9658 | ||
![]() |
6cd37bf16f | ||
![]() |
a8b56bd2bc | ||
![]() |
c7b6101855 | ||
![]() |
44d96002e9 | ||
![]() |
36626d8d4f | ||
![]() |
58c89a8dfd | ||
![]() |
ef3d8f7840 | ||
![]() |
216764c69e | ||
![]() |
7e334bce6f | ||
![]() |
593dbf7f9b | ||
![]() |
e20d896a13 | ||
![]() |
fa61bc361b | ||
![]() |
3bf8dda2e5 | ||
![]() |
a74ce63e09 | ||
![]() |
b95855edf5 | ||
![]() |
b5b5fb4f5c | ||
![]() |
731f3edaff | ||
![]() |
7df3bbfe66 | ||
![]() |
4bb6e93cab | ||
![]() |
b997d8d733 | ||
![]() |
5263353262 | ||
![]() |
ff12c7f8c2 | ||
![]() |
15573a8f20 | ||
![]() |
e4aa82fbc6 | ||
![]() |
19dfa41bb9 | ||
![]() |
9ee0398fdf | ||
![]() |
ee4d5321f2 | ||
![]() |
6976c17c8c | ||
![]() |
47e412cc5c | ||
![]() |
8db71dd316 | ||
![]() |
f5c332b816 | ||
![]() |
64690a935c | ||
![]() |
d11775eeca | ||
![]() |
be33bf6dc6 | ||
![]() |
0007447eab | ||
![]() |
c38448909c | ||
![]() |
b6e3f9b2c6 | ||
![]() |
818dd9f827 | ||
![]() |
6b3418fe0b | ||
![]() |
a9c9aa294c | ||
![]() |
caf62f97bd | ||
![]() |
9922b1a145 | ||
![]() |
7a871376a2 | ||
![]() |
c0354fc161 | ||
![]() |
101a773df1 | ||
![]() |
0675d1fec7 | ||
![]() |
29d4137e0b | ||
![]() |
db31e2bc94 | ||
![]() |
bf580e6698 | ||
![]() |
b912817240 | ||
![]() |
393dad1790 | ||
![]() |
2db4de4170 | ||
![]() |
a6fff58e4a | ||
![]() |
6e0fc168bb | ||
![]() |
51099d9901 | ||
![]() |
51cc9b8b2e | ||
![]() |
58083b7cef | ||
![]() |
ac09c433b3 | ||
![]() |
d77a0d5fc9 | ||
![]() |
3827456452 | ||
![]() |
9cef9a6ac0 | ||
![]() |
059d1e0269 | ||
![]() |
698979fe55 | ||
![]() |
40f871c2b1 | ||
![]() |
8416fe4311 | ||
![]() |
cf2b724cbb | ||
![]() |
ae69865ab3 | ||
![]() |
ad6ebb175f | ||
![]() |
f420d779ba | ||
![]() |
436bdd50aa | ||
![]() |
faa7a7ba97 | ||
![]() |
c56afbe514 | ||
![]() |
748943cd59 | ||
![]() |
d4b2957194 | ||
![]() |
6c3365f67c | ||
![]() |
c246f20564 | ||
![]() |
ae1d3f3eb3 | ||
![]() |
17f9ff5818 | ||
![]() |
c1dd74aa69 | ||
![]() |
c5b5732957 | ||
![]() |
94fdf082ec | ||
![]() |
0483e44f19 | ||
![]() |
142b7ac258 | ||
![]() |
974bad0054 | ||
![]() |
4be168509f | ||
![]() |
9c712e09ea | ||
![]() |
fd90018045 | ||
![]() |
02c196ad13 | ||
![]() |
2532a088cd | ||
![]() |
73bb35d943 | ||
![]() |
af3818ba24 | ||
![]() |
df835c1982 | ||
![]() |
9da6764eb8 | ||
![]() |
eba0b174d7 | ||
![]() |
0eace0200f | ||
![]() |
6c38c1a253 | ||
![]() |
99639ca5c9 | ||
![]() |
a5c3c18442 | ||
![]() |
554da0c21e | ||
![]() |
294a81a05c | ||
![]() |
bbb0013138 | ||
![]() |
572e955388 | ||
![]() |
26e31d23d5 | ||
![]() |
c54a8ced4f | ||
![]() |
77260acfd5 | ||
![]() |
a71bcfb734 | ||
![]() |
561a5c4bdf | ||
![]() |
4925172a28 | ||
![]() |
0dc964fb50 | ||
![]() |
0897afcc21 | ||
![]() |
d4ee87338b | ||
![]() |
5ee6d3c2bc | ||
![]() |
96bc89f721 | ||
![]() |
b36f276007 | ||
![]() |
fc06de8db1 | ||
![]() |
96c59e07fc | ||
![]() |
3140aecd2b | ||
![]() |
a9b37c2b5d | ||
![]() |
2e514c5e96 | ||
![]() |
df47c099e3 | ||
![]() |
468b26e918 | ||
![]() |
f2be0b75c6 | ||
![]() |
f9c9787251 | ||
![]() |
50187433cf | ||
![]() |
f0e2cf4689 | ||
![]() |
72bb50cb27 | ||
![]() |
36c65391a1 | ||
![]() |
d3c0f996a4 | ||
![]() |
7437fe9230 | ||
![]() |
de33a40107 | ||
![]() |
63b9da8bde | ||
![]() |
cae94bca24 | ||
![]() |
8b16003cab | ||
![]() |
676945eec7 | ||
![]() |
fb2297e11c | ||
![]() |
c738a2f2ed | ||
![]() |
e6468e87b0 | ||
![]() |
a7da90d8f6 | ||
![]() |
1ebd3c6b8e | ||
![]() |
eada163968 | ||
![]() |
2bd4e000a8 | ||
![]() |
067ee3ca04 | ||
![]() |
ba558deaa8 | ||
![]() |
770def9213 | ||
![]() |
4ee0f9410b | ||
![]() |
7be41f56db | ||
![]() |
fbd6b74900 | ||
![]() |
2c8becaf81 | ||
![]() |
b0e1dfd7b3 | ||
![]() |
6874ff572e | ||
![]() |
853d3dafe3 | ||
![]() |
e097b541c9 | ||
![]() |
5298d333fa | ||
![]() |
d3ad2520d3 | ||
![]() |
4d2ad41860 | ||
![]() |
9ae5bcaeba | ||
![]() |
7836a837cd | ||
![]() |
a10a2f171a | ||
![]() |
ed7496ea47 | ||
![]() |
43c825f6ba | ||
![]() |
137ccc40db | ||
![]() |
1eb4ec7f46 | ||
![]() |
80302f0b4c | ||
![]() |
8dd2e3473c | ||
![]() |
2536ff635b | ||
![]() |
d0d820673f | ||
![]() |
fb031ac418 | ||
![]() |
7faee48ef3 | ||
![]() |
3ae69b9691 | ||
![]() |
3fea06f16d | ||
![]() |
c9603c2625 | ||
![]() |
650f6089e1 | ||
![]() |
a763096359 | ||
![]() |
4e8fe71a80 | ||
![]() |
246d066457 | ||
![]() |
29ffc3ab1d | ||
![]() |
63958cf7e4 | ||
![]() |
d75030cfaf | ||
![]() |
61d6279963 | ||
![]() |
3f9f0298bb | ||
![]() |
9a6f5e8f48 | ||
![]() |
9ec04e006e | ||
![]() |
0466043ec6 | ||
![]() |
411c1f384e | ||
![]() |
e987ff4c01 | ||
![]() |
17c58d6587 | ||
![]() |
671cdd9313 | ||
![]() |
a87f2418aa | ||
![]() |
0d78611767 | ||
![]() |
3cd7a62f3f | ||
![]() |
cbcaf9fa2f | ||
![]() |
f3a7083441 | ||
![]() |
3d7ebf028d | ||
![]() |
6d10c20ea8 | ||
![]() |
17dcfa29fd | ||
![]() |
46ef29b9af | ||
![]() |
3e8fb1a06a | ||
![]() |
852bf0004a | ||
![]() |
092a79ccb9 | ||
![]() |
bcb0d79cdc | ||
![]() |
2b2c03893e | ||
![]() |
afbb42e812 | ||
![]() |
6e43c93634 | ||
![]() |
34e0120a89 | ||
![]() |
e97ae272fe | ||
![]() |
ce881d54fb | ||
![]() |
b3f2325645 | ||
![]() |
4e41a00ef9 | ||
![]() |
496bb08b58 | ||
![]() |
fb5afea86b | ||
![]() |
1c1633a8da | ||
![]() |
61a3053240 | ||
![]() |
759b1673a0 | ||
![]() |
72a65ff8ea | ||
![]() |
c7fa80cccc | ||
![]() |
dab09a47b8 | ||
![]() |
715e1501f2 | ||
![]() |
6f0e014664 | ||
![]() |
3cea3efaf1 | ||
![]() |
8c33bd50ec | ||
![]() |
101042d4e0 | ||
![]() |
6774e08aa4 | ||
![]() |
320641ca3c | ||
![]() |
9c353984d7 | ||
![]() |
c24ccef828 | ||
![]() |
c222b6e978 | ||
![]() |
53ddf5a32e | ||
![]() |
8b33b7fbaf | ||
![]() |
05528e22e5 | ||
![]() |
002ab9103c | ||
![]() |
aad1050e21 | ||
![]() |
6a9fa5e2ec | ||
![]() |
6597a6b115 | ||
![]() |
9d123b949e | ||
![]() |
f2b4410681 | ||
![]() |
cd9753ff0f | ||
![]() |
b40833ed15 | ||
![]() |
3b10d47286 | ||
![]() |
ef2a5a1a25 | ||
![]() |
e1d3c747e8 | ||
![]() |
a2b850557c | ||
![]() |
c25c8d55e8 | ||
![]() |
1b66ffc3ac | ||
![]() |
6b0690e5af | ||
![]() |
8f4565c42d | ||
![]() |
f39b7992d7 | ||
![]() |
962f7529fb | ||
![]() |
22f3d26708 | ||
![]() |
137a6b7119 | ||
![]() |
68d18b2ad3 | ||
![]() |
0fcde71062 | ||
![]() |
176bd3aca8 | ||
![]() |
84b4e96db2 | ||
![]() |
76ae4a6875 | ||
![]() |
15112db802 | ||
![]() |
14edbcf19c | ||
![]() |
d7446d9a32 | ||
![]() |
610cdaaec6 | ||
![]() |
4a2733a827 | ||
![]() |
a504c4cd73 | ||
![]() |
b8a2e80c4d | ||
![]() |
853f1441ad | ||
![]() |
64ef4b1349 | ||
![]() |
2595d8ea15 | ||
![]() |
4e707af83b | ||
![]() |
77b7c2327b | ||
![]() |
519d93f3d4 | ||
![]() |
0defb98428 | ||
![]() |
830a69e75d | ||
![]() |
98d7acdeb8 | ||
![]() |
37a1c80792 | ||
![]() |
dd1309cb2b | ||
![]() |
bf1d23ca8b | ||
![]() |
0707440b1f | ||
![]() |
baa296af3c | ||
![]() |
e29a663367 | ||
![]() |
0681720f50 | ||
![]() |
aef7697f23 | ||
![]() |
a04ae0323c | ||
![]() |
5b6ea8b5f2 | ||
![]() |
0d1d9a1649 | ||
![]() |
1e4509fd7d | ||
![]() |
c13bcb4597 | ||
![]() |
c1a5e4e122 | ||
![]() |
ceef4eacdc | ||
![]() |
a5a5ca9081 | ||
![]() |
6743cefc1c | ||
![]() |
e2bb8361cf | ||
![]() |
761c576bf5 | ||
![]() |
c48a3e412c | ||
![]() |
693053d684 | ||
![]() |
0a8bc5aeb0 | ||
![]() |
82070dc46d | ||
![]() |
c21f67b915 | ||
![]() |
89ee02f891 | ||
![]() |
70a69e9e2c | ||
![]() |
12332188af | ||
![]() |
ff1fcf5e95 | ||
![]() |
64e41d9166 | ||
![]() |
b65433f8ea | ||
![]() |
0d5b85fb1a | ||
![]() |
e1d34cfc7a | ||
![]() |
85c77e65b0 | ||
![]() |
b3cc75e5de | ||
![]() |
2da91cd48d | ||
![]() |
1ebb6b0b17 | ||
![]() |
9bbf6491a3 | ||
![]() |
9138c3ccd2 | ||
![]() |
24a44edf8c | ||
![]() |
9abb43cdcf | ||
![]() |
d63062be82 | ||
![]() |
449c82226e | ||
![]() |
1a82d65a25 | ||
![]() |
102afa6ac9 | ||
![]() |
e2761de56b | ||
![]() |
f742ecda0e | ||
![]() |
c5632c7d10 | ||
![]() |
7f8dbe2ed2 | ||
![]() |
18a4dd8282 | ||
![]() |
ac20d2fd75 | ||
![]() |
b44bd4be78 | ||
![]() |
fc428ef22f | ||
![]() |
99f9f1827d | ||
![]() |
ead4d3b90c | ||
![]() |
161e23f3e5 | ||
![]() |
665451d3cf | ||
![]() |
840e22c734 | ||
![]() |
0b2eb2db8f | ||
![]() |
6319f9c981 | ||
![]() |
72e414becc | ||
![]() |
f9bd4ad1b6 | ||
![]() |
5ec86050ca | ||
![]() |
949e30cc74 | ||
![]() |
d3e0bf3437 | ||
![]() |
306e3ded4f | ||
![]() |
cb6802b35d | ||
![]() |
060b0d2e43 | ||
![]() |
f1d50c0ce2 | ||
![]() |
18ca4fdfa9 | ||
![]() |
83225aca7d | ||
![]() |
dd5575e90f | ||
![]() |
5124936f9e | ||
![]() |
8742299e98 | ||
![]() |
f5067e7853 | ||
![]() |
7c5baf4bbe | ||
![]() |
aaba6250c0 | ||
![]() |
6e51f3b755 | ||
![]() |
19cbf09521 | ||
![]() |
22611256aa | ||
![]() |
bd60c9b40a | ||
![]() |
f96ecd141c | ||
![]() |
2aa213123b | ||
![]() |
3b697d39a2 | ||
![]() |
bfc78d7646 | ||
![]() |
589537902e | ||
![]() |
eb723cdc7e | ||
![]() |
a7381119cd | ||
![]() |
1075ec077c | ||
![]() |
bb26d8c97c | ||
![]() |
28d144aa46 | ||
![]() |
539664c6ba | ||
![]() |
bd89384259 | ||
![]() |
cb09449401 | ||
![]() |
a09558bb27 | ||
![]() |
3618966bc7 | ||
![]() |
43c32e62df | ||
![]() |
62903c242f | ||
![]() |
7cbfcf027e | ||
![]() |
0c50222775 | ||
![]() |
6984ae0ca1 | ||
![]() |
83dc2acece | ||
![]() |
847c94b56d | ||
![]() |
679605891d | ||
![]() |
54ebfae38a | ||
![]() |
3821987b9d | ||
![]() |
78c29e1679 | ||
![]() |
89442132ff | ||
![]() |
1abc6bb34f | ||
![]() |
24d5a4ea4b | ||
![]() |
a359f9188f | ||
![]() |
ec663ec3a7 | ||
![]() |
8e54f28333 | ||
![]() |
db1dfb6784 | ||
![]() |
fbf345e0d4 | ||
![]() |
512f61d573 | ||
![]() |
6c62792c73 | ||
![]() |
e9ecf30500 | ||
![]() |
94eaf94345 | ||
![]() |
809a22598b | ||
![]() |
766f8cf988 | ||
![]() |
6c6f0f41f0 | ||
![]() |
5f6c06ed56 | ||
![]() |
baee8afd3a | ||
![]() |
9bfe7cf898 | ||
![]() |
57d2becb8f | ||
![]() |
dfd6221cc5 | ||
![]() |
3735415fbc | ||
![]() |
1e2e79db75 | ||
![]() |
43565e2210 | ||
![]() |
378b1061e8 | ||
![]() |
2a04013aae | ||
![]() |
0559f97be3 | ||
![]() |
1082ce3c5f | ||
![]() |
57779dea8a | ||
![]() |
29bb08c662 | ||
![]() |
1532472dcb | ||
![]() |
84a9175e2c | ||
![]() |
5f1b51901c | ||
![]() |
9c12b06903 | ||
![]() |
561ad66dbb | ||
![]() |
a461c4bbec | ||
![]() |
a2fac53127 | ||
![]() |
0b92bfdf80 | ||
![]() |
60cc8dbce8 | ||
![]() |
3bc6e3a01d | ||
![]() |
da375bb682 | ||
![]() |
a37e5c704b | ||
![]() |
c1642c5b50 | ||
![]() |
56234c13f1 | ||
![]() |
2f84693dc3 | ||
![]() |
c174570e53 | ||
![]() |
d1d84e87b3 | ||
![]() |
8a9c92a72d | ||
![]() |
ae459fa73a | ||
![]() |
44200b16cc | ||
![]() |
1b39837f34 | ||
![]() |
ecd92059af | ||
![]() |
d4b4067ee4 | ||
![]() |
c408cddfaf | ||
![]() |
9d24b79637 | ||
![]() |
534bf76463 | ||
![]() |
80a2f65e58 | ||
![]() |
e2e3dc9634 | ||
![]() |
a10aa05f41 | ||
![]() |
389aac9663 | ||
![]() |
a5787031b0 | ||
![]() |
63bab25e1c | ||
![]() |
96395966da | ||
![]() |
1d0fcee9c1 | ||
![]() |
db823848f4 | ||
![]() |
58fe224eeb | ||
![]() |
09e09741d4 | ||
![]() |
20312c732d | ||
![]() |
7b3a89c821 | ||
![]() |
7322f3796b | ||
![]() |
73d35c9d54 | ||
![]() |
6d1cb70a37 | ||
![]() |
a40196be9e | ||
![]() |
f6a880a2f1 | ||
![]() |
7eb4fb6cd3 | ||
![]() |
7d35efe5b9 | ||
![]() |
277e56b113 | ||
![]() |
b32f09b655 | ||
![]() |
6fe58ffd30 | ||
![]() |
ad253d724a | ||
![]() |
34151d4490 | ||
![]() |
a108024fca | ||
![]() |
10dce329d5 | ||
![]() |
bff5a809cb | ||
![]() |
19669adde2 | ||
![]() |
5bb6c0f638 | ||
![]() |
4523450ddf | ||
![]() |
580d8b2851 | ||
![]() |
3d539e4dd2 | ||
![]() |
a8c4700c72 | ||
![]() |
d77574fd0b | ||
![]() |
c5a5738cf3 | ||
![]() |
0958595262 | ||
![]() |
79b4ffa116 | ||
![]() |
e2cf0e561e | ||
![]() |
5c7124a0ee | ||
![]() |
0b599713e9 | ||
![]() |
9ea88c5bef | ||
![]() |
bd98f0db68 | ||
![]() |
c1b8d81282 | ||
![]() |
e6ad2cc4db | ||
![]() |
e6bba0ddba | ||
![]() |
c7e18526f6 | ||
![]() |
00c45646d3 | ||
![]() |
e74ba6f802 | ||
![]() |
332379d06a | ||
![]() |
9bb92ed23d | ||
![]() |
8198500f51 | ||
![]() |
01a44b41f8 | ||
![]() |
eb07f58277 | ||
![]() |
81e39a3042 | ||
![]() |
af5a485793 | ||
![]() |
55fa8cee80 | ||
![]() |
7b40bdd4f9 | ||
![]() |
b9203f80db | ||
![]() |
56f3cbeff8 | ||
![]() |
5910c162d7 | ||
![]() |
f4d1e2ee49 | ||
![]() |
f7366267ab | ||
![]() |
848c7ad5bf | ||
![]() |
fcbb59cf17 | ||
![]() |
4a838b3ade | ||
![]() |
17fb9f6c31 | ||
![]() |
9d5975f64e | ||
![]() |
20854d6366 | ||
![]() |
b2c2c5a9cd | ||
![]() |
152811fcc6 | ||
![]() |
6a2fb70d59 | ||
![]() |
8fb0127f99 | ||
![]() |
0de790cd28 | ||
![]() |
f98b7073cd | ||
![]() |
37164221e5 | ||
![]() |
21cf213b10 | ||
![]() |
695353ee91 | ||
![]() |
c982cbdb3e | ||
![]() |
2265376154 | ||
![]() |
56a11ba9d1 | ||
![]() |
443fdd3e34 | ||
![]() |
ee55b9abe3 | ||
![]() |
9c80258c0c | ||
![]() |
f832181ee3 | ||
![]() |
ec549d6a06 | ||
![]() |
3d2c111f58 | ||
![]() |
40d5a253fe | ||
![]() |
1b6e964244 | ||
![]() |
2a532aa2a3 | ||
![]() |
0ba93f829c | ||
![]() |
0206162c2d | ||
![]() |
563556eade | ||
![]() |
d58a6897e6 | ||
![]() |
1290f93d06 | ||
![]() |
99e158f429 | ||
![]() |
e4c1da4323 | ||
![]() |
38faa6806b | ||
![]() |
eb817ac64c | ||
![]() |
aaad0d2ecf | ||
![]() |
b171fae4e2 | ||
![]() |
cb892be59e | ||
![]() |
29b6843268 | ||
![]() |
3baaf79cec | ||
![]() |
26b41d5753 | ||
![]() |
ec63293372 | ||
![]() |
adc1deed11 | ||
![]() |
41ef078433 | ||
![]() |
94d212fc71 | ||
![]() |
2e93352060 | ||
![]() |
55f99625e0 | ||
![]() |
a54efc5cd2 | ||
![]() |
5d8655cb26 | ||
![]() |
8743551143 | ||
![]() |
8a4228e5b3 | ||
![]() |
06e325f607 | ||
![]() |
33efd9c4a7 | ||
![]() |
217fafd0fc | ||
![]() |
747ab4c456 | ||
![]() |
7c8d94799e | ||
![]() |
c4bc012528 | ||
![]() |
7017c14ba6 | ||
![]() |
4923679536 | ||
![]() |
eff969d280 | ||
![]() |
c0266a0798 | ||
![]() |
d9a77b8d5f | ||
![]() |
135223313a | ||
![]() |
0401be21ed | ||
![]() |
fe00470219 | ||
![]() |
bd0d1306fd | ||
![]() |
cc1f9a4aab | ||
![]() |
821607434b | ||
![]() |
be97ceb118 | ||
![]() |
2b266935e4 | ||
![]() |
f525138d67 | ||
![]() |
df1061c457 | ||
![]() |
600e0b3a94 | ||
![]() |
148e63e89d | ||
![]() |
38bb21e484 | ||
![]() |
91f63da162 | ||
![]() |
00faf135ed | ||
![]() |
dc917c48a6 | ||
![]() |
a24a7d42b3 | ||
![]() |
5a85699c8b | ||
![]() |
34439f7750 | ||
![]() |
b0e7c62de8 | ||
![]() |
27173cb94e | ||
![]() |
f184bfe499 | ||
![]() |
8f76636c54 | ||
![]() |
ae0e579747 | ||
![]() |
738b373b4b | ||
![]() |
1694f84643 | ||
![]() |
d2e2bf7db8 | ||
![]() |
551e7c4b04 | ||
![]() |
9a756c02fa | ||
![]() |
a204ad2d00 | ||
![]() |
eb3e6eb7fd | ||
![]() |
6052a1ede2 | ||
![]() |
4656263cfc | ||
![]() |
23ee362db6 | ||
![]() |
1b3368b13d | ||
![]() |
fe47a95b28 | ||
![]() |
7835f3acb1 | ||
![]() |
f72dd1477e | ||
![]() |
75126a7228 | ||
![]() |
06bd9d1dfc | ||
![]() |
70324fab55 | ||
![]() |
0f2c3b53b6 | ||
![]() |
e39e606a72 | ||
![]() |
580ad437fd | ||
![]() |
dccf230a01 | ||
![]() |
45c56b0033 | ||
![]() |
7762814c72 | ||
![]() |
b6a9cc90b6 | ||
![]() |
974dacc5bb | ||
![]() |
344a3d8c2f | ||
![]() |
979261d688 | ||
![]() |
790304df99 | ||
![]() |
1e91416f4b | ||
![]() |
d765966173 | ||
![]() |
5028305cd3 | ||
![]() |
b6dcf7ec18 | ||
![]() |
9e16ab47e0 | ||
![]() |
1ba52960ea | ||
![]() |
37e50b000f | ||
![]() |
792424104c | ||
![]() |
e56ad175e8 | ||
![]() |
fe29d3c167 | ||
![]() |
6161ebc8f6 | ||
![]() |
3c5b143403 | ||
![]() |
5aa158d6af | ||
![]() |
00f6d4229f | ||
![]() |
197676378f | ||
![]() |
cf9ef7040f | ||
![]() |
86245dc12c | ||
![]() |
85ea23dd62 | ||
![]() |
457dfa6f23 | ||
![]() |
685478dc7e | ||
![]() |
56729e1e3f | ||
![]() |
28f30b1de0 | ||
![]() |
e5eee2f760 | ||
![]() |
ee34e0c0a2 | ||
![]() |
71b1b655c6 | ||
![]() |
be2e1f21c0 | ||
![]() |
d52e0e836b | ||
![]() |
374a960510 | ||
![]() |
bd99bb0b7b | ||
![]() |
605947d4a1 | ||
![]() |
d2d49f524d | ||
![]() |
0fbf743215 | ||
![]() |
c6acd0066f | ||
![]() |
65c65aae12 | ||
![]() |
649bde9c28 | ||
![]() |
428dd21f3d | ||
![]() |
181f66f7dd | ||
![]() |
fb9def3bb2 | ||
![]() |
f53f0ba3a4 | ||
![]() |
038e4706df | ||
![]() |
6a2b61e2a3 | ||
![]() |
902b6fb27f | ||
![]() |
2041d468d8 | ||
![]() |
e89da8a76d | ||
![]() |
857dbeba1d | ||
![]() |
036f2c3f58 | ||
![]() |
19cba10fa2 | ||
![]() |
1c64b277cc | ||
![]() |
32ed717ee8 | ||
![]() |
043875391f | ||
![]() |
13c5f3dbdb | ||
![]() |
46ccd24ca4 | ||
![]() |
8a378cbc4b | ||
![]() |
e298e7d362 | ||
![]() |
3604f6a025 | ||
![]() |
454a74c8d7 | ||
![]() |
1d21028471 | ||
![]() |
e8e0131d74 | ||
![]() |
9f9d9b7b96 | ||
![]() |
96c39e1442 | ||
![]() |
486f740a33 | ||
![]() |
5db7ba8af8 | ||
![]() |
a961565d33 | ||
![]() |
d0b9bbee35 | ||
![]() |
b979cc3754 | ||
![]() |
e8d9364442 | ||
![]() |
f1af58e966 | ||
![]() |
52322431b2 | ||
![]() |
591a19db17 | ||
![]() |
3ec5b91bbd | ||
![]() |
9c56daac01 | ||
![]() |
de651d6701 | ||
![]() |
44eb14b74c | ||
![]() |
cf6c1e1068 | ||
![]() |
5991f14a8c | ||
![]() |
8a72d487b3 | ||
![]() |
c484dd324f | ||
![]() |
90fa7be17f | ||
![]() |
0ff43842cf | ||
![]() |
11a06ce0da | ||
![]() |
2240c08406 | ||
![]() |
feca5c8476 | ||
![]() |
77f3c03f28 | ||
![]() |
0afc66c234 | ||
![]() |
21d5cca071 | ||
![]() |
4294a82322 | ||
![]() |
596924e510 | ||
![]() |
75e1569db1 | ||
![]() |
ea6b6854d3 | ||
![]() |
e7f1eba0fc | ||
![]() |
c5392cdbdb | ||
![]() |
47ad7709ef | ||
![]() |
9b45342648 | ||
![]() |
cf63bb3445 | ||
![]() |
c2a6550bc1 | ||
![]() |
eedcf490af | ||
![]() |
ddcfb2e183 | ||
![]() |
d2c59f45b6 | ||
![]() |
b1754eeb63 | ||
![]() |
e90f3c6bd7 | ||
![]() |
05db41b567 | ||
![]() |
8a78c2d272 | ||
![]() |
6a788482cf | ||
![]() |
c03df09311 | ||
![]() |
90cfdc1143 | ||
![]() |
975c2a439b | ||
![]() |
8a98919ab5 | ||
![]() |
2ec357ea4e | ||
![]() |
a2a589e5dc | ||
![]() |
f310d513df | ||
![]() |
5a8c75574e | ||
![]() |
5d72dc8455 | ||
![]() |
06dc7231ae | ||
![]() |
8128db4a60 | ||
![]() |
cf54203c93 | ||
![]() |
87f2271370 | ||
![]() |
4098a29c8e | ||
![]() |
3a774bd92d | ||
![]() |
60ae12fede | ||
![]() |
08e59972d8 | ||
![]() |
215b17bd22 | ||
![]() |
2e16029108 | ||
![]() |
78c6c47349 | ||
![]() |
bdb20a3c0d | ||
![]() |
69f01a79ea | ||
![]() |
38ed660c2a | ||
![]() |
a7ad08d22d | ||
![]() |
59813a60c2 | ||
![]() |
bd53746ca3 | ||
![]() |
34efe6d72c | ||
![]() |
53990e2944 | ||
![]() |
7aaf13b8a8 | ||
![]() |
1b2bd485b7 | ||
![]() |
7a1a558cfc | ||
![]() |
9a166d85cf | ||
![]() |
a81e0152ac | ||
![]() |
8a724997a7 | ||
![]() |
eef1ff2e2d | ||
![]() |
5d68c11560 | ||
![]() |
4ec9aad964 | ||
![]() |
d8edd611ff | ||
![]() |
ece16ca7ef | ||
![]() |
466b26f101 | ||
![]() |
1ad4918023 | ||
![]() |
de572d2233 | ||
![]() |
88b08f6d8d | ||
![]() |
8de027e249 | ||
![]() |
9063d0ca4b | ||
![]() |
e6e339ebbf | ||
![]() |
32ac3cf679 | ||
![]() |
352b83aea7 | ||
![]() |
1e0ef25461 | ||
![]() |
698c736477 | ||
![]() |
57d8471a5c | ||
![]() |
b26e49b574 | ||
![]() |
012d1fd205 | ||
![]() |
3dcf24c76d | ||
![]() |
159cd565c4 | ||
![]() |
75f1449b32 | ||
![]() |
fe79f31000 | ||
![]() |
872a462c85 | ||
![]() |
6a86f354a4 | ||
![]() |
cb57145514 | ||
![]() |
6fc0fa2e02 | ||
![]() |
483ca17006 | ||
![]() |
054269fbaa | ||
![]() |
0c2c638168 | ||
![]() |
aad71fdbcf | ||
![]() |
1271a7b27e | ||
![]() |
86dea3a5dd | ||
![]() |
179e95fa89 | ||
![]() |
21cd7143ed | ||
![]() |
dc526dc2b1 | ||
![]() |
02cba2d606 | ||
![]() |
f3ab61b069 | ||
![]() |
ee069a155b | ||
![]() |
0b8dbd8f20 | ||
![]() |
b7c3a2eba6 | ||
![]() |
54372bd248 | ||
![]() |
159d9327d1 | ||
![]() |
d79e345067 | ||
![]() |
1e9e8d6a11 | ||
![]() |
722983e167 | ||
![]() |
e36f1b3bca | ||
![]() |
a25f0055de | ||
![]() |
52b82fdae9 | ||
![]() |
804cc62c44 | ||
![]() |
622365a774 | ||
![]() |
3565c004f8 | ||
![]() |
559825c922 | ||
![]() |
4a8f72a15d | ||
![]() |
8b1e7c8427 | ||
![]() |
d78245ce35 | ||
![]() |
0c05af434a | ||
![]() |
e5d1e4db9d | ||
![]() |
0340fcea03 | ||
![]() |
0ecde50005 | ||
![]() |
fa177b1330 | ||
![]() |
620265426d | ||
![]() |
c535598f57 | ||
![]() |
f440082dd0 | ||
![]() |
79041b9206 | ||
![]() |
f6ea2e49fd | ||
![]() |
0597a1eed6 | ||
![]() |
9d4d730d59 | ||
![]() |
e8c3bdb482 | ||
![]() |
2868ac1f89 | ||
![]() |
73d6ca8914 | ||
![]() |
e9c86090ae | ||
![]() |
a649fec423 | ||
![]() |
de409e68d4 | ||
![]() |
653b61a2c9 | ||
![]() |
2f0cdc91b7 | ||
![]() |
b9ed67af24 | ||
![]() |
36e23f87ab | ||
![]() |
97e9395b7f | ||
![]() |
7342246514 | ||
![]() |
1a0727a912 | ||
![]() |
cda4823fd1 | ||
![]() |
12a8655c5f | ||
![]() |
bc8e35bcd4 | ||
![]() |
e7b735cdac | ||
![]() |
69b42256be | ||
![]() |
9269b5009b | ||
![]() |
5b6fbe1497 | ||
![]() |
dd2de01d70 | ||
![]() |
7964429e04 | ||
![]() |
6ea3ce5de8 | ||
![]() |
648910e7bb | ||
![]() |
a6c9caa6cc | ||
![]() |
819270445b | ||
![]() |
20f4966ca0 | ||
![]() |
37e56ba0a6 | ||
![]() |
bfacdbf98f | ||
![]() |
139290742a | ||
![]() |
a11e1033e3 | ||
![]() |
f527459022 | ||
![]() |
f0b70953ae | ||
![]() |
64bc4560a9 | ||
![]() |
311652331d | ||
![]() |
b1162dfaa3 | ||
![]() |
f7d7a50a84 | ||
![]() |
7de3f96ee5 | ||
![]() |
f099e532bc | ||
![]() |
8425b4d043 | ||
![]() |
fe16927f2b | ||
![]() |
2aafae606c | ||
![]() |
3b7676b3c8 | ||
![]() |
fe6ba8953e | ||
![]() |
f38c6bcbf7 | ||
![]() |
a5a0a13e6e | ||
![]() |
87db565b70 | ||
![]() |
229d6867a7 | ||
![]() |
bc36e7a82c | ||
![]() |
5c0a890078 | ||
![]() |
944d3cd744 | ||
![]() |
1a8ea6b343 | ||
![]() |
1998ba3424 | ||
![]() |
2d578e8ed0 | ||
![]() |
c32f748482 | ||
![]() |
c8c12aa922 | ||
![]() |
6050e6867e | ||
![]() |
ecb537efef | ||
![]() |
3c97e7f0d3 | ||
![]() |
28814b9eab | ||
![]() |
48f95f90ac | ||
![]() |
42dd9fea5d | ||
![]() |
0690b15be6 | ||
![]() |
025b3f073e | ||
![]() |
ed345d06e7 | ||
![]() |
2f24eeb9d1 | ||
![]() |
63d83353fa | ||
![]() |
296f6648f2 | ||
![]() |
5cab9dff7d | ||
![]() |
73bb545a8c | ||
![]() |
4166a8ba92 | ||
![]() |
7f98810ea6 | ||
![]() |
4eccba4f2d | ||
![]() |
027d95a3bc | ||
![]() |
895b761592 | ||
![]() |
fd82bcab9c | ||
![]() |
b62203973c | ||
![]() |
7c5125a8e3 | ||
![]() |
d6ff5005da | ||
![]() |
8d51ca91f2 | ||
![]() |
7cc7595167 | ||
![]() |
6592ccba52 | ||
![]() |
fb58df01f4 | ||
![]() |
0d27cf6c1f | ||
![]() |
62f13b32ec | ||
![]() |
b556845850 | ||
![]() |
5c4e810836 | ||
![]() |
176fe04e8d | ||
![]() |
eb785bdf30 | ||
![]() |
0f9f619e90 | ||
![]() |
78625d8db7 | ||
![]() |
93dadb2116 | ||
![]() |
61df4d8c86 | ||
![]() |
53fce6957d | ||
![]() |
15f9f585f0 | ||
![]() |
6fb74a0d1a | ||
![]() |
44d23fc4f2 | ||
![]() |
87d3344a7a | ||
![]() |
017f714e96 | ||
![]() |
4dcad68d9b | ||
![]() |
979457d935 | ||
![]() |
b7dd23ef4e | ||
![]() |
52ace3bc34 | ||
![]() |
2847192460 | ||
![]() |
4a6fac0934 | ||
![]() |
1f08a3c191 | ||
![]() |
65e5c4f227 | ||
![]() |
927ccc7264 | ||
![]() |
fedbf06223 | ||
![]() |
b2fe6c45db | ||
![]() |
ccfd8ea364 | ||
![]() |
fccecc695c | ||
![]() |
212e7211c6 | ||
![]() |
e525bbada3 | ||
![]() |
66bb993cbd | ||
![]() |
18cdcfc496 | ||
![]() |
aabba2793a | ||
![]() |
ebb02d6c3a | ||
![]() |
16bead7bcc | ||
![]() |
7b277420ce | ||
![]() |
dfb87cdf57 | ||
![]() |
e0c876b985 | ||
![]() |
e60f6a4bdf | ||
![]() |
533af2a851 | ||
![]() |
f0284f75e0 | ||
![]() |
e3fcc657b9 | ||
![]() |
35bc5fc46b | ||
![]() |
293f6b2d02 | ||
![]() |
559ad14f54 | ||
![]() |
0788122499 | ||
![]() |
92d6d41f0c | ||
![]() |
e23893fb51 | ||
![]() |
76d750dce9 | ||
![]() |
487424408d | ||
![]() |
608a8ffd1c | ||
![]() |
3d44711e01 | ||
![]() |
e50d3a194f | ||
![]() |
c0823f6a68 | ||
![]() |
da57d43cc0 | ||
![]() |
62bea250ff | ||
![]() |
a80140819b | ||
![]() |
e71960bdfd | ||
![]() |
009c059b26 | ||
![]() |
87556f9f50 | ||
![]() |
f7570047bd | ||
![]() |
7c61ac738e | ||
![]() |
773772c426 | ||
![]() |
aee2110921 | ||
![]() |
5bced402fb | ||
![]() |
d8cb80216e | ||
![]() |
8fc4738ead | ||
![]() |
666b50a56b | ||
![]() |
3b361a9e5b | ||
![]() |
3c3285e51e | ||
![]() |
cb7cd3e8d9 | ||
![]() |
d036b1177e | ||
![]() |
cbae042701 | ||
![]() |
9fb22eee5f | ||
![]() |
d6e72e1f54 | ||
![]() |
0fbe408bbb | ||
![]() |
176ccd63e3 | ||
![]() |
d09b485c16 | ||
![]() |
0105aecd96 | ||
![]() |
8512c0893a | ||
![]() |
6da38663e7 | ||
![]() |
9344558cc9 | ||
![]() |
fb86bf3423 | ||
![]() |
58d8435758 | ||
![]() |
c402aab17c | ||
![]() |
6952f883bd | ||
![]() |
127a32e480 | ||
![]() |
2ab50019a2 | ||
![]() |
5e36621803 | ||
![]() |
04a992a47c | ||
![]() |
74415c7d0a | ||
![]() |
4fda1a8854 | ||
![]() |
45cf38f4b4 | ||
![]() |
7e1ac8dd5a | ||
![]() |
fdfb0e1310 | ||
![]() |
5ebf18260c | ||
![]() |
5c5f484627 | ||
![]() |
24f1f3489a | ||
![]() |
198395f263 | ||
![]() |
05465fe898 | ||
![]() |
6da2e379c6 | ||
![]() |
93579ead06 | ||
![]() |
5832c07632 | ||
![]() |
7ad348dd48 | ||
![]() |
c336cdb1c9 | ||
![]() |
bd564da5b8 | ||
![]() |
628c0d2713 | ||
![]() |
1f22924a9a | ||
![]() |
9dee4488ad | ||
![]() |
54f4150727 | ||
![]() |
8c86b5ec70 | ||
![]() |
562e739408 | ||
![]() |
47341772ac | ||
![]() |
31c9d0df19 | ||
![]() |
d286feabff | ||
![]() |
e14ece8d3b | ||
![]() |
0d6a8fb303 | ||
![]() |
a66262933e | ||
![]() |
e005c26fff | ||
![]() |
b7f44da849 | ||
![]() |
539ad0f888 | ||
![]() |
b7630f3b08 | ||
![]() |
f4549113b1 | ||
![]() |
c920ba443d | ||
![]() |
e0761e33c0 | ||
![]() |
ad65c6364a | ||
![]() |
fcb38a54d5 | ||
![]() |
03ad4c6337 | ||
![]() |
3dea2eb541 | ||
![]() |
0d6d4cfb36 | ||
![]() |
67b08abf29 | ||
![]() |
104f94950a | ||
![]() |
060a192cc7 | ||
![]() |
a589667304 | ||
![]() |
738bbbbbc4 | ||
![]() |
bee018c0e5 | ||
![]() |
34ee98ba8c | ||
![]() |
b0a6a81387 | ||
![]() |
656cd3381b | ||
![]() |
3884ef9652 | ||
![]() |
afb794a7e4 | ||
![]() |
6c3658828b | ||
![]() |
3c894fc128 | ||
![]() |
874626e5d7 | ||
![]() |
64e0297936 | ||
![]() |
ecc39aa994 | ||
![]() |
15ccaccd62 | ||
![]() |
0f4c022fda | ||
![]() |
4e72e2f0a5 | ||
![]() |
52d9886620 | ||
![]() |
e94a6ec036 | ||
![]() |
acd688cc04 | ||
![]() |
e9122b1d7c | ||
![]() |
3978817195 | ||
![]() |
5f48635fe8 | ||
![]() |
9baf9bb76a | ||
![]() |
67b2965260 | ||
![]() |
b3a19e6ad4 | ||
![]() |
fce314e011 | ||
![]() |
7c487d2c71 | ||
![]() |
588fb116f7 | ||
![]() |
bec314d2d7 | ||
![]() |
163bab3849 | ||
![]() |
44b9e1de06 | ||
![]() |
856ac3545e | ||
![]() |
89c3abf1ec | ||
![]() |
9a35eb712a | ||
![]() |
5840ab68a5 | ||
![]() |
890140fba1 | ||
![]() |
b9be2dec1b | ||
![]() |
6e10b5b579 | ||
![]() |
9f797b56c2 | ||
![]() |
8383b9d0be | ||
![]() |
d3369980d4 | ||
![]() |
d0e9541034 | ||
![]() |
d68826c61e | ||
![]() |
de710aa39e | ||
![]() |
6315bab31b | ||
![]() |
2a39ff1357 | ||
![]() |
fd0e253fe2 | ||
![]() |
f3f54a6eb1 | ||
![]() |
7386c6ec87 | ||
![]() |
21ebbf155d | ||
![]() |
2f14cd2816 | ||
![]() |
ac5f0474e3 | ||
![]() |
c6991e22ce | ||
![]() |
823886028b | ||
![]() |
0d518807e2 | ||
![]() |
00292d6b0d | ||
![]() |
e142205060 | ||
![]() |
66d8a33c88 | ||
![]() |
34ab8c732b | ||
![]() |
37cf95b17a | ||
![]() |
8721d47578 | ||
![]() |
7c8a104f3f | ||
![]() |
6e1372d66f | ||
![]() |
590ebf7c80 | ||
![]() |
1269be10fc | ||
![]() |
ffe2d166c1 | ||
![]() |
76a86f765d | ||
![]() |
94b6f4cc78 | ||
![]() |
5d3ed5877b | ||
![]() |
5a42670549 | ||
![]() |
f406a090df | ||
![]() |
8c06378dd8 | ||
![]() |
3deaae2697 | ||
![]() |
995d4175c0 | ||
![]() |
1207030427 | ||
![]() |
2b565f770a | ||
![]() |
d5732b1611 | ||
![]() |
7f29337f70 | ||
![]() |
5fdb598bc4 | ||
![]() |
fd800f57f1 | ||
![]() |
4cade4f59b | ||
![]() |
4f8f60d89f | ||
![]() |
4de6b1011f | ||
![]() |
9777612cfd | ||
![]() |
3b24505b3a | ||
![]() |
2b364c92e6 | ||
![]() |
f35ae268aa | ||
![]() |
20573c8941 | ||
![]() |
0b129443e3 | ||
![]() |
6dd2f1783d | ||
![]() |
923b9fe7e9 | ||
![]() |
afe89f8916 | ||
![]() |
85ba203a8c | ||
![]() |
e5d3ef7ff7 | ||
![]() |
c4b12f4fef | ||
![]() |
67be59ed16 | ||
![]() |
ed2548d52a | ||
![]() |
fe80b71768 | ||
![]() |
646ba02849 | ||
![]() |
0afae059a6 | ||
![]() |
00a92cc453 | ||
![]() |
b29b26882f | ||
![]() |
f9ca6b18b0 | ||
![]() |
f96f8e0130 | ||
![]() |
4691317034 | ||
![]() |
eb4277bfc4 | ||
![]() |
d943669244 | ||
![]() |
9e09676855 | ||
![]() |
48b60c3380 | ||
![]() |
2631170a6f | ||
![]() |
890df5021b | ||
![]() |
6c14fd9a26 | ||
![]() |
2eef6d6489 | ||
![]() |
a76d5c47ef | ||
![]() |
3afeaad0a7 | ||
![]() |
560b24f1a6 | ||
![]() |
5ea370c1fb | ||
![]() |
121301514d | ||
![]() |
86cfde38ed | ||
![]() |
3448e40847 | ||
![]() |
b5c096920e | ||
![]() |
5098cf9710 | ||
![]() |
56f73ba939 | ||
![]() |
ec7bcdf8e6 | ||
![]() |
bfc1e9ad41 | ||
![]() |
7eb4ffbe75 | ||
![]() |
bc3cbf69c1 | ||
![]() |
a9b16faa75 | ||
![]() |
6c12abf3b1 | ||
![]() |
db9ca73942 | ||
![]() |
d02e2013be | ||
![]() |
d4f257ee04 | ||
![]() |
1a907c3ae2 | ||
![]() |
52ae5800a0 | ||
![]() |
0b55627eed | ||
![]() |
deff6bad13 | ||
![]() |
abf906ad1c | ||
![]() |
de002b6bcb | ||
![]() |
e6fac5395e | ||
![]() |
4b652eb666 | ||
![]() |
4a6b2ca81d | ||
![]() |
9f2fb8e76a | ||
![]() |
6694a7a273 | ||
![]() |
5eea920177 | ||
![]() |
d740f6cf70 | ||
![]() |
d3a21f249c | ||
![]() |
82fef67ad7 | ||
![]() |
b22db6db8a | ||
![]() |
0a096e74e0 | ||
![]() |
aad7a8ab14 | ||
![]() |
7ed9b35cc1 | ||
![]() |
88d9589a17 | ||
![]() |
9e7b942cba | ||
![]() |
00e094518d | ||
![]() |
98630ecdce | ||
![]() |
239e88c6e0 | ||
![]() |
9c47e11ef7 | ||
![]() |
3fb1279f32 | ||
![]() |
83e66d30b7 | ||
![]() |
349d701924 | ||
![]() |
20422ec3d1 | ||
![]() |
f01483ff92 | ||
![]() |
c7300429d7 | ||
![]() |
b9e5f513dd | ||
![]() |
82e4e2b55f | ||
![]() |
857c840762 | ||
![]() |
6914202143 | ||
![]() |
6d7a070657 | ||
![]() |
c74c37372b | ||
![]() |
dbc8b4cf66 | ||
![]() |
d20f57852b | ||
![]() |
0c7041ac43 | ||
![]() |
524b99cfcc | ||
![]() |
e086090e33 | ||
![]() |
a3cf72005d | ||
![]() |
bbf86a67be | ||
![]() |
4819ad1949 | ||
![]() |
a8ec3585a4 | ||
![]() |
23db530c03 | ||
![]() |
793b0bbe6a | ||
![]() |
17d1709621 | ||
![]() |
5a3baed83a | ||
![]() |
1cea32477f | ||
![]() |
daef7613bc | ||
![]() |
502fb070a8 | ||
![]() |
b71b259f39 | ||
![]() |
5b46b0ba5e | ||
![]() |
32aa7d4b67 | ||
![]() |
73922a7469 | ||
![]() |
ea0b6fee1b | ||
![]() |
31ac2414c1 | ||
![]() |
dc00efabf3 | ||
![]() |
e744d21767 | ||
![]() |
7e64eb4bd6 | ||
![]() |
d09e6cd3fb | ||
![]() |
72514da1a8 | ||
![]() |
bf935624f1 | ||
![]() |
2e3b133350 | ||
![]() |
5c4747d59d | ||
![]() |
02ca0a503d | ||
![]() |
10612f6313 | ||
![]() |
24040e063f | ||
![]() |
f08ae48642 | ||
![]() |
e698cfee2a | ||
![]() |
8774069f7d | ||
![]() |
d695107f7c | ||
![]() |
de670b3965 | ||
![]() |
68ecf808e3 | ||
![]() |
fac15ed958 | ||
![]() |
5b97fa6a58 | ||
![]() |
6b2ac305bf | ||
![]() |
7dcb34e6bf | ||
![]() |
98d97e488f | ||
![]() |
0fb45dbf3f | ||
![]() |
5ed9cefd55 | ||
![]() |
a3842e40d6 | ||
![]() |
c041fd3458 | ||
![]() |
7ff6652440 | ||
![]() |
2c1040b4c2 | ||
![]() |
c5cfe214cd | ||
![]() |
bdf9b6b6e2 | ||
![]() |
10341c5652 | ||
![]() |
257c067ef6 | ||
![]() |
fb562e60e8 | ||
![]() |
a4352e11cf | ||
![]() |
982ffba670 | ||
![]() |
d3a113f2e5 | ||
![]() |
47b8836f94 | ||
![]() |
9f34246960 | ||
![]() |
d9b25d475d | ||
![]() |
b69fe967b5 | ||
![]() |
9269d6d7a6 | ||
![]() |
bde1289fd8 | ||
![]() |
515d0ff503 | ||
![]() |
55ec2921e3 | ||
![]() |
f38d1f7467 | ||
![]() |
fd588f754e | ||
![]() |
4a35d6f47f | ||
![]() |
1d0dfdecbd | ||
![]() |
1beb3eed45 | ||
![]() |
ce57b8b62d | ||
![]() |
f14d79af73 | ||
![]() |
d53a8c74ba | ||
![]() |
6f5c0ac2e7 | ||
![]() |
6e1351679b | ||
![]() |
2248414917 | ||
![]() |
343b76474b | ||
![]() |
c1a0cb45db | ||
![]() |
9798ba5309 | ||
![]() |
a4309cda7c | ||
![]() |
ab3e52234d | ||
![]() |
e3721b493b | ||
![]() |
64834d821a | ||
![]() |
d995a52791 | ||
![]() |
a47eeb909c | ||
![]() |
c5ec77d36e | ||
![]() |
56ce1ff3e7 | ||
![]() |
5bc09e5470 | ||
![]() |
449cca258f | ||
![]() |
62774c3234 | ||
![]() |
a5d377db70 | ||
![]() |
887a9f30c8 | ||
![]() |
62e793e106 | ||
![]() |
a31597fea7 | ||
![]() |
eee8396473 | ||
![]() |
74079ee19b | ||
![]() |
e089d384eb | ||
![]() |
e39246b1ef | ||
![]() |
10b714cf55 | ||
![]() |
e7b99200c6 | ||
![]() |
1b3d01c703 | ||
![]() |
24257072bb | ||
![]() |
fdbb4ed333 | ||
![]() |
9adfd29c54 | ||
![]() |
85c282e51a | ||
![]() |
e84dbda010 | ||
![]() |
3347d543f5 | ||
![]() |
f62f6469ff | ||
![]() |
2bf2936f73 | ||
![]() |
2f5fe3bcb6 | ||
![]() |
74195b8306 | ||
![]() |
836b5421f7 | ||
![]() |
e36d67e83b | ||
![]() |
73e8175e12 | ||
![]() |
0e2520a597 | ||
![]() |
1545a18ce9 | ||
![]() |
0cc5b51aba | ||
![]() |
9f1ae6982d | ||
![]() |
5c83db6a3c | ||
![]() |
046352a760 | ||
![]() |
874698b9fb | ||
![]() |
0a88399996 | ||
![]() |
ea0d19f416 | ||
![]() |
7a6203eb50 | ||
![]() |
3309126c88 | ||
![]() |
21d80bfa3a | ||
![]() |
f04e3ff248 | ||
![]() |
b8075fd85f | ||
![]() |
bb6fbdd7af | ||
![]() |
fbb6728179 | ||
![]() |
9f5a1a7f4b | ||
![]() |
05faca63ea | ||
![]() |
53c7e6be4f | ||
![]() |
15f44f2d12 | ||
![]() |
a49554a4d3 | ||
![]() |
80a6efb608 | ||
![]() |
bc4a8390c8 | ||
![]() |
39197b9b40 | ||
![]() |
aaca103604 | ||
![]() |
6f6fd93cc2 | ||
![]() |
cf57e487e2 | ||
![]() |
ae1d72ccef | ||
![]() |
5e559b8449 | ||
![]() |
3c804657be | ||
![]() |
bade8e8206 | ||
![]() |
d6863fd79a | ||
![]() |
d431a948b2 | ||
![]() |
2224a82ded | ||
![]() |
01a56caac6 | ||
![]() |
2791be62b1 | ||
![]() |
5be82ae9fb | ||
![]() |
41e35b59e7 | ||
![]() |
7411dcc2d5 | ||
![]() |
a52741723b | ||
![]() |
808699f343 | ||
![]() |
d166dc4db1 | ||
![]() |
4771edb58a | ||
![]() |
7d198938ba | ||
![]() |
c122d225fc | ||
![]() |
bf1dda1212 | ||
![]() |
252e86518f | ||
![]() |
b6cb55a1f7 | ||
![]() |
cdf020c5c5 | ||
![]() |
c7d7454056 | ||
![]() |
b874028705 | ||
![]() |
24e7991f6b | ||
![]() |
56ad1ec942 | ||
![]() |
21391fe7ba | ||
![]() |
5a517e9fa5 | ||
![]() |
f8a9769320 | ||
![]() |
d07172cc0e | ||
![]() |
aba5f6238e | ||
![]() |
dbebd92626 | ||
![]() |
2a484f522f | ||
![]() |
da081c0ddb | ||
![]() |
6b7377799d | ||
![]() |
1640f04763 | ||
![]() |
88b7f3ada1 | ||
![]() |
9ec1772382 | ||
![]() |
d18fbb9631 | ||
![]() |
97072fb7af | ||
![]() |
7018b14a7c | ||
![]() |
60b890b800 | ||
![]() |
8dc4160a6a | ||
![]() |
13996cea60 | ||
![]() |
675370f86e | ||
![]() |
0291228dac | ||
![]() |
412178e48a | ||
![]() |
35144daf83 | ||
![]() |
f597c14263 | ||
![]() |
a5599d6c89 | ||
![]() |
2fea093052 | ||
![]() |
19a1edc3a1 | ||
![]() |
71812f43a6 | ||
![]() |
05f7877cf6 | ||
![]() |
cd7c54db6d | ||
![]() |
d71470a5ba | ||
![]() |
38e044b90d | ||
![]() |
745e5f00f5 | ||
![]() |
df9fcda537 | ||
![]() |
c899e7a4b0 | ||
![]() |
e546bd7461 | ||
![]() |
37d4aa2615 | ||
![]() |
0512a8a52a | ||
![]() |
b990868ca7 | ||
![]() |
3e491c3416 | ||
![]() |
433642f21f | ||
![]() |
627b321b9d | ||
![]() |
3716ba89f7 | ||
![]() |
ff60a94750 | ||
![]() |
10c2b7c0e0 | ||
![]() |
c8c0633488 | ||
![]() |
0ff2d2b7a8 | ||
![]() |
da4579e3e1 | ||
![]() |
14fc7397cb | ||
![]() |
8a92dfca61 | ||
![]() |
dd2d7954a1 | ||
![]() |
295f002fc3 | ||
![]() |
9cc3b82c1a | ||
![]() |
cebe1e6cf4 | ||
![]() |
0eebfcfd9b | ||
![]() |
c5da52cdae | ||
![]() |
fba2cc9fa6 | ||
![]() |
8da445b9f4 | ||
![]() |
2cff8b35fa | ||
![]() |
a66c4fc174 | ||
![]() |
c87acbba84 | ||
![]() |
5707801e8c | ||
![]() |
f9fa9381ef | ||
![]() |
22a160e791 | ||
![]() |
a05205bb4f | ||
![]() |
93e906203d | ||
![]() |
2304b355b1 | ||
![]() |
f2121e5ade | ||
![]() |
a11b4d5252 | ||
![]() |
1aff51676b | ||
![]() |
a87238ec28 | ||
![]() |
17076c1a8f | ||
![]() |
a7227a1e04 | ||
![]() |
e9cb14c91e | ||
![]() |
cd12ba1171 | ||
![]() |
a43f55027d | ||
![]() |
33925e7ad0 | ||
![]() |
b104d98e90 | ||
![]() |
e3ba31dc7a | ||
![]() |
84f434cc76 | ||
![]() |
caea12ed2d | ||
![]() |
7f4afe80ed | ||
![]() |
a04d09ec10 | ||
![]() |
61b172afb7 | ||
![]() |
1356628f35 | ||
![]() |
3c2e1ba1fb | ||
![]() |
8cb54e658b | ||
![]() |
56d85411f5 | ||
![]() |
4b08e2976a | ||
![]() |
dffe37973d | ||
![]() |
af6f8ee292 | ||
![]() |
4d3b76a761 | ||
![]() |
e8642a1606 | ||
![]() |
68c033dffe | ||
![]() |
77da0dbf16 | ||
![]() |
81b1236c5e | ||
![]() |
890d777392 | ||
![]() |
2a30b0a9cd | ||
![]() |
32cfcfd777 | ||
![]() |
59692d02f7 | ||
![]() |
d7549bd155 | ||
![]() |
a3f5ff9a7b | ||
![]() |
cf23adeac2 | ||
![]() |
c51c7e856a | ||
![]() |
7a366444ab | ||
![]() |
788b922d34 | ||
![]() |
f933e66a77 | ||
![]() |
21bdb366e5 | ||
![]() |
b9b0c2462b | ||
![]() |
30690637b5 | ||
![]() |
0b423e443f | ||
![]() |
54cf836d58 | ||
![]() |
9e95e3b2bf | ||
![]() |
341f844c6d | ||
![]() |
84b8993543 | ||
![]() |
913c0f7566 | ||
![]() |
e681745c09 | ||
![]() |
14d89b6ae3 | ||
![]() |
efc09396b7 | ||
![]() |
de12fc5ba5 | ||
![]() |
afdb17b125 | ||
![]() |
aa20365fd6 | ||
![]() |
4028520d10 | ||
![]() |
f582cb2365 | ||
![]() |
23d6ca8313 | ||
![]() |
885ddef0e0 | ||
![]() |
070798fdc5 | ||
![]() |
369b35c670 | ||
![]() |
cb9dd368ee | ||
![]() |
f66d495334 | ||
![]() |
4447b362d2 | ||
![]() |
172a1f9f35 | ||
![]() |
34bcdf4e32 | ||
![]() |
a50f5b8ff6 | ||
![]() |
8c77c94bf6 | ||
![]() |
77d79ddfe8 | ||
![]() |
c3e99eda87 | ||
![]() |
67703832e6 | ||
![]() |
5b23915205 | ||
![]() |
41d69cff5a | ||
![]() |
acbc56781b | ||
![]() |
c9080fc4e7 | ||
![]() |
873f6b9927 | ||
![]() |
c9afe46ad9 | ||
![]() |
0401d581ab | ||
![]() |
2d0c2e0f8a | ||
![]() |
0807410583 | ||
![]() |
233608ed88 | ||
![]() |
48ee392cbe | ||
![]() |
c083f4448a | ||
![]() |
b3d4a5e2c5 | ||
![]() |
e93d3fcc9c | ||
![]() |
6fb799d38a | ||
![]() |
967d3015f6 | ||
![]() |
6884678540 | ||
![]() |
60a4fdb0af | ||
![]() |
fd1cd1dcac | ||
![]() |
fc65161d2e | ||
![]() |
4a45b06c16 | ||
![]() |
7fc7572810 | ||
![]() |
fbe3b623d5 | ||
![]() |
ff2fc6de95 | ||
![]() |
866f9598b9 | ||
![]() |
890889e117 | ||
![]() |
a5108d40f0 | ||
![]() |
02c3b89e40 | ||
![]() |
c00978cb9e | ||
![]() |
5d5068cd63 | ||
![]() |
7cf72a7797 | ||
![]() |
f60c87aacb | ||
![]() |
7647f9425b | ||
![]() |
6934fe3c63 | ||
![]() |
d55cf91a21 | ||
![]() |
1e0c9d71c8 | ||
![]() |
45cbd66041 | ||
![]() |
98ec4e3a09 | ||
![]() |
3ad2037a39 | ||
![]() |
9b5f047f98 | ||
![]() |
cf76f70a7d | ||
![]() |
90e2a20a45 | ||
![]() |
356bceb212 | ||
![]() |
575c1b8a79 | ||
![]() |
fcb537103f | ||
![]() |
0ef171ca12 | ||
![]() |
6618df6166 | ||
![]() |
54ba9dcd70 | ||
![]() |
250704b18c | ||
![]() |
a420d94431 | ||
![]() |
a686ce00b2 | ||
![]() |
15cef7dbab | ||
![]() |
f36b821e1a | ||
![]() |
ee3d8924ac | ||
![]() |
d7338a0e79 | ||
![]() |
d7f20320d5 | ||
![]() |
3106812cd0 | ||
![]() |
89ee0bd9bd | ||
![]() |
33d35f6336 | ||
![]() |
9718d1e617 | ||
![]() |
e1717bcf5a | ||
![]() |
bad064ffc9 | ||
![]() |
60fd084cf5 | ||
![]() |
b97022a705 | ||
![]() |
65fb54614b | ||
![]() |
69e98434e0 | ||
![]() |
3e45b1f727 | ||
![]() |
311e2938c0 | ||
![]() |
3dbf49190f | ||
![]() |
73a6996c75 | ||
![]() |
6f1a821000 | ||
![]() |
702a899a15 | ||
![]() |
94ae9f4fc9 | ||
![]() |
9a031b1dee | ||
![]() |
ba395d7c27 | ||
![]() |
e8b384afc9 | ||
![]() |
5fa3605611 | ||
![]() |
8413e6f96d | ||
![]() |
ab72e70697 | ||
![]() |
0557667bbf | ||
![]() |
d14de602b1 | ||
![]() |
6cfbd27473 | ||
![]() |
cf060abb6d | ||
![]() |
6c7abe7935 | ||
![]() |
9ebd809470 | ||
![]() |
a43d56f2d9 | ||
![]() |
308a3c4233 | ||
![]() |
64e0a4021a | ||
![]() |
51b781fc4b | ||
![]() |
ffab39b7e2 | ||
![]() |
f29f83fda0 | ||
![]() |
040ae9e55c | ||
![]() |
c5fbfa1fe7 | ||
![]() |
c98aa86fd8 | ||
![]() |
6531c36679 | ||
![]() |
ad05dce33a | ||
![]() |
6640b05216 | ||
![]() |
4f790e3945 | ||
![]() |
d7443d1aea | ||
![]() |
ede29b4fdf | ||
![]() |
92cae105fd | ||
![]() |
63ade50e2b | ||
![]() |
163d34f430 | ||
![]() |
d6b7d47430 | ||
![]() |
e35033aebc | ||
![]() |
2b4b3ee2f6 | ||
![]() |
99370c6641 | ||
![]() |
8bdb2cd1e8 | ||
![]() |
c97a6dafde | ||
![]() |
5aba669eb4 | ||
![]() |
db38c751df | ||
![]() |
9356099229 | ||
![]() |
38e72b2716 | ||
![]() |
c78b214d0e | ||
![]() |
ba85fb6c07 | ||
![]() |
e43d884db7 | ||
![]() |
6c9dc08bb1 | ||
![]() |
baa119fa01 | ||
![]() |
e7b5f24ab3 | ||
![]() |
da7fbf8c0c | ||
![]() |
a73e655513 | ||
![]() |
53f54d08d9 | ||
![]() |
3843bdab93 | ||
![]() |
8edd383c0e | ||
![]() |
c8f5d589d2 | ||
![]() |
93b640543c | ||
![]() |
90666c8333 | ||
![]() |
32fac46e75 | ||
![]() |
5cba721cf9 | ||
![]() |
5a62968da8 | ||
![]() |
9325ed9cba | ||
![]() |
0cf0214951 | ||
![]() |
53b8fe1e21 | ||
![]() |
c99f3996c0 | ||
![]() |
7b9d93ecbc | ||
![]() |
53db412632 | ||
![]() |
d52647d98b | ||
![]() |
6d17f6e8ca | ||
![]() |
6638e51c19 | ||
![]() |
9f9a9ce45f | ||
![]() |
ec50d45b5e | ||
![]() |
739d85b09c | ||
![]() |
8f7d96b593 | ||
![]() |
830daeee97 | ||
![]() |
bc7c3f325d | ||
![]() |
21a2898313 | ||
![]() |
04e566b13a | ||
![]() |
319667147b | ||
![]() |
ee9bd1246a | ||
![]() |
4d805c160e | ||
![]() |
81606abb26 | ||
![]() |
9e02797ac0 | ||
![]() |
6c48871640 | ||
![]() |
fd0e3d2be2 | ||
![]() |
22b44a8275 | ||
![]() |
dd97c32df6 | ||
![]() |
aba2a78017 | ||
![]() |
b213670518 | ||
![]() |
e0bc1bd255 | ||
![]() |
c18760559a | ||
![]() |
0971af9331 | ||
![]() |
0445f9ea39 | ||
![]() |
cb44e8887b | ||
![]() |
a90653f337 | ||
![]() |
2fed2efa0f | ||
![]() |
00356791ff | ||
![]() |
74f02e8a68 | ||
![]() |
31146eae69 | ||
![]() |
497036f16b | ||
![]() |
144b1314d8 | ||
![]() |
ddab3f5bfb | ||
![]() |
13da73d414 | ||
![]() |
db27171a0b | ||
![]() |
8e07f20ef0 | ||
![]() |
bd457156c0 | ||
![]() |
27f6fb4d3c | ||
![]() |
2c6749e714 | ||
![]() |
08e4057148 | ||
![]() |
6e64117996 | ||
![]() |
acc4edb141 | ||
![]() |
ebd645a4fe | ||
![]() |
2702705043 | ||
![]() |
8dc6f1c8f5 | ||
![]() |
6d0224f08c | ||
![]() |
c2779df50c | ||
![]() |
023bd39eb5 | ||
![]() |
b85da8d8b0 | ||
![]() |
f9e5029ae9 | ||
![]() |
6fc622ffdd | ||
![]() |
f1c619292f | ||
![]() |
5e093b3b8d | ||
![]() |
fb717a8d4b | ||
![]() |
27f90f7239 | ||
![]() |
821522244f | ||
![]() |
24d2fef2fa | ||
![]() |
aa0d0b50ab | ||
![]() |
f334749c5f | ||
![]() |
1a510cc5b2 | ||
![]() |
4a63636225 | ||
![]() |
60b5fd0b00 | ||
![]() |
7e971a58e0 | ||
![]() |
c07f66261e | ||
![]() |
7fdd867d61 | ||
![]() |
432abe78f3 | ||
![]() |
c218f71db9 | ||
![]() |
39107095dd | ||
![]() |
1ec162e946 | ||
![]() |
845853c1bd | ||
![]() |
94f233cefa | ||
![]() |
fc58fa4d9d | ||
![]() |
d38a372687 | ||
![]() |
38f735622d | ||
![]() |
aaf493adba | ||
![]() |
a1e2b9cfcf | ||
![]() |
0d1dcad59c | ||
![]() |
685ca05b62 | ||
![]() |
910c78d4e0 | ||
![]() |
d0f05e0558 | ||
![]() |
3c6755d6ea | ||
![]() |
c5936f92ae | ||
![]() |
e3c428e61b | ||
![]() |
3a128fbd85 | ||
![]() |
623944d2a5 | ||
![]() |
74f43bb5d2 | ||
![]() |
f389c29473 | ||
![]() |
353b1d48b9 | ||
![]() |
31068bc174 | ||
![]() |
c854acb3ee | ||
![]() |
628592f22b | ||
![]() |
2425620013 | ||
![]() |
5a89789ba7 | ||
![]() |
c096bc6022 | ||
![]() |
2d46e659f4 | ||
![]() |
bee29be6dd | ||
![]() |
caa5e39303 | ||
![]() |
46eb7f783f | ||
![]() |
cd8e683255 | ||
![]() |
86d51a8215 | ||
![]() |
dec16149b2 | ||
![]() |
de45a7bed0 | ||
![]() |
af2c268347 | ||
![]() |
c2a4baf11f | ||
![]() |
f516003529 | ||
![]() |
b473ed193a | ||
![]() |
8764aa9eea | ||
![]() |
ade2eaa057 | ||
![]() |
6ba1b273ba | ||
![]() |
39fa242248 | ||
![]() |
d7daa1c424 | ||
![]() |
e407cb99ae | ||
![]() |
a4e60cae7e | ||
![]() |
5878e220cf | ||
![]() |
89bdd919a7 | ||
![]() |
f1a35823f7 | ||
![]() |
8f9405cf8e | ||
![]() |
0e6fb14f61 | ||
![]() |
e3fdb0b34f | ||
![]() |
624ed41cb5 | ||
![]() |
ae80dcddae | ||
![]() |
b5c19eb996 | ||
![]() |
e48cbeb31f | ||
![]() |
9c755a02ad | ||
![]() |
17abfb480e | ||
![]() |
fdc92807aa | ||
![]() |
5edd1b0cf1 | ||
![]() |
3880dae74f | ||
![]() |
2c41fd8a2e | ||
![]() |
8673e01511 | ||
![]() |
6ea385a16c | ||
![]() |
051dac0260 | ||
![]() |
b51248d166 | ||
![]() |
db9ed96d0d | ||
![]() |
4818aa6e9c | ||
![]() |
7bdf6a593e | ||
![]() |
7121882744 | ||
![]() |
4d9eee7753 | ||
![]() |
2c72c0471b | ||
![]() |
5cb273cef7 | ||
![]() |
82d35f5d5c | ||
![]() |
725a5bdc3a | ||
![]() |
30354137b0 | ||
![]() |
2cfc5c2fb4 | ||
![]() |
905f3d774d | ||
![]() |
8e27387d3b | ||
![]() |
0d3f49296e | ||
![]() |
b9c565f191 | ||
![]() |
e57901290e | ||
![]() |
17a9837e36 | ||
![]() |
cc834784be | ||
![]() |
07033471bd | ||
![]() |
673e6ecccb | ||
![]() |
ba433a71c7 | ||
![]() |
4afea71ae5 | ||
![]() |
478ed7008c | ||
![]() |
563afaffca | ||
![]() |
c9512a1853 | ||
![]() |
1f7ad41df7 | ||
![]() |
d940aa79b0 | ||
![]() |
ba82644375 | ||
![]() |
8e89a094f1 | ||
![]() |
d3f204e959 | ||
![]() |
cf7bcadedc | ||
![]() |
2b06ea6f65 | ||
![]() |
0f26cf5b93 | ||
![]() |
a6cb942a3b | ||
![]() |
e8a392577d | ||
![]() |
ad4821b87f | ||
![]() |
22107ebd32 | ||
![]() |
4f03b1d6ff | ||
![]() |
245670f37f | ||
![]() |
ea5cd720f1 | ||
![]() |
77b516f6a8 | ||
![]() |
58b3ab13c9 | ||
![]() |
4ac4da9aa4 | ||
![]() |
2199d8503b | ||
![]() |
2e7b0ac875 | ||
![]() |
8e7c528d17 | ||
![]() |
4fddd04ed2 | ||
![]() |
b258510a6d | ||
![]() |
2a522420ca | ||
![]() |
67f0c5a6e5 | ||
![]() |
e0854105d6 | ||
![]() |
584d26d5d7 | ||
![]() |
9a4da7d2e3 | ||
![]() |
cf4e901ba8 | ||
![]() |
85504a91a6 | ||
![]() |
d8d6bf1630 | ||
![]() |
43a3863532 | ||
![]() |
3a84dadfd2 | ||
![]() |
528134ec18 | ||
![]() |
1d1ca7b6cb | ||
![]() |
13190ff89a | ||
![]() |
8a97ed53ef | ||
![]() |
ca93da0d7d | ||
![]() |
2b90a91add | ||
![]() |
a951a88576 | ||
![]() |
dadacf2685 | ||
![]() |
a04fe85ffa | ||
![]() |
d23f40cce9 | ||
![]() |
f5d84f3bd1 | ||
![]() |
d273aef431 | ||
![]() |
ccf6353126 | ||
![]() |
8190791910 | ||
![]() |
3c788180ff | ||
![]() |
ae1bec97b7 | ||
![]() |
8a14947270 | ||
![]() |
16fad0baef | ||
![]() |
3294e14752 | ||
![]() |
9ef3a59698 | ||
![]() |
6320f769ea | ||
![]() |
0b7285b766 | ||
![]() |
9bede19920 | ||
![]() |
d51f3b9cc5 | ||
![]() |
56141835a6 | ||
![]() |
156e75f345 | ||
![]() |
7d50c786e3 | ||
![]() |
7fc6945cba | ||
![]() |
de4ab5fbcf | ||
![]() |
7b374707ac | ||
![]() |
52239d808c | ||
![]() |
1a0f8dd81b | ||
![]() |
fd8cf7875b | ||
![]() |
9765605d46 | ||
![]() |
c1cc4e23af | ||
![]() |
e79e3899e4 | ||
![]() |
8890635449 | ||
![]() |
0ede71eda5 | ||
![]() |
f556db26a9 | ||
![]() |
8035d2418d | ||
![]() |
d52df08f22 | ||
![]() |
5540fd8111 | ||
![]() |
d92df14ecb | ||
![]() |
4e22161bee | ||
![]() |
e0bea13bf2 | ||
![]() |
5f36621afa | ||
![]() |
7b58472599 | ||
![]() |
d931728bfe | ||
![]() |
ec7a44a52c | ||
![]() |
0dc6cfc78f | ||
![]() |
2f6241aaff | ||
![]() |
027a29fd0a | ||
![]() |
7d38fe0b25 | ||
![]() |
353a9c32fa | ||
![]() |
31c59ce450 | ||
![]() |
b4e407a8a9 | ||
![]() |
fe5990536b | ||
![]() |
ea3322b412 | ||
![]() |
b31a98b3a7 | ||
![]() |
4849a3529b | ||
![]() |
a2f428e5b3 | ||
![]() |
fea99b1335 | ||
![]() |
9683260d61 | ||
![]() |
077c253954 | ||
![]() |
7e3a5b10f1 | ||
![]() |
f3522141df | ||
![]() |
eb8328717d | ||
![]() |
cec5aa517d | ||
![]() |
149502ebfc | ||
![]() |
3a4533ee0c | ||
![]() |
fb4ce8a741 | ||
![]() |
e102f8f11e | ||
![]() |
65537f8cc6 | ||
![]() |
cbc40b7d43 | ||
![]() |
04e820258f | ||
![]() |
7ec90b0fcd | ||
![]() |
33c9efda45 | ||
![]() |
cdc2c20f5b | ||
![]() |
2b46917c4a | ||
![]() |
6227a8a407 | ||
![]() |
5ac0683840 | ||
![]() |
49fa6669be | ||
![]() |
69ee788738 | ||
![]() |
630db2b261 | ||
![]() |
fc89d6fe8b | ||
![]() |
c2d2522f1f | ||
![]() |
7d1907cd73 | ||
![]() |
ef367f4450 | ||
![]() |
6ed59e2727 | ||
![]() |
af59576d96 | ||
![]() |
3a42b0ce58 | ||
![]() |
bf6ab537f7 | ||
![]() |
131a389530 | ||
![]() |
a6ff8d0c23 | ||
![]() |
86a8ddc6a5 | ||
![]() |
3e413bd82e | ||
![]() |
2de8317e70 | ||
![]() |
dc07caab95 | ||
![]() |
e01ceb5284 | ||
![]() |
34e3f90035 | ||
![]() |
b49eb84027 | ||
![]() |
fd6be3b6e4 | ||
![]() |
b39278d39f | ||
![]() |
1d85a3ed89 | ||
![]() |
9de739f9dd | ||
![]() |
7674112d14 | ||
![]() |
611944fa7b | ||
![]() |
b2dc649f90 | ||
![]() |
318c198a6f | ||
![]() |
2073e52bc4 | ||
![]() |
a58a741150 | ||
![]() |
2974390263 | ||
![]() |
5f64fbd886 | ||
![]() |
43c7376f69 | ||
![]() |
26001ed8eb | ||
![]() |
ba106493c5 | ||
![]() |
8b6497da72 | ||
![]() |
7e709b11dd | ||
![]() |
61fc2933c4 | ||
![]() |
1826c9cf67 | ||
![]() |
1f1420aa0b | ||
![]() |
70486ca1d4 | ||
![]() |
b87c24ebeb | ||
![]() |
a8b88be486 | ||
![]() |
87715c4dc6 | ||
![]() |
0464a16c2a | ||
![]() |
00b9b20c4c | ||
![]() |
9770229106 | ||
![]() |
1c923149ae | ||
![]() |
aba6f6beae | ||
![]() |
f7b5792b71 | ||
![]() |
516ef95ca8 | ||
![]() |
271cf16d85 | ||
![]() |
d92ba4c8e2 | ||
![]() |
2244d93fb4 | ||
![]() |
22c1766113 | ||
![]() |
ae11e7c061 | ||
![]() |
01451adf6b | ||
![]() |
41b8b8423a | ||
![]() |
0caa05c54f | ||
![]() |
1decaf5711 | ||
![]() |
0a98359776 | ||
![]() |
f7787f2d93 | ||
![]() |
23e96fdc6d | ||
![]() |
c06f94523a | ||
![]() |
112cbda97a | ||
![]() |
75269c4ae2 | ||
![]() |
bea424c18d | ||
![]() |
bf27a19c96 | ||
![]() |
0ab67261b2 | ||
![]() |
0cbee0f790 | ||
![]() |
da2897e9bf | ||
![]() |
33f11e7107 | ||
![]() |
27844363dd | ||
![]() |
6a256a59a1 | ||
![]() |
c81c7b3f70 | ||
![]() |
5fa7a865ab | ||
![]() |
1ef90cbdc7 | ||
![]() |
e8ccb262a6 | ||
![]() |
7b07640b32 | ||
![]() |
11b3b913e8 | ||
![]() |
45e1ee9368 | ||
![]() |
52da6c2b58 | ||
![]() |
fd09ab7efb | ||
![]() |
1d7309188b | ||
![]() |
2450240d5c | ||
![]() |
df09fd980d | ||
![]() |
48749b15bc | ||
![]() |
b45c3191d6 | ||
![]() |
261e3e825d | ||
![]() |
7d6f51f871 | ||
![]() |
3a91a197d4 | ||
![]() |
892aeeda0f | ||
![]() |
46103123e4 | ||
![]() |
51e9fb6e71 | ||
![]() |
d91988fa1c | ||
![]() |
db5a9a1b4a | ||
![]() |
a92ddf5a41 | ||
![]() |
3693a0db42 | ||
![]() |
8540a3fdd4 | ||
![]() |
a2abb1a450 | ||
![]() |
4cd8fe14d4 | ||
![]() |
69696e6fd7 | ||
![]() |
767303a3dc | ||
![]() |
f5861ccb10 | ||
![]() |
e6abf4b2d5 | ||
![]() |
1ef02f5e1d | ||
![]() |
99ce534e0c | ||
![]() |
f97b3f4419 | ||
![]() |
f83d414280 | ||
![]() |
8744762580 | ||
![]() |
3e18925289 | ||
![]() |
f0933cf408 | ||
![]() |
60a97b6236 | ||
![]() |
2ccda334e6 | ||
![]() |
ebe00a73d6 | ||
![]() |
25ad522344 | ||
![]() |
055277d101 | ||
![]() |
9a2102a0e8 | ||
![]() |
405fd40634 | ||
![]() |
2efc71ad12 | ||
![]() |
d5f14ef632 | ||
![]() |
2d02017d6c | ||
![]() |
2e962677b7 | ||
![]() |
819c2dc9e7 | ||
![]() |
c463c94b8d | ||
![]() |
9aeed74e96 | ||
![]() |
31475e74e8 | ||
![]() |
c4987f83ed | ||
![]() |
94763bb50b | ||
![]() |
0892d07684 | ||
![]() |
560bacdaeb | ||
![]() |
1e08e7d006 | ||
![]() |
830232f9e9 | ||
![]() |
31cf73704f | ||
![]() |
c81e8f045b | ||
![]() |
7b0d09bd29 | ||
![]() |
5dc987a630 | ||
![]() |
6ae0c5ad6c | ||
![]() |
38bae082f4 | ||
![]() |
eb5473596a | ||
![]() |
8950f1198b | ||
![]() |
2215d92ab0 | ||
![]() |
b5b1a9794b | ||
![]() |
d12e128981 | ||
![]() |
b0bf7eba9b | ||
![]() |
7b8c3dbacc | ||
![]() |
374f91f1ca | ||
![]() |
1a4a4824ff | ||
![]() |
ddb9c20ec8 | ||
![]() |
4f027308f8 | ||
![]() |
5e28d77365 | ||
![]() |
77482013d6 | ||
![]() |
ae14412da3 | ||
![]() |
c6e2bd18e9 | ||
![]() |
a1861c7871 | ||
![]() |
d2f1990135 | ||
![]() |
14282d69f6 | ||
![]() |
ff42133e81 | ||
![]() |
8331264fdb | ||
![]() |
ad706711d3 | ||
![]() |
3f78a510c0 | ||
![]() |
e814134020 | ||
![]() |
72fc1094ce | ||
![]() |
7d57774710 | ||
![]() |
b8e3a0adca | ||
![]() |
ced7fea9f4 | ||
![]() |
b01c195632 | ||
![]() |
088bd21073 | ||
![]() |
ce356f1737 | ||
![]() |
dbea252572 | ||
![]() |
0f4c789349 | ||
![]() |
60086b0fa0 | ||
![]() |
98422f3848 | ||
![]() |
2478e75dca | ||
![]() |
4cfc91da62 | ||
![]() |
2126a1ed21 | ||
![]() |
c070da586b | ||
![]() |
db228b3430 | ||
![]() |
fea52743f6 | ||
![]() |
5c88202087 | ||
![]() |
5843cb54eb | ||
![]() |
50063ff617 | ||
![]() |
2a65c3e025 | ||
![]() |
efd6fdd24d | ||
![]() |
22c4437580 | ||
![]() |
9ab90fdc63 | ||
![]() |
3506e7fd0e | ||
![]() |
3c415b222f | ||
![]() |
5a03a24ea9 | ||
![]() |
461868daf5 | ||
![]() |
00dce8d08a | ||
![]() |
4d2c3c43c2 | ||
![]() |
756af0299b | ||
![]() |
98433521d0 | ||
![]() |
5069c1f916 | ||
![]() |
4886b6de2d | ||
![]() |
7da18aa93a | ||
![]() |
9539c31d48 | ||
![]() |
96c4c441a2 | ||
![]() |
7916ffb63e | ||
![]() |
d572d1a703 | ||
![]() |
e6c3661b48 | ||
![]() |
dfdb2aff48 | ||
![]() |
096b273fe6 | ||
![]() |
72bedb72d0 | ||
![]() |
1d02c08bad | ||
![]() |
d06e59c7f4 | ||
![]() |
09569cd5fb | ||
![]() |
72a3e74c8e | ||
![]() |
4e99f5053a | ||
![]() |
fa7c345977 | ||
![]() |
222f3017ad | ||
![]() |
dac338bd1c | ||
![]() |
0f529c13fc | ||
![]() |
94bce224fa | ||
![]() |
684b584623 | ||
![]() |
1375b85cba | ||
![]() |
fcfe059402 | ||
![]() |
ddbb95fee4 | ||
![]() |
b381f39024 | ||
![]() |
d05e5469a5 | ||
![]() |
528f345c87 | ||
![]() |
3de03147f1 | ||
![]() |
ced3436053 | ||
![]() |
05d3295eba | ||
![]() |
85e4687c4a | ||
![]() |
9017809e4e | ||
![]() |
3e68d44e56 | ||
![]() |
44cb892333 | ||
![]() |
0df47ecb62 | ||
![]() |
292485e703 | ||
![]() |
36a198935a | ||
![]() |
cd91e01917 | ||
![]() |
d2792d028f | ||
![]() |
7f4947449d | ||
![]() |
8cb1d6d899 | ||
![]() |
72b73eded9 | ||
![]() |
093370c6ab | ||
![]() |
15c1f5f870 | ||
![]() |
1851456b3f | ||
![]() |
156b8dc3f8 | ||
![]() |
e4490e55bb | ||
![]() |
7bbd386a1f | ||
![]() |
ada68ba56c | ||
![]() |
1137f14cc2 | ||
![]() |
bec7b90566 | ||
![]() |
593cd82096 | ||
![]() |
357e02f152 | ||
![]() |
11204d5d12 | ||
![]() |
ddcd66c2d2 | ||
![]() |
a2f0f1d783 | ||
![]() |
b1a1c78a91 | ||
![]() |
6c4c8292f0 | ||
![]() |
d94bb4475f | ||
![]() |
de44f9931a | ||
![]() |
69b890c0f6 | ||
![]() |
a5a7e3b019 | ||
![]() |
5583d8a653 | ||
![]() |
5877cd7a59 | ||
![]() |
c119d82495 | ||
![]() |
5b1031de0d | ||
![]() |
d933a235aa | ||
![]() |
8c268f47ce | ||
![]() |
a413f068c8 | ||
![]() |
c67dba3d30 | ||
![]() |
180ece842c | ||
![]() |
bef7a6ab2e | ||
![]() |
32ae15edb6 | ||
![]() |
243bf299d3 | ||
![]() |
ae37cc1d6b | ||
![]() |
fc1b7b2005 | ||
![]() |
8315107e91 | ||
![]() |
9aeea6bbbf | ||
![]() |
68e4d2e7c3 | ||
![]() |
319730b7a7 | ||
![]() |
0e63e023b9 | ||
![]() |
679651299a | ||
![]() |
cb4a3e7ead | ||
![]() |
679d5f83ea | ||
![]() |
374a084403 | ||
![]() |
723e7a66c7 | ||
![]() |
588cd703af | ||
![]() |
1c0593b4d6 | ||
![]() |
4b74e4612a | ||
![]() |
a302fcc166 | ||
![]() |
017c230f90 | ||
![]() |
023a54beb5 | ||
![]() |
44b3f7015a | ||
![]() |
b763dfa21c | ||
![]() |
ef6d0e93b6 | ||
![]() |
e8c536d1d8 | ||
![]() |
e1f962e0d8 | ||
![]() |
228232f570 | ||
![]() |
defca6ba10 | ||
![]() |
6e26da3c9f | ||
![]() |
542f97bf16 | ||
![]() |
4e4ac4b177 | ||
![]() |
4574d8722d | ||
![]() |
40f86e1898 | ||
![]() |
cbb617d478 | ||
![]() |
9a04e5ba8f | ||
![]() |
6bb2906634 | ||
![]() |
06504a4ec7 | ||
![]() |
410e70bfda | ||
![]() |
0a11311113 | ||
![]() |
dc3b1d615f | ||
![]() |
5ad59794fc | ||
![]() |
8b8c239ed7 | ||
![]() |
b88f6f1800 | ||
![]() |
a5bd72653f | ||
![]() |
9ffbd2103d | ||
![]() |
ef40721949 | ||
![]() |
faf1f267c6 | ||
![]() |
998dd6aa31 | ||
![]() |
6aab93fcf2 | ||
![]() |
499c8ea103 | ||
![]() |
4e7e6d8b3d | ||
![]() |
6ce69654c7 | ||
![]() |
b1ae2106c2 | ||
![]() |
1bcaac0479 | ||
![]() |
f8c54ddcfe | ||
![]() |
636eb42a4a | ||
![]() |
a0ee6df8bc | ||
![]() |
28de02a201 | ||
![]() |
78c8c11787 | ||
![]() |
7c6d550123 | ||
![]() |
608f7acfa2 | ||
![]() |
d9b5ee5429 | ||
![]() |
bc1217c99d | ||
![]() |
e7785cb886 | ||
![]() |
634b405481 | ||
![]() |
2fc5af765b | ||
![]() |
3a50ab534b | ||
![]() |
d3b82ebdcc | ||
![]() |
c84b2df3a9 | ||
![]() |
398ebb961e | ||
![]() |
8dfc923876 | ||
![]() |
098dae15cb | ||
![]() |
f012ab6313 | ||
![]() |
4b8fea8607 | ||
![]() |
a89d53d89c | ||
![]() |
5a10df14ed | ||
![]() |
f5ce55dd47 | ||
![]() |
c15f469b4f | ||
![]() |
cc27e2a19a | ||
![]() |
278368b908 | ||
![]() |
20ac30d1e5 | ||
![]() |
e18d829f36 | ||
![]() |
d988dfb5b2 | ||
![]() |
41d9f33c3f | ||
![]() |
d9a421d014 | ||
![]() |
d50ba4de9b | ||
![]() |
e8c016419b | ||
![]() |
5ac48c5161 | ||
![]() |
e521194579 | ||
![]() |
063bf2959e | ||
![]() |
3239877c78 | ||
![]() |
0ab0177629 | ||
![]() |
a0ca327de7 | ||
![]() |
a8591dd658 | ||
![]() |
8547e63306 | ||
![]() |
ec3b208d76 | ||
![]() |
18cd662eb6 | ||
![]() |
21bb7bdc9e | ||
![]() |
2587905637 | ||
![]() |
c10eb79954 | ||
![]() |
8167cf84b8 | ||
![]() |
fe0083720e | ||
![]() |
264a52b88a | ||
![]() |
51119c26dd | ||
![]() |
bb88243ff9 | ||
![]() |
3cec258fcc | ||
![]() |
3f15094038 | ||
![]() |
f4112d94fb | ||
![]() |
03a4ed63b8 | ||
![]() |
bb5f719e44 | ||
![]() |
235cd93326 | ||
![]() |
23582dce20 | ||
![]() |
356120f91a | ||
![]() |
5ad05c61cf | ||
![]() |
1f145c0f6e | ||
![]() |
117cd6861d | ||
![]() |
2e1097eac6 | ||
![]() |
4baea1a97b | ||
![]() |
e87bc38389 | ||
![]() |
9dcbea0d4b | ||
![]() |
486802f6dd | ||
![]() |
ca241b75d4 | ||
![]() |
094c2c16d8 | ||
![]() |
a2bebacc07 | ||
![]() |
4672a4dcdd | ||
![]() |
b69152f31b | ||
![]() |
6a5e032214 | ||
![]() |
ade9da2703 | ||
![]() |
680301504f | ||
![]() |
7727a43cc2 | ||
![]() |
e237d5525e | ||
![]() |
c86993794f | ||
![]() |
f3bee7f70f | ||
![]() |
38d1df3b46 | ||
![]() |
574e6cd2c2 | ||
![]() |
d1f514ad76 | ||
![]() |
97a77adc32 | ||
![]() |
5c8550de75 | ||
![]() |
4c4d017ddb | ||
![]() |
1b28ce55a6 | ||
![]() |
0c3ebc0795 | ||
![]() |
166c07f6da | ||
![]() |
a560f7c0f3 | ||
![]() |
5edd2259d8 | ||
![]() |
79f240c2e9 | ||
![]() |
41fe2c2b39 | ||
![]() |
bf40680835 | ||
![]() |
a34d42b276 | ||
![]() |
5bb33125b8 | ||
![]() |
fd88928562 | ||
![]() |
6d918e15a3 | ||
![]() |
80ff2f662d | ||
![]() |
f3edefc93a | ||
![]() |
a81f58f37d | ||
![]() |
d933bec989 | ||
![]() |
7247f37e0f | ||
![]() |
e2d55c5322 | ||
![]() |
bd10d94449 | ||
![]() |
eb71862449 | ||
![]() |
9dffea3178 | ||
![]() |
cb396fe805 | ||
![]() |
bd57d085ad | ||
![]() |
83cadc12f5 | ||
![]() |
724fc20824 | ||
![]() |
926270054d | ||
![]() |
db1406a85f | ||
![]() |
a467c04d04 | ||
![]() |
071a65fb10 | ||
![]() |
3c552ecb90 | ||
![]() |
aeaf55815a | ||
![]() |
57608c0067 | ||
![]() |
6503765b3f | ||
![]() |
b91f363951 | ||
![]() |
33e2538aa8 | ||
![]() |
4896c90684 | ||
![]() |
75a93eefc3 | ||
![]() |
b713c3441b | ||
![]() |
82ba2a5da8 | ||
![]() |
df28d77fbc | ||
![]() |
15849a5911 | ||
![]() |
b897734f4a | ||
![]() |
b2090e26bd | ||
![]() |
7fc547faff | ||
![]() |
64def39282 | ||
![]() |
402aa367ed | ||
![]() |
6867accbc4 | ||
![]() |
7ca76a8bea | ||
![]() |
fceb58d543 | ||
![]() |
c76fdd638a | ||
![]() |
b622d96a62 | ||
![]() |
0873ece26c | ||
![]() |
40ce1d1580 | ||
![]() |
780d348bbc | ||
![]() |
0cb3dc87e7 | ||
![]() |
f68e9616b9 | ||
![]() |
4d617dd44a | ||
![]() |
6b1a49e341 | ||
![]() |
bf964ccd25 | ||
![]() |
0b0c29e79c | ||
![]() |
4d2787c6af | ||
![]() |
dfa273bc5e | ||
![]() |
b2e91cccba | ||
![]() |
e2d4a223cc | ||
![]() |
a02127f7db | ||
![]() |
0e89ac8050 | ||
![]() |
5b69f6e5ed | ||
![]() |
5300cc451c | ||
![]() |
894c26780d | ||
![]() |
5a7854908f | ||
![]() |
e8e033a8f7 | ||
![]() |
4fc4450bf0 | ||
![]() |
afcd6a8114 | ||
![]() |
962f708c92 | ||
![]() |
8f0f4f1d3b | ||
![]() |
9a6ab55c3b | ||
![]() |
5be5cf890e | ||
![]() |
c33b56281b | ||
![]() |
cab551c697 | ||
![]() |
98ecf8d45c | ||
![]() |
99cb7c0e08 | ||
![]() |
639ba5613e | ||
![]() |
ad68e6b99d | ||
![]() |
14d62ec971 | ||
![]() |
2f6064de88 | ||
![]() |
5e177c0d90 | ||
![]() |
13437829fc | ||
![]() |
d9e8dfa6b4 | ||
![]() |
da6403b722 | ||
![]() |
3caa552c64 | ||
![]() |
125275ffbf | ||
![]() |
2bd55266c8 | ||
![]() |
7bafef9047 | ||
![]() |
2618bbb5c0 | ||
![]() |
fbac15ee92 | ||
![]() |
9d5b65435b | ||
![]() |
a17239e607 | ||
![]() |
4730bea00b | ||
![]() |
e4ba3fd7d2 | ||
![]() |
b93e9a9e2d | ||
![]() |
9955845052 | ||
![]() |
eaeccf72c6 | ||
![]() |
7160266006 | ||
![]() |
70d6164770 | ||
![]() |
4e8b725794 | ||
![]() |
8a1e4c120f | ||
![]() |
41a39417f8 | ||
![]() |
371a9243c0 | ||
![]() |
8462f04bff | ||
![]() |
017add2474 | ||
![]() |
5f62373cf9 | ||
![]() |
e88b4f7d90 | ||
![]() |
25ad71e934 | ||
![]() |
e195723514 | ||
![]() |
26fe276a4d | ||
![]() |
14a0709271 | ||
![]() |
8da111674a | ||
![]() |
f829deaf63 | ||
![]() |
3306e291a4 | ||
![]() |
0d249eed9a | ||
![]() |
464ff358d6 | ||
![]() |
55b7c000e5 | ||
![]() |
5b152e5979 | ||
![]() |
0b73c80085 | ||
![]() |
d86d7338d8 | ||
![]() |
b6b6ccde7b | ||
![]() |
73c1289490 | ||
![]() |
778d4f4633 | ||
![]() |
8ac339f2be | ||
![]() |
3fe729c471 | ||
![]() |
c9c1e2414e | ||
![]() |
3133b46807 | ||
![]() |
ff921f9b31 | ||
![]() |
56ac61af0d | ||
![]() |
abc5d290e0 | ||
![]() |
57926865d6 | ||
![]() |
585603c50e | ||
![]() |
e6d2e1390c | ||
![]() |
3979168773 | ||
![]() |
3fcb1e225e | ||
![]() |
171a9e33d5 | ||
![]() |
e34e2a8ea7 | ||
![]() |
e786d60d78 | ||
![]() |
cf441d46a9 | ||
![]() |
c475db0b6f | ||
![]() |
6fe80c9ee6 | ||
![]() |
0d2b934823 | ||
![]() |
7ecaca5c9b | ||
![]() |
98b151afc5 | ||
![]() |
b0eb60fe66 | ||
![]() |
30570ad2bd | ||
![]() |
f49a88153b | ||
![]() |
3db7d31358 | ||
![]() |
b4ddd2624b | ||
![]() |
903c1b4bf4 | ||
![]() |
dafba12af6 | ||
![]() |
365aa667e8 | ||
![]() |
ebfdf66845 | ||
![]() |
4f5435b589 | ||
![]() |
ec8ab6d688 | ||
![]() |
a34899f885 | ||
![]() |
2fbb30b668 | ||
![]() |
1d6379a759 | ||
![]() |
6b9463836e | ||
![]() |
cbbcebd78f | ||
![]() |
bd7bdd2eb9 | ||
![]() |
98a56353dd | ||
![]() |
f158710818 | ||
![]() |
fe440b5d49 | ||
![]() |
2e3b27e775 | ||
![]() |
d64b74ff91 | ||
![]() |
01a465fa88 | ||
![]() |
a3293afcca | ||
![]() |
b644e2d25b | ||
![]() |
2dbdea9065 | ||
![]() |
4557dd0157 | ||
![]() |
52e795a0f5 | ||
![]() |
7cf39d8992 | ||
![]() |
f34f21c517 | ||
![]() |
43f4eeb4ee | ||
![]() |
4bed6e90c7 | ||
![]() |
f68568d6c3 | ||
![]() |
b5cec62be3 | ||
![]() |
8cbe82e039 |
4364 changed files with 110970 additions and 642335 deletions
53
.github/CODEOWNERS
vendored
53
.github/CODEOWNERS
vendored
|
@ -9,55 +9,4 @@
|
||||||
# This file uses an fnmatch-style matching pattern.
|
# This file uses an fnmatch-style matching pattern.
|
||||||
|
|
||||||
# Team Core
|
# Team Core
|
||||||
* @saltstack/team-core
|
* @saltstack/salt-core-maintainers
|
||||||
|
|
||||||
# Team Boto
|
|
||||||
salt/*/*boto* @saltstack/team-core
|
|
||||||
|
|
||||||
# Team Cloud
|
|
||||||
salt/cloud/* @saltstack/team-core
|
|
||||||
salt/utils/openstack/* @saltstack/team-core
|
|
||||||
salt/utils/aws.py @saltstack/team-core
|
|
||||||
salt/*/*cloud* @saltstack/team-core
|
|
||||||
|
|
||||||
# Team NetAPI
|
|
||||||
salt/cli/api.py @saltstack/team-core
|
|
||||||
salt/client/netapi.py @saltstack/team-core
|
|
||||||
salt/netapi/* @saltstack/team-core
|
|
||||||
|
|
||||||
# Team Network
|
|
||||||
salt/proxy/* @saltstack/team-core
|
|
||||||
|
|
||||||
# Team SPM
|
|
||||||
salt/cli/spm.py @saltstack/team-core
|
|
||||||
salt/spm/* @saltstack/team-core
|
|
||||||
|
|
||||||
# Team SSH
|
|
||||||
salt/cli/ssh.py @saltstack/team-core
|
|
||||||
salt/client/ssh/* @saltstack/team-core
|
|
||||||
salt/roster/* @saltstack/team-core
|
|
||||||
salt/runners/ssh.py @saltstack/team-core
|
|
||||||
salt/*/thin.py @saltstack/team-core
|
|
||||||
|
|
||||||
# Team State
|
|
||||||
salt/state.py @saltstack/team-core
|
|
||||||
|
|
||||||
# Team SUSE
|
|
||||||
salt/*/*btrfs* @saltstack/team-core
|
|
||||||
salt/*/*kubernetes* @saltstack/team-core
|
|
||||||
salt/*/*pkg* @saltstack/team-core
|
|
||||||
salt/*/*snapper* @saltstack/team-core
|
|
||||||
salt/*/*xfs* @saltstack/team-core
|
|
||||||
salt/*/*zypper* @saltstack/team-core
|
|
||||||
|
|
||||||
# Team Transport
|
|
||||||
salt/transport/* @saltstack/team-core
|
|
||||||
salt/utils/zeromq.py @saltstack/team-core
|
|
||||||
|
|
||||||
# Team Windows
|
|
||||||
salt/*/*win* @saltstack/team-core
|
|
||||||
salt/modules/reg.py @saltstack/team-core
|
|
||||||
salt/states/reg.py @saltstack/team-core
|
|
||||||
tests/*/*win* @saltstack/team-core
|
|
||||||
tests/*/test_reg.py @saltstack/team-core
|
|
||||||
tests/pytests/* @saltstack/team-core @s0undt3ch
|
|
||||||
|
|
7
.github/ISSUE_TEMPLATE/config.yml
vendored
7
.github/ISSUE_TEMPLATE/config.yml
vendored
|
@ -1,11 +1,8 @@
|
||||||
blank_issues_enabled: true
|
blank_issues_enabled: true
|
||||||
contact_links:
|
contact_links:
|
||||||
- name: Salt Community Slack
|
- name: Salt Community Discord
|
||||||
url: https://saltstackcommunity.slack.com/
|
url: https://discord.com/invite/J7b7EscrAs
|
||||||
about: Please ask and answer questions here.
|
about: Please ask and answer questions here.
|
||||||
- name: Salt-Users Forum
|
- name: Salt-Users Forum
|
||||||
url: https://groups.google.com/forum/#!forum/salt-users
|
url: https://groups.google.com/forum/#!forum/salt-users
|
||||||
about: Please ask and answer questions here.
|
about: Please ask and answer questions here.
|
||||||
- name: Salt on LiberaChat
|
|
||||||
url: https://web.libera.chat/#salt
|
|
||||||
about: Please ask and answer questions here.
|
|
||||||
|
|
2
.github/ISSUE_TEMPLATE/tech-debt.md
vendored
2
.github/ISSUE_TEMPLATE/tech-debt.md
vendored
|
@ -8,7 +8,7 @@ assignees: ''
|
||||||
---
|
---
|
||||||
|
|
||||||
### Description of the tech debt to be addressed, include links and screenshots
|
### Description of the tech debt to be addressed, include links and screenshots
|
||||||
<!-- Note: Please direct questions to the salt-users google group, IRC or Community Slack. -->
|
<!-- Note: Please direct questions to the salt-users google group, GitHub Discussions or Community Discord. -->
|
||||||
|
|
||||||
### Versions Report
|
### Versions Report
|
||||||
(Provided by running `salt --versions-report`. Please also mention any differences in master/minion versions.)
|
(Provided by running `salt --versions-report`. Please also mention any differences in master/minion versions.)
|
||||||
|
|
17
.github/PULL_REQUEST_TEMPLATE.md
vendored
17
.github/PULL_REQUEST_TEMPLATE.md
vendored
|
@ -1,7 +1,7 @@
|
||||||
### What does this PR do?
|
### What does this PR do?
|
||||||
|
|
||||||
### What issues does this PR fix or reference?
|
### What issues does this PR fix or reference?
|
||||||
Fixes:
|
Fixes
|
||||||
|
|
||||||
### Previous Behavior
|
### Previous Behavior
|
||||||
Remove this section if not relevant
|
Remove this section if not relevant
|
||||||
|
@ -11,7 +11,9 @@ Remove this section if not relevant
|
||||||
|
|
||||||
### Merge requirements satisfied?
|
### Merge requirements satisfied?
|
||||||
**[NOTICE] Bug fixes or features added to Salt require tests.**
|
**[NOTICE] Bug fixes or features added to Salt require tests.**
|
||||||
<!-- Please review the [test documentation](https://docs.saltproject.io/en/master/topics/tutorials/writing_tests.html) for details on how to implement tests into Salt's test suite. -->
|
<!-- Please review the test documentation for details on how to implement tests
|
||||||
|
into Salt's test suite:
|
||||||
|
https://docs.saltproject.io/en/master/topics/tutorials/writing_tests.html -->
|
||||||
- [ ] Docs
|
- [ ] Docs
|
||||||
- [ ] Changelog - https://docs.saltproject.io/en/master/topics/development/changelog.html
|
- [ ] Changelog - https://docs.saltproject.io/en/master/topics/development/changelog.html
|
||||||
- [ ] Tests written/updated
|
- [ ] Tests written/updated
|
||||||
|
@ -19,6 +21,13 @@ Remove this section if not relevant
|
||||||
### Commits signed with GPG?
|
### Commits signed with GPG?
|
||||||
Yes/No
|
Yes/No
|
||||||
|
|
||||||
Please review [Salt's Contributing Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html) for best practices.
|
<!-- Please review Salt's Contributing Guide for best practices and guidance in
|
||||||
|
choosing the right branch:
|
||||||
|
https://docs.saltproject.io/en/master/topics/development/contributing.html -->
|
||||||
|
|
||||||
See GitHub's [page on GPG signing](https://help.github.com/articles/signing-commits-using-gpg/) for more information about signing commits with GPG.
|
<!-- Additional guidance for pull requests can be found here:
|
||||||
|
https://docs.saltproject.io/en/master/topics/development/pull_requests.html -->
|
||||||
|
|
||||||
|
<!-- See GitHub's page on GPG signing for more information about signing commits
|
||||||
|
with GPG:
|
||||||
|
https://help.github.com/articles/signing-commits-using-gpg/ -->
|
||||||
|
|
13
.github/actionlint.yaml
vendored
13
.github/actionlint.yaml
vendored
|
@ -1,14 +1,5 @@
|
||||||
self-hosted-runner:
|
self-hosted-runner:
|
||||||
# Labels of self-hosted runner in array of string
|
# Labels of self-hosted runner in array of string
|
||||||
labels:
|
labels:
|
||||||
- bastion
|
- linux-x86_64
|
||||||
- x86_64
|
- linux-arm64
|
||||||
- arm64
|
|
||||||
- aarch64
|
|
||||||
- amd64
|
|
||||||
- repo-nightly
|
|
||||||
- repo-staging
|
|
||||||
- repo-release
|
|
||||||
- medium
|
|
||||||
- large
|
|
||||||
- macos-13-xlarge
|
|
||||||
|
|
18
.github/actions/build-onedir-deps/action.yml
vendored
18
.github/actions/build-onedir-deps/action.yml
vendored
|
@ -1,45 +1,33 @@
|
||||||
---
|
---
|
||||||
name: build-onedir-deps
|
name: build-onedir-deps
|
||||||
description: Build Onedir Dependencies
|
description: Build Onedir Dependencies
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
platform:
|
platform:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The platform to build
|
description: The platform to build
|
||||||
arch:
|
arch:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The platform arch to build
|
description: The platform arch to build
|
||||||
python-version:
|
python-version:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The python version to build
|
description: The python version to build
|
||||||
package-name:
|
package-name:
|
||||||
required: false
|
required: false
|
||||||
type: string
|
|
||||||
description: The onedir package name to create
|
description: The onedir package name to create
|
||||||
default: salt
|
default: salt
|
||||||
cache-prefix:
|
cache-prefix:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
RELENV_BUILDENV: 1
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Cache Deps Onedir Package Directory
|
- name: Cache Deps Onedir Package Directory
|
||||||
id: onedir-pkg-cache
|
id: onedir-pkg-cache
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: artifacts/${{ inputs.package-name }}
|
path: artifacts/${{ inputs.package-name }}
|
||||||
key: >
|
key: >
|
||||||
|
@ -56,6 +44,8 @@ runs:
|
||||||
- name: Install Salt Onedir Package Dependencies
|
- name: Install Salt Onedir Package Dependencies
|
||||||
shell: bash
|
shell: bash
|
||||||
if: steps.onedir-pkg-cache.outputs.cache-hit != 'true'
|
if: steps.onedir-pkg-cache.outputs.cache-hit != 'true'
|
||||||
|
env:
|
||||||
|
RELENV_BUILDENV: "1"
|
||||||
run: |
|
run: |
|
||||||
tools pkg build onedir-dependencies --arch ${{ inputs.arch }} --python-version ${{ inputs.python-version }} --package-name artifacts/${{ inputs.package-name }} --platform ${{ inputs.platform }}
|
tools pkg build onedir-dependencies --arch ${{ inputs.arch }} --python-version ${{ inputs.python-version }} --package-name artifacts/${{ inputs.package-name }} --platform ${{ inputs.platform }}
|
||||||
|
|
||||||
|
|
36
.github/actions/build-onedir-salt/action.yml
vendored
36
.github/actions/build-onedir-salt/action.yml
vendored
|
@ -1,61 +1,41 @@
|
||||||
---
|
---
|
||||||
name: build-onedir-salt
|
name: build-onedir-salt
|
||||||
description: Build Onedir Package
|
description: Build Onedir Package
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
platform:
|
platform:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The platform to build
|
description: The platform to build
|
||||||
arch:
|
arch:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The platform arch to build
|
description: The platform arch to build
|
||||||
package-name:
|
package-name:
|
||||||
required: false
|
required: false
|
||||||
type: string
|
|
||||||
description: The onedir package name to create
|
description: The onedir package name to create
|
||||||
default: salt
|
default: salt
|
||||||
cache-prefix:
|
cache-prefix:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
python-version:
|
python-version:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The python version to build
|
description: The python version to build
|
||||||
salt-version:
|
salt-version:
|
||||||
type: string
|
|
||||||
required: true
|
required: true
|
||||||
description: The Salt version to set prior to building packages.
|
description: The Salt version to set prior to building packages.
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
RELENV_BUILDENV: 1
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Download Cached Deps Onedir Package Directory
|
- name: Install Salt Packaging Dependencies into Relenv Onedir
|
||||||
id: onedir-bare-cache
|
uses: ./.github/actions/build-onedir-deps
|
||||||
uses: actions/cache@v3.3.1
|
|
||||||
with:
|
with:
|
||||||
path: artifacts/${{ inputs.package-name }}
|
platform: ${{ inputs.platform }}
|
||||||
key: >
|
arch: ${{ inputs.arch }}
|
||||||
${{ inputs.cache-prefix }}|${{ inputs.python-version }}|deps|${{ inputs.platform }}|${{ inputs.arch }}|${{ inputs.package-name }}|${{
|
python-version: "${{ inputs.python-version }}"
|
||||||
hashFiles(
|
cache-prefix: ${{ inputs.cache-seed }}|relenv|${{ inputs.salt-version }}
|
||||||
format('{0}/.relenv/**/*.xz', github.workspace),
|
|
||||||
'requirements/static/pkg/*/*.txt',
|
|
||||||
'.github/actions/build-onedir-deps/action.yml',
|
|
||||||
'.github/workflows/build-deps-onedir-*.yml',
|
|
||||||
'cicd/shared-gh-workflows-context.yml'
|
|
||||||
)
|
|
||||||
}}
|
|
||||||
|
|
||||||
- name: Download Source Tarball
|
- name: Download Source Tarball
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
|
@ -64,6 +44,8 @@ runs:
|
||||||
|
|
||||||
- name: Install Salt Into Onedir
|
- name: Install Salt Into Onedir
|
||||||
shell: bash
|
shell: bash
|
||||||
|
env:
|
||||||
|
RELENV_BUILDENV: "1"
|
||||||
run: |
|
run: |
|
||||||
tools pkg build salt-onedir salt-${{ inputs.salt-version }}.tar.gz --platform ${{ inputs.platform }} --package-name artifacts/${{ inputs.package-name }}
|
tools pkg build salt-onedir salt-${{ inputs.salt-version }}.tar.gz --platform ${{ inputs.platform }} --package-name artifacts/${{ inputs.package-name }}
|
||||||
|
|
||||||
|
|
|
@ -1,24 +1,17 @@
|
||||||
---
|
---
|
||||||
name: build-source-tarball
|
name: build-source-tarball
|
||||||
description: Build Source Tarball
|
description: Build Source Tarball
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
salt-version:
|
salt-version:
|
||||||
type: string
|
|
||||||
required: true
|
required: true
|
||||||
description: The Salt version to set prior to building the tarball.
|
description: The Salt version to set prior to building the tarball.
|
||||||
nox-version:
|
nox-version:
|
||||||
required: false
|
required: false
|
||||||
type: string
|
|
||||||
description: The version of Nox to install
|
description: The version of Nox to install
|
||||||
default: "2022.8.7"
|
default: "2022.8.7"
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
||||||
|
|
105
.github/actions/cache/action.yml
vendored
Normal file
105
.github/actions/cache/action.yml
vendored
Normal file
|
@ -0,0 +1,105 @@
|
||||||
|
---
|
||||||
|
name: cache
|
||||||
|
description: GitHub Actions Cache
|
||||||
|
inputs:
|
||||||
|
path:
|
||||||
|
description: 'A list of files, directories, and wildcard patterns to cache and restore'
|
||||||
|
required: true
|
||||||
|
key:
|
||||||
|
description: 'An explicit key for restoring and saving the cache'
|
||||||
|
required: true
|
||||||
|
restore-keys:
|
||||||
|
description: 'An ordered list of keys to use for restoring stale cache if no cache hit occurred for key. Note `cache-hit` returns false in this case.'
|
||||||
|
required: false
|
||||||
|
upload-chunk-size:
|
||||||
|
description: 'The chunk size used to split up large files during upload, in bytes'
|
||||||
|
required: false
|
||||||
|
enableCrossOsArchive:
|
||||||
|
description: 'An optional boolean when enabled, allows windows runners to save or restore caches that can be restored or saved respectively on other platforms'
|
||||||
|
default: 'false'
|
||||||
|
required: false
|
||||||
|
fail-on-cache-miss:
|
||||||
|
description: 'Fail the workflow if cache entry is not found'
|
||||||
|
default: 'false'
|
||||||
|
required: false
|
||||||
|
lookup-only:
|
||||||
|
description: 'Check if a cache entry exists for the given input(s) (key, restore-keys) without downloading the cache'
|
||||||
|
default: 'false'
|
||||||
|
required: false
|
||||||
|
|
||||||
|
outputs:
|
||||||
|
cache-hit:
|
||||||
|
description: 'A boolean value to indicate an exact match was found for the primary key'
|
||||||
|
value: ${{ steps.github-cache.outputs.cache-hit || steps.s3-cache.outputs.cache-hit }}
|
||||||
|
|
||||||
|
runs:
|
||||||
|
using: composite
|
||||||
|
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: Map inputs to environment variables
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
echo "GHA_CACHE_PATH=${{ inputs.path }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "GHA_CACHE_KEY=${{ inputs.key }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "GHA_CACHE_ENABLE_CROSS_OS_ARCHIVE=${{ inputs.enableCrossOsArchive }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "GHA_CACHE_FAIL_ON_CACHE_MISS=${{ inputs.fail-on-cache-miss }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "GHA_CACHE_LOOKUP_ONLY=${{ inputs.lookup-only }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "GHA_CACHE_RESTORE_KEYS=${{ inputs.restore-keys }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "GHA_CACHE_UPLOAD_CHUNK_SIZE=${{ inputs.upload-chunk-size }}" | tee -a "${GITHUB_ENV}"
|
||||||
|
|
||||||
|
- name: Cache Provided Path (GitHub Actions)
|
||||||
|
id: github-cache
|
||||||
|
if: ${{ env.USE_S3_CACHE != 'true' }}
|
||||||
|
uses: actions/cache@v4
|
||||||
|
with:
|
||||||
|
path: ${{ env.GHA_CACHE_PATH }}
|
||||||
|
key: ${{ env.GHA_CACHE_KEY }}
|
||||||
|
enableCrossOsArchive: ${{ env.GHA_CACHE_ENABLE_CROSS_OS_ARCHIVE }}
|
||||||
|
fail-on-cache-miss: ${{ env.GHA_CACHE_FAIL_ON_CACHE_MISS }}
|
||||||
|
lookup-only: ${{ env.GHA_CACHE_LOOKUP_ONLY }}
|
||||||
|
restore-keys: ${{ env.GHA_CACHE_RESTORE_KEYS }}
|
||||||
|
upload-chunk-size: ${{ env.GHA_CACHE_UPLOAD_CHUNK_SIZE }}
|
||||||
|
|
||||||
|
- name: Get Salt Project GitHub Actions Bot Environment
|
||||||
|
if: ${{ env.USE_S3_CACHE == 'true' }}
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
||||||
|
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
||||||
|
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" | tee -a "$GITHUB_ENV"
|
||||||
|
REGION=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/placement/region)
|
||||||
|
echo "GHA_CACHE_AWS_REGION=$REGION" | tee -a "$GITHUB_ENV"
|
||||||
|
|
||||||
|
- name: Configure AWS Credentials to access cache bucket
|
||||||
|
id: creds
|
||||||
|
if: ${{ env.USE_S3_CACHE == 'true' }}
|
||||||
|
uses: aws-actions/configure-aws-credentials@v4
|
||||||
|
with:
|
||||||
|
aws-region: ${{ env.GHA_CACHE_AWS_REGION }}
|
||||||
|
|
||||||
|
- name: Cache Provided Path (S3)
|
||||||
|
if: ${{ env.USE_S3_CACHE == 'true' }}
|
||||||
|
id: s3-cache
|
||||||
|
env:
|
||||||
|
AWS_REGION: ${{ env.GHA_CACHE_AWS_REGION }}
|
||||||
|
RUNS_ON_S3_BUCKET_CACHE: salt-project-${{ env.SPB_ENVIRONMENT}}-salt-github-actions-s3-cache
|
||||||
|
uses: runs-on/cache@v4
|
||||||
|
with:
|
||||||
|
path: ${{ env.GHA_CACHE_PATH }}
|
||||||
|
key: ${{ env.GHA_CACHE_KEY }}
|
||||||
|
enableCrossOsArchive: ${{ env.GHA_CACHE_ENABLE_CROSS_OS_ARCHIVE }}
|
||||||
|
fail-on-cache-miss: ${{ env.GHA_CACHE_FAIL_ON_CACHE_MISS }}
|
||||||
|
lookup-only: ${{ env.GHA_CACHE_LOOKUP_ONLY }}
|
||||||
|
restore-keys: ${{ env.GHA_CACHE_RESTORE_KEYS }}
|
||||||
|
upload-chunk-size: ${{ env.GHA_CACHE_UPLOAD_CHUNK_SIZE }}
|
||||||
|
|
||||||
|
- name: Verify 'fail-on-cache-miss'
|
||||||
|
if: ${{ inputs.fail-on-cache-miss == 'true' }}
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
CACHE_HIT="${{ steps.github-cache.outputs.cache-hit || steps.s3-cache.outputs.cache-hit }}"
|
||||||
|
if [ "$CACHE_HIT" != "true" ]; then
|
||||||
|
echo "No cache hit and fail-on-cache-miss is set to true."
|
||||||
|
exit 1
|
||||||
|
fi
|
13
.github/actions/cached-virtualenv/action.yml
vendored
13
.github/actions/cached-virtualenv/action.yml
vendored
|
@ -5,26 +5,23 @@ description: Setup a cached python virtual environment
|
||||||
inputs:
|
inputs:
|
||||||
name:
|
name:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The Virtualenv Name
|
description: The Virtualenv Name
|
||||||
cache-seed:
|
cache-seed:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
cache-hit:
|
cache-hit:
|
||||||
|
description: 'A boolean value to indicate an exact match was found for the primary key'
|
||||||
value: ${{ steps.cache-virtualenv.outputs.cache-hit }}
|
value: ${{ steps.cache-virtualenv.outputs.cache-hit }}
|
||||||
cache-key:
|
cache-key:
|
||||||
|
description: The value of the cache key
|
||||||
value: ${{ steps.setup-cache-key.outputs.cache-key }}
|
value: ${{ steps.setup-cache-key.outputs.cache-key }}
|
||||||
python-executable:
|
python-executable:
|
||||||
|
description: The path to the virtualenv's python executable
|
||||||
value: ${{ steps.define-python-executable.outputs.python-executable }}
|
value: ${{ steps.define-python-executable.outputs.python-executable }}
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
||||||
|
@ -54,7 +51,7 @@ runs:
|
||||||
|
|
||||||
- name: Cache VirtualEnv
|
- name: Cache VirtualEnv
|
||||||
id: cache-virtualenv
|
id: cache-virtualenv
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
key: ${{ steps.setup-cache-key.outputs.cache-key }}
|
key: ${{ steps.setup-cache-key.outputs.cache-key }}
|
||||||
path: ${{ steps.virtualenv-path.outputs.venv-path }}
|
path: ${{ steps.virtualenv-path.outputs.venv-path }}
|
||||||
|
|
41
.github/actions/download-artifact/action.yml
vendored
41
.github/actions/download-artifact/action.yml
vendored
|
@ -1,41 +0,0 @@
|
||||||
# This actions was inspired by https://github.com/alehechka/download-tartifact
|
|
||||||
---
|
|
||||||
name: Download Tar Artifact
|
|
||||||
description: >
|
|
||||||
Download and extract a tar artifact that was previously uploaded in the
|
|
||||||
workflow by the upload-tartifact action
|
|
||||||
|
|
||||||
inputs:
|
|
||||||
name:
|
|
||||||
description: Artifact name
|
|
||||||
required: false
|
|
||||||
path:
|
|
||||||
description: Destination path
|
|
||||||
required: false
|
|
||||||
archive-name:
|
|
||||||
description: >
|
|
||||||
By default `inputs.name`(last resort, `archive`) is what's used to name the archive.
|
|
||||||
This parameter allows a customizing that archive name. This will allow uploading multiple
|
|
||||||
archives under the same 'name', like the underlying official action does
|
|
||||||
without overriding the existing archives.
|
|
||||||
required: false
|
|
||||||
|
|
||||||
runs:
|
|
||||||
using: composite
|
|
||||||
steps:
|
|
||||||
- uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.name }}
|
|
||||||
path: ${{ inputs.path }}
|
|
||||||
|
|
||||||
- shell: bash
|
|
||||||
working-directory: ${{ inputs.path }}
|
|
||||||
run: |
|
|
||||||
tar -xvf ${{ inputs.archive-name || inputs.name || 'archive' }}.tar.gz
|
|
||||||
|
|
||||||
- shell: bash
|
|
||||||
working-directory: ${{ inputs.path }}
|
|
||||||
run: |
|
|
||||||
rm -f ${{ inputs.archive-name || inputs.name || 'archive' }}.tar.gz
|
|
|
@ -1,19 +1,24 @@
|
||||||
---
|
---
|
||||||
name: get-python-version
|
name: get-python-version
|
||||||
description: Setup Relenv
|
description: Setup Relenv
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
python-binary:
|
python-binary:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The python binary to get the version from
|
description: The python binary to get the version from
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
binary:
|
binary:
|
||||||
|
description: The python binary executable
|
||||||
value: ${{ steps.get-python-version.outputs.binary }}
|
value: ${{ steps.get-python-version.outputs.binary }}
|
||||||
version:
|
version:
|
||||||
|
description: The python version
|
||||||
value: ${{ steps.get-python-version.outputs.version }}
|
value: ${{ steps.get-python-version.outputs.version }}
|
||||||
full-version:
|
full-version:
|
||||||
|
description: The full python version
|
||||||
value: ${{ steps.get-python-version.outputs.full-version }}
|
value: ${{ steps.get-python-version.outputs.full-version }}
|
||||||
version-sha256sum:
|
version-sha256sum:
|
||||||
|
description: The sha256sum of the version
|
||||||
value: ${{ steps.get-python-version.outputs.version-sha256sum }}
|
value: ${{ steps.get-python-version.outputs.version-sha256sum }}
|
||||||
|
|
||||||
|
|
||||||
|
|
7
.github/actions/setup-actionlint/action.yml
vendored
7
.github/actions/setup-actionlint/action.yml
vendored
|
@ -1,21 +1,22 @@
|
||||||
---
|
---
|
||||||
name: setup-actionlint
|
name: setup-actionlint
|
||||||
description: Setup actionlint
|
description: Setup actionlint
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
version:
|
version:
|
||||||
description: The version of actionlint
|
description: The version of actionlint
|
||||||
default: 1.6.26
|
default: 1.7.7
|
||||||
cache-seed:
|
cache-seed:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Cache actionlint Binary
|
- name: Cache actionlint Binary
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: /usr/local/bin/actionlint
|
path: /usr/local/bin/actionlint
|
||||||
key: ${{ inputs.cache-seed }}|${{ runner.os }}|${{ runner.arch }}|actionlint|${{ inputs.version }}
|
key: ${{ inputs.cache-seed }}|${{ runner.os }}|${{ runner.arch }}|actionlint|${{ inputs.version }}
|
||||||
|
|
8
.github/actions/setup-pre-commit/action.yml
vendored
8
.github/actions/setup-pre-commit/action.yml
vendored
|
@ -4,19 +4,13 @@ description: Setup 'pre-commit'
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
version:
|
version:
|
||||||
type: string
|
|
||||||
description: Pre-commit version to install
|
description: Pre-commit version to install
|
||||||
required: true
|
required: true
|
||||||
default: 3.0.3
|
default: 3.0.3
|
||||||
cache-seed:
|
cache-seed:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
|
|
||||||
env:
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
@ -36,7 +30,7 @@ runs:
|
||||||
${{ steps.pre-commit-virtualenv.outputs.python-executable }} -m pip install pre-commit==${{ inputs.version }}
|
${{ steps.pre-commit-virtualenv.outputs.python-executable }} -m pip install pre-commit==${{ inputs.version }}
|
||||||
|
|
||||||
- name: Cache Pre-Commit Hooks
|
- name: Cache Pre-Commit Hooks
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
id: pre-commit-hooks-cache
|
id: pre-commit-hooks-cache
|
||||||
with:
|
with:
|
||||||
key: ${{ steps.pre-commit-virtualenv.outputs.cache-key }}|${{ inputs.version }}|${{ hashFiles('.pre-commit-config.yaml') }}
|
key: ${{ steps.pre-commit-virtualenv.outputs.cache-key }}|${{ inputs.version }}|${{ hashFiles('.pre-commit-config.yaml') }}
|
||||||
|
|
|
@ -5,23 +5,17 @@ description: Setup 'python-tools-scripts'
|
||||||
inputs:
|
inputs:
|
||||||
cache-prefix:
|
cache-prefix:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
cwd:
|
cwd:
|
||||||
type: string
|
|
||||||
description: The directory the salt checkout is located in
|
description: The directory the salt checkout is located in
|
||||||
default: "."
|
default: "."
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
version:
|
version:
|
||||||
|
description: "Return the python-tools-scripts version"
|
||||||
value: ${{ steps.get-version.outputs.version }}
|
value: ${{ steps.get-version.outputs.version }}
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
||||||
|
@ -50,7 +44,7 @@ runs:
|
||||||
cache-seed: tools|${{ steps.venv-hash.outputs.venv-hash }}
|
cache-seed: tools|${{ steps.venv-hash.outputs.venv-hash }}
|
||||||
|
|
||||||
- name: Restore Python Tools Virtualenvs Cache
|
- name: Restore Python Tools Virtualenvs Cache
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: ${{ inputs.cwd }}/.tools-venvs
|
path: ${{ inputs.cwd }}/.tools-venvs
|
||||||
key: ${{ inputs.cache-prefix }}|${{ steps.venv-hash.outputs.venv-hash }}
|
key: ${{ inputs.cache-prefix }}|${{ steps.venv-hash.outputs.venv-hash }}
|
||||||
|
@ -60,10 +54,13 @@ runs:
|
||||||
working-directory: ${{ inputs.cwd }}
|
working-directory: ${{ inputs.cwd }}
|
||||||
run: |
|
run: |
|
||||||
PYTHON_EXE=${{ steps.tools-virtualenv.outputs.python-executable }}
|
PYTHON_EXE=${{ steps.tools-virtualenv.outputs.python-executable }}
|
||||||
|
${PYTHON_EXE} -m ensurepip --upgrade
|
||||||
(${PYTHON_EXE} -m pip install --help | grep break-system-packages > /dev/null 2>&1) && exitcode=0 || exitcode=1
|
(${PYTHON_EXE} -m pip install --help | grep break-system-packages > /dev/null 2>&1) && exitcode=0 || exitcode=1
|
||||||
if [ $exitcode -eq 0 ]; then
|
if [ $exitcode -eq 0 ]; then
|
||||||
|
${PYTHON_EXE} -m pip install --break-system-packages --upgrade setuptools
|
||||||
${PYTHON_EXE} -m pip install --break-system-packages -r requirements/static/ci/py${{ steps.get-python-version.outputs.version }}/tools.txt
|
${PYTHON_EXE} -m pip install --break-system-packages -r requirements/static/ci/py${{ steps.get-python-version.outputs.version }}/tools.txt
|
||||||
else
|
else
|
||||||
|
${PYTHON_EXE} -m pip install --upgrade setuptools
|
||||||
${PYTHON_EXE} -m pip install -r requirements/static/ci/py${{ steps.get-python-version.outputs.version }}/tools.txt
|
${PYTHON_EXE} -m pip install -r requirements/static/ci/py${{ steps.get-python-version.outputs.version }}/tools.txt
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
|
14
.github/actions/setup-relenv/action.yml
vendored
14
.github/actions/setup-relenv/action.yml
vendored
|
@ -1,39 +1,31 @@
|
||||||
---
|
---
|
||||||
name: setup-relenv
|
name: setup-relenv
|
||||||
description: Setup Relenv
|
description: Setup Relenv
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
platform:
|
platform:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The platform to build
|
description: The platform to build
|
||||||
arch:
|
arch:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The platform arch to build
|
description: The platform arch to build
|
||||||
python-version:
|
python-version:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: The version of python to build
|
description: The version of python to build
|
||||||
cache-seed:
|
cache-seed:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
version:
|
version:
|
||||||
required: false
|
required: false
|
||||||
type: string
|
|
||||||
description: The version of relenv to use
|
description: The version of relenv to use
|
||||||
default: 0.13.2
|
default: 0.13.2
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
version:
|
version:
|
||||||
|
description: The relenv version
|
||||||
value: ${{ inputs.version }}
|
value: ${{ inputs.version }}
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
||||||
|
@ -45,7 +37,7 @@ runs:
|
||||||
python3 -m pip install relenv==${{ inputs.version }}
|
python3 -m pip install relenv==${{ inputs.version }}
|
||||||
|
|
||||||
- name: Cache Relenv Data Directory
|
- name: Cache Relenv Data Directory
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: ${{ github.workspace }}/.relenv
|
path: ${{ github.workspace }}/.relenv
|
||||||
key: ${{ inputs.cache-seed }}|relenv|${{ inputs.version }}|${{ inputs.python-version }}|${{ inputs.platform }}|${{ inputs.arch }}
|
key: ${{ inputs.cache-seed }}|relenv|${{ inputs.version }}|${{ inputs.python-version }}|${{ inputs.platform }}|${{ inputs.arch }}
|
||||||
|
|
13
.github/actions/setup-salt-version/action.yml
vendored
13
.github/actions/setup-salt-version/action.yml
vendored
|
@ -1,32 +1,29 @@
|
||||||
---
|
---
|
||||||
name: setup-salt-version
|
name: setup-salt-version
|
||||||
description: Setup Salt Version
|
description: Setup Salt Version
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
cwd:
|
cwd:
|
||||||
type: string
|
|
||||||
default: ""
|
default: ""
|
||||||
|
description: The current working directory to use
|
||||||
salt-version:
|
salt-version:
|
||||||
type: string
|
|
||||||
default: ""
|
default: ""
|
||||||
description: >
|
description: >
|
||||||
The Salt version to set prior to running tests or building packages.
|
The Salt version to set prior to running tests or building packages.
|
||||||
If not set, it is discover at run time, like, for example, capturing
|
If not set, it is discover at run time, like, for example, capturing
|
||||||
the output of running `python3 salt/version.py`
|
the output of running `python3 salt/version.py`
|
||||||
validate-version:
|
validate-version:
|
||||||
type: boolean
|
default: "false"
|
||||||
default: false
|
|
||||||
description: Validate the passed version.
|
description: Validate the passed version.
|
||||||
release:
|
release:
|
||||||
type: boolean
|
default: "false"
|
||||||
default: false
|
|
||||||
description: This is a release of salt.
|
description: This is a release of salt.
|
||||||
|
|
||||||
outputs:
|
outputs:
|
||||||
salt-version:
|
salt-version:
|
||||||
value: ${{ steps.setup-salt-version.outputs.salt-version }}
|
value: ${{ steps.setup-salt-version.outputs.salt-version }}
|
||||||
description: The Salt version written to `salt/_version.txt`
|
description: The Salt version written to `salt/_version.txt`
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
|
|
5
.github/actions/setup-shellcheck/action.yml
vendored
5
.github/actions/setup-shellcheck/action.yml
vendored
|
@ -1,21 +1,22 @@
|
||||||
---
|
---
|
||||||
name: setup-shellcheck
|
name: setup-shellcheck
|
||||||
description: Setup shellcheck
|
description: Setup shellcheck
|
||||||
|
|
||||||
inputs:
|
inputs:
|
||||||
version:
|
version:
|
||||||
description: The version of shellcheck
|
description: The version of shellcheck
|
||||||
default: v0.9.0
|
default: v0.9.0
|
||||||
cache-seed:
|
cache-seed:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Cache shellcheck Binary
|
- name: Cache shellcheck Binary
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: /usr/local/bin/shellcheck
|
path: /usr/local/bin/shellcheck
|
||||||
key: ${{ inputs.cache-seed }}|${{ runner.os }}|${{ runner.arch }}|shellcheck|${{ inputs.version }}
|
key: ${{ inputs.cache-seed }}|${{ runner.os }}|${{ runner.arch }}|shellcheck|${{ inputs.version }}
|
||||||
|
|
100
.github/actions/ssh-tunnel/README.md
vendored
Normal file
100
.github/actions/ssh-tunnel/README.md
vendored
Normal file
|
@ -0,0 +1,100 @@
|
||||||
|
# SSH Tunnel
|
||||||
|
|
||||||
|
The ssh-tunnel action will create a reverse tunnel over webrtc to port 22 on the runner.
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
In order to use this action you must have a sdp offer from your local host and a ssh key pair.
|
||||||
|
Start with creating an sdp offer on your local machine. Provide these values to the ssh-tunnel
|
||||||
|
action and wait for output from the action with the sdp reply. Provide the reply to the local
|
||||||
|
rtcforward.py process by pasting it to stdin. If all goes well the local port on your maching
|
||||||
|
will be forwarded to the ssh port on the runner.
|
||||||
|
|
||||||
|
### Getting an sdp offer
|
||||||
|
|
||||||
|
To get an sdp offer start rtcforward.py on you local machine with the offer command.
|
||||||
|
You can also specify which port on the local machine will be used for the tunnel.
|
||||||
|
|
||||||
|
``` bash
|
||||||
|
$ python3 .github/actions/ssh-tunnel/rtcforward.py offer --port 5222
|
||||||
|
```
|
||||||
|
|
||||||
|
rtcforward.py will create an offer an display it to your terminal. (This example offer has been truncated)
|
||||||
|
After showing the offer the `rtcforward.py` process will wait for a reply.
|
||||||
|
```
|
||||||
|
-- offer --
|
||||||
|
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg4NjUzIDM5NDczODg2NTMgSU4gSVA0IDAuMC4wLjBcclxu
|
||||||
|
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
|
||||||
|
bm09YXBwbGljYXRpb24gMzUyNjkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC4wLjIw
|
||||||
|
IHVkcCAxNjk0NDk4ODE1IDE4NC4xNzkuMjEwLjE1MiAzNTI2OSB0eXAgc3JmbHggcmFkZHIgMTkyLjE2
|
||||||
|
OC4wLjIwMSBycG9ydCAzNTI2OVxyXG5hPWNhbmRpZGF0ZTozZWFjMzJiZTZkY2RkMTAwZDcwMTFiNWY0
|
||||||
|
NTo4Qzo2MDoxMTpFQTo3NzpDMTo5RTo1QTo3QzpDQzowRDowODpFQzo2NDowQToxM1xyXG5hPWZpbmdl
|
||||||
|
cnByaW50OnNoYS01MTIgNjY6MzI6RUQ6MDA6N0I6QjY6NTQ6NzA6MzE6OTA6M0I6Mjg6Q0I6QTk6REU6
|
||||||
|
MzQ6QjI6NDY6NzE6NUI6MjM6ODA6Nzg6Njg6RDA6QTA6QTg6MjU6QkY6MDQ6ODY6NUY6OTA6QUY6MUQ6
|
||||||
|
QjA6QzY6ODA6QUY6OTc6QTI6MkM6NDI6QUU6MkI6Q0Q6Mjk6RUQ6MkI6ODc6NTU6ODg6NDY6QTM6ODk6
|
||||||
|
OEY6ODk6OTE6QTE6QTI6NDM6NTc6M0E6MjZcclxuYT1zZXR1cDphY3RwYXNzXHJcbiIsICJ0eXBlIjog
|
||||||
|
Im9mZmVyIn0=
|
||||||
|
-- end offer --
|
||||||
|
-- Please enter a message from remote party --
|
||||||
|
```
|
||||||
|
|
||||||
|
### Getting an sdp answer
|
||||||
|
|
||||||
|
Provide the offer to the ssh-tunnel action. When the action runs, an answer to the offer will be generated.
|
||||||
|
In the action output you will see that the offer was recieved and the reply in the output.
|
||||||
|
|
||||||
|
```
|
||||||
|
-- Please enter a message from remote party --
|
||||||
|
-- Message received --
|
||||||
|
-- reply --
|
||||||
|
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg3NDcxIDM5NDczODc0NzEgSU4gSVA0IDAuMC4wLjBcclxu
|
||||||
|
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
|
||||||
|
bm09YXBwbGljYXRpb24gNTcwMzkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC42NC4x
|
||||||
|
MFxyXG5hPW1pZDowXHJcbmE9c2N0cG1hcDo1MDAwIHdlYnJ0Yy1kYXRhY2hhbm5lbCA2NTUzNVxyXG5h
|
||||||
|
MTc6MEI6RTA6OTA6QUM6RjU6RTk6RUI6Q0E6RUE6NTY6REI6NTA6QTk6REY6NTU6MzY6MkM6REI6OUE6
|
||||||
|
MDc6Mzc6QTM6NDc6NjlcclxuYT1maW5nZXJwcmludDpzaGEtNTEyIDMyOjRDOjk0OkRDOjNFOkU5OkU3
|
||||||
|
OjNCOjc5OjI4OjZDOjc5OkFEOkVDOjIzOkJDOjRBOjRBOjE5OjlCOjg5OkE3OkE2OjZBOjAwOjJFOkM5
|
||||||
|
OkE0OjlEOjAwOjM0OjFFOjRDOkVGOjcwOkY5OkNBOjg0OjlEOjcxOjI5OkVCOkIxOkREOkFEOjg5OjUx
|
||||||
|
OkZFOjhCOjI3OjFDOjFBOkJEOjUxOjQ2OjE4OjBBOjhFOjVBOjI1OjQzOjQzOjZGOkRBXHJcbmE9c2V0
|
||||||
|
dXA6YWN0aXZlXHJcbiIsICJ0eXBlIjogImFuc3dlciJ9
|
||||||
|
-- end reply --
|
||||||
|
```
|
||||||
|
|
||||||
|
# Finalizing the tunnel
|
||||||
|
|
||||||
|
Paste the sdp reply from the running action into the running `rtcforward.py` process that created the offer.
|
||||||
|
After receiveing the offer you will see `-- Message received --` and tunnel will be created.
|
||||||
|
|
||||||
|
```
|
||||||
|
-- offer --
|
||||||
|
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg4NjUzIDM5NDczODg2NTMgSU4gSVA0IDAuMC4wLjBcclxu
|
||||||
|
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
|
||||||
|
bm09YXBwbGljYXRpb24gMzUyNjkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC4wLjIw
|
||||||
|
IHVkcCAxNjk0NDk4ODE1IDE4NC4xNzkuMjEwLjE1MiAzNTI2OSB0eXAgc3JmbHggcmFkZHIgMTkyLjE2
|
||||||
|
OC4wLjIwMSBycG9ydCAzNTI2OVxyXG5hPWNhbmRpZGF0ZTozZWFjMzJiZTZkY2RkMTAwZDcwMTFiNWY0
|
||||||
|
NTo4Qzo2MDoxMTpFQTo3NzpDMTo5RTo1QTo3QzpDQzowRDowODpFQzo2NDowQToxM1xyXG5hPWZpbmdl
|
||||||
|
cnByaW50OnNoYS01MTIgNjY6MzI6RUQ6MDA6N0I6QjY6NTQ6NzA6MzE6OTA6M0I6Mjg6Q0I6QTk6REU6
|
||||||
|
MzQ6QjI6NDY6NzE6NUI6MjM6ODA6Nzg6Njg6RDA6QTA6QTg6MjU6QkY6MDQ6ODY6NUY6OTA6QUY6MUQ6
|
||||||
|
QjA6QzY6ODA6QUY6OTc6QTI6MkM6NDI6QUU6MkI6Q0Q6Mjk6RUQ6MkI6ODc6NTU6ODg6NDY6QTM6ODk6
|
||||||
|
OEY6ODk6OTE6QTE6QTI6NDM6NTc6M0E6MjZcclxuYT1zZXR1cDphY3RwYXNzXHJcbiIsICJ0eXBlIjog
|
||||||
|
Im9mZmVyIn0=
|
||||||
|
-- end offer --
|
||||||
|
-- Please enter a message from remote party --
|
||||||
|
eyJzZHAiOiAidj0wXHJcbm89LSAzOTQ3Mzg3NDcxIDM5NDczODc0NzEgSU4gSVA0IDAuMC4wLjBcclxu
|
||||||
|
cz0tXHJcbnQ9MCAwXHJcbmE9Z3JvdXA6QlVORExFIDBcclxuYT1tc2lkLXNlbWFudGljOldNUyAqXHJc
|
||||||
|
bm09YXBwbGljYXRpb24gNTcwMzkgRFRMUy9TQ1RQIDUwMDBcclxuYz1JTiBJUDQgMTkyLjE2OC42NC4x
|
||||||
|
MFxyXG5hPW1pZDowXHJcbmE9c2N0cG1hcDo1MDAwIHdlYnJ0Yy1kYXRhY2hhbm5lbCA2NTUzNVxyXG5h
|
||||||
|
MTc6MEI6RTA6OTA6QUM6RjU6RTk6RUI6Q0E6RUE6NTY6REI6NTA6QTk6REY6NTU6MzY6MkM6REI6OUE6
|
||||||
|
MDc6Mzc6QTM6NDc6NjlcclxuYT1maW5nZXJwcmludDpzaGEtNTEyIDMyOjRDOjk0OkRDOjNFOkU5OkU3
|
||||||
|
OjNCOjc5OjI4OjZDOjc5OkFEOkVDOjIzOkJDOjRBOjRBOjE5OjlCOjg5OkE3OkE2OjZBOjAwOjJFOkM5
|
||||||
|
OkE0OjlEOjAwOjM0OjFFOjRDOkVGOjcwOkY5OkNBOjg0OjlEOjcxOjI5OkVCOkIxOkREOkFEOjg5OjUx
|
||||||
|
OkZFOjhCOjI3OjFDOjFBOkJEOjUxOjQ2OjE4OjBBOjhFOjVBOjI1OjQzOjQzOjZGOkRBXHJcbmE9c2V0
|
||||||
|
dXA6YWN0aXZlXHJcbiIsICJ0eXBlIjogImFuc3dlciJ9
|
||||||
|
-- Message received --
|
||||||
|
```
|
||||||
|
|
||||||
|
SSH to your local port.
|
||||||
|
|
||||||
|
```
|
||||||
|
ssh -o StrictHostKeychecking=no -o TCPKeepAlive=no -o StrictHostKeyChecking=no -vv -p 5222 runner@localhost
|
||||||
|
```
|
107
.github/actions/ssh-tunnel/action.yml
vendored
Normal file
107
.github/actions/ssh-tunnel/action.yml
vendored
Normal file
|
@ -0,0 +1,107 @@
|
||||||
|
name: ssh-tunnel
|
||||||
|
description: SSH Reverse Tunnel
|
||||||
|
|
||||||
|
inputs:
|
||||||
|
public_key:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Public key to accept for reverse tunnel. Warning, this should not be the public key for the 'private_key' input.
|
||||||
|
offer:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: RTC offer
|
||||||
|
debug:
|
||||||
|
required: false
|
||||||
|
type: bool
|
||||||
|
default: false
|
||||||
|
description: Run sshd with debug enabled.
|
||||||
|
|
||||||
|
runs:
|
||||||
|
using: composite
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
|
||||||
|
- name: Install ssh
|
||||||
|
if: ${{ runner.os == 'Windows' }}
|
||||||
|
shell: powershell
|
||||||
|
run: |
|
||||||
|
python3.exe -m pip install requests
|
||||||
|
python3.exe .github/actions/ssh-tunnel/installssh.py
|
||||||
|
|
||||||
|
- name: Start SSH
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
if [ "$RUNNER_OS" = "Windows" ]; then
|
||||||
|
powershell.exe -command "Start-Service sshd"
|
||||||
|
elif [ "$RUNNER_OS" = "macOS" ]; then
|
||||||
|
sudo launchctl load -w /System/Library/LaunchDaemons/ssh.plist
|
||||||
|
else
|
||||||
|
sudo systemctl start ssh
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Show sshd configuration
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
if [ "$RUNNER_OS" = "Linux" ]; then
|
||||||
|
cat /etc/ssh/sshd_config
|
||||||
|
elif [ "$RUNNER_OS" = "macOS" ]; then
|
||||||
|
cat /private/etc/ssh/sshd_config
|
||||||
|
else
|
||||||
|
cat "C:\ProgramData\ssh\sshd_config"
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Add ssh public key
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
if [ "$RUNNER_OS" = "Linux" ]; then
|
||||||
|
mkdir -p /home/runner/.ssh
|
||||||
|
chmod 700 /home/runner/.ssh
|
||||||
|
touch /home/runner/.ssh/authorized_keys
|
||||||
|
echo "${{ inputs.public_key }}" | tee -a /home/runner/.ssh/authorized_keys
|
||||||
|
elif [ "$RUNNER_OS" = "macOS" ]; then
|
||||||
|
mkdir -p /Users/runner/.ssh
|
||||||
|
chmod 700 /Users/runner/.ssh
|
||||||
|
touch /Users/runner/.ssh/authorized_keys
|
||||||
|
echo "${{ inputs.public_key }}" | tee -a /Users/runner/.ssh/authorized_keys
|
||||||
|
else
|
||||||
|
echo "${{ inputs.public_key }}" | tee -a "C:\ProgramData\ssh\administrators_authorized_keys"
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Stop SSHD
|
||||||
|
if: ${{ inputs.debug }}
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
if [ "${{ inputs.debug }}" = "true" ]; then
|
||||||
|
if [ "$RUNNER_OS" = "Windows" ]; then
|
||||||
|
powershell.exe -command "Stop-Service sshd"
|
||||||
|
elif [ "$RUNNER_OS" = "macOS" ]; then
|
||||||
|
sudo launchctl unload /System/Library/LaunchDaemons/ssh.plist
|
||||||
|
else
|
||||||
|
sudo systemctl stop ssh
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Create rtc tunnel
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
if [ "${{ inputs.debug }}" = "true" ]; then
|
||||||
|
if [ "$RUNNER_OS" = "Windows" ]; then
|
||||||
|
./OpenSSH-Win64/sshd.exe -d &
|
||||||
|
elif [ "$RUNNER_OS" = "macOS" ]; then
|
||||||
|
sudo /usr/sbin/sshd -d &
|
||||||
|
else
|
||||||
|
sudo mkdir -p /run/sshd
|
||||||
|
sudo chmod 755 /run/sshd
|
||||||
|
sudo /usr/sbin/sshd -d &
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
if [ "$RUNNER_OS" = "Windows" ]; then
|
||||||
|
python3 -m pip install aiortc
|
||||||
|
else
|
||||||
|
python3 -m pip install aiortc uvloop
|
||||||
|
fi
|
||||||
|
echo '${{ inputs.offer }}' | python .github/actions/ssh-tunnel/rtcforward.py --port 22 answer
|
44
.github/actions/ssh-tunnel/installssh.py
vendored
Normal file
44
.github/actions/ssh-tunnel/installssh.py
vendored
Normal file
|
@ -0,0 +1,44 @@
|
||||||
|
"""
|
||||||
|
"""
|
||||||
|
|
||||||
|
import pathlib
|
||||||
|
import subprocess
|
||||||
|
import zipfile
|
||||||
|
|
||||||
|
import requests
|
||||||
|
|
||||||
|
fwrule = """
|
||||||
|
New-NetFirewallRule `
|
||||||
|
-Name sshd `
|
||||||
|
-DisplayName 'OpenSSH SSH Server' `
|
||||||
|
-Enabled True `
|
||||||
|
-Direction Inbound `
|
||||||
|
-Protocol TCP `
|
||||||
|
-Action Allow `
|
||||||
|
-LocalPort 22 `
|
||||||
|
-Program "{}"
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def start_ssh_server():
|
||||||
|
"""
|
||||||
|
Pretty print the GH Actions event.
|
||||||
|
"""
|
||||||
|
resp = requests.get(
|
||||||
|
"https://github.com/PowerShell/Win32-OpenSSH/releases/download/v9.8.1.0p1-Preview/OpenSSH-Win64.zip",
|
||||||
|
allow_redirects=True,
|
||||||
|
)
|
||||||
|
with open("openssh.zip", "wb") as fp:
|
||||||
|
fp.write(resp.content)
|
||||||
|
with zipfile.ZipFile("openssh.zip") as fp:
|
||||||
|
fp.extractall()
|
||||||
|
install_script = pathlib.Path("./OpenSSH-Win64/install-sshd.ps1").resolve()
|
||||||
|
print(f"{install_script}")
|
||||||
|
subprocess.call(["powershell.exe", f"{install_script}"])
|
||||||
|
with open("fwrule.ps1", "w") as fp:
|
||||||
|
fp.write(fwrule.format(install_script.parent / "sshd.exe"))
|
||||||
|
subprocess.call(["powershell.exe", f"fwrule.ps1"])
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
start_ssh_server()
|
386
.github/actions/ssh-tunnel/rtcforward.py
vendored
Normal file
386
.github/actions/ssh-tunnel/rtcforward.py
vendored
Normal file
|
@ -0,0 +1,386 @@
|
||||||
|
import argparse
|
||||||
|
import asyncio
|
||||||
|
import base64
|
||||||
|
import concurrent
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import signal
|
||||||
|
import sys
|
||||||
|
import textwrap
|
||||||
|
import time
|
||||||
|
|
||||||
|
aiortc = None
|
||||||
|
try:
|
||||||
|
import aiortc.exceptions
|
||||||
|
from aiortc import RTCIceCandidate, RTCPeerConnection, RTCSessionDescription
|
||||||
|
from aiortc.contrib.signaling import BYE, add_signaling_arguments, create_signaling
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
uvloop = None
|
||||||
|
try:
|
||||||
|
import uvloop
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if sys.platform == "win32":
|
||||||
|
if not aiortc:
|
||||||
|
print("Please run 'pip install aiortc' and try again.")
|
||||||
|
sys.exit(1)
|
||||||
|
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
|
||||||
|
else:
|
||||||
|
if not aiortc or not uvloop:
|
||||||
|
print("Please run 'pip install aiortc uvloop' and try again.")
|
||||||
|
sys.exit(1)
|
||||||
|
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
|
||||||
|
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def object_from_string(message_str):
|
||||||
|
message = json.loads(message_str)
|
||||||
|
if message["type"] in ["answer", "offer"]:
|
||||||
|
return RTCSessionDescription(**message)
|
||||||
|
elif message["type"] == "candidate" and message["candidate"]:
|
||||||
|
candidate = candidate_from_sdp(message["candidate"].split(":", 1)[1])
|
||||||
|
candidate.sdpMid = message["id"]
|
||||||
|
candidate.sdpMLineIndex = message["label"]
|
||||||
|
return candidate
|
||||||
|
elif message["type"] == "bye":
|
||||||
|
return BYE
|
||||||
|
|
||||||
|
|
||||||
|
def object_to_string(obj):
|
||||||
|
if isinstance(obj, RTCSessionDescription):
|
||||||
|
message = {"sdp": obj.sdp, "type": obj.type}
|
||||||
|
elif isinstance(obj, RTCIceCandidate):
|
||||||
|
message = {
|
||||||
|
"candidate": "candidate:" + candidate_to_sdp(obj),
|
||||||
|
"id": obj.sdpMid,
|
||||||
|
"label": obj.sdpMLineIndex,
|
||||||
|
"type": "candidate",
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
assert obj is BYE
|
||||||
|
message = {"type": "bye"}
|
||||||
|
return json.dumps(message, sort_keys=True)
|
||||||
|
|
||||||
|
|
||||||
|
def print_pastable(data, message="offer"):
|
||||||
|
print(f"-- {message} --")
|
||||||
|
sys.stdout.flush()
|
||||||
|
print(f"{data}")
|
||||||
|
sys.stdout.flush()
|
||||||
|
print(f"-- end {message} --")
|
||||||
|
sys.stdout.flush()
|
||||||
|
|
||||||
|
|
||||||
|
async def read_from_stdin():
|
||||||
|
loop = asyncio.get_event_loop()
|
||||||
|
line = await loop.run_in_executor(
|
||||||
|
None, input, "-- Please enter a message from remote party --\n"
|
||||||
|
)
|
||||||
|
data = line
|
||||||
|
while line:
|
||||||
|
try:
|
||||||
|
line = await loop.run_in_executor(None, input)
|
||||||
|
except EOFError:
|
||||||
|
break
|
||||||
|
data += line
|
||||||
|
print("-- Message received --")
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
class Channels:
|
||||||
|
def __init__(self, channels=None):
|
||||||
|
if channels is None:
|
||||||
|
channels = []
|
||||||
|
self.channels = channels
|
||||||
|
|
||||||
|
def add(self, channel):
|
||||||
|
self.channels.append(channel)
|
||||||
|
|
||||||
|
def close(self):
|
||||||
|
for channel in self.channels:
|
||||||
|
channel.close()
|
||||||
|
|
||||||
|
|
||||||
|
class ProxyConnection:
|
||||||
|
def __init__(self, pc, channel):
|
||||||
|
self.pc = pc
|
||||||
|
self.channel = channel
|
||||||
|
|
||||||
|
|
||||||
|
class ProxyClient:
|
||||||
|
|
||||||
|
def __init__(self, args, channel):
|
||||||
|
self.args = args
|
||||||
|
self.channel = channel
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
self.channel.on("message")(self.on_message)
|
||||||
|
|
||||||
|
def on_message(self, message):
|
||||||
|
msg = json.loads(message)
|
||||||
|
key = msg["key"]
|
||||||
|
data = msg["data"]
|
||||||
|
log.debug("new connection messsage %s", key)
|
||||||
|
|
||||||
|
pc = RTCPeerConnection()
|
||||||
|
|
||||||
|
@pc.on("datachannel")
|
||||||
|
def on_channel(channel):
|
||||||
|
log.info("Sub channel established %s", key)
|
||||||
|
asyncio.ensure_future(self.handle_channel(channel))
|
||||||
|
|
||||||
|
async def finalize_connection():
|
||||||
|
obj = object_from_string(data)
|
||||||
|
if isinstance(obj, RTCSessionDescription):
|
||||||
|
await pc.setRemoteDescription(obj)
|
||||||
|
if obj.type == "offer":
|
||||||
|
# send answer
|
||||||
|
await pc.setLocalDescription(await pc.createAnswer())
|
||||||
|
msg = {"key": key, "data": object_to_string(pc.localDescription)}
|
||||||
|
self.channel.send(json.dumps(msg))
|
||||||
|
elif isinstance(obj, RTCIceCandidate):
|
||||||
|
await pc.addIceCandidate(obj)
|
||||||
|
elif obj is BYE:
|
||||||
|
log.warning("Exiting")
|
||||||
|
|
||||||
|
asyncio.ensure_future(finalize_connection())
|
||||||
|
|
||||||
|
async def handle_channel(self, channel):
|
||||||
|
try:
|
||||||
|
reader, writer = await asyncio.open_connection("127.0.0.1", self.args.port)
|
||||||
|
log.info("opened connection to port %s", self.args.port)
|
||||||
|
|
||||||
|
@channel.on("message")
|
||||||
|
def on_message(message):
|
||||||
|
log.debug("rtc to socket %r", message)
|
||||||
|
writer.write(message)
|
||||||
|
asyncio.ensure_future(writer.drain())
|
||||||
|
|
||||||
|
while True:
|
||||||
|
data = await reader.read(100)
|
||||||
|
if data:
|
||||||
|
log.debug("socket to rtc %r", data)
|
||||||
|
channel.send(data)
|
||||||
|
except Exception:
|
||||||
|
log.exception("WTF4")
|
||||||
|
|
||||||
|
|
||||||
|
class ProxyServer:
|
||||||
|
|
||||||
|
def __init__(self, args, channel):
|
||||||
|
self.args = args
|
||||||
|
self.channel = channel
|
||||||
|
self.connections = {}
|
||||||
|
|
||||||
|
async def start(self):
|
||||||
|
@self.channel.on("message")
|
||||||
|
def handle_message(message):
|
||||||
|
asyncio.ensure_future(self.handle_message(message))
|
||||||
|
|
||||||
|
self.server = await asyncio.start_server(
|
||||||
|
self.new_connection, "127.0.0.1", self.args.port
|
||||||
|
)
|
||||||
|
log.info("Listening on port %s", self.args.port)
|
||||||
|
async with self.server:
|
||||||
|
await self.server.serve_forever()
|
||||||
|
|
||||||
|
async def handle_message(self, message):
|
||||||
|
msg = json.loads(message)
|
||||||
|
key = msg["key"]
|
||||||
|
pc = self.connections[key].pc
|
||||||
|
channel = self.connections[key].channel
|
||||||
|
obj = object_from_string(msg["data"])
|
||||||
|
if isinstance(obj, RTCSessionDescription):
|
||||||
|
await pc.setRemoteDescription(obj)
|
||||||
|
if obj.type == "offer":
|
||||||
|
# send answer
|
||||||
|
await pc.setLocalDescription(await pc.createAnswer())
|
||||||
|
msg = {
|
||||||
|
"key": key,
|
||||||
|
"data": object_to_string(pc.localDescription),
|
||||||
|
}
|
||||||
|
self.channel.send(json.dumps(msg))
|
||||||
|
elif isinstance(obj, RTCIceCandidate):
|
||||||
|
await pc.addIceCandidate(obj)
|
||||||
|
elif obj is BYE:
|
||||||
|
print("Exiting")
|
||||||
|
|
||||||
|
async def new_connection(self, reader, writer):
|
||||||
|
try:
|
||||||
|
info = writer.get_extra_info("peername")
|
||||||
|
key = f"{info[0]}:{info[1]}"
|
||||||
|
log.info("Connection from %s", key)
|
||||||
|
pc = RTCPeerConnection()
|
||||||
|
channel = pc.createDataChannel("{key}")
|
||||||
|
|
||||||
|
async def readerproxy():
|
||||||
|
while True:
|
||||||
|
data = await reader.read(100)
|
||||||
|
if data:
|
||||||
|
log.debug("socket to rtc %r", data)
|
||||||
|
try:
|
||||||
|
channel.send(data)
|
||||||
|
except aiortc.exceptions.InvalidStateError:
|
||||||
|
log.error(
|
||||||
|
"Channel was in an invalid state %s, bailing reader coroutine",
|
||||||
|
key,
|
||||||
|
)
|
||||||
|
break
|
||||||
|
|
||||||
|
@channel.on("open")
|
||||||
|
def on_open():
|
||||||
|
asyncio.ensure_future(readerproxy())
|
||||||
|
|
||||||
|
@channel.on("message")
|
||||||
|
def on_message(message):
|
||||||
|
log.debug("rtc to socket %r", message)
|
||||||
|
writer.write(message)
|
||||||
|
asyncio.ensure_future(writer.drain())
|
||||||
|
|
||||||
|
self.connections[key] = ProxyConnection(pc, channel)
|
||||||
|
await pc.setLocalDescription(await pc.createOffer())
|
||||||
|
msg = {
|
||||||
|
"key": key,
|
||||||
|
"data": object_to_string(pc.localDescription),
|
||||||
|
}
|
||||||
|
log.debug("Send new offer")
|
||||||
|
self.channel.send(json.dumps(msg, sort_keys=True))
|
||||||
|
except Exception:
|
||||||
|
log.exception("WTF")
|
||||||
|
|
||||||
|
|
||||||
|
async def run_answer(stop, pc, args):
|
||||||
|
"""
|
||||||
|
Top level offer answer server.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@pc.on("datachannel")
|
||||||
|
def on_datachannel(channel):
|
||||||
|
log.info("Channel created")
|
||||||
|
client = ProxyClient(args, channel)
|
||||||
|
client.start()
|
||||||
|
|
||||||
|
data = await read_from_stdin()
|
||||||
|
data = base64.b64decode(data)
|
||||||
|
obj = object_from_string(data)
|
||||||
|
if isinstance(obj, RTCSessionDescription):
|
||||||
|
log.debug("received rtc session description")
|
||||||
|
await pc.setRemoteDescription(obj)
|
||||||
|
if obj.type == "offer":
|
||||||
|
await pc.setLocalDescription(await pc.createAnswer())
|
||||||
|
data = object_to_string(pc.localDescription)
|
||||||
|
data = base64.b64encode(data.encode())
|
||||||
|
data = os.linesep.join(textwrap.wrap(data.decode(), 80))
|
||||||
|
print_pastable(data, "reply")
|
||||||
|
elif isinstance(obj, RTCIceCandidate):
|
||||||
|
log.debug("received rtc ice candidate")
|
||||||
|
await pc.addIceCandidate(obj)
|
||||||
|
elif obj is BYE:
|
||||||
|
print("Exiting")
|
||||||
|
|
||||||
|
while not stop.is_set():
|
||||||
|
await asyncio.sleep(0.3)
|
||||||
|
|
||||||
|
|
||||||
|
async def run_offer(stop, pc, args):
|
||||||
|
"""
|
||||||
|
Top level offer server this will estabilsh a data channel and start a tcp
|
||||||
|
server on the port provided. New connections to the server will start the
|
||||||
|
creation of a new rtc connectin and a new data channel used for proxying
|
||||||
|
the client's connection to the remote side.
|
||||||
|
"""
|
||||||
|
control_channel = pc.createDataChannel("main")
|
||||||
|
log.info("Created control channel.")
|
||||||
|
|
||||||
|
async def start_server():
|
||||||
|
"""
|
||||||
|
Start the proxy server. The proxy server will create a local port and
|
||||||
|
handle creation of additional rtc peer connections for each new client
|
||||||
|
to the proxy server port.
|
||||||
|
"""
|
||||||
|
server = ProxyServer(args, control_channel)
|
||||||
|
await server.start()
|
||||||
|
|
||||||
|
@control_channel.on("open")
|
||||||
|
def on_open():
|
||||||
|
"""
|
||||||
|
Start the proxy server when the control channel is connected.
|
||||||
|
"""
|
||||||
|
asyncio.ensure_future(start_server())
|
||||||
|
|
||||||
|
await pc.setLocalDescription(await pc.createOffer())
|
||||||
|
|
||||||
|
data = object_to_string(pc.localDescription).encode()
|
||||||
|
data = base64.b64encode(data)
|
||||||
|
data = os.linesep.join(textwrap.wrap(data.decode(), 80))
|
||||||
|
|
||||||
|
print_pastable(data, "offer")
|
||||||
|
|
||||||
|
data = await read_from_stdin()
|
||||||
|
data = base64.b64decode(data.encode())
|
||||||
|
obj = object_from_string(data)
|
||||||
|
if isinstance(obj, RTCSessionDescription):
|
||||||
|
log.debug("received rtc session description")
|
||||||
|
await pc.setRemoteDescription(obj)
|
||||||
|
if obj.type == "offer":
|
||||||
|
# send answer
|
||||||
|
await pc.setLocalDescription(await pc.createAnswer())
|
||||||
|
await signaling.send(pc.localDescription)
|
||||||
|
elif isinstance(obj, RTCIceCandidate):
|
||||||
|
log.debug("received rtc ice candidate")
|
||||||
|
await pc.addIceCandidate(obj)
|
||||||
|
elif obj is BYE:
|
||||||
|
print("Exiting")
|
||||||
|
|
||||||
|
while not stop.is_set():
|
||||||
|
await asyncio.sleep(0.3)
|
||||||
|
|
||||||
|
|
||||||
|
async def signal_handler(stop, pc):
|
||||||
|
stop.set()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
if sys.platform == "win32":
|
||||||
|
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
|
||||||
|
parser = argparse.ArgumentParser(description="Port proxy")
|
||||||
|
parser.add_argument("role", choices=["offer", "answer"])
|
||||||
|
parser.add_argument("--port", type=int, default=11224)
|
||||||
|
parser.add_argument("--verbose", "-v", action="count", default=None)
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
if args.verbose is None:
|
||||||
|
logging.basicConfig(level=logging.WARNING)
|
||||||
|
elif args.verbose > 1:
|
||||||
|
logging.basicConfig(level=logging.DEBUG)
|
||||||
|
else:
|
||||||
|
logging.basicConfig(level=logging.INFO)
|
||||||
|
stop = asyncio.Event()
|
||||||
|
pc = RTCPeerConnection()
|
||||||
|
if args.role == "offer":
|
||||||
|
coro = run_offer(stop, pc, args)
|
||||||
|
else:
|
||||||
|
coro = run_answer(stop, pc, args)
|
||||||
|
|
||||||
|
# run event loop
|
||||||
|
loop = asyncio.new_event_loop()
|
||||||
|
asyncio.set_event_loop(loop)
|
||||||
|
for signame in ("SIGINT", "SIGTERM"):
|
||||||
|
loop.add_signal_handler(
|
||||||
|
getattr(signal, signame),
|
||||||
|
lambda: asyncio.create_task(signal_handler(stop, pc)),
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
loop.run_until_complete(coro)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
pass
|
||||||
|
finally:
|
||||||
|
loop.run_until_complete(pc.close())
|
5
.github/actions/upload-artifact/action.yml
vendored
5
.github/actions/upload-artifact/action.yml
vendored
|
@ -37,6 +37,7 @@ inputs:
|
||||||
without overriding the existing archives.
|
without overriding the existing archives.
|
||||||
required: false
|
required: false
|
||||||
|
|
||||||
|
|
||||||
runs:
|
runs:
|
||||||
using: composite
|
using: composite
|
||||||
steps:
|
steps:
|
||||||
|
@ -45,9 +46,7 @@ runs:
|
||||||
shopt -s globstar || echo "'globstar' not available"
|
shopt -s globstar || echo "'globstar' not available"
|
||||||
tar -cavf ${{ inputs.archive-name || inputs.name || 'archive' }}.tar.gz ${{ inputs.path }}
|
tar -cavf ${{ inputs.archive-name || inputs.name || 'archive' }}.tar.gz ${{ inputs.path }}
|
||||||
|
|
||||||
- uses: actions/upload-artifact@v3
|
- uses: actions/upload-artifact@v4
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
with:
|
||||||
name: ${{ inputs.name }}
|
name: ${{ inputs.name }}
|
||||||
path: ${{ inputs.archive-name || inputs.name || 'archive' }}.tar.gz
|
path: ${{ inputs.archive-name || inputs.name || 'archive' }}.tar.gz
|
||||||
|
|
16
.github/config.yml
vendored
16
.github/config.yml
vendored
|
@ -11,18 +11,16 @@ newIssueWelcomeComment: >
|
||||||
Also, check out some of our community
|
Also, check out some of our community
|
||||||
resources including:
|
resources including:
|
||||||
|
|
||||||
- [Community Wiki](https://github.com/saltstack/community/wiki)
|
|
||||||
- [Salt’s Contributor Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html)
|
- [Salt’s Contributor Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html)
|
||||||
- [Join our Community Slack](https://via.vmw.com/salt-slack)
|
- [Join our Community Discord](https://discord.com/invite/J7b7EscrAs)
|
||||||
- [IRC on LiberaChat](https://web.libera.chat/#salt)
|
|
||||||
- [Salt Project YouTube channel](https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg)
|
- [Salt Project YouTube channel](https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg)
|
||||||
- [Salt Project Twitch channel](https://www.twitch.tv/saltprojectoss)
|
- [Community Wiki](https://github.com/saltstack/community/wiki)
|
||||||
|
|
||||||
There are lots of ways to get involved in our community. Every month, there are around a dozen
|
There are lots of ways to get involved in our community. Every month, there are around a dozen
|
||||||
opportunities to meet with other contributors and the Salt Core team and collaborate in real
|
opportunities to meet with other contributors and the Salt Core team and collaborate in real
|
||||||
time. The best way to keep track is by subscribing to the Salt Community Events Calendar.
|
time. The best way to keep track is by subscribing to the Salt Community Events Calendar.
|
||||||
|
|
||||||
If you have additional questions, email us at saltproject@vmware.com. We’re glad
|
If you have additional questions, email us at saltproject.pdl@broadcom.com. We’re glad
|
||||||
you’ve joined our community and look forward to doing awesome things with
|
you’ve joined our community and look forward to doing awesome things with
|
||||||
you!
|
you!
|
||||||
|
|
||||||
|
@ -37,18 +35,16 @@ newPRWelcomeComment: >
|
||||||
Also, check out some of our community
|
Also, check out some of our community
|
||||||
resources including:
|
resources including:
|
||||||
|
|
||||||
- [Community Wiki](https://github.com/saltstack/community/wiki)
|
|
||||||
- [Salt’s Contributor Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html)
|
- [Salt’s Contributor Guide](https://docs.saltproject.io/en/master/topics/development/contributing.html)
|
||||||
- [Join our Community Slack](https://via.vmw.com/salt-slack)
|
- [Join our Community Discord](https://discord.com/invite/J7b7EscrAs)
|
||||||
- [IRC on LiberaChat](https://web.libera.chat/#salt)
|
|
||||||
- [Salt Project YouTube channel](https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg)
|
- [Salt Project YouTube channel](https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg)
|
||||||
- [Salt Project Twitch channel](https://www.twitch.tv/saltprojectoss)
|
- [Community Wiki](https://github.com/saltstack/community/wiki)
|
||||||
|
|
||||||
There are lots of ways to get involved in our community. Every month, there are around a dozen
|
There are lots of ways to get involved in our community. Every month, there are around a dozen
|
||||||
opportunities to meet with other contributors and the Salt Core team and collaborate in real
|
opportunities to meet with other contributors and the Salt Core team and collaborate in real
|
||||||
time. The best way to keep track is by subscribing to the Salt Community Events Calendar.
|
time. The best way to keep track is by subscribing to the Salt Community Events Calendar.
|
||||||
|
|
||||||
If you have additional questions, email us at saltproject@vmware.com. We’re glad
|
If you have additional questions, email us at saltproject.pdl@broadcom.com. We’re glad
|
||||||
you’ve joined our community and look forward to doing awesome things with
|
you’ve joined our community and look forward to doing awesome things with
|
||||||
you!
|
you!
|
||||||
|
|
||||||
|
|
34
.github/dependabot.yml
vendored
Normal file
34
.github/dependabot.yml
vendored
Normal file
|
@ -0,0 +1,34 @@
|
||||||
|
version: 2
|
||||||
|
updates:
|
||||||
|
# master branch
|
||||||
|
- package-ecosystem: "pip"
|
||||||
|
directory: "/"
|
||||||
|
schedule:
|
||||||
|
interval: "daily"
|
||||||
|
target-branch: master
|
||||||
|
labels:
|
||||||
|
- "test:full"
|
||||||
|
# Don't open PRs for regular version updates
|
||||||
|
open-pull-requests-limit: 0
|
||||||
|
|
||||||
|
# 3006.x release branch
|
||||||
|
- package-ecosystem: "pip"
|
||||||
|
directory: "/"
|
||||||
|
schedule:
|
||||||
|
interval: "daily"
|
||||||
|
target-branch: 3006.x
|
||||||
|
labels:
|
||||||
|
- "test:full"
|
||||||
|
# Don't open PRs for regular version updates
|
||||||
|
open-pull-requests-limit: 0
|
||||||
|
|
||||||
|
# freeze release branch
|
||||||
|
- package-ecosystem: "pip"
|
||||||
|
directory: "/"
|
||||||
|
schedule:
|
||||||
|
interval: "daily"
|
||||||
|
target-branch: freeze
|
||||||
|
labels:
|
||||||
|
- "test:full"
|
||||||
|
# Don't open PRs for regular version updates
|
||||||
|
open-pull-requests-limit: 0
|
2
.github/workflows/backport.yml
vendored
2
.github/workflows/backport.yml
vendored
|
@ -20,12 +20,14 @@ jobs:
|
||||||
github.event.pull_request.merged == true
|
github.event.pull_request.merged == true
|
||||||
&& (
|
&& (
|
||||||
contains(github.event.pull_request.labels.*.name, 'backport:master') ||
|
contains(github.event.pull_request.labels.*.name, 'backport:master') ||
|
||||||
|
contains(github.event.pull_request.labels.*.name, 'backport:3007.x') ||
|
||||||
contains(github.event.pull_request.labels.*.name, 'backport:3006.x') ||
|
contains(github.event.pull_request.labels.*.name, 'backport:3006.x') ||
|
||||||
contains(github.event.pull_request.labels.*.name, 'backport:3005.x')
|
contains(github.event.pull_request.labels.*.name, 'backport:3005.x')
|
||||||
)
|
)
|
||||||
&& (
|
&& (
|
||||||
(github.event.action == 'labeled' && (
|
(github.event.action == 'labeled' && (
|
||||||
contains(github.event.pull_request.labels.*.name, 'backport:master') ||
|
contains(github.event.pull_request.labels.*.name, 'backport:master') ||
|
||||||
|
contains(github.event.pull_request.labels.*.name, 'backport:3007.x') ||
|
||||||
contains(github.event.pull_request.labels.*.name, 'backport:3006.x') ||
|
contains(github.event.pull_request.labels.*.name, 'backport:3006.x') ||
|
||||||
contains(github.event.pull_request.labels.*.name, 'backport:3005.x')
|
contains(github.event.pull_request.labels.*.name, 'backport:3005.x')
|
||||||
))
|
))
|
||||||
|
|
158
.github/workflows/build-deps-ci-action.yml
vendored
158
.github/workflows/build-deps-ci-action.yml
vendored
|
@ -34,14 +34,23 @@ on:
|
||||||
type: string
|
type: string
|
||||||
description: The onedir package name to use
|
description: The onedir package name to use
|
||||||
default: salt
|
default: salt
|
||||||
|
matrix:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
linux_arm_runner:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
|
||||||
|
|
||||||
env:
|
env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
AWS_MAX_ATTEMPTS: "10"
|
||||||
AWS_RETRY_MODE: "adaptive"
|
AWS_RETRY_MODE: "adaptive"
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
||||||
|
|
||||||
|
@ -49,20 +58,20 @@ jobs:
|
||||||
|
|
||||||
linux-dependencies:
|
linux-dependencies:
|
||||||
name: Linux
|
name: Linux
|
||||||
|
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
|
||||||
- linux
|
env:
|
||||||
- bastion
|
USE_S3_CACHE: 'false'
|
||||||
timeout-minutes: 90
|
timeout-minutes: 90
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include: ${{ fromJSON(inputs.matrix)['linux'] }}
|
||||||
- distro-slug: centos-7
|
|
||||||
arch: x86_64
|
|
||||||
- distro-slug: centos-7-arm64
|
|
||||||
arch: arm64
|
|
||||||
steps:
|
steps:
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
- name: "Throttle Builds"
|
||||||
shell: bash
|
shell: bash
|
||||||
|
@ -72,9 +81,13 @@ jobs:
|
||||||
- name: Checkout Source Code
|
- name: Checkout Source Code
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
|
||||||
- name: Cache nox.linux.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
- name: Cache nox.linux.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
||||||
id: nox-dependencies-cache
|
id: nox-dependencies-cache
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: nox.linux.${{ matrix.arch }}.tar.*
|
path: nox.linux.${{ matrix.arch }}.tar.*
|
||||||
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|linux|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|linux|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
||||||
|
@ -97,7 +110,7 @@ jobs:
|
||||||
- name: PyPi Proxy
|
- name: PyPi Proxy
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
sed -i '7s;^;--index-url=https://pypi-proxy.saltstack.net/root/local/+simple/ --extra-index-url=https://pypi.org/simple\n;' requirements/static/ci/*/*.txt
|
sed -i '7s;^;--index-url=${{ vars.PIP_INDEX_URL }} --trusted-host ${{ vars.PIP_TRUSTED_HOST }} --extra-index-url=${{ vars.PIP_EXTRA_INDEX_URL }}\n;' requirements/static/ci/*/*.txt
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
- name: Setup Python Tools Scripts
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
|
@ -105,53 +118,34 @@ jobs:
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
|
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
- name: Install System Dependencies
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
echo true
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
- name: Install Nox
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
|
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Install Dependencies
|
- name: Install Dependencies
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
|
env:
|
||||||
|
PRINT_TEST_SELECTION: "0"
|
||||||
|
PRINT_SYSTEM_INFO: "0"
|
||||||
|
RELENV_BUILDENV: "1"
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm install-dependencies --nox-session=${{ inputs.nox-session }} ${{ matrix.distro-slug }}
|
nox --install-only -e ${{ inputs.nox-session }}
|
||||||
|
|
||||||
- name: Cleanup .nox Directory
|
- name: Cleanup .nox Directory
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm pre-archive-cleanup ${{ matrix.distro-slug }}
|
nox --force-color -e "pre-archive-cleanup(pkg=False)"
|
||||||
|
|
||||||
- name: Compress .nox Directory
|
- name: Compress .nox Directory
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm compress-dependencies ${{ matrix.distro-slug }}
|
nox --force-color -e compress-dependencies -- linux ${{ matrix.arch }}
|
||||||
|
|
||||||
- name: Download Compressed .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always() && steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Upload Nox Requirements Tarball
|
- name: Upload Nox Requirements Tarball
|
||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
|
@ -161,16 +155,15 @@ jobs:
|
||||||
|
|
||||||
macos-dependencies:
|
macos-dependencies:
|
||||||
name: MacOS
|
name: MacOS
|
||||||
runs-on: ${{ matrix.distro-slug }}
|
runs-on: ${{ matrix.arch == 'x86_64' && 'macos-13' || 'macos-14' }}
|
||||||
|
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
|
||||||
timeout-minutes: 90
|
timeout-minutes: 90
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include: ${{ fromJSON(inputs.matrix)['macos'] }}
|
||||||
- distro-slug: macos-12
|
env:
|
||||||
arch: x86_64
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
- distro-slug: macos-13-xlarge
|
|
||||||
arch: arm64
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
- name: "Throttle Builds"
|
||||||
|
@ -183,7 +176,7 @@ jobs:
|
||||||
|
|
||||||
- name: Cache nox.macos.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
- name: Cache nox.macos.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
||||||
id: nox-dependencies-cache
|
id: nox-dependencies-cache
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: nox.macos.${{ matrix.arch }}.tar.*
|
path: nox.macos.${{ matrix.arch }}.tar.*
|
||||||
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|macos|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|macos|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
||||||
|
@ -247,19 +240,19 @@ jobs:
|
||||||
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
|
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
|
||||||
path: nox.macos.${{ matrix.arch }}.tar.*
|
path: nox.macos.${{ matrix.arch }}.tar.*
|
||||||
|
|
||||||
|
|
||||||
windows-dependencies:
|
windows-dependencies:
|
||||||
name: Windows
|
name: Windows
|
||||||
runs-on:
|
runs-on: windows-latest
|
||||||
- self-hosted
|
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
|
||||||
- linux
|
env:
|
||||||
- bastion
|
USE_S3_CACHE: 'false'
|
||||||
|
GITHUB_WORKSPACE: 'C:\Windows\Temp\testing'
|
||||||
timeout-minutes: 90
|
timeout-minutes: 90
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include: ${{ fromJSON(inputs.matrix)['windows'] }}
|
||||||
- distro-slug: windows-2022
|
|
||||||
arch: amd64
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
- name: "Throttle Builds"
|
||||||
|
@ -267,12 +260,16 @@ jobs:
|
||||||
run: |
|
run: |
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
||||||
|
|
||||||
|
- name: "Show environment"
|
||||||
|
run: |
|
||||||
|
env
|
||||||
|
|
||||||
- name: Checkout Source Code
|
- name: Checkout Source Code
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
- name: Cache nox.windows.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
- name: Cache nox.windows.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
||||||
id: nox-dependencies-cache
|
id: nox-dependencies-cache
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: nox.windows.${{ matrix.arch }}.tar.*
|
path: nox.windows.${{ matrix.arch }}.tar.*
|
||||||
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|windows|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|windows|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
||||||
|
@ -292,10 +289,11 @@ jobs:
|
||||||
cd artifacts
|
cd artifacts
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
|
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
|
||||||
|
|
||||||
- name: PyPi Proxy
|
- name: Set up Python ${{ inputs.python-version }}
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
uses: actions/setup-python@v5
|
||||||
sed -i '7s;^;--index-url=https://pypi-proxy.saltstack.net/root/local/+simple/ --extra-index-url=https://pypi.org/simple\n;' requirements/static/ci/*/*.txt
|
with:
|
||||||
|
python-version: "${{ inputs.python-version }}"
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
- name: Setup Python Tools Scripts
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
|
@ -303,53 +301,33 @@ jobs:
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
|
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
- name: Install System Dependencies
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
echo true
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
- name: Install Nox
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
|
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Install Dependencies
|
- name: Install Dependencies
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
|
env:
|
||||||
|
PRINT_TEST_SELECTION: "0"
|
||||||
|
PRINT_SYSTEM_INFO: "0"
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm install-dependencies --nox-session=${{ inputs.nox-session }} ${{ matrix.distro-slug }}
|
nox --install-only -e ${{ inputs.nox-session }}
|
||||||
|
|
||||||
- name: Cleanup .nox Directory
|
- name: Cleanup .nox Directory
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm pre-archive-cleanup ${{ matrix.distro-slug }}
|
nox --force-color -e "pre-archive-cleanup(pkg=False)"
|
||||||
|
|
||||||
- name: Compress .nox Directory
|
- name: Compress .nox Directory
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
||||||
run: |
|
run: |
|
||||||
tools --timestamps vm compress-dependencies ${{ matrix.distro-slug }}
|
nox --force-color -e compress-dependencies -- windows ${{ matrix.arch }}
|
||||||
|
|
||||||
- name: Download Compressed .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always() && steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Upload Nox Requirements Tarball
|
- name: Upload Nox Requirements Tarball
|
||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
|
|
182
.github/workflows/build-deps-onedir.yml
vendored
182
.github/workflows/build-deps-onedir.yml
vendored
|
@ -1,182 +0,0 @@
|
||||||
---
|
|
||||||
name: Build Packaging Dependencies Onedir
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version to set prior to building packages.
|
|
||||||
github-hosted-runners:
|
|
||||||
type: boolean
|
|
||||||
required: true
|
|
||||||
self-hosted-runners:
|
|
||||||
type: boolean
|
|
||||||
required: true
|
|
||||||
cache-seed:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
relenv-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The version of relenv to use
|
|
||||||
python-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The version of python to use with relenv
|
|
||||||
|
|
||||||
env:
|
|
||||||
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
build-deps-linux:
|
|
||||||
name: Linux
|
|
||||||
if: ${{ inputs.self-hosted-runners }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
arch:
|
|
||||||
- x86_64
|
|
||||||
- arm64
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- ${{ matrix.arch }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}-build-deps-linux-${{ matrix.arch }}
|
|
||||||
|
|
||||||
- name: Setup Relenv
|
|
||||||
id: setup-relenv
|
|
||||||
uses: ./.github/actions/setup-relenv
|
|
||||||
with:
|
|
||||||
platform: linux
|
|
||||||
arch: ${{ matrix.arch == 'arm64' && 'aarch64' || matrix.arch }}
|
|
||||||
version: ${{ inputs.relenv-version }}
|
|
||||||
cache-seed: ${{ inputs.cache-seed }}
|
|
||||||
python-version: ${{ inputs.python-version }}
|
|
||||||
|
|
||||||
- name: Install Salt Packaging Dependencies into Relenv Onedir
|
|
||||||
uses: ./.github/actions/build-onedir-deps
|
|
||||||
with:
|
|
||||||
platform: linux
|
|
||||||
arch: ${{ matrix.arch }}
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}|relenv|${{ steps.setup-relenv.outputs.version }}
|
|
||||||
|
|
||||||
build-deps-macos:
|
|
||||||
name: macOS
|
|
||||||
if: ${{ inputs.github-hosted-runners }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
max-parallel: 2
|
|
||||||
matrix:
|
|
||||||
arch:
|
|
||||||
- x86_64
|
|
||||||
- arm64
|
|
||||||
runs-on:
|
|
||||||
- ${{ matrix.arch == 'arm64' && 'macos-13-xlarge' || 'macos-12' }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Set up Python 3.10
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "3.10"
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}-build-deps-macos
|
|
||||||
|
|
||||||
- name: Setup Relenv
|
|
||||||
id: setup-relenv
|
|
||||||
uses: ./.github/actions/setup-relenv
|
|
||||||
with:
|
|
||||||
platform: macos
|
|
||||||
arch: ${{ matrix.arch }}
|
|
||||||
version: ${{ inputs.relenv-version }}
|
|
||||||
cache-seed: ${{ inputs.cache-seed }}
|
|
||||||
python-version: ${{ inputs.python-version }}
|
|
||||||
|
|
||||||
- name: Install Salt Packaging Dependencies into Relenv Onedir
|
|
||||||
uses: ./.github/actions/build-onedir-deps
|
|
||||||
with:
|
|
||||||
platform: macos
|
|
||||||
arch: ${{ matrix.arch }}
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}|relenv|${{ steps.setup-relenv.outputs.version }}
|
|
||||||
|
|
||||||
build-deps-windows:
|
|
||||||
name: Windows
|
|
||||||
if: ${{ inputs.github-hosted-runners }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
max-parallel: 2
|
|
||||||
matrix:
|
|
||||||
arch:
|
|
||||||
- x86
|
|
||||||
- amd64
|
|
||||||
runs-on: windows-latest
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Set up Python 3.10
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "3.10"
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}-build-deps-windows-${{ matrix.arch }}
|
|
||||||
|
|
||||||
- name: Setup Relenv
|
|
||||||
id: setup-relenv
|
|
||||||
uses: ./.github/actions/setup-relenv
|
|
||||||
with:
|
|
||||||
platform: windows
|
|
||||||
arch: ${{ matrix.arch }}
|
|
||||||
version: ${{ inputs.relenv-version }}
|
|
||||||
cache-seed: ${{ inputs.cache-seed }}
|
|
||||||
python-version: ${{ inputs.python-version }}
|
|
||||||
|
|
||||||
- name: Install Salt Packaging Dependencies into Relenv Onedir
|
|
||||||
uses: ./.github/actions/build-onedir-deps
|
|
||||||
with:
|
|
||||||
platform: windows
|
|
||||||
arch: ${{ matrix.arch }}
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}|relenv|${{ steps.setup-relenv.outputs.version }}
|
|
17
.github/workflows/build-docs.yml
vendored
17
.github/workflows/build-docs.yml
vendored
|
@ -17,27 +17,28 @@ env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
AWS_MAX_ATTEMPTS: "10"
|
||||||
AWS_RETRY_MODE: "adaptive"
|
AWS_RETRY_MODE: "adaptive"
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
build:
|
build:
|
||||||
name: Build
|
name: Build
|
||||||
runs-on:
|
runs-on:
|
||||||
- ubuntu-latest
|
- ubuntu-22.04
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
docs-output:
|
docs-output:
|
||||||
- linkcheck
|
# XXX re-enable lintcheck and spellcheck then fix the errors
|
||||||
- spellcheck
|
# - linkcheck
|
||||||
|
# - spellcheck
|
||||||
- html
|
- html
|
||||||
- epub
|
|
||||||
# - pdf
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
|
||||||
- name: Download Release Patch
|
- name: Download Release Patch
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
@ -69,7 +70,7 @@ jobs:
|
||||||
|
|
||||||
- name: Cache Python Tools Docs Virtualenv
|
- name: Cache Python Tools Docs Virtualenv
|
||||||
id: tools-venvs-dependencies-cache
|
id: tools-venvs-dependencies-cache
|
||||||
uses: actions/cache@v3.3.1
|
uses: ./.github/actions/cache
|
||||||
with:
|
with:
|
||||||
path: .tools-venvs/docs
|
path: .tools-venvs/docs
|
||||||
key: ${{ inputs.cache-seed }}|${{ github.workflow }}|${{ github.job }}|tools-venvs|${{ steps.python-tools-scripts.outputs.version }}|docs|${{ steps.get-python-version.outputs.version }}|${{ hashFiles('requirements/**/docs.txt') }}
|
key: ${{ inputs.cache-seed }}|${{ github.workflow }}|${{ github.job }}|tools-venvs|${{ steps.python-tools-scripts.outputs.version }}|docs|${{ steps.get-python-version.outputs.version }}|${{ hashFiles('requirements/**/docs.txt') }}
|
||||||
|
|
393
.github/workflows/build-packages.yml
vendored
393
.github/workflows/build-packages.yml
vendored
|
@ -36,29 +36,217 @@ on:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
type: string
|
||||||
description: Seed used to invalidate caches
|
description: Seed used to invalidate caches
|
||||||
|
matrix:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
linux_arm_runner:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
|
||||||
env:
|
env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
|
|
||||||
|
build-deb-packages:
|
||||||
|
name: DEB
|
||||||
|
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
|
||||||
|
runs-on:
|
||||||
|
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix)['linux'] }}
|
||||||
|
|
||||||
|
container:
|
||||||
|
image: ghcr.io/saltstack/salt-ci-containers/packaging:debian-12
|
||||||
|
|
||||||
|
steps:
|
||||||
|
# Checkout here so we can easily use custom actions
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
# We need a more recent rustc
|
||||||
|
- name: Install a more recent `rustc`
|
||||||
|
if: ${{ inputs.source == 'src' }}
|
||||||
|
uses: actions-rust-lang/setup-rust-toolchain@v1
|
||||||
|
|
||||||
|
- name: Set rust environment variables
|
||||||
|
if: ${{ inputs.source == 'src' }}
|
||||||
|
run: |
|
||||||
|
CARGO_HOME=${CARGO_HOME:-${HOME}/.cargo}
|
||||||
|
export CARGO_HOME
|
||||||
|
echo "CARGO_HOME=${CARGO_HOME}" | tee -a "${GITHUB_ENV}"
|
||||||
|
echo "${CARGO_HOME}/bin" | tee -a "${GITHUB_PATH}"
|
||||||
|
|
||||||
|
# Checkout here for the build process
|
||||||
|
- name: Checkout in build directory
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
path:
|
||||||
|
pkgs/checkout/
|
||||||
|
|
||||||
|
- name: Download Onedir Tarball as an Artifact
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
|
||||||
|
path: pkgs/checkout/artifacts/
|
||||||
|
|
||||||
|
- name: Download Release Patch
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}.patch
|
||||||
|
path: pkgs/checkout/
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cwd: pkgs/checkout/
|
||||||
|
cache-prefix: ${{ inputs.cache-prefix }}
|
||||||
|
|
||||||
|
- name: Setup Salt Version
|
||||||
|
id: setup-salt-version
|
||||||
|
uses: ./.github/actions/setup-salt-version
|
||||||
|
with:
|
||||||
|
salt-version: "${{ inputs.salt-version }}"
|
||||||
|
cwd: pkgs/checkout/
|
||||||
|
|
||||||
|
- name: Configure Git
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
working-directory: pkgs/checkout/
|
||||||
|
run: |
|
||||||
|
tools pkg configure-git
|
||||||
|
|
||||||
|
- name: Apply release patch
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
working-directory: pkgs/checkout/
|
||||||
|
run: |
|
||||||
|
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
|
||||||
|
|
||||||
|
- name: Build Deb
|
||||||
|
working-directory: pkgs/checkout/
|
||||||
|
run: |
|
||||||
|
tools pkg build deb --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
|
||||||
|
inputs.source == 'onedir' &&
|
||||||
|
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
|
||||||
|
||
|
||||||
|
format('--arch={0}', matrix.arch)
|
||||||
|
}}
|
||||||
|
|
||||||
|
- name: Cleanup
|
||||||
|
run: |
|
||||||
|
rm -rf pkgs/checkout/
|
||||||
|
|
||||||
|
- name: Set Artifact Name
|
||||||
|
id: set-artifact-name
|
||||||
|
run: |
|
||||||
|
if [ "${{ inputs.source }}" != "src" ]; then
|
||||||
|
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb" >> "$GITHUB_OUTPUT"
|
||||||
|
else
|
||||||
|
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb-from-src" >> "$GITHUB_OUTPUT"
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Upload DEBs
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
|
||||||
|
path: ${{ github.workspace }}/pkgs/*
|
||||||
|
retention-days: 7
|
||||||
|
if-no-files-found: error
|
||||||
|
|
||||||
|
build-rpm-packages:
|
||||||
|
name: RPM
|
||||||
|
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
|
||||||
|
runs-on:
|
||||||
|
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix)['linux'] }}
|
||||||
|
|
||||||
|
container:
|
||||||
|
image: ghcr.io/saltstack/salt-ci-containers/packaging:rockylinux-9
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Download Onedir Tarball as an Artifact
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
|
||||||
|
path: artifacts/
|
||||||
|
|
||||||
|
- name: Download Release Patch
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}.patch
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cache-prefix: ${{ inputs.cache-prefix }}
|
||||||
|
|
||||||
|
- name: Setup Salt Version
|
||||||
|
id: setup-salt-version
|
||||||
|
uses: ./.github/actions/setup-salt-version
|
||||||
|
with:
|
||||||
|
salt-version: "${{ inputs.salt-version }}"
|
||||||
|
|
||||||
|
- name: Configure Git
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
run: |
|
||||||
|
tools pkg configure-git
|
||||||
|
|
||||||
|
- name: Apply release patch
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
run: |
|
||||||
|
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
|
||||||
|
|
||||||
|
- name: Build RPM
|
||||||
|
run: |
|
||||||
|
tools pkg build rpm --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
|
||||||
|
inputs.source == 'onedir' &&
|
||||||
|
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
|
||||||
|
||
|
||||||
|
format('--arch={0}', matrix.arch)
|
||||||
|
}}
|
||||||
|
|
||||||
|
- name: Set Artifact Name
|
||||||
|
id: set-artifact-name
|
||||||
|
run: |
|
||||||
|
if [ "${{ inputs.source }}" != "src" ]; then
|
||||||
|
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm" >> "$GITHUB_OUTPUT"
|
||||||
|
else
|
||||||
|
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm-from-src" >> "$GITHUB_OUTPUT"
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Upload RPMs
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
|
||||||
|
path: ~/rpmbuild/RPMS/${{ matrix.arch == 'arm64' && 'aarch64' || matrix.arch }}/*.rpm
|
||||||
|
retention-days: 7
|
||||||
|
if-no-files-found: error
|
||||||
|
|
||||||
build-macos-pkgs:
|
build-macos-pkgs:
|
||||||
name: macOS
|
name: macOS
|
||||||
|
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
|
||||||
environment: ${{ inputs.environment }}
|
environment: ${{ inputs.environment }}
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
arch:
|
include: ${{ fromJSON(inputs.matrix)['macos'] }}
|
||||||
- x86_64
|
env:
|
||||||
- arm64
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
source:
|
|
||||||
- ${{ inputs.source }}
|
|
||||||
|
|
||||||
runs-on:
|
runs-on:
|
||||||
- ${{ matrix.arch == 'arm64' && 'macos-13-xlarge' || 'macos-12' }}
|
- ${{ matrix.arch == 'arm64' && 'macos-14' || 'macos-13' }}
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check Package Signing Enabled
|
- name: Check Package Signing Enabled
|
||||||
|
@ -162,197 +350,15 @@ jobs:
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
||||||
build-deb-packages:
|
|
||||||
name: DEB
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- ${{ matrix.arch }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
arch:
|
|
||||||
- x86_64
|
|
||||||
- arm64
|
|
||||||
source:
|
|
||||||
- ${{ inputs.source }}
|
|
||||||
|
|
||||||
container:
|
|
||||||
image: ghcr.io/saltstack/salt-ci-containers/packaging:debian-12
|
|
||||||
|
|
||||||
steps:
|
|
||||||
# Checkout here so we can easily use custom actions
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
|
|
||||||
# Checkout here for the build process
|
|
||||||
- name: Checkout in build directory
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
with:
|
|
||||||
path:
|
|
||||||
pkgs/checkout/
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
|
|
||||||
path: pkgs/checkout/artifacts/
|
|
||||||
|
|
||||||
- name: Download Release Patch
|
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: salt-${{ inputs.salt-version }}.patch
|
|
||||||
path: pkgs/checkout/
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cwd: pkgs/checkout/
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
id: setup-salt-version
|
|
||||||
uses: ./.github/actions/setup-salt-version
|
|
||||||
with:
|
|
||||||
salt-version: "${{ inputs.salt-version }}"
|
|
||||||
cwd: pkgs/checkout/
|
|
||||||
|
|
||||||
- name: Configure Git
|
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
|
||||||
working-directory: pkgs/checkout/
|
|
||||||
run: |
|
|
||||||
tools pkg configure-git
|
|
||||||
|
|
||||||
- name: Apply release patch
|
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
|
||||||
working-directory: pkgs/checkout/
|
|
||||||
run: |
|
|
||||||
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
|
|
||||||
|
|
||||||
- name: Build Deb
|
|
||||||
working-directory: pkgs/checkout/
|
|
||||||
run: |
|
|
||||||
tools pkg build deb --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
|
|
||||||
inputs.source == 'onedir' &&
|
|
||||||
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
|
|
||||||
||
|
|
||||||
format('--arch={0}', matrix.arch)
|
|
||||||
}}
|
|
||||||
|
|
||||||
- name: Cleanup
|
|
||||||
run: |
|
|
||||||
rm -rf pkgs/checkout/
|
|
||||||
|
|
||||||
- name: Set Artifact Name
|
|
||||||
id: set-artifact-name
|
|
||||||
run: |
|
|
||||||
if [ "${{ inputs.source }}" != "src" ]; then
|
|
||||||
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb" >> "$GITHUB_OUTPUT"
|
|
||||||
else
|
|
||||||
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb-from-src" >> "$GITHUB_OUTPUT"
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload DEBs
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
|
|
||||||
path: ${{ github.workspace }}/pkgs/*
|
|
||||||
retention-days: 7
|
|
||||||
if-no-files-found: error
|
|
||||||
|
|
||||||
build-rpm-packages:
|
|
||||||
name: RPM
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- ${{ matrix.arch }}
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
arch:
|
|
||||||
- x86_64
|
|
||||||
- arm64
|
|
||||||
source:
|
|
||||||
- ${{ inputs.source }}
|
|
||||||
|
|
||||||
container:
|
|
||||||
image: ghcr.io/saltstack/salt-ci-containers/packaging:centosstream-9
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: salt-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Download Release Patch
|
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: salt-${{ inputs.salt-version }}.patch
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
id: setup-salt-version
|
|
||||||
uses: ./.github/actions/setup-salt-version
|
|
||||||
with:
|
|
||||||
salt-version: "${{ inputs.salt-version }}"
|
|
||||||
|
|
||||||
- name: Configure Git
|
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
|
||||||
run: |
|
|
||||||
tools pkg configure-git
|
|
||||||
|
|
||||||
- name: Apply release patch
|
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
|
||||||
run: |
|
|
||||||
tools pkg apply-release-patch salt-${{ inputs.salt-version }}.patch --delete
|
|
||||||
|
|
||||||
- name: Build RPM
|
|
||||||
run: |
|
|
||||||
tools pkg build rpm --relenv-version=${{ inputs.relenv-version }} --python-version=${{ inputs.python-version }} ${{
|
|
||||||
inputs.source == 'onedir' &&
|
|
||||||
format('--onedir=salt-{0}-onedir-linux-{1}.tar.xz', inputs.salt-version, matrix.arch)
|
|
||||||
||
|
|
||||||
format('--arch={0}', matrix.arch)
|
|
||||||
}}
|
|
||||||
|
|
||||||
- name: Set Artifact Name
|
|
||||||
id: set-artifact-name
|
|
||||||
run: |
|
|
||||||
if [ "${{ inputs.source }}" != "src" ]; then
|
|
||||||
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm" >> "$GITHUB_OUTPUT"
|
|
||||||
else
|
|
||||||
echo "artifact-name=salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm-from-src" >> "$GITHUB_OUTPUT"
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload RPMs
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ steps.set-artifact-name.outputs.artifact-name }}
|
|
||||||
path: ~/rpmbuild/RPMS/${{ matrix.arch == 'arm64' && 'aarch64' || matrix.arch }}/*.rpm
|
|
||||||
retention-days: 7
|
|
||||||
if-no-files-found: error
|
|
||||||
|
|
||||||
build-windows-pkgs:
|
build-windows-pkgs:
|
||||||
name: Windows
|
name: Windows
|
||||||
|
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
|
||||||
environment: ${{ inputs.environment }}
|
environment: ${{ inputs.environment }}
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
matrix:
|
matrix:
|
||||||
arch:
|
include: ${{ fromJSON(inputs.matrix)['windows'] }}
|
||||||
- x86
|
|
||||||
- amd64
|
|
||||||
source:
|
|
||||||
- ${{ inputs.source }}
|
|
||||||
|
|
||||||
runs-on:
|
runs-on:
|
||||||
- windows-latest
|
- windows-latest
|
||||||
env:
|
env:
|
||||||
|
@ -362,6 +368,7 @@ jobs:
|
||||||
SM_CLIENT_CERT_PASSWORD: "${{ secrets.WIN_SIGN_CERT_PASSWORD }}"
|
SM_CLIENT_CERT_PASSWORD: "${{ secrets.WIN_SIGN_CERT_PASSWORD }}"
|
||||||
SM_CLIENT_CERT_FILE_B64: "${{ secrets.WIN_SIGN_CERT_FILE_B64 }}"
|
SM_CLIENT_CERT_FILE_B64: "${{ secrets.WIN_SIGN_CERT_FILE_B64 }}"
|
||||||
WIN_SIGN_CERT_SHA1_HASH: "${{ secrets.WIN_SIGN_CERT_SHA1_HASH }}"
|
WIN_SIGN_CERT_SHA1_HASH: "${{ secrets.WIN_SIGN_CERT_SHA1_HASH }}"
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Check Package Signing Enabled
|
- name: Check Package Signing Enabled
|
||||||
|
|
109
.github/workflows/build-salt-onedir.yml
vendored
109
.github/workflows/build-salt-onedir.yml
vendored
|
@ -8,12 +8,6 @@ on:
|
||||||
type: string
|
type: string
|
||||||
required: true
|
required: true
|
||||||
description: The Salt version to set prior to building packages.
|
description: The Salt version to set prior to building packages.
|
||||||
github-hosted-runners:
|
|
||||||
type: boolean
|
|
||||||
required: true
|
|
||||||
self-hosted-runners:
|
|
||||||
type: boolean
|
|
||||||
required: true
|
|
||||||
cache-seed:
|
cache-seed:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
type: string
|
||||||
|
@ -26,31 +20,39 @@ on:
|
||||||
required: true
|
required: true
|
||||||
type: string
|
type: string
|
||||||
description: The version of python to use with relenv
|
description: The version of python to use with relenv
|
||||||
|
matrix:
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
|
description: Json config for build matrix
|
||||||
|
linux_arm_runner:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
|
||||||
env:
|
env:
|
||||||
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
AWS_MAX_ATTEMPTS: "10"
|
||||||
AWS_RETRY_MODE: "adaptive"
|
AWS_RETRY_MODE: "adaptive"
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
|
|
||||||
|
|
||||||
build-salt-linux:
|
build-salt-linux:
|
||||||
name: Linux
|
name: Linux
|
||||||
if: ${{ inputs.self-hosted-runners }}
|
if: ${{ toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
|
||||||
|
env:
|
||||||
|
USE_S3_CACHE: 'false'
|
||||||
|
runs-on:
|
||||||
|
- ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
arch:
|
include: ${{ fromJSON(inputs.matrix)['linux'] }}
|
||||||
- x86_64
|
|
||||||
- arm64
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- ${{ matrix.arch }}
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
- name: "Throttle Builds"
|
||||||
|
@ -60,10 +62,14 @@ jobs:
|
||||||
|
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: '3.10'
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
- name: Setup Python Tools Scripts
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ inputs.cache-seed }}-build-salt-onedir-windows
|
cache-prefix: ${{ inputs.cache-seed }}|build-salt-onedir|linux
|
||||||
|
|
||||||
- name: Setup Salt Version
|
- name: Setup Salt Version
|
||||||
id: setup-salt-version
|
id: setup-salt-version
|
||||||
|
@ -92,18 +98,22 @@ jobs:
|
||||||
|
|
||||||
build-salt-macos:
|
build-salt-macos:
|
||||||
name: macOS
|
name: macOS
|
||||||
if: ${{ inputs.github-hosted-runners }}
|
if: ${{ toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
matrix:
|
matrix:
|
||||||
arch:
|
include: ${{ fromJSON(inputs.matrix)['macos'] }}
|
||||||
- x86_64
|
|
||||||
- arm64
|
|
||||||
runs-on:
|
runs-on:
|
||||||
- ${{ matrix.arch == 'arm64' && 'macos-13-xlarge' || 'macos-12' }}
|
- ${{ matrix.arch == 'arm64' && 'macos-14' || 'macos-13' }}
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
USE_S3_CACHE: 'false'
|
||||||
steps:
|
steps:
|
||||||
|
- name: "Check cores"
|
||||||
|
shell: bash
|
||||||
|
run: sysctl -n hw.ncpu
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
- name: "Throttle Builds"
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
|
@ -115,6 +125,17 @@ jobs:
|
||||||
with:
|
with:
|
||||||
python-version: "3.10"
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cache-prefix: ${{ inputs.cache-seed }}|build-salt-onedir|macos
|
||||||
|
|
||||||
|
- name: Setup Salt Version
|
||||||
|
id: setup-salt-version
|
||||||
|
uses: ./.github/actions/setup-salt-version
|
||||||
|
with:
|
||||||
|
salt-version: "${{ inputs.salt-version }}"
|
||||||
|
|
||||||
- name: Setup Relenv
|
- name: Setup Relenv
|
||||||
id: setup-relenv
|
id: setup-relenv
|
||||||
uses: ./.github/actions/setup-relenv
|
uses: ./.github/actions/setup-relenv
|
||||||
|
@ -125,17 +146,6 @@ jobs:
|
||||||
cache-seed: ${{ inputs.cache-seed }}
|
cache-seed: ${{ inputs.cache-seed }}
|
||||||
python-version: ${{ inputs.python-version }}
|
python-version: ${{ inputs.python-version }}
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}-build-salt-onedir-macos
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
id: setup-salt-version
|
|
||||||
uses: ./.github/actions/setup-salt-version
|
|
||||||
with:
|
|
||||||
salt-version: "${{ inputs.salt-version }}"
|
|
||||||
|
|
||||||
- name: Install Salt into Relenv Onedir
|
- name: Install Salt into Relenv Onedir
|
||||||
uses: ./.github/actions/build-onedir-salt
|
uses: ./.github/actions/build-onedir-salt
|
||||||
with:
|
with:
|
||||||
|
@ -147,15 +157,16 @@ jobs:
|
||||||
|
|
||||||
build-salt-windows:
|
build-salt-windows:
|
||||||
name: Windows
|
name: Windows
|
||||||
if: ${{ inputs.github-hosted-runners }}
|
if: ${{ toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
max-parallel: 2
|
max-parallel: 2
|
||||||
matrix:
|
matrix:
|
||||||
arch:
|
include: ${{ fromJSON(inputs.matrix)['windows'] }}
|
||||||
- x86
|
|
||||||
- amd64
|
|
||||||
runs-on: windows-latest
|
runs-on: windows-latest
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
USE_S3_CACHE: 'false'
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
- name: "Throttle Builds"
|
||||||
|
@ -170,6 +181,17 @@ jobs:
|
||||||
with:
|
with:
|
||||||
python-version: "3.10"
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cache-prefix: ${{ inputs.cache-seed }}|build-salt-onedir|windows
|
||||||
|
|
||||||
|
- name: Setup Salt Version
|
||||||
|
id: setup-salt-version
|
||||||
|
uses: ./.github/actions/setup-salt-version
|
||||||
|
with:
|
||||||
|
salt-version: "${{ inputs.salt-version }}"
|
||||||
|
|
||||||
- name: Setup Relenv
|
- name: Setup Relenv
|
||||||
id: setup-relenv
|
id: setup-relenv
|
||||||
uses: ./.github/actions/setup-relenv
|
uses: ./.github/actions/setup-relenv
|
||||||
|
@ -180,17 +202,6 @@ jobs:
|
||||||
cache-seed: ${{ inputs.cache-seed }}
|
cache-seed: ${{ inputs.cache-seed }}
|
||||||
python-version: ${{ inputs.python-version }}
|
python-version: ${{ inputs.python-version }}
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-seed }}-build-salt-onedir-macos
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
id: setup-salt-version
|
|
||||||
uses: ./.github/actions/setup-salt-version
|
|
||||||
with:
|
|
||||||
salt-version: "${{ inputs.salt-version }}"
|
|
||||||
|
|
||||||
- name: Install Salt into Relenv Onedir
|
- name: Install Salt into Relenv Onedir
|
||||||
uses: ./.github/actions/build-onedir-salt
|
uses: ./.github/actions/build-onedir-salt
|
||||||
with:
|
with:
|
||||||
|
|
1695
.github/workflows/ci.yml
vendored
1695
.github/workflows/ci.yml
vendored
File diff suppressed because it is too large
Load diff
132
.github/workflows/draft-release.yml
vendored
Normal file
132
.github/workflows/draft-release.yml
vendored
Normal file
|
@ -0,0 +1,132 @@
|
||||||
|
---
|
||||||
|
name: Draft Github Release
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
salt-version:
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
|
description: The Salt version to set prior to building packages.
|
||||||
|
matrix:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
build-matrix:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
|
||||||
|
env:
|
||||||
|
COLUMNS: 190
|
||||||
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
|
||||||
|
list-artifacts:
|
||||||
|
name: List Artifacts
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
steps:
|
||||||
|
# Checkout here so we can easily use custom actions
|
||||||
|
- uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
path: artifacts/
|
||||||
|
- name: List Directory Structure
|
||||||
|
run: ls -R artifacts/
|
||||||
|
|
||||||
|
create-github-release:
|
||||||
|
name: Draft Release v${{ inputs.salt-version }}
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
outputs:
|
||||||
|
upload_url: ${{ steps.create_release.outputs.upload_url }}
|
||||||
|
steps:
|
||||||
|
- name: Create Release
|
||||||
|
id: create_release
|
||||||
|
uses: actions/create-release@v1
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
with:
|
||||||
|
release_name: "Release v${{ inputs.salt-version }}"
|
||||||
|
tag_name: v${{ inputs.salt-version }}
|
||||||
|
draft: true
|
||||||
|
prerelease: false
|
||||||
|
- name: Release Output
|
||||||
|
run: echo "upload_url=${{ steps.create_release.outputs.upload_url }}" >> "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
upload-source-tarball:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}.tar.gz
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
||||||
|
|
||||||
|
upload-onedir:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix) }}
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.${{ matrix.platform == 'windows' && 'zip' || 'tar.xz' }}
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
||||||
|
|
||||||
|
upload-deb-packages:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.build-matrix)['linux'] }}
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-deb
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
||||||
|
pattern: "*.deb"
|
||||||
|
|
||||||
|
upload-rpm-packages:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.build-matrix)['linux'] }}
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-rpm
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
||||||
|
|
||||||
|
upload-mac-packages:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.build-matrix)['macos'] }}
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-macos
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
||||||
|
|
||||||
|
upload-windows-msi-packages:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.build-matrix)['windows'] }}
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-MSI
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
||||||
|
|
||||||
|
upload-windows-nsis-packages:
|
||||||
|
needs:
|
||||||
|
- create-github-release
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.build-matrix)['windows'] }}
|
||||||
|
uses: ./.github/workflows/release-artifact.yml
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-NSIS
|
||||||
|
upload_url: ${{ needs.create-github-release.outputs.upload_url }}
|
20
.github/workflows/lint-action.yml
vendored
20
.github/workflows/lint-action.yml
vendored
|
@ -11,25 +11,20 @@ on:
|
||||||
|
|
||||||
|
|
||||||
env:
|
env:
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
|
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
Salt:
|
Salt:
|
||||||
name: Lint Salt's Source Code
|
name: Lint Salt's Source Code
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['salt'] || fromJSON(inputs.changed-files)['lint'] }}
|
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['salt'] || fromJSON(inputs.changed-files)['lint'] }}
|
||||||
|
|
||||||
container:
|
container:
|
||||||
image: ghcr.io/saltstack/salt-ci-containers/python:3.8
|
image: ghcr.io/saltstack/salt-ci-containers/python:3.10
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Install System Deps
|
|
||||||
run: |
|
|
||||||
apt-get update
|
|
||||||
apt-get install -y enchant-2 git gcc make zlib1g-dev libc-dev libffi-dev g++ libxml2 libxml2-dev libxslt-dev libcurl4-openssl-dev libssl-dev libgnutls28-dev
|
|
||||||
|
|
||||||
- name: Add Git Safe Directory
|
- name: Add Git Safe Directory
|
||||||
run: |
|
run: |
|
||||||
|
@ -63,18 +58,13 @@ jobs:
|
||||||
|
|
||||||
Tests:
|
Tests:
|
||||||
name: Lint Salt's Test Suite
|
name: Lint Salt's Test Suite
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['tests'] || fromJSON(inputs.changed-files)['lint'] }}
|
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['tests'] || fromJSON(inputs.changed-files)['lint'] }}
|
||||||
|
|
||||||
container:
|
container:
|
||||||
image: ghcr.io/saltstack/salt-ci-containers/python:3.8
|
image: ghcr.io/saltstack/salt-ci-containers/python:3.10
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Install System Deps
|
|
||||||
run: |
|
|
||||||
echo "deb http://deb.debian.org/debian bookworm-backports main" >> /etc/apt/sources.list
|
|
||||||
apt-get update
|
|
||||||
apt-get install -y enchant-2 git gcc make zlib1g-dev libc-dev libffi-dev g++ libxml2 libxml2-dev libxslt-dev libcurl4-openssl-dev libssl-dev libgnutls28-dev
|
|
||||||
|
|
||||||
- name: Add Git Safe Directory
|
- name: Add Git Safe Directory
|
||||||
run: |
|
run: |
|
||||||
|
|
2704
.github/workflows/nightly.yml
vendored
2704
.github/workflows/nightly.yml
vendored
File diff suppressed because it is too large
Load diff
67
.github/workflows/nsis-tests.yml
vendored
Normal file
67
.github/workflows/nsis-tests.yml
vendored
Normal file
|
@ -0,0 +1,67 @@
|
||||||
|
---
|
||||||
|
name: Test NSIS Installer
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
changed-files:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: JSON string containing information about changed files
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
Test-NSIS-Logic:
|
||||||
|
name: Logic Tests
|
||||||
|
runs-on:
|
||||||
|
- windows-latest
|
||||||
|
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['nsis_tests'] }}
|
||||||
|
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: Checkout Salt
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set Up Python 3.10
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Install NSIS
|
||||||
|
run: .\pkg\windows\install_nsis.cmd -CICD
|
||||||
|
shell: cmd
|
||||||
|
|
||||||
|
- name: Build Test Installer
|
||||||
|
run: .\pkg\windows\nsis\tests\setup.cmd -CICD
|
||||||
|
shell: cmd
|
||||||
|
|
||||||
|
- name: Run Config Tests
|
||||||
|
run: .\pkg\windows\nsis\tests\test.cmd -CICD .\config_tests
|
||||||
|
shell: cmd
|
||||||
|
|
||||||
|
Test-NSIS-Stress:
|
||||||
|
name: Stress Tests
|
||||||
|
runs-on:
|
||||||
|
- windows-latest
|
||||||
|
if: ${{ contains(fromJSON('["push", "schedule", "workflow_dispatch"]'), github.event_name) || fromJSON(inputs.changed-files)['nsis_tests'] }}
|
||||||
|
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: Checkout Salt
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set Up Python 3.10
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Install NSIS
|
||||||
|
run: .\pkg\windows\install_nsis.cmd -CICD
|
||||||
|
shell: cmd
|
||||||
|
|
||||||
|
- name: Build Test Installer
|
||||||
|
run: .\pkg\windows\nsis\tests\setup.cmd -CICD
|
||||||
|
shell: cmd
|
||||||
|
|
||||||
|
- name: Run Stress Test
|
||||||
|
run: .\pkg\windows\nsis\tests\test.cmd -CICD .\stress_tests
|
||||||
|
shell: cmd
|
17
.github/workflows/pre-commit-action.yml
vendored
17
.github/workflows/pre-commit-action.yml
vendored
|
@ -21,21 +21,16 @@ jobs:
|
||||||
Pre-Commit:
|
Pre-Commit:
|
||||||
name: Run Pre-Commit Against Salt
|
name: Run Pre-Commit Against Salt
|
||||||
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
|
|
||||||
container:
|
container:
|
||||||
image: ghcr.io/saltstack/salt-ci-containers/python:3.10
|
image: ghcr.io/saltstack/salt-ci-containers/testing:ubuntu-22.04
|
||||||
|
|
||||||
env:
|
env:
|
||||||
PRE_COMMIT_COLOR: always
|
PRE_COMMIT_COLOR: always
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Install System Deps
|
|
||||||
run: |
|
|
||||||
apt-get update
|
|
||||||
apt-get install -y wget curl enchant-2 git gcc make zlib1g-dev libc-dev libffi-dev g++ libxml2 libxml2-dev libxslt-dev libcurl4-openssl-dev libssl-dev libgnutls28-dev rustc
|
|
||||||
|
|
||||||
- name: Add Git Safe Directory
|
- name: Add Git Safe Directory
|
||||||
run: |
|
run: |
|
||||||
git config --global --add safe.directory "$(pwd)"
|
git config --global --add safe.directory "$(pwd)"
|
||||||
|
@ -53,14 +48,14 @@ jobs:
|
||||||
cache-seed: ${{ inputs.cache-seed }}
|
cache-seed: ${{ inputs.cache-seed }}
|
||||||
|
|
||||||
- name: Check ALL Files On Branch
|
- name: Check ALL Files On Branch
|
||||||
if: github.event_name != 'pull_request'
|
if: ${{ !cancelled() && github.event_name != 'pull_request' }}
|
||||||
env:
|
env:
|
||||||
SKIP: lint-salt,lint-tests,remove-import-headers,rstcheck
|
SKIP: lint-salt,lint-tests,remove-import-headers,pyupgrade
|
||||||
run: |
|
run: |
|
||||||
pre-commit run --show-diff-on-failure --color=always --all-files
|
pre-commit run --show-diff-on-failure --color=always --all-files
|
||||||
|
|
||||||
- name: Check Changed Files On PR
|
- name: Check Changed Files On PR
|
||||||
if: github.event_name == 'pull_request' && fromJSON(inputs.changed-files)['repo']
|
if: ${{ !cancelled() && github.event_name == 'pull_request' && fromJSON(inputs.changed-files)['repo'] }}
|
||||||
env:
|
env:
|
||||||
SKIP: lint-salt,lint-tests
|
SKIP: lint-salt,lint-tests
|
||||||
GH_ACTIONS_ANNOTATE: "1"
|
GH_ACTIONS_ANNOTATE: "1"
|
||||||
|
@ -68,6 +63,6 @@ jobs:
|
||||||
pre-commit run --show-diff-on-failure --color=always --files ${{ join(fromJSON(inputs.changed-files)['repo_files'], ' ') }}
|
pre-commit run --show-diff-on-failure --color=always --files ${{ join(fromJSON(inputs.changed-files)['repo_files'], ' ') }}
|
||||||
|
|
||||||
- name: Check Docs On Deleted Files
|
- name: Check Docs On Deleted Files
|
||||||
if: github.event_name == 'pull_request' && fromJSON(inputs.changed-files)['deleted']
|
if: ${{ !cancelled() && github.event_name == 'pull_request' && fromJSON(inputs.changed-files)['deleted'] }}
|
||||||
run: |
|
run: |
|
||||||
pre-commit run --show-diff-on-failure --color=always check-docs --files ${{ join(fromJSON(inputs.changed-files)['deleted_files'], ' ') }}
|
pre-commit run --show-diff-on-failure --color=always check-docs --files ${{ join(fromJSON(inputs.changed-files)['deleted_files'], ' ') }}
|
||||||
|
|
69
.github/workflows/release-artifact.yml
vendored
Normal file
69
.github/workflows/release-artifact.yml
vendored
Normal file
|
@ -0,0 +1,69 @@
|
||||||
|
---
|
||||||
|
name: Upload Release Artifact
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
name:
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
|
description: The Salt version to set prior to building packages.
|
||||||
|
upload_url:
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
|
description: Release's upload url.
|
||||||
|
pattern:
|
||||||
|
type: string
|
||||||
|
required: false
|
||||||
|
description: Pattern of files to upload
|
||||||
|
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
|
||||||
|
list-files:
|
||||||
|
name: List ${{ inputs.name }}
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
outputs:
|
||||||
|
files: ${{ steps.list-files.outputs.files }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.name }}
|
||||||
|
path: artifacts
|
||||||
|
- run: find artifacts -maxdepth 1 -type f -printf '%f\n'
|
||||||
|
- id: list-files
|
||||||
|
run: |
|
||||||
|
if [ "${{ inputs.pattern }}" != "" ]; then
|
||||||
|
echo files="$(find artifacts -maxdepth 1 -type f -name '${{ inputs.pattern }}' -printf '%f\n' | jq -Rnc '[inputs | { file: "\(.)" }]')" >> "$GITHUB_OUTPUT"
|
||||||
|
else
|
||||||
|
echo files="$(find artifacts -maxdepth 1 -type f -printf '%f\n' | jq -Rnc '[inputs | { file: "\(.)" }]')" >> "$GITHUB_OUTPUT"
|
||||||
|
fi
|
||||||
|
|
||||||
|
upload-files:
|
||||||
|
name: Upload ${{ matrix.file }} from ${{ inputs.name }}
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
needs:
|
||||||
|
- list-files
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(needs.list-files.outputs.files) }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.name }}
|
||||||
|
path: artifacts
|
||||||
|
|
||||||
|
- name: Detect type of ${{ matrix.file }}
|
||||||
|
id: file-type
|
||||||
|
run: echo "file_type=$( file --mime-type artifacts/${{ matrix.file }} )" >> "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
- name: Upload ${{ matrix.file }}
|
||||||
|
id: upload-release-asset-source
|
||||||
|
uses: actions/upload-release-asset@v1
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
with:
|
||||||
|
upload_url: ${{ inputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
|
||||||
|
asset_path: artifacts/${{ matrix.file }}
|
||||||
|
asset_name: ${{ matrix.file }}
|
||||||
|
asset_content_type: ${{ steps.file-type.outputs.file_type }}
|
7
.github/workflows/release-tag.yml
vendored
7
.github/workflows/release-tag.yml
vendored
|
@ -19,8 +19,9 @@ on:
|
||||||
|
|
||||||
|
|
||||||
env:
|
env:
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
|
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
|
@ -31,7 +32,7 @@ jobs:
|
||||||
permissions:
|
permissions:
|
||||||
contents: write # for dev-drprasad/delete-tag-and-release to delete tags or releases
|
contents: write # for dev-drprasad/delete-tag-and-release to delete tags or releases
|
||||||
name: Generate Tag and Github Release
|
name: Generate Tag and Github Release
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
- uses: dev-drprasad/delete-tag-and-release@v0.2.0
|
- uses: dev-drprasad/delete-tag-and-release@v0.2.0
|
||||||
if: github.event.inputs.reTag == 'true'
|
if: github.event.inputs.reTag == 'true'
|
||||||
|
|
9
.github/workflows/release-update-winrepo.yml
vendored
9
.github/workflows/release-update-winrepo.yml
vendored
|
@ -19,7 +19,7 @@ permissions:
|
||||||
jobs:
|
jobs:
|
||||||
update-winrepo:
|
update-winrepo:
|
||||||
name: Update Winrepo
|
name: Update Winrepo
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Checkout Salt
|
- name: Checkout Salt
|
||||||
|
@ -31,7 +31,7 @@ jobs:
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
path: winrepo
|
path: winrepo
|
||||||
repository: twangboy/salt-winrepo-ng
|
repository: saltstack/salt-winrepo-ng
|
||||||
|
|
||||||
- name: Set Up Python 3.10
|
- name: Set Up Python 3.10
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v5
|
||||||
|
@ -41,9 +41,12 @@ jobs:
|
||||||
- name: Add Version to Minion Definition File
|
- name: Add Version to Minion Definition File
|
||||||
working-directory: salt
|
working-directory: salt
|
||||||
run: |
|
run: |
|
||||||
|
pwd
|
||||||
|
ls -al ../winrepo/salt-minion.sls
|
||||||
python .github/workflows/scripts/update_winrepo.py \
|
python .github/workflows/scripts/update_winrepo.py \
|
||||||
--file ../winrepo/salt-minion.sls \
|
--file ../winrepo/salt-minion.sls \
|
||||||
--version ${{ inputs.salt-version || github.ref_name }}
|
--version ${{ inputs.salt-version || github.ref_name }}
|
||||||
|
grep ${{ inputs.salt-version || github.ref_name }} ../winrepo/salt-minion.sls
|
||||||
|
|
||||||
- name: Commit Changes
|
- name: Commit Changes
|
||||||
working-directory: winrepo
|
working-directory: winrepo
|
||||||
|
@ -56,7 +59,7 @@ jobs:
|
||||||
|
|
||||||
- name: Create Pull Request
|
- name: Create Pull Request
|
||||||
id: cpr
|
id: cpr
|
||||||
uses: peter-evans/create-pull-request@v4
|
uses: peter-evans/create-pull-request@v7
|
||||||
with:
|
with:
|
||||||
path: winrepo
|
path: winrepo
|
||||||
push-to-fork: saltbot-open/salt-winrepo-ng
|
push-to-fork: saltbot-open/salt-winrepo-ng
|
||||||
|
|
|
@ -20,8 +20,9 @@ env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
AWS_MAX_ATTEMPTS: "10"
|
||||||
AWS_RETRY_MODE: "adaptive"
|
AWS_RETRY_MODE: "adaptive"
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
upload-virustotal:
|
upload-virustotal:
|
||||||
|
@ -30,7 +31,6 @@ jobs:
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- self-hosted
|
||||||
- linux
|
- linux
|
||||||
- repo-release
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
- name: Checkout Salt
|
- name: Checkout Salt
|
||||||
|
|
89
.github/workflows/release.yml
vendored
89
.github/workflows/release.yml
vendored
|
@ -21,7 +21,7 @@ on:
|
||||||
|
|
||||||
env:
|
env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
CACHE_SEED: SEED-7 # Bump the number to invalidate all caches
|
CACHE_SEED: SEED-1 # Bump the number to invalidate all caches
|
||||||
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
||||||
|
@ -37,7 +37,7 @@ jobs:
|
||||||
|
|
||||||
check-requirements:
|
check-requirements:
|
||||||
name: Check Requirements
|
name: Check Requirements
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
environment: release-check
|
environment: release-check
|
||||||
steps:
|
steps:
|
||||||
- name: Check For Admin Permission
|
- name: Check For Admin Permission
|
||||||
|
@ -49,9 +49,9 @@ jobs:
|
||||||
prepare-workflow:
|
prepare-workflow:
|
||||||
name: Prepare Workflow Run
|
name: Prepare Workflow Run
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-release
|
USE_S3_CACHE: 'false'
|
||||||
environment: release
|
environment: release
|
||||||
needs:
|
needs:
|
||||||
- check-requirements
|
- check-requirements
|
||||||
|
@ -61,6 +61,7 @@ jobs:
|
||||||
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
|
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
|
||||||
releases: ${{ steps.get-salt-releases.outputs.releases }}
|
releases: ${{ steps.get-salt-releases.outputs.releases }}
|
||||||
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
|
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
|
||||||
|
config: ${{ steps.workflow-config.outputs.config }}
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
|
@ -119,12 +120,17 @@ jobs:
|
||||||
run: |
|
run: |
|
||||||
echo "nox-archive-hash=${{ hashFiles('requirements/**/*.txt', 'cicd/golden-images.json', 'noxfile.py', 'pkg/common/env-cleanup-rules.yml', '.github/workflows/build-deps-ci-action.yml') }}" | tee -a "$GITHUB_OUTPUT"
|
echo "nox-archive-hash=${{ hashFiles('requirements/**/*.txt', 'cicd/golden-images.json', 'noxfile.py', 'pkg/common/env-cleanup-rules.yml', '.github/workflows/build-deps-ci-action.yml') }}" | tee -a "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
- name: Define workflow config
|
||||||
|
id: workflow-config
|
||||||
|
run: |
|
||||||
|
tools ci workflow-config${{ inputs.skip-salt-pkg-download-test-suite && ' --skip-pkg-download-tests' || '' }} ${{ steps.setup-salt-version.outputs.salt-version }} ${{ github.event_name }} changed-files.json
|
||||||
|
|
||||||
download-onedir-artifact:
|
download-onedir-artifact:
|
||||||
name: Download Staging Onedir Artifact
|
name: Download Staging Onedir Artifact
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-release
|
USE_S3_CACHE: 'true'
|
||||||
environment: release
|
environment: release
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
@ -180,17 +186,19 @@ jobs:
|
||||||
nox-version: 2022.8.7
|
nox-version: 2022.8.7
|
||||||
python-version: "3.10"
|
python-version: "3.10"
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|3.10.13
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|3.10.16
|
||||||
nox-archive-hash: "${{ needs.prepare-workflow.outputs.nox-archive-hash }}"
|
nox-archive-hash: "${{ needs.prepare-workflow.outputs.nox-archive-hash }}"
|
||||||
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
|
||||||
|
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
|
||||||
|
|
||||||
backup:
|
backup:
|
||||||
name: Backup
|
name: Backup
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
|
||||||
- repo-release
|
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
env:
|
||||||
|
USE_S3_CACHE: 'true'
|
||||||
environment: release
|
environment: release
|
||||||
outputs:
|
outputs:
|
||||||
backup-complete: ${{ steps.backup.outputs.backup-complete }}
|
backup-complete: ${{ steps.backup.outputs.backup-complete }}
|
||||||
|
@ -217,15 +225,14 @@ jobs:
|
||||||
publish-repositories:
|
publish-repositories:
|
||||||
name: Publish Repositories
|
name: Publish Repositories
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-release
|
USE_S3_CACHE: 'true'
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- backup
|
- backup
|
||||||
- download-onedir-artifact
|
- download-onedir-artifact
|
||||||
environment: release
|
environment: release
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Clone The Salt Repository
|
- name: Clone The Salt Repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
@ -248,38 +255,17 @@ jobs:
|
||||||
run: |
|
run: |
|
||||||
tools pkg repo publish release ${{ needs.prepare-workflow.outputs.salt-version }}
|
tools pkg repo publish release ${{ needs.prepare-workflow.outputs.salt-version }}
|
||||||
|
|
||||||
pkg-download-tests:
|
|
||||||
name: Package Downloads
|
|
||||||
if: ${{ inputs.skip-salt-pkg-download-test-suite == false }}
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
- publish-repositories
|
|
||||||
- build-ci-deps
|
|
||||||
- download-onedir-artifact
|
|
||||||
uses: ./.github/workflows/test-package-downloads-action.yml
|
|
||||||
with:
|
|
||||||
nox-session: ci-test-onedir
|
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|3.10.13
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
|
||||||
environment: release
|
|
||||||
nox-version: 2022.8.7
|
|
||||||
python-version: "3.10"
|
|
||||||
skip-code-coverage: true
|
|
||||||
latest-release: "${{ needs.prepare-workflow.outputs.latest-release }}"
|
|
||||||
secrets: inherit
|
|
||||||
|
|
||||||
release:
|
release:
|
||||||
name: Release v${{ needs.prepare-workflow.outputs.salt-version }}
|
name: Release v${{ needs.prepare-workflow.outputs.salt-version }}
|
||||||
if: ${{ always() && ! failure() && ! cancelled() }}
|
if: ${{ always() && ! failure() && ! cancelled() }}
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-release
|
USE_S3_CACHE: 'true'
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- backup
|
- backup
|
||||||
- publish-repositories
|
- publish-repositories
|
||||||
- pkg-download-tests
|
|
||||||
environment: release
|
environment: release
|
||||||
steps:
|
steps:
|
||||||
- name: Clone The Salt Repository
|
- name: Clone The Salt Repository
|
||||||
|
@ -354,7 +340,7 @@ jobs:
|
||||||
branch: ${{ github.ref }}
|
branch: ${{ github.ref }}
|
||||||
|
|
||||||
- name: Create Github Release
|
- name: Create Github Release
|
||||||
uses: ncipollo/release-action@v1.12.0
|
uses: ncipollo/release-action@v1
|
||||||
with:
|
with:
|
||||||
artifactErrorsFailBuild: true
|
artifactErrorsFailBuild: true
|
||||||
artifacts: ${{ steps.prepare-release.outputs.release-artifacts }}
|
artifacts: ${{ steps.prepare-release.outputs.release-artifacts }}
|
||||||
|
@ -386,9 +372,9 @@ jobs:
|
||||||
- release
|
- release
|
||||||
environment: release
|
environment: release
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-release
|
USE_S3_CACHE: 'true'
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
@ -432,25 +418,23 @@ jobs:
|
||||||
TWINE_PASSWORD: "${{ steps.get-secrets.outputs.twine-password }}"
|
TWINE_PASSWORD: "${{ steps.get-secrets.outputs.twine-password }}"
|
||||||
run: |
|
run: |
|
||||||
tools pkg pypi-upload artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
|
tools pkg pypi-upload artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
|
||||||
|
|
||||||
set-pipeline-exit-status:
|
set-pipeline-exit-status:
|
||||||
# This step is just so we can make github require this step, to pass checks
|
# This step is just so we can make github require this step, to pass checks
|
||||||
# on a pull request instead of requiring all
|
# on a pull request instead of requiring all
|
||||||
name: Set the ${{ github.workflow }} Pipeline Exit Status
|
name: Set the ${{ github.workflow }} Pipeline Exit Status
|
||||||
if: always()
|
if: ${{ !cancelled() && always() }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
needs:
|
needs:
|
||||||
- check-requirements
|
- check-requirements
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- publish-repositories
|
- publish-repositories
|
||||||
- pkg-download-tests
|
|
||||||
- release
|
- release
|
||||||
- publish-pypi
|
- publish-pypi
|
||||||
- build-ci-deps
|
- build-ci-deps
|
||||||
steps:
|
steps:
|
||||||
- name: Get workflow information
|
- name: Get workflow information
|
||||||
id: get-workflow-info
|
id: get-workflow-info
|
||||||
uses: technote-space/workflow-conclusion-action@v3
|
uses: im-open/workflow-conclusion@v2
|
||||||
|
|
||||||
- run: |
|
- run: |
|
||||||
# shellcheck disable=SC2129
|
# shellcheck disable=SC2129
|
||||||
|
@ -464,13 +448,8 @@ jobs:
|
||||||
- name: Set Pipeline Exit Status
|
- name: Set Pipeline Exit Status
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
if [ "${{ steps.get-workflow-info.outputs.conclusion }}" != "success" ]; then
|
if [ "${{ steps.get-workflow-info.outputs.workflow_conclusion }}" != "success" ]; then
|
||||||
exit 1
|
exit 1
|
||||||
else
|
else
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
- name: Done
|
|
||||||
if: always()
|
|
||||||
run:
|
|
||||||
echo "All worflows finished"
|
|
||||||
|
|
70
.github/workflows/run-nightly.yml
vendored
Normal file
70
.github/workflows/run-nightly.yml
vendored
Normal file
|
@ -0,0 +1,70 @@
|
||||||
|
name: Run Nightly Builds
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_dispatch: {}
|
||||||
|
schedule:
|
||||||
|
# https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#onschedule
|
||||||
|
- cron: '0 0 * * *' # Every day at 0AM
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read # for dorny/paths-filter to fetch a list of changed files
|
||||||
|
pull-requests: read # for dorny/paths-filter to read pull requests
|
||||||
|
actions: write # to trigger branch nightly builds
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
|
||||||
|
workflow-requirements:
|
||||||
|
name: Check Workflow Requirements
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
outputs:
|
||||||
|
requirements-met: ${{ steps.check-requirements.outputs.requirements-met }}
|
||||||
|
steps:
|
||||||
|
- name: Check Requirements
|
||||||
|
id: check-requirements
|
||||||
|
run: |
|
||||||
|
if [ "${{ vars.RUN_SCHEDULED_BUILDS }}" = "1" ]; then
|
||||||
|
MSG="Running workflow because RUN_SCHEDULED_BUILDS=1"
|
||||||
|
echo "${MSG}"
|
||||||
|
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
||||||
|
echo "requirements-met=true" >> "${GITHUB_OUTPUT}"
|
||||||
|
elif [ "${{ github.event.repository.fork }}" = "true" ]; then
|
||||||
|
MSG="Not running workflow because ${{ github.repository }} is a fork"
|
||||||
|
echo "${MSG}"
|
||||||
|
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
||||||
|
echo "requirements-met=false" >> "${GITHUB_OUTPUT}"
|
||||||
|
elif [ "${{ github.event.repository.private }}" = "true" ]; then
|
||||||
|
MSG="Not running workflow because ${{ github.repository }} is a private repository"
|
||||||
|
echo "${MSG}"
|
||||||
|
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
||||||
|
echo "requirements-met=false" >> "${GITHUB_OUTPUT}"
|
||||||
|
else
|
||||||
|
MSG="Running workflow because ${{ github.repository }} is not a fork"
|
||||||
|
echo "${MSG}"
|
||||||
|
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
||||||
|
echo "requirements-met=true" >> "${GITHUB_OUTPUT}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
trigger-branch-nightly-builds:
|
||||||
|
name: Trigger Branch Workflows
|
||||||
|
if: ${{ fromJSON(needs.workflow-requirements.outputs.requirements-met) }}
|
||||||
|
runs-on: ubuntu-24.04
|
||||||
|
needs:
|
||||||
|
- workflow-requirements
|
||||||
|
environment: workflow-restart
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
branch: [3006.x, 3007.x, master]
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: Generate a token
|
||||||
|
id: generate-token
|
||||||
|
uses: actions/create-github-app-token@v1
|
||||||
|
with:
|
||||||
|
app-id: ${{ vars.APP_ID }}
|
||||||
|
private-key: ${{ secrets.APP_PRIVATE_KEY }}
|
||||||
|
|
||||||
|
- name: Trigger ${{ matrix.branch }} branch
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ steps.generate-token.outputs.token }}
|
||||||
|
run: |
|
||||||
|
gh workflow run nightly.yml --repo ${{ github.repository }} --ref ${{ matrix.branch }}
|
1796
.github/workflows/scheduled.yml
vendored
1796
.github/workflows/scheduled.yml
vendored
File diff suppressed because it is too large
Load diff
17
.github/workflows/scripts/update_winrepo.py
vendored
17
.github/workflows/scripts/update_winrepo.py
vendored
|
@ -1,8 +1,10 @@
|
||||||
import argparse
|
import argparse
|
||||||
import os
|
import os
|
||||||
|
|
||||||
|
print("Update winrepo script")
|
||||||
|
|
||||||
# Where are we
|
# Where are we
|
||||||
print(os.getcwd())
|
print(f"Current working directory: {os.getcwd()}")
|
||||||
|
|
||||||
arg_parser = argparse.ArgumentParser()
|
arg_parser = argparse.ArgumentParser()
|
||||||
arg_parser.add_argument("-f", "--file", help="the winrepo file to edit")
|
arg_parser.add_argument("-f", "--file", help="the winrepo file to edit")
|
||||||
|
@ -12,10 +14,15 @@ args = arg_parser.parse_args()
|
||||||
file = args.file
|
file = args.file
|
||||||
version = args.version
|
version = args.version
|
||||||
|
|
||||||
|
print("Args:")
|
||||||
|
print(f"- file: {file}")
|
||||||
|
print(f"- version: {version}")
|
||||||
|
|
||||||
if version.startswith("v"):
|
if version.startswith("v"):
|
||||||
version = version[1:]
|
version = version[1:]
|
||||||
|
|
||||||
with open(file) as f:
|
with open(file) as f:
|
||||||
|
print(f"Opening file: {file}")
|
||||||
current_contents = f.readlines()
|
current_contents = f.readlines()
|
||||||
|
|
||||||
new_contents = []
|
new_contents = []
|
||||||
|
@ -23,9 +30,13 @@ new_contents = []
|
||||||
added = False
|
added = False
|
||||||
for line in current_contents:
|
for line in current_contents:
|
||||||
new_contents.append(line)
|
new_contents.append(line)
|
||||||
if "for version in [" in line and not added:
|
if "load_yaml as versions_relenv" in line and not added:
|
||||||
new_contents.append(f" '{version}',\n")
|
print(f"Adding version: {version}")
|
||||||
|
new_contents.append(f"- {version}\n")
|
||||||
added = True
|
added = True
|
||||||
|
|
||||||
with open(file, "w") as f:
|
with open(file, "w") as f:
|
||||||
|
print(f"Writing file: {file}")
|
||||||
f.writelines(new_contents)
|
f.writelines(new_contents)
|
||||||
|
|
||||||
|
print("Update winrepo script complete")
|
||||||
|
|
56
.github/workflows/ssh-debug.yml
vendored
Normal file
56
.github/workflows/ssh-debug.yml
vendored
Normal file
|
@ -0,0 +1,56 @@
|
||||||
|
name: SSH Debug
|
||||||
|
run-name: "SSH Debug ${{ inputs.runner }}"
|
||||||
|
on:
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
runner:
|
||||||
|
type: string
|
||||||
|
required: True
|
||||||
|
description: The runner to start a tunnel on.
|
||||||
|
offer:
|
||||||
|
type: string
|
||||||
|
required: True
|
||||||
|
description: SDP Offer
|
||||||
|
public_key:
|
||||||
|
type: string
|
||||||
|
required: True
|
||||||
|
description: Your public key for ssh access.
|
||||||
|
debug:
|
||||||
|
required: false
|
||||||
|
type: boolean
|
||||||
|
default: false
|
||||||
|
description: Run sshd with debug enabled.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
debug:
|
||||||
|
runs-on: ${{ inputs.runner }}
|
||||||
|
if: ${{ inputs.runner }}
|
||||||
|
environment: ci
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: Checkout Source Code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Python 3.10
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cache-prefix: ssh-debug
|
||||||
|
|
||||||
|
- name: Install Nox
|
||||||
|
run: |
|
||||||
|
python3 -m pip install 'nox==2022.8.7'
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
|
||||||
|
- uses: ./.github/actions/ssh-tunnel
|
||||||
|
with:
|
||||||
|
public_key: ${{ inputs.public_key }}
|
||||||
|
offer: ${{ inputs.offer }}
|
||||||
|
debug: ${{ inputs.debug }}
|
2549
.github/workflows/staging.yml
vendored
2549
.github/workflows/staging.yml
vendored
File diff suppressed because it is too large
Load diff
|
@ -1,9 +1,10 @@
|
||||||
|
|
||||||
build-ci-deps:
|
build-ci-deps:
|
||||||
<%- do test_salt_needs.append("build-ci-deps") %>
|
<%- do test_salt_needs.append("build-ci-deps") %>
|
||||||
|
<%- do test_salt_linux_needs.append("build-ci-deps") %>
|
||||||
name: CI Deps
|
name: CI Deps
|
||||||
<%- if workflow_slug != 'release' %>
|
<%- if workflow_slug != 'release' %>
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['build-deps-ci'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['build-deps-ci'] }}
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
@ -20,3 +21,5 @@
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
||||||
nox-archive-hash: "${{ needs.prepare-workflow.outputs.nox-archive-hash }}"
|
nox-archive-hash: "${{ needs.prepare-workflow.outputs.nox-archive-hash }}"
|
||||||
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
|
||||||
|
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
|
||||||
|
|
|
@ -23,12 +23,6 @@
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Download DEB Packages
|
- name: Download DEB Packages
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
|
@ -78,7 +72,7 @@
|
||||||
- name: Upload Repository As An Artifact
|
- name: Upload Repository As An Artifact
|
||||||
uses: ./.github/actions/upload-artifact
|
uses: ./.github/actions/upload-artifact
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-${{ matrix.pkg-type }}-${{ matrix.distro }}-${{ matrix.version }}-${{ matrix.arch }}
|
||||||
path: artifacts/pkgs/repo/*
|
path: artifacts/pkgs/repo/*
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
|
@ -1,360 +0,0 @@
|
||||||
---
|
|
||||||
name: Install Test Dependencies
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
nox-session:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version to set prior to running tests.
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
nox-archive-hash:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Nox Tarball Cache Hash
|
|
||||||
python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
linux-dependencies:
|
|
||||||
name: Linux
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
timeout-minutes: 90
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
<%- for arch, build_distro_slug in build_ci_deps_listing["linux"] %>
|
|
||||||
- distro-slug: <{ build_distro_slug }>
|
|
||||||
arch: <{ arch }>
|
|
||||||
<%- endfor %>
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Cache nox.linux.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
|
||||||
id: nox-dependencies-cache
|
|
||||||
uses: actions/cache@v3.3.1
|
|
||||||
with:
|
|
||||||
path: nox.linux.${{ matrix.arch }}.tar.*
|
|
||||||
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|linux|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: PyPi Proxy
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
sed -i '7s;^;--index-url=https://pypi-proxy.saltstack.net/root/local/+simple/ --extra-index-url=https://pypi.org/simple\n;' requirements/static/ci/*/*.txt
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Install Dependencies
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm install-dependencies --nox-session=${{ inputs.nox-session }} ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Cleanup .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm pre-archive-cleanup ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Compress .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm compress-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Download Compressed .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always() && steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Upload Nox Requirements Tarball
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-linux-${{ matrix.arch }}-${{ inputs.nox-session }}
|
|
||||||
path: nox.linux.${{ matrix.arch }}.tar.*
|
|
||||||
|
|
||||||
macos-dependencies:
|
|
||||||
name: MacOS
|
|
||||||
runs-on: ${{ matrix.distro-slug }}
|
|
||||||
timeout-minutes: 90
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
<%- for arch, build_distro_slug in build_ci_deps_listing["macos"] %>
|
|
||||||
- distro-slug: <{ build_distro_slug }>
|
|
||||||
arch: <{ arch }>
|
|
||||||
<%- endfor %>
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Cache nox.macos.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
|
||||||
id: nox-dependencies-cache
|
|
||||||
uses: actions/cache@v3.3.1
|
|
||||||
with:
|
|
||||||
path: nox.macos.${{ matrix.arch }}.tar.*
|
|
||||||
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|macos|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-macos-${{ matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-macos-${{ matrix.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Set up Python ${{ inputs.python-version }}
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
|
|
||||||
- name: Install System Dependencies
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
brew install openssl@3
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Install Dependencies
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
env:
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
run: |
|
|
||||||
export PYCURL_SSL_LIBRARY=openssl
|
|
||||||
export LDFLAGS="-L/usr/local/opt/openssl@3/lib"
|
|
||||||
export CPPFLAGS="-I/usr/local/opt/openssl@3/include"
|
|
||||||
export PKG_CONFIG_PATH="/usr/local/opt/openssl@3/lib/pkgconfig"
|
|
||||||
nox --install-only -e ${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Cleanup .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
nox --force-color -e "pre-archive-cleanup(pkg=False)"
|
|
||||||
|
|
||||||
- name: Compress .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
nox --force-color -e compress-dependencies -- macos ${{ matrix.arch }}
|
|
||||||
|
|
||||||
- name: Upload Nox Requirements Tarball
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
|
|
||||||
path: nox.macos.${{ matrix.arch }}.tar.*
|
|
||||||
|
|
||||||
windows-dependencies:
|
|
||||||
name: Windows
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
timeout-minutes: 90
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
<%- for arch, build_distro_slug in build_ci_deps_listing["windows"] %>
|
|
||||||
- distro-slug: <{ build_distro_slug }>
|
|
||||||
arch: <{ arch }>
|
|
||||||
<%- endfor %>
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Cache nox.windows.${{ matrix.arch }}.tar.* for session ${{ inputs.nox-session }}
|
|
||||||
id: nox-dependencies-cache
|
|
||||||
uses: actions/cache@v3.3.1
|
|
||||||
with:
|
|
||||||
path: nox.windows.${{ matrix.arch }}.tar.*
|
|
||||||
key: ${{ inputs.cache-prefix }}|testrun-deps|${{ matrix.arch }}|windows|${{ inputs.nox-session }}|${{ inputs.python-version }}|${{ inputs.nox-archive-hash }}
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: PyPi Proxy
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
sed -i '7s;^;--index-url=https://pypi-proxy.saltstack.net/root/local/+simple/ --extra-index-url=https://pypi.org/simple\n;' requirements/static/ci/*/*.txt
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}-build-deps-ci
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Install Dependencies
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm install-dependencies --nox-session=${{ inputs.nox-session }} ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Cleanup .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm pre-archive-cleanup ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Compress .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm compress-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Download Compressed .nox Directory
|
|
||||||
if: steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always() && steps.nox-dependencies-cache.outputs.cache-hit != 'true'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Upload Nox Requirements Tarball
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-windows-${{ matrix.arch }}-${{ inputs.nox-session }}
|
|
||||||
path: nox.windows.${{ matrix.arch }}.tar.*
|
|
|
@ -13,12 +13,6 @@
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Download macOS x86_64 Packages
|
- name: Download macOS x86_64 Packages
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
|
@ -26,6 +20,7 @@
|
||||||
path: artifacts/pkgs/incoming
|
path: artifacts/pkgs/incoming
|
||||||
|
|
||||||
- name: Download macOS Arch64 Packages
|
- name: Download macOS Arch64 Packages
|
||||||
|
if: ${{ ! github.event.repository.fork }}
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-arm64-macos
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-arm64-macos
|
||||||
|
@ -73,7 +68,7 @@
|
||||||
- name: Upload Repository As An Artifact
|
- name: Upload Repository As An Artifact
|
||||||
uses: ./.github/actions/upload-artifact
|
uses: ./.github/actions/upload-artifact
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-macos
|
||||||
path: artifacts/pkgs/repo/*
|
path: artifacts/pkgs/repo/*
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
|
@ -13,12 +13,6 @@
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Download Linux x86_64 Onedir Archive
|
- name: Download Linux x86_64 Onedir Archive
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
|
@ -38,6 +32,7 @@
|
||||||
path: artifacts/pkgs/incoming
|
path: artifacts/pkgs/incoming
|
||||||
|
|
||||||
- name: Download macOS arm64 Onedir Archive
|
- name: Download macOS arm64 Onedir Archive
|
||||||
|
if: ${{ ! github.event.repository.fork }}
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-onedir-macos-arm64.tar.xz
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-onedir-macos-arm64.tar.xz
|
||||||
|
@ -109,7 +104,7 @@
|
||||||
- name: Upload Repository As An Artifact
|
- name: Upload Repository As An Artifact
|
||||||
uses: ./.github/actions/upload-artifact
|
uses: ./.github/actions/upload-artifact
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-onedir
|
||||||
path: artifacts/pkgs/repo/*
|
path: artifacts/pkgs/repo/*
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
|
@ -1,4 +1,9 @@
|
||||||
<%- for backend in ("onedir", "src") %>
|
<%- if gh_environment != "ci" -%>
|
||||||
|
<%- set pkg_types = ("onedir", "src") %>
|
||||||
|
<%- else -%>
|
||||||
|
<%- set pkg_types = ("onedir",) %>
|
||||||
|
<%- endif -%>
|
||||||
|
<%- for backend in pkg_types %>
|
||||||
<%- set job_name = "build-pkgs-{}".format(backend) %>
|
<%- set job_name = "build-pkgs-{}".format(backend) %>
|
||||||
<%- if backend == "src" %>
|
<%- if backend == "src" %>
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
|
@ -6,7 +11,7 @@
|
||||||
|
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
name: Build Packages
|
name: Build Packages
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['build-pkgs'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['build-pkgs'] }}
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- build-salt-onedir
|
- build-salt-onedir
|
||||||
|
@ -17,11 +22,14 @@
|
||||||
relenv-version: "<{ relenv_version }>"
|
relenv-version: "<{ relenv_version }>"
|
||||||
python-version: "<{ python_version }>"
|
python-version: "<{ python_version }>"
|
||||||
source: "<{ backend }>"
|
source: "<{ backend }>"
|
||||||
<%- if gh_environment %>
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
|
||||||
|
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
|
||||||
|
<%- if gh_environment != "ci" %>
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
sign-macos-packages: true
|
sign-macos-packages: false
|
||||||
sign-windows-packages: <% if gh_environment == 'nightly' -%> false <%- else -%> ${{ inputs.sign-windows-packages }} <%- endif %>
|
sign-windows-packages: <% if gh_environment == 'nightly' -%> false <%- else -%> ${{ inputs.sign-windows-packages }} <%- endif %>
|
||||||
secrets: inherit
|
secrets: inherit
|
||||||
|
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
<%- endfor %>
|
<%- endfor %>
|
||||||
|
|
|
@ -1,32 +0,0 @@
|
||||||
<%- for type, display_name in (
|
|
||||||
("src", "Source"),
|
|
||||||
("deb", "DEB"),
|
|
||||||
("rpm", "RPM"),
|
|
||||||
("windows", "Windows"),
|
|
||||||
("macos", "macOS"),
|
|
||||||
("onedir", "Onedir"),
|
|
||||||
) %>
|
|
||||||
|
|
||||||
<%- set job_name = "build-{}-repo".format(type) %>
|
|
||||||
<%- do build_repo_needs.append(job_name) %>
|
|
||||||
|
|
||||||
<{ job_name }>:
|
|
||||||
name: Build Repository
|
|
||||||
environment: <{ gh_environment }>
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- repo-<{ gh_environment }>
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
<%- if type not in ("src", "onedir") %>
|
|
||||||
- build-pkgs-onedir
|
|
||||||
<%- elif type == 'onedir' %>
|
|
||||||
- build-salt-onedir
|
|
||||||
<%- elif type == 'src' %>
|
|
||||||
- build-source-tarball
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
<%- include "build-{}-repo.yml.jinja".format(type) %>
|
|
||||||
|
|
||||||
<%- endfor %>
|
|
|
@ -23,12 +23,6 @@
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Download RPM Packages
|
- name: Download RPM Packages
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
|
@ -85,7 +79,7 @@
|
||||||
- name: Upload Repository As An Artifact
|
- name: Upload Repository As An Artifact
|
||||||
uses: ./.github/actions/upload-artifact
|
uses: ./.github/actions/upload-artifact
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-${{ matrix.pkg-type }}-${{ matrix.distro }}-${{ matrix.version }}-${{ matrix.arch }}
|
||||||
path: artifacts/pkgs/repo/*
|
path: artifacts/pkgs/repo/*
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
|
@ -13,12 +13,6 @@
|
||||||
with:
|
with:
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Download Source Tarball
|
- name: Download Source Tarball
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
|
@ -83,7 +77,7 @@
|
||||||
- name: Upload Repository As An Artifact
|
- name: Upload Repository As An Artifact
|
||||||
uses: ./.github/actions/upload-artifact
|
uses: ./.github/actions/upload-artifact
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-src
|
||||||
path: artifacts/pkgs/repo/*
|
path: artifacts/pkgs/repo/*
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
|
@ -85,7 +85,7 @@
|
||||||
- name: Upload Repository As An Artifact
|
- name: Upload Repository As An Artifact
|
||||||
uses: ./.github/actions/upload-artifact
|
uses: ./.github/actions/upload-artifact
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo-windows
|
||||||
path: artifacts/pkgs/repo/*
|
path: artifacts/pkgs/repo/*
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
|
117
.github/workflows/templates/ci.yml.jinja
vendored
117
.github/workflows/templates/ci.yml.jinja
vendored
|
@ -1,3 +1,5 @@
|
||||||
|
<%- set gh_environment = gh_environment|default("ci") %>
|
||||||
|
|
||||||
<%- extends 'layout.yml.jinja' %>
|
<%- extends 'layout.yml.jinja' %>
|
||||||
<%- set pre_commit_version = "3.0.4" %>
|
<%- set pre_commit_version = "3.0.4" %>
|
||||||
|
|
||||||
|
@ -10,7 +12,6 @@
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
name: Pre-Commit
|
name: Pre-Commit
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
|
||||||
uses: ./.github/workflows/pre-commit-action.yml
|
uses: ./.github/workflows/pre-commit-action.yml
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
@ -28,7 +29,7 @@
|
||||||
lint:
|
lint:
|
||||||
<%- do conclusion_needs.append('lint') %>
|
<%- do conclusion_needs.append('lint') %>
|
||||||
name: Lint
|
name: Lint
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
if: ${{ !cancelled() && fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
|
||||||
uses: ./.github/workflows/lint-action.yml
|
uses: ./.github/workflows/lint-action.yml
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
@ -37,37 +38,37 @@
|
||||||
|
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
|
<%- set job_name = "nsis-tests" %>
|
||||||
|
<%- if includes.get(job_name, True) %>
|
||||||
|
<{ job_name }>:
|
||||||
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
|
name: NSIS Tests
|
||||||
|
uses: ./.github/workflows/nsis-tests.yml
|
||||||
|
needs:
|
||||||
|
- prepare-workflow
|
||||||
|
with:
|
||||||
|
changed-files: ${{ needs.prepare-workflow.outputs.changed-files }}
|
||||||
|
|
||||||
|
<%- endif %>
|
||||||
|
|
||||||
<%- set job_name = "prepare-release" %>
|
<%- set job_name = "prepare-release" %>
|
||||||
<%- if includes.get(job_name, True) %>
|
<%- if includes.get(job_name, True) %>
|
||||||
|
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
name: "Prepare Release: ${{ needs.prepare-workflow.outputs.salt-version }}"
|
name: "Prepare Release: ${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
<%- if prepare_actual_release %>
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- ubuntu-22.04
|
||||||
- linux
|
if: ${{ !cancelled() && fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
|
||||||
- medium
|
|
||||||
- x86_64
|
|
||||||
<%- else %>
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
<%- endif %>
|
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
<%- if not prepare_actual_release %>
|
|
||||||
|
|
||||||
- name: Set up Python 3.10
|
- name: Set up Python 3.10
|
||||||
uses: actions/setup-python@v5
|
uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: "3.10"
|
python-version: "3.10"
|
||||||
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
- name: Setup Python Tools Scripts
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
with:
|
with:
|
||||||
|
@ -96,22 +97,27 @@
|
||||||
tools changelog update-rpm --draft
|
tools changelog update-rpm --draft
|
||||||
tools changelog update-rpm
|
tools changelog update-rpm
|
||||||
|
|
||||||
|
- name: Create Release Notes Template
|
||||||
|
shell: bash
|
||||||
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
|
run: |
|
||||||
|
if [ "${{ needs.prepare-workflow.outputs.release-changelog-target }}" == "next-major-release" ]; then
|
||||||
|
tools changelog update-release-notes --next-release --template-only
|
||||||
|
else
|
||||||
|
tools changelog update-release-notes --template-only
|
||||||
|
fi
|
||||||
|
|
||||||
- name: Update Release Notes
|
- name: Update Release Notes
|
||||||
shell: bash
|
shell: bash
|
||||||
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
if: ${{ startsWith(github.event.ref, 'refs/tags') == false }}
|
||||||
run: |
|
run: |
|
||||||
<%- if gh_environment == 'nightly' %>
|
if [ "${{ needs.prepare-workflow.outputs.release-changelog-target }}" == "next-major-release" ]; then
|
||||||
if [ "${{ contains(fromJSON('["master"]'), github.ref_name) }}" == "true" ]; then
|
|
||||||
tools changelog update-release-notes --draft <%- if prepare_actual_release %> --release <%- endif %> --next-release
|
tools changelog update-release-notes --draft <%- if prepare_actual_release %> --release <%- endif %> --next-release
|
||||||
tools changelog update-release-notes <%- if prepare_actual_release %> --release <%- endif %> --next-release
|
tools changelog update-release-notes <%- if prepare_actual_release %> --release <%- endif %> --next-release
|
||||||
else
|
else
|
||||||
tools changelog update-release-notes --draft <%- if prepare_actual_release %> --release <%- endif %>
|
tools changelog update-release-notes --draft <%- if prepare_actual_release %> --release <%- endif %>
|
||||||
tools changelog update-release-notes <%- if prepare_actual_release %> --release <%- endif %>
|
tools changelog update-release-notes <%- if prepare_actual_release %> --release <%- endif %>
|
||||||
fi
|
fi
|
||||||
<%- else %>
|
|
||||||
tools changelog update-release-notes --draft <%- if prepare_actual_release %> --release <%- endif %>
|
|
||||||
tools changelog update-release-notes <%- if prepare_actual_release %> --release <%- endif %>
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
- name: Generate MAN Pages
|
- name: Generate MAN Pages
|
||||||
shell: bash
|
shell: bash
|
||||||
|
@ -184,7 +190,7 @@
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
name: Documentation
|
name: Documentation
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
if: ${{ !cancelled() && fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- build-source-tarball
|
- build-source-tarball
|
||||||
|
@ -201,11 +207,11 @@
|
||||||
|
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
name: Build Source Tarball
|
name: Build Source Tarball
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
if: ${{ !cancelled() && fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- prepare-release
|
- prepare-release
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
@ -232,51 +238,27 @@
|
||||||
|
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
|
|
||||||
<%- set job_name = "build-deps-onedir" %>
|
|
||||||
<%- if includes.get(job_name, True) %>
|
|
||||||
|
|
||||||
<{ job_name }>:
|
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
|
||||||
name: Build Dependencies Onedir
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
uses: ./.github/workflows/build-deps-onedir.yml
|
|
||||||
with:
|
|
||||||
cache-seed: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
|
||||||
self-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
github-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
|
||||||
relenv-version: "<{ relenv_version }>"
|
|
||||||
python-version: "<{ python_version }>"
|
|
||||||
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
|
|
||||||
<%- set job_name = "build-salt-onedir" %>
|
<%- set job_name = "build-salt-onedir" %>
|
||||||
<%- if includes.get(job_name, True) %>
|
<%- if includes.get(job_name, True) %>
|
||||||
|
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
name: Build Salt Onedir
|
name: Build Salt Onedir
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['<{ job_name }>'] }}
|
if: ${{ !cancelled() && fromJSON(needs.prepare-workflow.outputs.config)['jobs']['<{ job_name }>'] }}
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- build-deps-onedir
|
|
||||||
- build-source-tarball
|
- build-source-tarball
|
||||||
uses: ./.github/workflows/build-salt-onedir.yml
|
uses: ./.github/workflows/build-salt-onedir.yml
|
||||||
with:
|
with:
|
||||||
cache-seed: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
cache-seed: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
self-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
github-hosted-runners: ${{ fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
|
||||||
relenv-version: "<{ relenv_version }>"
|
relenv-version: "<{ relenv_version }>"
|
||||||
python-version: "<{ python_version }>"
|
python-version: "<{ python_version }>"
|
||||||
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
|
||||||
|
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
|
||||||
|
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
|
|
||||||
<%- set job_name = "build-pkgs" %>
|
<%- set job_name = "build-pkgs" %>
|
||||||
<%- if includes.get(job_name, True) %>
|
<%- if includes.get(job_name, True) %>
|
||||||
<%- include "build-packages.yml.jinja" %>
|
<%- include "build-packages.yml.jinja" %>
|
||||||
|
@ -302,8 +284,10 @@
|
||||||
combine-all-code-coverage:
|
combine-all-code-coverage:
|
||||||
<%- do conclusion_needs.append("combine-all-code-coverage") %>
|
<%- do conclusion_needs.append("combine-all-code-coverage") %>
|
||||||
name: Combine Code Coverage
|
name: Combine Code Coverage
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.testrun)['skip_code_coverage'] == false }}
|
if: ${{ !cancelled() && fromJSON(needs.prepare-workflow.outputs.config)['skip_code_coverage'] == false }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
<%- for need in test_salt_needs.iter(consume=False) %>
|
<%- for need in test_salt_needs.iter(consume=False) %>
|
||||||
|
@ -340,14 +324,22 @@
|
||||||
|
|
||||||
#}
|
#}
|
||||||
|
|
||||||
- name: Get coverage reports
|
- name: Merge All Code Coverage Test Run Artifacts
|
||||||
id: get-coverage-reports
|
continue-on-error: true
|
||||||
uses: actions/download-artifact@v3
|
uses: actions/upload-artifact/merge@v4
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
with:
|
||||||
name: all-testrun-coverage-artifacts
|
name: all-testrun-coverage-artifacts
|
||||||
|
pattern: all-testrun-coverage-artifacts-*
|
||||||
|
separate-directories: false
|
||||||
|
delete-merged: true
|
||||||
|
|
||||||
|
- name: Get coverage reports
|
||||||
|
id: get-coverage-reports
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
path: artifacts/coverage/
|
path: artifacts/coverage/
|
||||||
|
pattern: all-testrun-coverage-artifacts*
|
||||||
|
merge-multiple: true
|
||||||
|
|
||||||
- name: Display structure of downloaded files
|
- name: Display structure of downloaded files
|
||||||
run: tree -a artifacts/
|
run: tree -a artifacts/
|
||||||
|
@ -371,7 +363,9 @@
|
||||||
nox --force-color -e create-xml-coverage-reports
|
nox --force-color -e create-xml-coverage-reports
|
||||||
|
|
||||||
- name: Upload Code Coverage To Codecov
|
- name: Upload Code Coverage To Codecov
|
||||||
if: ${{ ! github.event.repository.private }}
|
if: ${{ ! github.event.repository.private && ! github.event.repository.fork }}
|
||||||
|
env:
|
||||||
|
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
|
||||||
run: |
|
run: |
|
||||||
tools ci upload-coverage --commit-sha=${{ github.event.pull_request.head.sha || github.sha }} artifacts/coverage/
|
tools ci upload-coverage --commit-sha=${{ github.event.pull_request.head.sha || github.sha }} artifacts/coverage/
|
||||||
|
|
||||||
|
@ -398,6 +392,7 @@
|
||||||
path: artifacts/coverage/html/salt
|
path: artifacts/coverage/html/salt
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
- name: Report Combined Code Coverage
|
- name: Report Combined Code Coverage
|
||||||
run: |
|
run: |
|
||||||
|
@ -414,6 +409,7 @@
|
||||||
path: artifacts/coverage/coverage.json
|
path: artifacts/coverage/coverage.json
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
- name: Create Combined Code Coverage HTML Report
|
- name: Create Combined Code Coverage HTML Report
|
||||||
run: |
|
run: |
|
||||||
|
@ -426,6 +422,7 @@
|
||||||
path: artifacts/coverage/html/full
|
path: artifacts/coverage/html/full
|
||||||
retention-days: 7
|
retention-days: 7
|
||||||
if-no-files-found: error
|
if-no-files-found: error
|
||||||
|
include-hidden-files: true
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
<%- endblock jobs %>
|
<%- endblock jobs %>
|
||||||
|
|
96
.github/workflows/templates/layout.yml.jinja
vendored
96
.github/workflows/templates/layout.yml.jinja
vendored
|
@ -5,7 +5,7 @@
|
||||||
<%- set prepare_workflow_skip_pkg_test_suite = prepare_workflow_skip_pkg_test_suite|default("") %>
|
<%- set prepare_workflow_skip_pkg_test_suite = prepare_workflow_skip_pkg_test_suite|default("") %>
|
||||||
<%- set prepare_workflow_skip_pkg_download_test_suite = prepare_workflow_skip_pkg_download_test_suite|default("") %>
|
<%- set prepare_workflow_skip_pkg_download_test_suite = prepare_workflow_skip_pkg_download_test_suite|default("") %>
|
||||||
<%- set prepare_workflow_salt_version_input = prepare_workflow_salt_version_input|default("") %>
|
<%- set prepare_workflow_salt_version_input = prepare_workflow_salt_version_input|default("") %>
|
||||||
<%- set skip_test_coverage_check = skip_test_coverage_check|default("${{ fromJSON(needs.prepare-workflow.outputs.testrun)['skip_code_coverage'] }}") %>
|
<%- set skip_test_coverage_check = skip_test_coverage_check|default("${{ fromJSON(needs.prepare-workflow.outputs.config)['skip_code_coverage'] }}") %>
|
||||||
<%- set gpg_key_id = "64CBBC8173D76B3F" %>
|
<%- set gpg_key_id = "64CBBC8173D76B3F" %>
|
||||||
<%- set prepare_actual_release = prepare_actual_release | default(False) %>
|
<%- set prepare_actual_release = prepare_actual_release | default(False) %>
|
||||||
<%- set gh_actions_workflows_python_version = "3.10" %>
|
<%- set gh_actions_workflows_python_version = "3.10" %>
|
||||||
|
@ -34,7 +34,7 @@ on:
|
||||||
|
|
||||||
env:
|
env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
CACHE_SEED: SEED-7 # Bump the number to invalidate all caches
|
CACHE_SEED: SEED-1 # Bump the number to invalidate all caches
|
||||||
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
RELENV_DATA: "${{ github.workspace }}/.relenv"
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
||||||
|
@ -50,6 +50,7 @@ permissions:
|
||||||
actions: read # for technote-space/workflow-conclusion-action to get the job statuses
|
actions: read # for technote-space/workflow-conclusion-action to get the job statuses
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
|
|
||||||
<%- endblock permissions %>
|
<%- endblock permissions %>
|
||||||
|
|
||||||
<%- block concurrency %>
|
<%- block concurrency %>
|
||||||
|
@ -77,7 +78,8 @@ jobs:
|
||||||
|
|
||||||
prepare-workflow:
|
prepare-workflow:
|
||||||
name: Prepare Workflow Run
|
name: Prepare Workflow Run
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
|
environment: ci
|
||||||
<%- if prepare_workflow_if_check %>
|
<%- if prepare_workflow_if_check %>
|
||||||
if: <{ prepare_workflow_if_check }>
|
if: <{ prepare_workflow_if_check }>
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
@ -88,17 +90,19 @@ jobs:
|
||||||
<%- endfor %>
|
<%- endfor %>
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
outputs:
|
outputs:
|
||||||
jobs: ${{ steps.define-jobs.outputs.jobs }}
|
|
||||||
runners: ${{ steps.runner-types.outputs.runners }}
|
|
||||||
changed-files: ${{ steps.process-changed-files.outputs.changed-files }}
|
changed-files: ${{ steps.process-changed-files.outputs.changed-files }}
|
||||||
pull-labels: ${{ steps.get-pull-labels.outputs.labels }}
|
|
||||||
testrun: ${{ steps.define-testrun.outputs.testrun }}
|
|
||||||
salt-version: ${{ steps.setup-salt-version.outputs.salt-version }}
|
salt-version: ${{ steps.setup-salt-version.outputs.salt-version }}
|
||||||
cache-seed: ${{ steps.set-cache-seed.outputs.cache-seed }}
|
cache-seed: ${{ steps.set-cache-seed.outputs.cache-seed }}
|
||||||
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
|
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
|
||||||
releases: ${{ steps.get-salt-releases.outputs.releases }}
|
releases: ${{ steps.get-salt-releases.outputs.releases }}
|
||||||
|
release-changelog-target: ${{ steps.get-release-changelog-target.outputs.release-changelog-target }}
|
||||||
testing-releases: ${{ steps.get-testing-releases.outputs.testing-releases }}
|
testing-releases: ${{ steps.get-testing-releases.outputs.testing-releases }}
|
||||||
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
|
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
|
||||||
|
config: ${{ steps.workflow-config.outputs.config }}
|
||||||
|
env:
|
||||||
|
LINUX_ARM_RUNNER: ${{ vars.LINUX_ARM_RUNNER }}
|
||||||
|
FULL_TESTRUN_SLUGS: ${{ vars.FULL_TESTRUN_SLUGS }}
|
||||||
|
PR_TESTRUN_SLUGS: ${{ vars.PR_TESTRUN_SLUGS }}
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
|
@ -107,7 +111,7 @@ jobs:
|
||||||
- name: Get Changed Files
|
- name: Get Changed Files
|
||||||
if: ${{ github.event_name == 'pull_request'}}
|
if: ${{ github.event_name == 'pull_request'}}
|
||||||
id: changed-files
|
id: changed-files
|
||||||
uses: dorny/paths-filter@v2
|
uses: dorny/paths-filter@v3
|
||||||
with:
|
with:
|
||||||
token: ${{ github.token }}
|
token: ${{ github.token }}
|
||||||
list-files: json
|
list-files: json
|
||||||
|
@ -174,6 +178,9 @@ jobs:
|
||||||
- pkg/**
|
- pkg/**
|
||||||
- *pkg_requirements
|
- *pkg_requirements
|
||||||
- *salt_added_modified
|
- *salt_added_modified
|
||||||
|
nsis_tests:
|
||||||
|
- added|modified: &nsis_tests
|
||||||
|
- pkg/windows/nsis/**
|
||||||
testrun:
|
testrun:
|
||||||
- added|modified:
|
- added|modified:
|
||||||
- *pkg_requirements
|
- *pkg_requirements
|
||||||
|
@ -208,14 +215,6 @@ jobs:
|
||||||
salt-version: "<{ prepare_workflow_salt_version_input }>"
|
salt-version: "<{ prepare_workflow_salt_version_input }>"
|
||||||
validate-version: true
|
validate-version: true
|
||||||
|
|
||||||
- name: Get Pull Request Test Labels
|
|
||||||
id: get-pull-labels
|
|
||||||
if: ${{ github.event_name == 'pull_request'}}
|
|
||||||
env:
|
|
||||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
||||||
run: |
|
|
||||||
tools ci get-pr-test-labels --repository ${{ github.repository }}
|
|
||||||
|
|
||||||
- name: Get Hash For Nox Tarball Cache
|
- name: Get Hash For Nox Tarball Cache
|
||||||
id: nox-archive-hash
|
id: nox-archive-hash
|
||||||
run: |
|
run: |
|
||||||
|
@ -254,26 +253,6 @@ jobs:
|
||||||
run: |
|
run: |
|
||||||
echo '${{ steps.process-changed-files.outputs.changed-files }}' | jq -C '.'
|
echo '${{ steps.process-changed-files.outputs.changed-files }}' | jq -C '.'
|
||||||
|
|
||||||
- name: Define Runner Types
|
|
||||||
id: runner-types
|
|
||||||
run: |
|
|
||||||
tools ci runner-types ${{ github.event_name }}
|
|
||||||
|
|
||||||
- name: Check Defined Runners
|
|
||||||
run: |
|
|
||||||
echo '${{ steps.runner-types.outputs.runners }}' | jq -C '.'
|
|
||||||
|
|
||||||
- name: Define Jobs
|
|
||||||
id: define-jobs
|
|
||||||
run: |
|
|
||||||
tools ci define-jobs<{ prepare_workflow_skip_test_suite }><{
|
|
||||||
prepare_workflow_skip_pkg_test_suite }><{ prepare_workflow_skip_pkg_download_test_suite
|
|
||||||
}> ${{ github.event_name }} changed-files.json
|
|
||||||
|
|
||||||
- name: Check Defined Jobs
|
|
||||||
run: |
|
|
||||||
echo '${{ steps.define-jobs.outputs.jobs }}' | jq -C '.'
|
|
||||||
|
|
||||||
- name: Get Salt Releases
|
- name: Get Salt Releases
|
||||||
id: get-salt-releases
|
id: get-salt-releases
|
||||||
env:
|
env:
|
||||||
|
@ -288,42 +267,45 @@ jobs:
|
||||||
run: |
|
run: |
|
||||||
tools ci get-testing-releases ${{ join(fromJSON(steps.get-salt-releases.outputs.releases), ' ') }} --salt-version ${{ steps.setup-salt-version.outputs.salt-version }}
|
tools ci get-testing-releases ${{ join(fromJSON(steps.get-salt-releases.outputs.releases), ' ') }} --salt-version ${{ steps.setup-salt-version.outputs.salt-version }}
|
||||||
|
|
||||||
- name: Define Testrun
|
- name: Define workflow config
|
||||||
id: define-testrun
|
id: workflow-config
|
||||||
run: |
|
run: |
|
||||||
tools ci define-testrun ${{ github.event_name }} changed-files.json
|
tools ci workflow-config<{ prepare_workflow_skip_test_suite }><{
|
||||||
|
prepare_workflow_skip_pkg_test_suite }><{ prepare_workflow_skip_pkg_download_test_suite
|
||||||
- name: Check Defined Test Run
|
}> ${{ steps.setup-salt-version.outputs.salt-version }} ${{ github.event_name }} changed-files.json
|
||||||
run: |
|
|
||||||
echo '${{ steps.define-testrun.outputs.testrun }}' | jq -C '.'
|
|
||||||
|
|
||||||
- name: Check Contents of generated testrun-changed-files.txt
|
- name: Check Contents of generated testrun-changed-files.txt
|
||||||
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['type'] != 'full' }}
|
if: ${{ fromJSON(steps.workflow-config.outputs.config)['testrun']['type'] != 'full' }}
|
||||||
run: |
|
run: |
|
||||||
cat testrun-changed-files.txt || true
|
cat testrun-changed-files.txt || true
|
||||||
|
|
||||||
- name: Upload testrun-changed-files.txt
|
- name: Upload testrun-changed-files.txt
|
||||||
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['type'] != 'full' }}
|
if: ${{ fromJSON(steps.workflow-config.outputs.config)['testrun']['type'] != 'full' }}
|
||||||
uses: actions/upload-artifact@v4
|
uses: actions/upload-artifact@v4
|
||||||
with:
|
with:
|
||||||
name: testrun-changed-files.txt
|
name: testrun-changed-files.txt
|
||||||
path: testrun-changed-files.txt
|
path: testrun-changed-files.txt
|
||||||
|
|
||||||
|
- name: Get Release Changelog Target
|
||||||
|
id: get-release-changelog-target
|
||||||
|
run: |
|
||||||
|
tools ci get-release-changelog-target ${{ github.event_name }}
|
||||||
|
|
||||||
{# We can't yet use tokenless uploads with the codecov CLI
|
{# We can't yet use tokenless uploads with the codecov CLI
|
||||||
|
|
||||||
- name: Install Codecov CLI
|
- name: Install Codecov CLI
|
||||||
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['skip_code_coverage'] == false }}
|
if: ${{ fromJSON(steps.define-testrun.outputs.config)['skip_code_coverage'] == false }}
|
||||||
run: |
|
run: |
|
||||||
python3 -m pip install codecov-cli
|
python3 -m pip install codecov-cli
|
||||||
|
|
||||||
- name: Save Commit Metadata In Codecov
|
- name: Save Commit Metadata In Codecov
|
||||||
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['skip_code_coverage'] == false }}
|
if: ${{ fromJSON(steps.define-testrun.outputs.config)['skip_code_coverage'] == false }}
|
||||||
run: |
|
run: |
|
||||||
codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
||||||
create-commit --git-service github --sha ${{ github.sha }}
|
create-commit --git-service github --sha ${{ github.sha }}
|
||||||
|
|
||||||
- name: Create Codecov Coverage Report
|
- name: Create Codecov Coverage Report
|
||||||
if: ${{ fromJSON(steps.define-testrun.outputs.testrun)['skip_code_coverage'] == false }}
|
if: ${{ fromJSON(steps.define-testrun.outputs.config)['skip_code_coverage'] == false }}
|
||||||
run: |
|
run: |
|
||||||
codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
||||||
create-report --git-service github --sha ${{ github.sha }}
|
create-report --git-service github --sha ${{ github.sha }}
|
||||||
|
@ -334,13 +316,12 @@ jobs:
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
<%- endblock jobs %>
|
<%- endblock jobs %>
|
||||||
|
|
||||||
set-pipeline-exit-status:
|
set-pipeline-exit-status:
|
||||||
# This step is just so we can make github require this step, to pass checks
|
# This step is just so we can make github require this step, to pass checks
|
||||||
# on a pull request instead of requiring all
|
# on a pull request instead of requiring all
|
||||||
name: Set the ${{ github.workflow }} Pipeline Exit Status
|
name: Set the ${{ github.workflow }} Pipeline Exit Status
|
||||||
if: always()
|
if: ${{ !cancelled() && always() }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
<%- if workflow_slug == "nightly" %>
|
<%- if workflow_slug == "nightly" %>
|
||||||
environment: <{ workflow_slug }>
|
environment: <{ workflow_slug }>
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
@ -360,10 +341,14 @@ jobs:
|
||||||
<%- for need in test_repo_needs.iter(consume=True) %>
|
<%- for need in test_repo_needs.iter(consume=True) %>
|
||||||
- <{ need }>
|
- <{ need }>
|
||||||
<%- endfor %>
|
<%- endfor %>
|
||||||
|
<%- if workflow_slug != "release" %>
|
||||||
|
- test-packages
|
||||||
|
- test
|
||||||
|
<%- endif %>
|
||||||
steps:
|
steps:
|
||||||
- name: Get workflow information
|
- name: Get workflow information
|
||||||
id: get-workflow-info
|
id: get-workflow-info
|
||||||
uses: technote-space/workflow-conclusion-action@v3
|
uses: im-open/workflow-conclusion@v2
|
||||||
|
|
||||||
<%- block set_pipeline_exit_status_extra_steps %>
|
<%- block set_pipeline_exit_status_extra_steps %>
|
||||||
<%- endblock set_pipeline_exit_status_extra_steps %>
|
<%- endblock set_pipeline_exit_status_extra_steps %>
|
||||||
|
@ -371,13 +356,8 @@ jobs:
|
||||||
- name: Set Pipeline Exit Status
|
- name: Set Pipeline Exit Status
|
||||||
shell: bash
|
shell: bash
|
||||||
run: |
|
run: |
|
||||||
if [ "${{ steps.get-workflow-info.outputs.conclusion }}" != "success" ]; then
|
if [ "${{ steps.get-workflow-info.outputs.workflow_conclusion }}" != "success" ]; then
|
||||||
exit 1
|
exit 1
|
||||||
else
|
else
|
||||||
exit 0
|
exit 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
- name: Done
|
|
||||||
if: always()
|
|
||||||
run:
|
|
||||||
echo "All worflows finished"
|
|
||||||
|
|
215
.github/workflows/templates/nightly.yml.jinja
vendored
215
.github/workflows/templates/nightly.yml.jinja
vendored
|
@ -1,8 +1,7 @@
|
||||||
<%- set gh_environment = gh_environment|default("nightly") %>
|
<%- set gh_environment = gh_environment|default("nightly") %>
|
||||||
<%- set skip_test_coverage_check = skip_test_coverage_check|default("false") %>
|
<%- set skip_test_coverage_check = skip_test_coverage_check|default("true") %>
|
||||||
<%- set prepare_workflow_skip_test_suite = "${{ inputs.skip-salt-test-suite && ' --skip-tests' || '' }}" %>
|
<%- set prepare_workflow_skip_test_suite = "${{ inputs.skip-salt-test-suite && ' --skip-tests' || '' }}" %>
|
||||||
<%- set prepare_workflow_skip_pkg_test_suite = "${{ inputs.skip-salt-pkg-test-suite && ' --skip-pkg-tests' || '' }}" %>
|
<%- set prepare_workflow_skip_pkg_test_suite = "${{ inputs.skip-salt-pkg-test-suite && ' --skip-pkg-tests' || '' }}" %>
|
||||||
<%- set prepare_workflow_if_check = prepare_workflow_if_check|default("${{ fromJSON(needs.workflow-requirements.outputs.requirements-met) }}") %>
|
|
||||||
<%- extends 'ci.yml.jinja' %>
|
<%- extends 'ci.yml.jinja' %>
|
||||||
|
|
||||||
<%- block name %>
|
<%- block name %>
|
||||||
|
@ -25,9 +24,6 @@ on:
|
||||||
type: boolean
|
type: boolean
|
||||||
default: false
|
default: false
|
||||||
description: Skip running the Salt packages test suite.
|
description: Skip running the Salt packages test suite.
|
||||||
schedule:
|
|
||||||
# https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#onschedule
|
|
||||||
- cron: '0 0 * * *' # Every day at 0AM
|
|
||||||
|
|
||||||
<%- endblock on %>
|
<%- endblock on %>
|
||||||
|
|
||||||
|
@ -48,219 +44,10 @@ concurrency:
|
||||||
|
|
||||||
<%- block pre_jobs %>
|
<%- block pre_jobs %>
|
||||||
|
|
||||||
<%- include "workflow-requirements-check.yml.jinja" %>
|
|
||||||
<%- include "trigger-branch-workflows.yml.jinja" %>
|
|
||||||
|
|
||||||
{#- When we start using a slack app, we can update messages, not while using incoming webhooks
|
|
||||||
<%- if workflow_slug == "nightly" %>
|
|
||||||
|
|
||||||
<%- do conclusion_needs.append('notify-slack') %>
|
|
||||||
notify-slack:
|
|
||||||
name: Notify Slack
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
environment: <{ gh_environment }>
|
|
||||||
needs:
|
|
||||||
<%- for need in prepare_workflow_needs.iter(consume=False) %>
|
|
||||||
- <{ need }>
|
|
||||||
<%- endfor %>
|
|
||||||
outputs:
|
|
||||||
update-ts: ${{ steps.slack.outputs.update-ts }}
|
|
||||||
steps:
|
|
||||||
- name: Notify Slack
|
|
||||||
id: slack
|
|
||||||
uses: slackapi/slack-github-action@v1.24.0
|
|
||||||
with:
|
|
||||||
payload: |
|
|
||||||
{
|
|
||||||
"attachments": [
|
|
||||||
{
|
|
||||||
"color": "ffca28",
|
|
||||||
"fields": [
|
|
||||||
{
|
|
||||||
"title": "Workflow",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ github.workflow }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Workflow Run",
|
|
||||||
"short": true,
|
|
||||||
"value": "<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|${{ github.run_id }}>",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Branch",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ github.ref_name }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Commit",
|
|
||||||
"short": true,
|
|
||||||
"value": "<${{ github.server_url }}/${{ github.repository }}/commit/${{ github.sha }}|${{ github.sha }}>",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Attempt",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ github.run_attempt }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Status",
|
|
||||||
"short": true,
|
|
||||||
"value": "running",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"author_name": "${{ github.event.sender.login }}",
|
|
||||||
"author_link": "${{ github.event.sender.html_url }}",
|
|
||||||
"author_icon": "${{ github.event.sender.avatar_url }}"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
env:
|
|
||||||
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
|
|
||||||
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
|
|
||||||
|
|
||||||
<%- endif %>
|
|
||||||
#}
|
|
||||||
|
|
||||||
<%- endblock pre_jobs %>
|
<%- endblock pre_jobs %>
|
||||||
|
|
||||||
<%- block jobs %>
|
<%- block jobs %>
|
||||||
<{- super() }>
|
<{- super() }>
|
||||||
|
|
||||||
<%- if includes.get("build-repos", True) %>
|
|
||||||
<%- include "build-repos.yml.jinja" %>
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
publish-repositories:
|
|
||||||
<%- do conclusion_needs.append('publish-repositories') %>
|
|
||||||
name: Publish Repositories
|
|
||||||
if: ${{ always() && ! failure() && ! cancelled() }}
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- repo-<{ gh_environment }>
|
|
||||||
environment: <{ gh_environment }>
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
<%- for need in build_repo_needs.iter(consume=True) %>
|
|
||||||
- <{ need }>
|
|
||||||
<%- endfor %>
|
|
||||||
<%- if workflow_slug == "nightly" %>
|
|
||||||
<%- for need in test_salt_needs.iter(consume=True) %>
|
|
||||||
- <{ need }>
|
|
||||||
<%- endfor %>
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}
|
|
||||||
|
|
||||||
- name: Download Repository Artifact
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-repo
|
|
||||||
path: repo/
|
|
||||||
|
|
||||||
- name: Decompress Repository Artifacts
|
|
||||||
run: |
|
|
||||||
find repo/ -type f -name '*.tar.gz' -print -exec tar xvf {} \;
|
|
||||||
find repo/ -type f -name '*.tar.gz' -print -exec rm -f {} \;
|
|
||||||
|
|
||||||
- name: Show Repository
|
|
||||||
run: |
|
|
||||||
tree -a artifacts/pkgs/repo/
|
|
||||||
|
|
||||||
- name: Upload Repository Contents (<{ gh_environment }>)
|
|
||||||
env:
|
|
||||||
SALT_REPO_DOMAIN_RELEASE: ${{ vars.SALT_REPO_DOMAIN_RELEASE || 'repo.saltproject.io' }}
|
|
||||||
SALT_REPO_DOMAIN_STAGING: ${{ vars.SALT_REPO_DOMAIN_STAGING || 'staging.repo.saltproject.io' }}
|
|
||||||
run: |
|
|
||||||
tools pkg repo publish <{ gh_environment }> --salt-version=${{ needs.prepare-workflow.outputs.salt-version }} artifacts/pkgs/repo/
|
|
||||||
|
|
||||||
<%- endblock jobs %>
|
<%- endblock jobs %>
|
||||||
|
|
||||||
<%- block set_pipeline_exit_status_extra_steps %>
|
|
||||||
|
|
||||||
<%- if workflow_slug == "nightly" %>
|
|
||||||
|
|
||||||
- name: Notify Slack
|
|
||||||
id: slack
|
|
||||||
if: always()
|
|
||||||
uses: slackapi/slack-github-action@v1.24.0
|
|
||||||
with:
|
|
||||||
{#- When we start using a slack app, we can update messages, not while using incoming webhooks
|
|
||||||
update-ts: ${{ needs.notify-slack.outputs.update-ts }}
|
|
||||||
#}
|
|
||||||
payload: |
|
|
||||||
{
|
|
||||||
"attachments": [
|
|
||||||
{
|
|
||||||
"fallback": "${{ github.workflow }} Workflow build result for the `${{ github.ref_name }}` branch(attempt: ${{ github.run_attempt }}): `${{ steps.get-workflow-info.outputs.conclusion }}`\n${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}",
|
|
||||||
"color": "${{ steps.get-workflow-info.outputs.conclusion != 'success' && 'ff3d00' || '00e676' }}",
|
|
||||||
"fields": [
|
|
||||||
{
|
|
||||||
"title": "Workflow",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ github.workflow }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Workflow Run",
|
|
||||||
"short": true,
|
|
||||||
"value": "<${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}|${{ github.run_id }}>",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Branch",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ github.ref_name }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Commit",
|
|
||||||
"short": true,
|
|
||||||
"value": "<${{ github.server_url }}/${{ github.repository }}/commit/${{ github.sha }}|${{ github.sha }}>",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Attempt",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ github.run_attempt }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"title": "Status",
|
|
||||||
"short": true,
|
|
||||||
"value": "${{ steps.get-workflow-info.outputs.conclusion }}",
|
|
||||||
"type": "mrkdwn"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"author_name": "${{ github.event.sender.login }}",
|
|
||||||
"author_link": "${{ github.event.sender.html_url }}",
|
|
||||||
"author_icon": "${{ github.event.sender.avatar_url }}"
|
|
||||||
}
|
|
||||||
]
|
|
||||||
}
|
|
||||||
env:
|
|
||||||
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
|
|
||||||
SLACK_WEBHOOK_TYPE: INCOMING_WEBHOOK
|
|
||||||
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
<%- endblock set_pipeline_exit_status_extra_steps %>
|
|
||||||
|
|
62
.github/workflows/templates/release.yml.jinja
vendored
62
.github/workflows/templates/release.yml.jinja
vendored
|
@ -52,7 +52,7 @@ permissions:
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do prepare_workflow_needs.append(job_name) %>
|
<%- do prepare_workflow_needs.append(job_name) %>
|
||||||
name: Check Requirements
|
name: Check Requirements
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
environment: <{ gh_environment }>-check
|
environment: <{ gh_environment }>-check
|
||||||
steps:
|
steps:
|
||||||
- name: Check For Admin Permission
|
- name: Check For Admin Permission
|
||||||
|
@ -71,9 +71,9 @@ permissions:
|
||||||
prepare-workflow:
|
prepare-workflow:
|
||||||
name: Prepare Workflow Run
|
name: Prepare Workflow Run
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-<{ gh_environment }>
|
USE_S3_CACHE: 'false'
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
<%- if prepare_workflow_needs %>
|
<%- if prepare_workflow_needs %>
|
||||||
needs:
|
needs:
|
||||||
|
@ -87,6 +87,7 @@ permissions:
|
||||||
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
|
latest-release: ${{ steps.get-salt-releases.outputs.latest-release }}
|
||||||
releases: ${{ steps.get-salt-releases.outputs.releases }}
|
releases: ${{ steps.get-salt-releases.outputs.releases }}
|
||||||
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
|
nox-archive-hash: ${{ steps.nox-archive-hash.outputs.nox-archive-hash }}
|
||||||
|
config: ${{ steps.workflow-config.outputs.config }}
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
|
@ -145,6 +146,14 @@ permissions:
|
||||||
run: |
|
run: |
|
||||||
echo "nox-archive-hash=<{ nox_archive_hashfiles }>" | tee -a "$GITHUB_OUTPUT"
|
echo "nox-archive-hash=<{ nox_archive_hashfiles }>" | tee -a "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
- name: Define workflow config
|
||||||
|
id: workflow-config
|
||||||
|
run: |
|
||||||
|
tools ci workflow-config<{ prepare_workflow_skip_test_suite }><{
|
||||||
|
prepare_workflow_skip_pkg_test_suite }><{ prepare_workflow_skip_pkg_download_test_suite
|
||||||
|
}> ${{ steps.setup-salt-version.outputs.salt-version }} ${{ github.event_name }} changed-files.json
|
||||||
|
|
||||||
|
|
||||||
<%- endblock prepare_workflow_job %>
|
<%- endblock prepare_workflow_job %>
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
|
||||||
|
@ -154,9 +163,9 @@ permissions:
|
||||||
download-onedir-artifact:
|
download-onedir-artifact:
|
||||||
name: Download Staging Onedir Artifact
|
name: Download Staging Onedir Artifact
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-<{ gh_environment }>
|
USE_S3_CACHE: 'true'
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
@ -207,11 +216,11 @@ permissions:
|
||||||
backup:
|
backup:
|
||||||
name: Backup
|
name: Backup
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
|
||||||
- repo-<{ gh_environment }>
|
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
|
env:
|
||||||
|
USE_S3_CACHE: 'true'
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
outputs:
|
outputs:
|
||||||
backup-complete: ${{ steps.backup.outputs.backup-complete }}
|
backup-complete: ${{ steps.backup.outputs.backup-complete }}
|
||||||
|
@ -239,15 +248,14 @@ permissions:
|
||||||
<%- do conclusion_needs.append('publish-repositories') %>
|
<%- do conclusion_needs.append('publish-repositories') %>
|
||||||
name: Publish Repositories
|
name: Publish Repositories
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-<{ gh_environment }>
|
USE_S3_CACHE: 'true'
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- backup
|
- backup
|
||||||
- download-onedir-artifact
|
- download-onedir-artifact
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Clone The Salt Repository
|
- name: Clone The Salt Repository
|
||||||
uses: actions/checkout@v4
|
uses: actions/checkout@v4
|
||||||
|
@ -270,18 +278,14 @@ permissions:
|
||||||
run: |
|
run: |
|
||||||
tools pkg repo publish <{ gh_environment }> ${{ needs.prepare-workflow.outputs.salt-version }}
|
tools pkg repo publish <{ gh_environment }> ${{ needs.prepare-workflow.outputs.salt-version }}
|
||||||
|
|
||||||
<%- if includes.get("test-pkg-downloads", True) %>
|
|
||||||
<%- include "test-salt-pkg-repo-downloads.yml.jinja" %>
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
release:
|
release:
|
||||||
<%- do conclusion_needs.append('release') %>
|
<%- do conclusion_needs.append('release') %>
|
||||||
name: Release v${{ needs.prepare-workflow.outputs.salt-version }}
|
name: Release v${{ needs.prepare-workflow.outputs.salt-version }}
|
||||||
if: ${{ always() && ! failure() && ! cancelled() }}
|
if: ${{ always() && ! failure() && ! cancelled() }}
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-<{ gh_environment }>
|
USE_S3_CACHE: 'true'
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- backup
|
- backup
|
||||||
|
@ -363,7 +367,7 @@ permissions:
|
||||||
branch: ${{ github.ref }}
|
branch: ${{ github.ref }}
|
||||||
|
|
||||||
- name: Create Github Release
|
- name: Create Github Release
|
||||||
uses: ncipollo/release-action@v1.12.0
|
uses: ncipollo/release-action@v1
|
||||||
with:
|
with:
|
||||||
artifactErrorsFailBuild: true
|
artifactErrorsFailBuild: true
|
||||||
artifacts: ${{ steps.prepare-release.outputs.release-artifacts }}
|
artifacts: ${{ steps.prepare-release.outputs.release-artifacts }}
|
||||||
|
@ -393,9 +397,9 @@ permissions:
|
||||||
name: Restore Release Bucket From Backup
|
name: Restore Release Bucket From Backup
|
||||||
if: ${{ always() && needs.backup.outputs.backup-complete == 'true' && (failure() || cancelled()) }}
|
if: ${{ always() && needs.backup.outputs.backup-complete == 'true' && (failure() || cancelled()) }}
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-<{ gh_environment }>
|
USE_S3_CACHE: 'true'
|
||||||
needs:
|
needs:
|
||||||
- backup
|
- backup
|
||||||
- release
|
- release
|
||||||
|
@ -434,9 +438,9 @@ permissions:
|
||||||
- restore #}
|
- restore #}
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- linux-x86_64
|
||||||
- linux
|
env:
|
||||||
- repo-<{ gh_environment }>
|
USE_S3_CACHE: 'true'
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
@ -494,3 +498,5 @@ permissions:
|
||||||
echo '```' >> "${GITHUB_STEP_SUMMARY}"
|
echo '```' >> "${GITHUB_STEP_SUMMARY}"
|
||||||
fi
|
fi
|
||||||
<%- endblock set_pipeline_exit_status_extra_steps %>
|
<%- endblock set_pipeline_exit_status_extra_steps %>
|
||||||
|
<%- block retry %>
|
||||||
|
<%- endblock retry %>
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
<%- set prepare_workflow_if_check = "${{ fromJSON(needs.workflow-requirements.outputs.requirements-met) }}" %>
|
<%- set prepare_workflow_if_check = "${{ fromJSON(needs.workflow-requirements.outputs.requirements-met) }}" %>
|
||||||
<%- set skip_test_coverage_check = "false" %>
|
<%- set skip_test_coverage_check = "true" %>
|
||||||
<%- extends 'ci.yml.jinja' %>
|
<%- extends 'ci.yml.jinja' %>
|
||||||
|
|
||||||
|
|
||||||
|
|
89
.github/workflows/templates/staging.yml.jinja
vendored
89
.github/workflows/templates/staging.yml.jinja
vendored
|
@ -51,9 +51,9 @@ on:
|
||||||
|
|
||||||
<%- block concurrency %>
|
<%- block concurrency %>
|
||||||
|
|
||||||
concurrency:
|
#concurrency:
|
||||||
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.repository }}
|
# group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.repository }}
|
||||||
cancel-in-progress: false
|
# cancel-in-progress: false
|
||||||
|
|
||||||
<%- endblock concurrency %>
|
<%- endblock concurrency %>
|
||||||
|
|
||||||
|
@ -65,7 +65,7 @@ concurrency:
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do prepare_workflow_needs.append(job_name) %>
|
<%- do prepare_workflow_needs.append(job_name) %>
|
||||||
name: Check Requirements
|
name: Check Requirements
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
environment: <{ gh_environment }>-check
|
environment: <{ gh_environment }>-check
|
||||||
steps:
|
steps:
|
||||||
- name: Check For Admin Permission
|
- name: Check For Admin Permission
|
||||||
|
@ -86,21 +86,12 @@ concurrency:
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- build-docs
|
- build-docs
|
||||||
- build-src-repo
|
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- ubuntu-22.04
|
||||||
- linux
|
|
||||||
- repo-<{ gh_environment }>
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
- name: Setup Python Tools Scripts
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
with:
|
with:
|
||||||
|
@ -112,60 +103,20 @@ concurrency:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}.patch
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}.patch
|
||||||
path: artifacts/release
|
path: artifacts/release
|
||||||
|
|
||||||
- name: Download Source Repository
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-<{ gh_environment }>-src-repo
|
|
||||||
path: artifacts/release
|
|
||||||
|
|
||||||
- name: Download Release Documentation (HTML)
|
- name: Download Release Documentation (HTML)
|
||||||
uses: actions/download-artifact@v4
|
uses: actions/download-artifact@v4
|
||||||
with:
|
with:
|
||||||
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-docs-html.tar.xz
|
name: salt-${{ needs.prepare-workflow.outputs.salt-version }}-docs-html.tar.xz
|
||||||
path: artifacts/release
|
path: artifacts/release
|
||||||
|
|
||||||
- name: Download Release Documentation (ePub)
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: Salt-${{ needs.prepare-workflow.outputs.salt-version }}.epub
|
|
||||||
path: artifacts/release
|
|
||||||
|
|
||||||
- name: Show Release Artifacts
|
- name: Show Release Artifacts
|
||||||
run: |
|
run: |
|
||||||
tree -a artifacts/release
|
tree -a artifacts/release
|
||||||
|
|
||||||
{#-
|
|
||||||
|
|
||||||
- name: Download Release Documentation (PDF)
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: Salt-${{ needs.prepare-workflow.outputs.salt-version }}.pdf
|
|
||||||
path: artifacts/release
|
|
||||||
|
|
||||||
#}
|
|
||||||
|
|
||||||
- name: Upload Release Artifacts
|
|
||||||
run: |
|
|
||||||
tools release upload-artifacts ${{ needs.prepare-workflow.outputs.salt-version }} artifacts/release
|
|
||||||
|
|
||||||
- name: Upload PyPi Artifacts
|
|
||||||
uses: actions/upload-artifact@v4
|
|
||||||
with:
|
|
||||||
name: pypi-artifacts
|
|
||||||
path: |
|
|
||||||
artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
|
|
||||||
artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz.asc
|
|
||||||
retention-days: 7
|
|
||||||
if-no-files-found: error
|
|
||||||
|
|
||||||
<%- if includes.get("test-pkg-downloads", True) %>
|
|
||||||
<%- include "test-salt-pkg-repo-downloads.yml.jinja" %>
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
publish-pypi:
|
publish-pypi:
|
||||||
<%- do conclusion_needs.append('publish-pypi') %>
|
<%- do conclusion_needs.append('publish-pypi') %>
|
||||||
name: Publish to PyPi(test)
|
name: Publish to PyPi(test)
|
||||||
if: ${{ inputs.skip-test-pypi-publish != true && github.event.repository.fork != true }}
|
if: ${{ !cancelled() && inputs.skip-test-pypi-publish != true && github.event.repository.fork != true }}
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- upload-release-artifacts
|
- upload-release-artifacts
|
||||||
|
@ -180,9 +131,7 @@ concurrency:
|
||||||
<%- endfor %>
|
<%- endfor %>
|
||||||
environment: <{ gh_environment }>
|
environment: <{ gh_environment }>
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- ubuntu-22.04
|
||||||
- linux
|
|
||||||
- repo-<{ gh_environment }>
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v4
|
- uses: actions/checkout@v4
|
||||||
|
|
||||||
|
@ -227,4 +176,28 @@ concurrency:
|
||||||
run: |
|
run: |
|
||||||
tools pkg pypi-upload --test artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
|
tools pkg pypi-upload --test artifacts/release/salt-${{ needs.prepare-workflow.outputs.salt-version }}.tar.gz
|
||||||
|
|
||||||
|
draft-release:
|
||||||
|
name: Draft Github Release
|
||||||
|
if: ${{ !cancelled() && (needs.test.result == 'success' || needs.test.result == 'skipped') &&
|
||||||
|
(needs.test-packages.result == 'success' || needs.test-packages.result == 'skipped') &&
|
||||||
|
needs.prepare-workflow.result == 'success' && needs.build-salt-onedir.result == 'success' &&
|
||||||
|
needs.build-pkgs-onedir.result == 'success' && needs.pre-commit.result == 'success' }}
|
||||||
|
needs:
|
||||||
|
- prepare-workflow
|
||||||
|
- pre-commit
|
||||||
|
- build-salt-onedir
|
||||||
|
- build-pkgs-onedir
|
||||||
|
- test-packages
|
||||||
|
- test
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
pull-requests: read
|
||||||
|
id-token: write
|
||||||
|
uses: ./.github/workflows/draft-release.yml
|
||||||
|
with:
|
||||||
|
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['artifact-matrix']) }}
|
||||||
|
build-matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['build-matrix']) }}
|
||||||
|
|
||||||
|
|
||||||
<%- endblock jobs %>
|
<%- endblock jobs %>
|
||||||
|
|
|
@ -1,673 +0,0 @@
|
||||||
name: Test Download Packages
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version of the packages to install and test
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
environment:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The environment to run tests against
|
|
||||||
latest-release:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The latest salt release
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
nox-session:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
default: ci-test-onedir
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
linux:
|
|
||||||
name: Linux
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
environment: ${{ inputs.environment }}
|
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
<%- for slug, arch, pkg_type in test_salt_pkg_downloads_listing["linux"] %>
|
|
||||||
- distro-slug: <{ slug }>
|
|
||||||
arch: <{ arch }>
|
|
||||||
pkg-type: <{ pkg_type }>
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch == 'aarch64' && 'arm64' || matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-linux-${{ matrix.arch == 'aarch64' && 'arm64' || matrix.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Download nox.linux.${{ matrix.arch == 'aarch64' && 'arm64' || matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-linux-${{ matrix.arch == 'aarch64' && 'arm64' || matrix.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}-pkg-download-linux
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm decompress-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
run: |
|
|
||||||
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs ${{ matrix.distro-slug }} -- download-pkgs
|
|
||||||
|
|
||||||
- name: Run Package Download Tests
|
|
||||||
env:
|
|
||||||
SALT_RELEASE: "${{ inputs.salt-version }}"
|
|
||||||
SALT_REPO_ARCH: ${{ matrix.arch }}
|
|
||||||
SALT_REPO_TYPE: ${{ inputs.environment }}
|
|
||||||
SALT_REPO_USER: ${{ secrets.SALT_REPO_USER }}
|
|
||||||
SALT_REPO_PASS: ${{ secrets.SALT_REPO_PASS }}
|
|
||||||
SALT_REPO_DOMAIN_RELEASE: ${{ vars.SALT_REPO_DOMAIN_RELEASE || 'repo.saltproject.io' }}
|
|
||||||
SALT_REPO_DOMAIN_STAGING: ${{ vars.SALT_REPO_DOMAIN_STAGING || 'staging.repo.saltproject.io' }}
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
LATEST_SALT_RELEASE: "${{ inputs.latest-release }}"
|
|
||||||
DOWNLOAD_TEST_PACKAGE_TYPE: ${{ matrix.pkg-type }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
-E SALT_RELEASE -E SALT_REPO_ARCH -E SALT_REPO_TYPE -E SALT_REPO_USER -E SALT_REPO_PASS \
|
|
||||||
-E SALT_REPO_DOMAIN_RELEASE -E SALT_REPO_DOMAIN_STAGING -E LATEST_SALT_RELEASE -E DOWNLOAD_TEST_PACKAGE_TYPE \
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs --rerun-failures ${{ matrix.distro-slug }} -- download-pkgs
|
|
||||||
|
|
||||||
- name: Combine Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.spin-up-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm combine-coverage ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-artifacts ${{ matrix.distro-slug }}
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }} || true
|
|
||||||
|
|
||||||
- name: Fix file ownership
|
|
||||||
run: |
|
|
||||||
sudo chown -R "$(id -un)" .
|
|
||||||
|
|
||||||
- name: Install Codecov CLI
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
# python3 -m pip install codecov-cli
|
|
||||||
#
|
|
||||||
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --no-default-keyring --import
|
|
||||||
curl -Os https://uploader.codecov.io/latest/linux/codecov
|
|
||||||
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM
|
|
||||||
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM.sig
|
|
||||||
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
|
|
||||||
shasum -a 256 -c codecov.SHA256SUM
|
|
||||||
chmod +x codecov
|
|
||||||
|
|
||||||
- name: Upload Source Code Coverage To Codecov
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
if [ ! -s artifacts/coverage/salt.xml ]; then
|
|
||||||
echo "The artifacts/coverage/salt.xml file does not exist"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
#codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
|
||||||
# do-upload --git-service github --sha ${{ github.sha }} \
|
|
||||||
# --file artifacts/coverage/salt.xml \
|
|
||||||
# --flag salt --flag ${{ matrix.distro-slug }} --flag pkg \
|
|
||||||
# --name salt.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs
|
|
||||||
n=0
|
|
||||||
until [ "$n" -ge 5 ]
|
|
||||||
do
|
|
||||||
if ./codecov --file artifacts/coverage/salt.xml \
|
|
||||||
--sha ${{ github.event.pull_request.head.sha || github.sha }} ${{ github.event_name == 'pull_request' && format('--parent {0}', github.event.pull_request.base.sha) }} \
|
|
||||||
--flags salt,${{ matrix.distro-slug }},pkg \
|
|
||||||
--name salt.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs --nonZero; then
|
|
||||||
rc=$?
|
|
||||||
break
|
|
||||||
fi
|
|
||||||
rc=$?
|
|
||||||
n=$((n+1))
|
|
||||||
sleep 15
|
|
||||||
done
|
|
||||||
if [ "$rc" -ne 0 ]; then
|
|
||||||
echo "Failed to upload codecov stats"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Tests Code Coverage To Codecov
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
if [ ! -s artifacts/coverage/tests.xml ]; then
|
|
||||||
echo "The artifacts/coverage/tests.xml file does not exist"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
#codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
|
||||||
# do-upload --git-service github --sha ${{ github.sha }} \
|
|
||||||
# --file artifacts/coverage/tests.xml \
|
|
||||||
# --flag tests --flag ${{ matrix.distro-slug }} --flag pkg \
|
|
||||||
# --name tests.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs
|
|
||||||
n=0
|
|
||||||
until [ "$n" -ge 5 ]
|
|
||||||
do
|
|
||||||
if ./codecov --file artifacts/coverage/tests.xml \
|
|
||||||
--sha ${{ github.event.pull_request.head.sha || github.sha }} ${{ github.event_name == 'pull_request' && format('--parent {0}', github.event.pull_request.base.sha) }} \
|
|
||||||
--flags tests,${{ matrix.distro-slug }},pkg \
|
|
||||||
--name tests.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs --nonZero; then
|
|
||||||
rc=$?
|
|
||||||
break
|
|
||||||
fi
|
|
||||||
rc=$?
|
|
||||||
n=$((n+1))
|
|
||||||
sleep 15
|
|
||||||
done
|
|
||||||
if [ "$rc" -ne 0 ]; then
|
|
||||||
echo "Failed to upload codecov stats"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}
|
|
||||||
path: |
|
|
||||||
artifacts
|
|
||||||
!artifacts/salt/*
|
|
||||||
!artifacts/salt-*.tar.*
|
|
||||||
|
|
||||||
|
|
||||||
macos:
|
|
||||||
name: MacOS
|
|
||||||
runs-on: ${{ matrix.distro-slug }}
|
|
||||||
environment: ${{ inputs.environment }}
|
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
<%- for slug, arch, pkg_type in test_salt_pkg_downloads_listing["macos"] %>
|
|
||||||
- distro-slug: <{ slug }>
|
|
||||||
arch: <{ arch }>
|
|
||||||
pkg-type: <{ pkg_type }>
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-macos-${{ matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Install System Dependencies
|
|
||||||
run: |
|
|
||||||
brew install tree
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-macos-${{ matrix.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Set up Python ${{ inputs.python-version }}
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
update-environment: true
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Download nox.macos.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
nox --force-color -e decompress-dependencies -- macos ${{ matrix.arch }}
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_SYSTEM_INFO_ONLY: "1"
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- download-pkgs
|
|
||||||
|
|
||||||
- name: Run Package Download Tests
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ matrix.distro-slug }}
|
|
||||||
SALT_RELEASE: "${{ inputs.salt-version }}"
|
|
||||||
SALT_REPO_ARCH: ${{ matrix.arch }}
|
|
||||||
LATEST_SALT_RELEASE: "${{ inputs.latest-release }}"
|
|
||||||
SALT_REPO_TYPE: ${{ inputs.environment }}
|
|
||||||
SALT_REPO_USER: ${{ secrets.SALT_REPO_USER }}
|
|
||||||
SALT_REPO_PASS: ${{ secrets.SALT_REPO_PASS }}
|
|
||||||
SALT_REPO_DOMAIN_RELEASE: ${{ vars.SALT_REPO_DOMAIN_RELEASE || 'repo.saltproject.io' }}
|
|
||||||
SALT_REPO_DOMAIN_STAGING: ${{ vars.SALT_REPO_DOMAIN_STAGING || 'staging.repo.saltproject.io' }}
|
|
||||||
DOWNLOAD_TEST_PACKAGE_TYPE: ${{ matrix.pkg-type }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- download-pkgs
|
|
||||||
|
|
||||||
- name: Fix file ownership
|
|
||||||
run: |
|
|
||||||
sudo chown -R "$(id -un)" .
|
|
||||||
|
|
||||||
- name: Combine Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
nox --force-color -e combine-coverage
|
|
||||||
|
|
||||||
- name: Prepare Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Install Codecov CLI
|
|
||||||
if: always() && inputs.skip-code-coverage == false && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
# python3 -m pip install codecov-cli
|
|
||||||
#
|
|
||||||
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --no-default-keyring --import
|
|
||||||
curl -Os https://uploader.codecov.io/latest/macos/codecov
|
|
||||||
curl -Os https://uploader.codecov.io/latest/macos/codecov.SHA256SUM
|
|
||||||
curl -Os https://uploader.codecov.io/latest/macos/codecov.SHA256SUM.sig
|
|
||||||
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
|
|
||||||
shasum -a 256 -c codecov.SHA256SUM
|
|
||||||
chmod +x codecov
|
|
||||||
|
|
||||||
- name: Upload Source Code Coverage To Codecov
|
|
||||||
if: always() && inputs.skip-code-coverage == false && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
if [ ! -s artifacts/coverage/salt.xml ]; then
|
|
||||||
echo "The artifacts/coverage/salt.xml file does not exist"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
#codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
|
||||||
# do-upload --git-service github --sha ${{ github.sha }} \
|
|
||||||
# --file artifacts/coverage/salt.xml \
|
|
||||||
# --flag salt --flag ${{ matrix.distro-slug }} --flag pkg \
|
|
||||||
# --name salt.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs
|
|
||||||
n=0
|
|
||||||
until [ "$n" -ge 5 ]
|
|
||||||
do
|
|
||||||
if ./codecov --file artifacts/coverage/salt.xml \
|
|
||||||
--sha ${{ github.event.pull_request.head.sha || github.sha }} ${{ github.event_name == 'pull_request' && format('--parent {0}', github.event.pull_request.base.sha) }} \
|
|
||||||
--flags salt,${{ matrix.distro-slug }},pkg \
|
|
||||||
--name salt.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs --nonZero; then
|
|
||||||
rc=$?
|
|
||||||
break
|
|
||||||
fi
|
|
||||||
rc=$?
|
|
||||||
n=$((n+1))
|
|
||||||
sleep 15
|
|
||||||
done
|
|
||||||
if [ "$rc" -ne 0 ]; then
|
|
||||||
echo "Failed to upload codecov stats"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Tests Code Coverage To Codecov
|
|
||||||
if: always() && inputs.skip-code-coverage == false && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
if [ ! -s artifacts/coverage/tests.xml ]; then
|
|
||||||
echo "The artifacts/coverage/tests.xml file does not exist"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
#codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
|
||||||
# do-upload --git-service github --sha ${{ github.sha }} \
|
|
||||||
# --file artifacts/coverage/tests.xml \
|
|
||||||
# --flag tests --flag ${{ matrix.distro-slug }} --flag pkg \
|
|
||||||
# --name tests.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs
|
|
||||||
n=0
|
|
||||||
until [ "$n" -ge 5 ]
|
|
||||||
do
|
|
||||||
if ./codecov --file artifacts/coverage/tests.xml \
|
|
||||||
--sha ${{ github.event.pull_request.head.sha || github.sha }} ${{ github.event_name == 'pull_request' && format('--parent {0}', github.event.pull_request.base.sha) }} \
|
|
||||||
--flags tests,${{ matrix.distro-slug }},pkg \
|
|
||||||
--name tests.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs --nonZero; then
|
|
||||||
rc=$?
|
|
||||||
break
|
|
||||||
fi
|
|
||||||
rc=$?
|
|
||||||
n=$((n+1))
|
|
||||||
sleep 15
|
|
||||||
done
|
|
||||||
if [ "$rc" -ne 0 ]; then
|
|
||||||
echo "Failed to upload codecov stats"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
|
||||||
if: always()
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}
|
|
||||||
path: |
|
|
||||||
artifacts
|
|
||||||
!artifacts/salt/*
|
|
||||||
!artifacts/salt-*.tar.*
|
|
||||||
|
|
||||||
|
|
||||||
windows:
|
|
||||||
name: Windows
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
environment: ${{ inputs.environment }}
|
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include:
|
|
||||||
<%- for slug, arch, pkg_type in test_salt_pkg_downloads_listing["windows"] %>
|
|
||||||
- distro-slug: <{ slug }>
|
|
||||||
arch: <{ arch }>
|
|
||||||
pkg-type: <{ pkg_type }>
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-windows-${{ matrix.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Download nox.windows.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-windows-${{ matrix.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}-pkg-download-windows
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ matrix.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm decompress-dependencies ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
run: |
|
|
||||||
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs ${{ matrix.distro-slug }} -- download-pkgs
|
|
||||||
|
|
||||||
- name: Run Package Download Tests
|
|
||||||
env:
|
|
||||||
SALT_RELEASE: "${{ inputs.salt-version }}"
|
|
||||||
SALT_REPO_ARCH: ${{ matrix.arch }}
|
|
||||||
LATEST_SALT_RELEASE: "${{ inputs.latest-release }}"
|
|
||||||
SALT_REPO_TYPE: ${{ inputs.environment }}
|
|
||||||
SALT_REPO_USER: ${{ secrets.SALT_REPO_USER }}
|
|
||||||
SALT_REPO_PASS: ${{ secrets.SALT_REPO_PASS }}
|
|
||||||
SALT_REPO_DOMAIN_RELEASE: ${{ vars.SALT_REPO_DOMAIN_RELEASE || 'repo.saltproject.io' }}
|
|
||||||
SALT_REPO_DOMAIN_STAGING: ${{ vars.SALT_REPO_DOMAIN_STAGING || 'staging.repo.saltproject.io' }}
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
DOWNLOAD_TEST_PACKAGE_TYPE: ${{ matrix.pkg-type }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
-E SALT_RELEASE -E SALT_REPO_ARCH -E SALT_REPO_TYPE -E SALT_REPO_USER -E SALT_REPO_PASS \
|
|
||||||
-E SALT_REPO_DOMAIN_RELEASE -E SALT_REPO_DOMAIN_STAGING -E LATEST_SALT_RELEASE -E DOWNLOAD_TEST_PACKAGE_TYPE \
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs --rerun-failures ${{ matrix.distro-slug }} -- download-pkgs
|
|
||||||
|
|
||||||
- name: Combine Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.spin-up-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm combine-coverage ${{ matrix.distro-slug }}
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-artifacts ${{ matrix.distro-slug }}
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ matrix.distro-slug }} || true
|
|
||||||
|
|
||||||
- name: Fix file ownership
|
|
||||||
run: |
|
|
||||||
sudo chown -R "$(id -un)" .
|
|
||||||
|
|
||||||
- name: Install Codecov CLI
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
# python3 -m pip install codecov-cli
|
|
||||||
#
|
|
||||||
curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --no-default-keyring --import
|
|
||||||
curl -Os https://uploader.codecov.io/latest/linux/codecov
|
|
||||||
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM
|
|
||||||
curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM.sig
|
|
||||||
gpg --verify codecov.SHA256SUM.sig codecov.SHA256SUM
|
|
||||||
shasum -a 256 -c codecov.SHA256SUM
|
|
||||||
chmod +x codecov
|
|
||||||
|
|
||||||
- name: Upload Source Code Coverage To Codecov
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
if [ ! -s artifacts/coverage/salt.xml ]; then
|
|
||||||
echo "The artifacts/coverage/salt.xml file does not exist"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
#codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
|
||||||
# do-upload --git-service github --sha ${{ github.sha }} \
|
|
||||||
# --file artifacts/coverage/salt.xml \
|
|
||||||
# --flag salt --flag ${{ matrix.distro-slug }} --flag pkg \
|
|
||||||
# --name salt.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs
|
|
||||||
n=0
|
|
||||||
until [ "$n" -ge 5 ]
|
|
||||||
do
|
|
||||||
if ./codecov --file artifacts/coverage/salt.xml \
|
|
||||||
--sha ${{ github.event.pull_request.head.sha || github.sha }} ${{ github.event_name == 'pull_request' && format('--parent {0}', github.event.pull_request.base.sha) }} \
|
|
||||||
--flags salt,${{ matrix.distro-slug }},pkg \
|
|
||||||
--name salt.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs --nonZero; then
|
|
||||||
rc=$?
|
|
||||||
break
|
|
||||||
fi
|
|
||||||
rc=$?
|
|
||||||
n=$((n+1))
|
|
||||||
sleep 15
|
|
||||||
done
|
|
||||||
if [ "$rc" -ne 0 ]; then
|
|
||||||
echo "Failed to upload codecov stats"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Tests Code Coverage To Codecov
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
if [ ! -s artifacts/coverage/tests.xml ]; then
|
|
||||||
echo "The artifacts/coverage/tests.xml file does not exist"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
# We can't yet use tokenless uploads with the codecov CLI
|
|
||||||
#codecovcli --auto-load-params-from GithubActions --verbose --token ${{ secrets.CODECOV_TOKEN }} \
|
|
||||||
# do-upload --git-service github --sha ${{ github.sha }} \
|
|
||||||
# --file artifacts/coverage/tests.xml \
|
|
||||||
# --flag tests --flag ${{ matrix.distro-slug }} --flag pkg \
|
|
||||||
# --name tests.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs
|
|
||||||
n=0
|
|
||||||
until [ "$n" -ge 5 ]
|
|
||||||
do
|
|
||||||
if ./codecov --file artifacts/coverage/tests.xml \
|
|
||||||
--sha ${{ github.event.pull_request.head.sha || github.sha }} ${{ github.event_name == 'pull_request' && format('--parent {0}', github.event.pull_request.base.sha) }} \
|
|
||||||
--flags tests,${{ matrix.distro-slug }},pkg \
|
|
||||||
--name tests.${{ matrix.distro-slug }}.${{ inputs.nox-session }}.download-pkgs --nonZero; then
|
|
||||||
rc=$?
|
|
||||||
break
|
|
||||||
fi
|
|
||||||
rc=$?
|
|
||||||
n=$((n+1))
|
|
||||||
sleep 15
|
|
||||||
done
|
|
||||||
if [ "$rc" -ne 0 ]; then
|
|
||||||
echo "Failed to upload codecov stats"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}
|
|
||||||
path: |
|
|
||||||
artifacts
|
|
||||||
!artifacts/salt/*
|
|
||||||
!artifacts/salt-*.tar.*
|
|
|
@ -6,16 +6,13 @@
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
name: Package Downloads
|
name: Package Downloads
|
||||||
<%- if gh_environment == "staging" %>
|
<%- if gh_environment == "staging" %>
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg-download'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['test-pkg-download'] }}
|
||||||
<%- else %>
|
<%- else %>
|
||||||
if: ${{ inputs.skip-salt-pkg-download-test-suite == false }}
|
if: ${{ inputs.skip-salt-pkg-download-test-suite == false }}
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- publish-repositories
|
- build-ci-deps
|
||||||
<%- for slug in test_salt_pkg_downloads_needs_slugs %>
|
|
||||||
- <{ slug }>
|
|
||||||
<%- endfor %>
|
|
||||||
<%- if gh_environment == "release" %>
|
<%- if gh_environment == "release" %>
|
||||||
- download-onedir-artifact
|
- download-onedir-artifact
|
||||||
<%- else %>
|
<%- else %>
|
||||||
|
|
|
@ -1,88 +1,19 @@
|
||||||
<%- for slug, display_name, arch, pkg_type, fips in test_salt_pkg_listing["linux"] %>
|
<%- set job_name = "test-packages" %>
|
||||||
<%- set job_name = "{}-pkg-tests".format(slug.replace(".", "")) %>
|
|
||||||
|
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do test_salt_pkg_needs.append(job_name) %>
|
name: Test Package
|
||||||
name: <{ display_name }> Package Test
|
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['test-pkg'] }}
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- build-pkgs-onedir
|
- build-pkgs-onedir
|
||||||
- build-ci-deps
|
- build-ci-deps
|
||||||
uses: ./.github/workflows/test-packages-action-linux.yml
|
uses: ./.github/workflows/test-packages-action.yml
|
||||||
with:
|
with:
|
||||||
distro-slug: <{ slug }>
|
|
||||||
nox-session: ci-test-onedir
|
nox-session: ci-test-onedir
|
||||||
platform: linux
|
|
||||||
arch: <{ arch }>
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
pkg-type: <{ pkg_type }>
|
|
||||||
nox-version: <{ nox_version }>
|
nox-version: <{ nox_version }>
|
||||||
python-version: "<{ gh_actions_workflows_python_version }>"
|
python-version: "<{ gh_actions_workflows_python_version }>"
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
||||||
skip-code-coverage: <{ skip_test_coverage_check }>
|
skip-code-coverage: <{ skip_test_coverage_check }>
|
||||||
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
|
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
|
||||||
<%- if fips == "fips" %>
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['pkg-test-matrix']) }}
|
||||||
fips: true
|
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
<%- for slug, display_name, arch in test_salt_pkg_listing["macos"] %>
|
|
||||||
<%- set job_name = "{}-pkg-tests".format(slug.replace(".", "")) %>
|
|
||||||
|
|
||||||
<{ job_name }>:
|
|
||||||
<%- do test_salt_pkg_needs.append(job_name) %>
|
|
||||||
name: <{ display_name }> Package Test
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
- build-pkgs-onedir
|
|
||||||
- build-ci-deps
|
|
||||||
uses: ./.github/workflows/test-packages-action-macos.yml
|
|
||||||
with:
|
|
||||||
distro-slug: <{ slug }>
|
|
||||||
nox-session: ci-test-onedir
|
|
||||||
platform: macos
|
|
||||||
arch: <{ arch }>
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
|
||||||
pkg-type: macos
|
|
||||||
nox-version: <{ nox_version }>
|
|
||||||
python-version: "<{ gh_actions_workflows_python_version }>"
|
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
|
||||||
skip-code-coverage: <{ skip_test_coverage_check }>
|
|
||||||
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
|
|
||||||
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
||||||
|
|
||||||
<%- for slug, display_name, arch in test_salt_pkg_listing["windows"] %>
|
|
||||||
<%- for pkg_type in ("NSIS", "MSI") %>
|
|
||||||
<%- set job_name = "{}-{}-pkg-tests".format(slug.replace(".", ""), pkg_type.lower()) %>
|
|
||||||
|
|
||||||
<{ job_name }>:
|
|
||||||
<%- do test_salt_pkg_needs.append(job_name) %>
|
|
||||||
name: <{ display_name }> <{ pkg_type }> Package Test
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test-pkg'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
- build-pkgs-onedir
|
|
||||||
- build-ci-deps
|
|
||||||
uses: ./.github/workflows/test-packages-action-windows.yml
|
|
||||||
with:
|
|
||||||
distro-slug: <{ slug }>
|
|
||||||
nox-session: ci-test-onedir
|
|
||||||
platform: windows
|
|
||||||
arch: <{ arch }>
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
|
||||||
pkg-type: <{ pkg_type }>
|
|
||||||
nox-version: <{ nox_version }>
|
|
||||||
python-version: "<{ gh_actions_workflows_python_version }>"
|
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
|
||||||
skip-code-coverage: <{ skip_test_coverage_check }>
|
|
||||||
testing-releases: ${{ needs.prepare-workflow.outputs.testing-releases }}
|
|
||||||
|
|
||||||
<%- endfor %>
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
79
.github/workflows/templates/test-salt.yml.jinja
vendored
79
.github/workflows/templates/test-salt.yml.jinja
vendored
|
@ -3,85 +3,22 @@
|
||||||
<%- else %>
|
<%- else %>
|
||||||
<%- set timeout_value = 180 %>
|
<%- set timeout_value = 180 %>
|
||||||
<%- endif %>
|
<%- endif %>
|
||||||
|
test:
|
||||||
<%- for slug, display_name, arch in test_salt_listing["windows"] %>
|
name: Test Salt
|
||||||
|
if: ${{ fromJSON(needs.prepare-workflow.outputs.config)['jobs']['test'] }}
|
||||||
<{ slug.replace(".", "") }>:
|
|
||||||
<%- do test_salt_needs.append(slug.replace(".", "")) %>
|
|
||||||
name: <{ display_name }> Test
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
needs:
|
needs:
|
||||||
- prepare-workflow
|
- prepare-workflow
|
||||||
- build-ci-deps
|
- build-ci-deps
|
||||||
uses: ./.github/workflows/test-action-windows.yml
|
uses: ./.github/workflows/test-action.yml
|
||||||
with:
|
with:
|
||||||
distro-slug: <{ slug }>
|
|
||||||
nox-session: ci-test-onedir
|
nox-session: ci-test-onedir
|
||||||
platform: windows
|
|
||||||
arch: amd64
|
|
||||||
nox-version: <{ nox_version }>
|
nox-version: <{ nox_version }>
|
||||||
gh-actions-python-version: "<{ gh_actions_workflows_python_version }>"
|
python-version: "<{ gh_actions_workflows_python_version }>"
|
||||||
testrun: ${{ needs.prepare-workflow.outputs.testrun }}
|
testrun: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['testrun']) }}
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
||||||
skip-code-coverage: <{ skip_test_coverage_check }>
|
skip-code-coverage: <{ skip_test_coverage_check }>
|
||||||
workflow-slug: <{ workflow_slug }>
|
workflow-slug: <{ workflow_slug }>
|
||||||
default-timeout: <{ timeout_value }>
|
default-timeout: <{ timeout_value }>
|
||||||
|
matrix: ${{ toJSON(fromJSON(needs.prepare-workflow.outputs.config)['test-matrix']) }}
|
||||||
<%- endfor %>
|
linux_arm_runner: ${{ fromJSON(needs.prepare-workflow.outputs.config)['linux_arm_runner'] }}
|
||||||
|
|
||||||
|
|
||||||
<%- for slug, display_name, arch in test_salt_listing["macos"] %>
|
|
||||||
|
|
||||||
<{ slug.replace(".", "") }>:
|
|
||||||
<%- do test_salt_needs.append(slug.replace(".", "")) %>
|
|
||||||
name: <{ display_name }> Test
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['github-hosted'] }}
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
- build-ci-deps
|
|
||||||
uses: ./.github/workflows/test-action-macos.yml
|
|
||||||
with:
|
|
||||||
distro-slug: <{ slug }>
|
|
||||||
nox-session: ci-test-onedir
|
|
||||||
platform: macos
|
|
||||||
arch: <{ arch }>
|
|
||||||
nox-version: <{ nox_version }>
|
|
||||||
gh-actions-python-version: "<{ gh_actions_workflows_python_version }>"
|
|
||||||
testrun: ${{ needs.prepare-workflow.outputs.testrun }}
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
|
||||||
skip-code-coverage: <{ skip_test_coverage_check }>
|
|
||||||
workflow-slug: <{ workflow_slug }>
|
|
||||||
default-timeout: <{ timeout_value }>
|
|
||||||
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
||||||
<%- for slug, display_name, arch, fips in test_salt_listing["linux"] %>
|
|
||||||
|
|
||||||
<{ slug.replace(".", "") }>:
|
|
||||||
<%- do test_salt_needs.append(slug.replace(".", "")) %>
|
|
||||||
name: <{ display_name }> Test
|
|
||||||
if: ${{ fromJSON(needs.prepare-workflow.outputs.jobs)['test'] && fromJSON(needs.prepare-workflow.outputs.runners)['self-hosted'] }}
|
|
||||||
needs:
|
|
||||||
- prepare-workflow
|
|
||||||
- build-ci-deps
|
|
||||||
uses: ./.github/workflows/test-action-linux.yml
|
|
||||||
with:
|
|
||||||
distro-slug: <{ slug }>
|
|
||||||
nox-session: ci-test-onedir
|
|
||||||
platform: linux
|
|
||||||
arch: <{ arch }>
|
|
||||||
nox-version: <{ nox_version }>
|
|
||||||
gh-actions-python-version: "<{ gh_actions_workflows_python_version }>"
|
|
||||||
testrun: ${{ needs.prepare-workflow.outputs.testrun }}
|
|
||||||
salt-version: "${{ needs.prepare-workflow.outputs.salt-version }}"
|
|
||||||
cache-prefix: ${{ needs.prepare-workflow.outputs.cache-seed }}|<{ python_version }>
|
|
||||||
skip-code-coverage: <{ skip_test_coverage_check }>
|
|
||||||
workflow-slug: <{ workflow_slug }>
|
|
||||||
default-timeout: <{ timeout_value }>
|
|
||||||
<%- if fips == "fips" %>
|
|
||||||
fips: true
|
|
||||||
<%- endif %>
|
|
||||||
|
|
||||||
<%- endfor %>
|
|
||||||
|
|
|
@ -1,17 +1,18 @@
|
||||||
|
|
||||||
|
|
||||||
<%- set job_name = "trigger-branch-{}-builds".format(workflow_slug) %>
|
<%- set job_name = "trigger-branch-{}-builds".format(workflow_slug) %>
|
||||||
<%- set branches = ["3006.x"] %>
|
|
||||||
|
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do conclusion_needs.append(job_name) %>
|
<%- do conclusion_needs.append(job_name) %>
|
||||||
name: Trigger Branch Workflows
|
name: Trigger Branch Workflows
|
||||||
if: ${{ github.event_name == 'schedule' && fromJSON(needs.workflow-requirements.outputs.requirements-met) }}
|
if: ${{ github.event_name == 'schedule' && fromJSON(needs.workflow-requirements.outputs.requirements-met) }}
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
needs:
|
needs:
|
||||||
- workflow-requirements
|
- workflow-requirements
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
<%- for branch in branches %>
|
<%- for branch in release_branches %>
|
||||||
|
|
||||||
- name: Trigger <{ branch }> branch
|
- name: Trigger <{ branch }> branch
|
||||||
env:
|
env:
|
||||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
|
@ -4,7 +4,7 @@
|
||||||
<{ job_name }>:
|
<{ job_name }>:
|
||||||
<%- do prepare_workflow_needs.append(job_name) %>
|
<%- do prepare_workflow_needs.append(job_name) %>
|
||||||
name: Check Workflow Requirements
|
name: Check Workflow Requirements
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-22.04
|
||||||
outputs:
|
outputs:
|
||||||
requirements-met: ${{ steps.check-requirements.outputs.requirements-met }}
|
requirements-met: ${{ steps.check-requirements.outputs.requirements-met }}
|
||||||
steps:
|
steps:
|
||||||
|
@ -21,6 +21,11 @@
|
||||||
echo "${MSG}"
|
echo "${MSG}"
|
||||||
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
||||||
echo "requirements-met=false" >> "${GITHUB_OUTPUT}"
|
echo "requirements-met=false" >> "${GITHUB_OUTPUT}"
|
||||||
|
elif [ "${{ github.event.repository.private }}" = "true" ]; then
|
||||||
|
MSG="Not running workflow because ${{ github.repository }} is a private repository"
|
||||||
|
echo "${MSG}"
|
||||||
|
echo "${MSG}" >> "${GITHUB_STEP_SUMMARY}"
|
||||||
|
echo "requirements-met=false" >> "${GITHUB_OUTPUT}"
|
||||||
else
|
else
|
||||||
MSG="Running workflow because ${{ github.repository }} is not a fork"
|
MSG="Running workflow because ${{ github.repository }} is not a fork"
|
||||||
echo "${MSG}"
|
echo "${MSG}"
|
||||||
|
|
367
.github/workflows/test-action-linux.yml
vendored
367
.github/workflows/test-action-linux.yml
vendored
|
@ -1,367 +0,0 @@
|
||||||
---
|
|
||||||
name: Test Artifact
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
distro-slug:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The OS slug to run tests against
|
|
||||||
nox-session:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
testrun:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: JSON string containing information about what and how to run the test suite
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version to set prior to running tests.
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
platform:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform being tested
|
|
||||||
arch:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
gh-actions-python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
fips:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
default: false
|
|
||||||
description: Test run with FIPS enabled
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
workflow-slug:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: Which workflow is running.
|
|
||||||
default: ci
|
|
||||||
default-timeout:
|
|
||||||
required: false
|
|
||||||
type: number
|
|
||||||
description: Timeout, in minutes, for the test job(Default 360, 6 hours).
|
|
||||||
default: 360
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
generate-matrix:
|
|
||||||
name: Test Matrix
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
outputs:
|
|
||||||
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Generate Test Matrix
|
|
||||||
id: generate-matrix
|
|
||||||
run: |
|
|
||||||
tools ci matrix --workflow=${{ inputs.workflow-slug }} ${{ fromJSON(inputs.testrun)['type'] == 'full' && '--full ' || '' }}${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
test:
|
|
||||||
name: Test
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
# Full test runs. Each chunk should never take more than 2 hours.
|
|
||||||
# Partial test runs(no chunk parallelization), 6 Hours
|
|
||||||
timeout-minutes: ${{ fromJSON(inputs.testrun)['type'] == 'full' && inputs.default-timeout || 360 }}
|
|
||||||
needs:
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include) }}
|
|
||||||
env:
|
|
||||||
SALT_TRANSPORT: ${{ matrix.transport }}
|
|
||||||
TEST_GROUP: ${{ matrix.test-group || 1 }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
run: |
|
|
||||||
echo "${{ inputs.salt-version }}" > salt/_version.txt
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Download nox.linux.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-linux-${{ inputs.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: PyPi Proxy
|
|
||||||
run: |
|
|
||||||
sed -i '7s;^;--index-url=https://pypi-proxy.saltstack.net/root/local/+simple/ --extra-index-url=https://pypi.org/simple\n;' requirements/static/ci/*/*.txt
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Download testrun-changed-files.txt
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: testrun-changed-files.txt
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
id: spin-up-vm
|
|
||||||
env:
|
|
||||||
TESTS_CHUNK: ${{ matrix.tests-chunk }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
run: |
|
|
||||||
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
|
|
||||||
--nox-session=${{ inputs.nox-session }} ${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Changed Tests
|
|
||||||
id: run-fast-changed-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --core-tests --slow-tests --suppress-no-test-exit-code \
|
|
||||||
--from-filenames=testrun-changed-files.txt
|
|
||||||
|
|
||||||
- name: Run Fast Tests
|
|
||||||
id: run-fast-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
|
|
||||||
${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Slow Tests
|
|
||||||
id: run-slow-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['slow'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --no-fast-tests --slow-tests
|
|
||||||
|
|
||||||
- name: Run Core Tests
|
|
||||||
id: run-core-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['core'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --no-fast-tests --core-tests
|
|
||||||
|
|
||||||
- name: Run Flaky Tests
|
|
||||||
id: run-flaky-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['selected_tests']['flaky'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --no-fast-tests --flaky-jail
|
|
||||||
|
|
||||||
- name: Run Full Tests
|
|
||||||
id: run-full-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] == 'full' }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
|
|
||||||
-E TEST_GROUP ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }} -- --slow-tests --core-tests \
|
|
||||||
--test-group-count=${{ matrix.test-group-count || 1 }} --test-group=${{ matrix.test-group || 1 }}
|
|
||||||
|
|
||||||
- name: Combine Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm combine-coverage ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
|
|
||||||
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}.grp${{ matrix.test-group || '1' }}
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
|
|
||||||
|
|
||||||
- name: Upload Code Coverage Test Run Artifacts
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
|
|
||||||
path: |
|
|
||||||
artifacts/coverage/
|
|
||||||
|
|
||||||
- name: Upload JUnit XML Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}
|
|
||||||
path: |
|
|
||||||
artifacts/xml-unittests-output/
|
|
||||||
|
|
||||||
- name: Upload Test Run Log Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}
|
|
||||||
path: |
|
|
||||||
artifacts/logs
|
|
||||||
|
|
||||||
report:
|
|
||||||
name: Test Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- test
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Code Coverage Test Run Artifacts
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
if: ${{ inputs.skip-code-coverage == false }}
|
|
||||||
id: download-coverage-artifacts
|
|
||||||
with:
|
|
||||||
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
|
|
||||||
path: artifacts/coverage/
|
|
||||||
|
|
||||||
- name: Show Downloaded Test Run Artifacts
|
|
||||||
run: |
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Create XML Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
nox --force-color -e create-xml-coverage-reports
|
|
||||||
mv artifacts/coverage/salt.xml artifacts/coverage/salt..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
|
|
||||||
mv artifacts/coverage/tests.xml artifacts/coverage/tests..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
|
|
||||||
|
|
||||||
- name: Report Salt Code Coverage
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
nox --force-color -e report-coverage -- salt
|
|
||||||
|
|
||||||
- name: Report Combined Code Coverage
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
nox --force-color -e report-coverage
|
|
||||||
|
|
||||||
- name: Rename Code Coverage DB
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Upload Code Coverage DB
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: all-testrun-coverage-artifacts
|
|
||||||
path: artifacts/coverage
|
|
396
.github/workflows/test-action-macos.yml
vendored
396
.github/workflows/test-action-macos.yml
vendored
|
@ -1,396 +0,0 @@
|
||||||
---
|
|
||||||
name: Test Artifact(macOS)
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
distro-slug:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The OS slug to run tests against
|
|
||||||
nox-session:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
testrun:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: JSON string containing information about what and how to run the test suite
|
|
||||||
gh-actions-python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.11"
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version to set prior to running tests.
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
platform:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform being tested
|
|
||||||
arch:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
workflow-slug:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: Which workflow is running.
|
|
||||||
default: ci
|
|
||||||
default-timeout:
|
|
||||||
required: false
|
|
||||||
type: number
|
|
||||||
description: Timeout, in minutes, for the test job(Default 360, 6 hours).
|
|
||||||
default: 360
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
PIP_INDEX_URL: "https://pypi-proxy.saltstack.net/root/local/+simple/"
|
|
||||||
PIP_EXTRA_INDEX_URL: "https://pypi.org/simple"
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
generate-matrix:
|
|
||||||
name: Test Matrix
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
outputs:
|
|
||||||
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Generate Test Matrix
|
|
||||||
id: generate-matrix
|
|
||||||
run: |
|
|
||||||
tools ci matrix --workflow=${{ inputs.workflow-slug }} ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
test:
|
|
||||||
name: Test
|
|
||||||
runs-on: ${{ inputs.distro-slug }}
|
|
||||||
# Full test runs. Each chunk should never take more than 2 hours.
|
|
||||||
# Partial test runs(no chunk parallelization), 6 Hours
|
|
||||||
timeout-minutes: ${{ fromJSON(inputs.testrun)['type'] == 'full' && inputs.default-timeout || 360 }}
|
|
||||||
needs:
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include) }}
|
|
||||||
env:
|
|
||||||
SALT_TRANSPORT: ${{ matrix.transport }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
run: |
|
|
||||||
echo "${{ inputs.salt-version }}" > salt/_version.txt
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Install System Dependencies
|
|
||||||
run: |
|
|
||||||
brew install tree
|
|
||||||
|
|
||||||
- name: Download nox.macos.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-macos-${{ inputs.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Set up Python ${{ inputs.gh-actions-python-version }}
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "${{ inputs.gh-actions-python-version }}"
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
nox --force-color -e decompress-dependencies -- macos ${{ inputs.arch }}
|
|
||||||
|
|
||||||
- name: Download testrun-changed-files.txt
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: testrun-changed-files.txt
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_SYSTEM_INFO_ONLY: "1"
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Changed Tests
|
|
||||||
id: run-fast-changed-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] == false }}
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
|
|
||||||
-k "mac or darwin" --core-tests --slow-tests --suppress-no-test-exit-code \
|
|
||||||
--from-filenames=testrun-changed-files.txt
|
|
||||||
|
|
||||||
- name: Run Fast Tests
|
|
||||||
id: run-fast-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] }}
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
|
|
||||||
-k "mac or darwin" --suppress-no-test-exit-code
|
|
||||||
|
|
||||||
- name: Run Slow Tests
|
|
||||||
id: run-slow-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['slow'] }}
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
|
|
||||||
-k "mac or darwin" --suppress-no-test-exit-code --no-fast-tests --slow-tests
|
|
||||||
|
|
||||||
- name: Run Core Tests
|
|
||||||
id: run-core-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['core'] }}
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
|
|
||||||
-k "mac or darwin" --suppress-no-test-exit-code --no-fast-tests --core-tests
|
|
||||||
|
|
||||||
- name: Run Flaky Tests
|
|
||||||
id: run-flaky-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['selected_tests']['flaky'] }}
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
|
|
||||||
-k "mac or darwin" --suppress-no-test-exit-code --no-fast-tests --flaky-jail
|
|
||||||
|
|
||||||
- name: Run Full Tests
|
|
||||||
id: run-full-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] == 'full' }}
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
SKIP_CODE_COVERAGE: "${{ inputs.skip-code-coverage && '1' || '0' }}"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }} -- ${{ matrix.tests-chunk }} -- \
|
|
||||||
--slow-tests --core-tests -k "mac or darwin"
|
|
||||||
|
|
||||||
- name: Fix file ownership
|
|
||||||
run: |
|
|
||||||
sudo chown -R "$(id -un)" .
|
|
||||||
|
|
||||||
- name: Combine Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false
|
|
||||||
run: |
|
|
||||||
nox --force-color -e combine-coverage
|
|
||||||
|
|
||||||
- name: Prepare Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
|
|
||||||
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Upload Code Coverage Test Run Artifacts
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
|
|
||||||
path: |
|
|
||||||
artifacts/coverage/
|
|
||||||
|
|
||||||
- name: Upload JUnit XML Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}
|
|
||||||
path: |
|
|
||||||
artifacts/xml-unittests-output/
|
|
||||||
|
|
||||||
- name: Upload Test Run Log Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}
|
|
||||||
path: |
|
|
||||||
artifacts/logs
|
|
||||||
|
|
||||||
report:
|
|
||||||
name: Test Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- test
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Code Coverage Test Run Artifacts
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
if: ${{ inputs.skip-code-coverage == false }}
|
|
||||||
id: download-coverage-artifacts
|
|
||||||
with:
|
|
||||||
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
|
|
||||||
path: artifacts/coverage/
|
|
||||||
|
|
||||||
- name: Show Downloaded Test Run Artifacts
|
|
||||||
run: |
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Set up Python ${{ inputs.gh-actions-python-version }}
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "${{ inputs.gh-actions-python-version }}"
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Create XML Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
nox --force-color -e create-xml-coverage-reports
|
|
||||||
mv artifacts/coverage/salt.xml artifacts/coverage/salt..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
|
|
||||||
mv artifacts/coverage/tests.xml artifacts/coverage/tests..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
|
|
||||||
|
|
||||||
- name: Report Salt Code Coverage
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
nox --force-color -e report-coverage -- salt
|
|
||||||
|
|
||||||
- name: Report Combined Code Coverage
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
nox --force-color -e report-coverage
|
|
||||||
|
|
||||||
- name: Rename Code Coverage DB
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Upload Code Coverage DB
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: all-testrun-coverage-artifacts
|
|
||||||
path: artifacts/coverage
|
|
368
.github/workflows/test-action-windows.yml
vendored
368
.github/workflows/test-action-windows.yml
vendored
|
@ -1,368 +0,0 @@
|
||||||
---
|
|
||||||
name: Test Artifact
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
distro-slug:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The OS slug to run tests against
|
|
||||||
nox-session:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
testrun:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: JSON string containing information about what and how to run the test suite
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version to set prior to running tests.
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
platform:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform being tested
|
|
||||||
arch:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
gh-actions-python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
fips:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
default: false
|
|
||||||
description: Test run with FIPS enabled
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
workflow-slug:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: Which workflow is running.
|
|
||||||
default: ci
|
|
||||||
default-timeout:
|
|
||||||
required: false
|
|
||||||
type: number
|
|
||||||
description: Timeout, in minutes, for the test job(Default 360, 6 hours).
|
|
||||||
default: 360
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
generate-matrix:
|
|
||||||
name: Test Matrix
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
outputs:
|
|
||||||
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Generate Test Matrix
|
|
||||||
id: generate-matrix
|
|
||||||
run: |
|
|
||||||
tools ci matrix --workflow=${{ inputs.workflow-slug }} ${{ fromJSON(inputs.testrun)['type'] == 'full' && '--full ' || '' }}${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
test:
|
|
||||||
name: Test
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
# Full test runs. Each chunk should never take more than 2 hours.
|
|
||||||
# Partial test runs(no chunk parallelization), 6 Hours
|
|
||||||
timeout-minutes: ${{ fromJSON(inputs.testrun)['type'] == 'full' && inputs.default-timeout || 360 }}
|
|
||||||
needs:
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include) }}
|
|
||||||
env:
|
|
||||||
SALT_TRANSPORT: ${{ matrix.transport }}
|
|
||||||
TEST_GROUP: ${{ matrix.test-group || 1 }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Salt Version
|
|
||||||
run: |
|
|
||||||
echo "${{ inputs.salt-version }}" > salt/_version.txt
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Download nox.windows.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-windows-${{ inputs.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: PyPi Proxy
|
|
||||||
run: |
|
|
||||||
sed -i '7s;^;--index-url=https://pypi-proxy.saltstack.net/root/local/+simple/ --extra-index-url=https://pypi.org/simple\n;' requirements/static/ci/*/*.txt
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Download testrun-changed-files.txt
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: testrun-changed-files.txt
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
id: spin-up-vm
|
|
||||||
env:
|
|
||||||
TESTS_CHUNK: ${{ matrix.tests-chunk }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
run: |
|
|
||||||
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
|
|
||||||
--nox-session=${{ inputs.nox-session }} ${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Changed Tests
|
|
||||||
id: run-fast-changed-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] == false }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --core-tests --slow-tests --suppress-no-test-exit-code \
|
|
||||||
--from-filenames=testrun-changed-files.txt
|
|
||||||
|
|
||||||
- name: Run Fast Tests
|
|
||||||
id: run-fast-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['fast'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
|
|
||||||
${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Slow Tests
|
|
||||||
id: run-slow-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['slow'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --no-fast-tests --slow-tests
|
|
||||||
|
|
||||||
- name: Run Core Tests
|
|
||||||
id: run-core-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] != 'full' && fromJSON(inputs.testrun)['selected_tests']['core'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --no-fast-tests --core-tests
|
|
||||||
|
|
||||||
- name: Run Flaky Tests
|
|
||||||
id: run-flaky-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['selected_tests']['flaky'] }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ matrix.tests-chunk }} -- --no-fast-tests --flaky-jail
|
|
||||||
|
|
||||||
- name: Run Full Tests
|
|
||||||
id: run-full-tests
|
|
||||||
if: ${{ fromJSON(inputs.testrun)['type'] == 'full' }}
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install \
|
|
||||||
--nox-session=${{ inputs.nox-session }} --rerun-failures -E SALT_TRANSPORT ${{ (inputs.skip-code-coverage && matrix.tests-chunk != 'unit') && '--skip-code-coverage' || '' }} \
|
|
||||||
-E TEST_GROUP ${{ matrix.fips && '--fips ' || '' }}${{ inputs.distro-slug }} ${{ matrix.tests-chunk }} -- --slow-tests --core-tests \
|
|
||||||
--test-group-count=${{ matrix.test-group-count || 1 }} --test-group=${{ matrix.test-group || 1 }}
|
|
||||||
|
|
||||||
- name: Combine Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm combine-coverage ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
|
|
||||||
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}.grp${{ matrix.test-group || '1' }}
|
|
||||||
fi
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
|
|
||||||
|
|
||||||
- name: Upload Code Coverage Test Run Artifacts
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-artifacts-from-vm.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
|
|
||||||
path: |
|
|
||||||
artifacts/coverage/
|
|
||||||
|
|
||||||
- name: Upload JUnit XML Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-junit-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}
|
|
||||||
path: |
|
|
||||||
artifacts/xml-unittests-output/
|
|
||||||
|
|
||||||
- name: Upload Test Run Log Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: testrun-log-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}
|
|
||||||
path: |
|
|
||||||
artifacts/logs
|
|
||||||
|
|
||||||
|
|
||||||
report:
|
|
||||||
name: Test Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
needs:
|
|
||||||
- test
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Code Coverage Test Run Artifacts
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
if: ${{ inputs.skip-code-coverage == false }}
|
|
||||||
id: download-coverage-artifacts
|
|
||||||
with:
|
|
||||||
name: testrun-coverage-artifacts-${{ inputs.distro-slug }}-${{ inputs.nox-session }}
|
|
||||||
path: artifacts/coverage/
|
|
||||||
|
|
||||||
- name: Show Downloaded Test Run Artifacts
|
|
||||||
run: |
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Create XML Coverage Reports
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success' && job.status != 'cancelled'
|
|
||||||
run: |
|
|
||||||
nox --force-color -e create-xml-coverage-reports
|
|
||||||
mv artifacts/coverage/salt.xml artifacts/coverage/salt..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
|
|
||||||
mv artifacts/coverage/tests.xml artifacts/coverage/tests..${{ inputs.distro-slug }}..${{ inputs.nox-session }}.xml
|
|
||||||
|
|
||||||
- name: Report Salt Code Coverage
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
nox --force-color -e report-coverage -- salt
|
|
||||||
|
|
||||||
- name: Report Combined Code Coverage
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
nox --force-color -e report-coverage
|
|
||||||
|
|
||||||
- name: Rename Code Coverage DB
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
continue-on-error: true
|
|
||||||
run: |
|
|
||||||
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ inputs.distro-slug }}.${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Upload Code Coverage DB
|
|
||||||
if: always() && inputs.skip-code-coverage == false && steps.download-coverage-artifacts.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: all-testrun-coverage-artifacts
|
|
||||||
path: artifacts/coverage
|
|
1387
.github/workflows/test-action.yml
vendored
Normal file
1387
.github/workflows/test-action.yml
vendored
Normal file
File diff suppressed because it is too large
Load diff
256
.github/workflows/test-package-downloads-action.yml
vendored
256
.github/workflows/test-package-downloads-action.yml
vendored
|
@ -48,175 +48,55 @@ env:
|
||||||
COLUMNS: 190
|
COLUMNS: 190
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
AWS_MAX_ATTEMPTS: "10"
|
||||||
AWS_RETRY_MODE: "adaptive"
|
AWS_RETRY_MODE: "adaptive"
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
|
|
||||||
|
generate-matrix:
|
||||||
|
name: Generate Matrix
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
matrix-include: ${{ steps.generate-matrix.outputs.matrix }}
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: "Throttle Builds"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
||||||
|
|
||||||
|
- name: Checkout Source Code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cache-prefix: ${{ inputs.cache-prefix }}
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
|
||||||
|
- name: Generate Test Matrix
|
||||||
|
id: generate-matrix
|
||||||
|
run: |
|
||||||
|
tools ci pkg-downloads-matrix
|
||||||
|
|
||||||
linux:
|
linux:
|
||||||
name: Linux
|
name: Linux
|
||||||
|
needs:
|
||||||
|
- generate-matrix
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- ubuntu-latest
|
||||||
- linux
|
env:
|
||||||
- bastion
|
USE_S3_CACHE: 'true'
|
||||||
environment: ${{ inputs.environment }}
|
environment: ${{ inputs.environment }}
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include)['linux'] }}
|
||||||
- distro-slug: almalinux-8
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: almalinux-8-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: almalinux-8-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: almalinux-9
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: almalinux-9-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: almalinux-9-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: amazonlinux-2
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: amazonlinux-2-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: amazonlinux-2-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: amazonlinux-2023
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: amazonlinux-2023-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: amazonlinux-2023-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centos-7
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centos-7-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centos-7-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centosstream-8
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centosstream-8-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centosstream-8-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centosstream-9
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centosstream-9-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: centosstream-9-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: debian-10
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: debian-10-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: debian-11
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: debian-11-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: debian-12
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: debian-12-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: fedora-37
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: fedora-37-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: fedora-37-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: fedora-38
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: fedora-38-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: fedora-38-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-3
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-3-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-3-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-4
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-4-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-4-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-5
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-5-arm64
|
|
||||||
arch: aarch64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: photonos-5-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: ubuntu-20.04
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: ubuntu-20.04-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: ubuntu-22.04
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: ubuntu-22.04
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: onedir
|
|
||||||
- distro-slug: ubuntu-22.04-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: ubuntu-22.04-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: onedir
|
|
||||||
- distro-slug: ubuntu-23.04
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: ubuntu-23.04-arm64
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
|
@ -401,38 +281,29 @@ jobs:
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
- name: Upload Test Run Artifacts
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
||||||
uses: actions/upload-artifact@v3
|
uses: actions/upload-artifact@v4
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
with:
|
||||||
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}
|
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}-${{ matrix.pkg-type }}
|
||||||
path: |
|
path: |
|
||||||
artifacts
|
artifacts/
|
||||||
!artifacts/salt/*
|
!artifacts/salt/*
|
||||||
!artifacts/salt-*.tar.*
|
!artifacts/salt-*.tar.*
|
||||||
|
|
||||||
|
|
||||||
macos:
|
macos:
|
||||||
name: MacOS
|
name: MacOS
|
||||||
runs-on: ${{ matrix.distro-slug }}
|
needs:
|
||||||
|
- generate-matrix
|
||||||
|
runs-on: ${{ matrix.distro-slug == 'macos-13-arm64' && 'macos-13-xlarge' || matrix.distro-slug }}
|
||||||
|
env:
|
||||||
|
USE_S3_CACHE: 'false'
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
environment: ${{ inputs.environment }}
|
environment: ${{ inputs.environment }}
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include)['macos'] }}
|
||||||
- distro-slug: macos-12
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: macos-13
|
|
||||||
arch: x86_64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: macos-13-xlarge
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: package
|
|
||||||
- distro-slug: macos-13-xlarge
|
|
||||||
arch: arm64
|
|
||||||
pkg-type: onedir
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
|
|
||||||
|
@ -608,38 +479,29 @@ jobs:
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
- name: Upload Test Run Artifacts
|
||||||
if: always()
|
if: always()
|
||||||
uses: actions/upload-artifact@v3
|
uses: actions/upload-artifact@v4
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
with:
|
||||||
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}
|
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}-${{ matrix.pkg-type }}
|
||||||
path: |
|
path: |
|
||||||
artifacts
|
artifacts/
|
||||||
!artifacts/salt/*
|
!artifacts/salt/*
|
||||||
!artifacts/salt-*.tar.*
|
!artifacts/salt-*.tar.*
|
||||||
|
|
||||||
|
|
||||||
windows:
|
windows:
|
||||||
name: Windows
|
name: Windows
|
||||||
|
needs:
|
||||||
|
- generate-matrix
|
||||||
|
env:
|
||||||
|
USE_S3_CACHE: 'true'
|
||||||
runs-on:
|
runs-on:
|
||||||
- self-hosted
|
- ubuntu-latest
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
environment: ${{ inputs.environment }}
|
environment: ${{ inputs.environment }}
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
||||||
strategy:
|
strategy:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
include:
|
include: ${{ fromJSON(needs.generate-matrix.outputs.matrix-include)['windows'] }}
|
||||||
- distro-slug: windows-2022
|
|
||||||
arch: amd64
|
|
||||||
pkg-type: nsis
|
|
||||||
- distro-slug: windows-2022
|
|
||||||
arch: amd64
|
|
||||||
pkg-type: msi
|
|
||||||
- distro-slug: windows-2022
|
|
||||||
arch: amd64
|
|
||||||
pkg-type: onedir
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- name: Checkout Source Code
|
- name: Checkout Source Code
|
||||||
|
@ -818,12 +680,10 @@ jobs:
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
- name: Upload Test Run Artifacts
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
||||||
uses: actions/upload-artifact@v3
|
uses: actions/upload-artifact@v4
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
with:
|
||||||
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}
|
name: pkg-testrun-artifacts-${{ matrix.distro-slug }}-${{ matrix.arch }}-${{ matrix.pkg-type }}
|
||||||
path: |
|
path: |
|
||||||
artifacts
|
artifacts/
|
||||||
!artifacts/salt/*
|
!artifacts/salt/*
|
||||||
!artifacts/salt-*.tar.*
|
!artifacts/salt-*.tar.*
|
||||||
|
|
260
.github/workflows/test-packages-action-linux.yml
vendored
260
.github/workflows/test-packages-action-linux.yml
vendored
|
@ -1,260 +0,0 @@
|
||||||
name: Test Artifact
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
distro-slug:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The OS slug to run tests against
|
|
||||||
platform:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform being tested
|
|
||||||
arch:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
pkg-type:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version of the packages to install and test
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
testing-releases:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: A JSON list of releases to test upgrades against
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
fips:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
default: false
|
|
||||||
description: Test run with FIPS enabled
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
nox-session:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
default: ci-test-onedir
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
generate-matrix:
|
|
||||||
name: Generate Matrix
|
|
||||||
runs-on:
|
|
||||||
# We need to run on our self-hosted runners because we need proper credentials
|
|
||||||
# for boto3 to scan through our repositories.
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- x86_64
|
|
||||||
outputs:
|
|
||||||
pkg-matrix-include: ${{ steps.generate-pkg-matrix.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Generate Package Test Matrix
|
|
||||||
id: generate-pkg-matrix
|
|
||||||
run: |
|
|
||||||
tools ci pkg-matrix ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ inputs.pkg-type }} --testing-releases ${{ join(fromJSON(inputs.testing-releases), ' ') }}
|
|
||||||
|
|
||||||
|
|
||||||
test:
|
|
||||||
name: Test
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
|
||||||
needs:
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Packages
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ inputs.arch }}-${{ inputs.pkg-type }}
|
|
||||||
path: artifacts/pkg/
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: List Packages
|
|
||||||
run: |
|
|
||||||
tree artifacts/pkg/
|
|
||||||
|
|
||||||
- name: Download nox.linux.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-linux-${{ inputs.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Downgrade importlib-metadata
|
|
||||||
if: ${{ contains(fromJSON('["amazonlinux-2", "centos-7", "debian-10"]'), inputs.distro-slug) && contains(fromJSON('["upgrade-classic", "downgrade-classic"]'), matrix.tests-chunk) }}
|
|
||||||
run: |
|
|
||||||
# This step can go away once we stop testing classic packages upgrade/downgrades to/from 3005.x
|
|
||||||
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- "sudo python3 -m pip install -U 'importlib-metadata<=4.13.0' 'virtualenv<=20.21.1'"
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
run: |
|
|
||||||
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Package Tests
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install ${{ matrix.fips && '--fips ' || '' }}\
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs --rerun-failures ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }} \
|
|
||||||
${{ matrix.version && format('--prev-version {0}', matrix.version) || ''}}
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
|
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ matrix.tests-chunk }}
|
|
||||||
path: |
|
|
||||||
artifacts
|
|
||||||
!artifacts/pkg/*
|
|
||||||
!artifacts/salt/*
|
|
||||||
!artifacts/salt-*.tar.*
|
|
||||||
|
|
||||||
report:
|
|
||||||
name: Report
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: always() && inputs.skip-code-coverage == false && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
|
|
||||||
needs:
|
|
||||||
- test
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-test-run-artifacts
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ matrix.tests-chunk }}
|
|
||||||
path: artifacts
|
|
||||||
|
|
||||||
- name: Show Test Run Artifacts
|
|
||||||
if: always() && steps.download-test-run-artifacts.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tree -a artifacts
|
|
249
.github/workflows/test-packages-action-macos.yml
vendored
249
.github/workflows/test-packages-action-macos.yml
vendored
|
@ -1,249 +0,0 @@
|
||||||
name: Test Artifact
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
distro-slug:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The OS slug to run tests against
|
|
||||||
platform:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform being tested
|
|
||||||
arch:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
pkg-type:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version of the packages to install and test
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
testing-releases:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: A JSON list of releases to test upgrades against
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
nox-session:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
default: ci-test-onedir
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
generate-matrix:
|
|
||||||
name: Generate Matrix
|
|
||||||
runs-on:
|
|
||||||
# We need to run on our self-hosted runners because we need proper credentials
|
|
||||||
# for boto3 to scan through our repositories.
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- x86_64
|
|
||||||
outputs:
|
|
||||||
pkg-matrix-include: ${{ steps.generate-pkg-matrix.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Generate Package Test Matrix
|
|
||||||
id: generate-pkg-matrix
|
|
||||||
run: |
|
|
||||||
tools ci pkg-matrix ${{ inputs.distro-slug }} ${{ inputs.pkg-type }} --testing-releases ${{ join(fromJSON(inputs.testing-releases), ' ') }}
|
|
||||||
|
|
||||||
|
|
||||||
test:
|
|
||||||
name: Test
|
|
||||||
runs-on: ${{ inputs.distro-slug }}
|
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
|
||||||
needs:
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Packages
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: salt-${{ inputs.salt-version }}-${{ inputs.arch }}-${{ inputs.pkg-type }}
|
|
||||||
path: artifacts/pkg/
|
|
||||||
|
|
||||||
- name: Install System Dependencies
|
|
||||||
run: |
|
|
||||||
brew install tree
|
|
||||||
|
|
||||||
- name: List Packages
|
|
||||||
run: |
|
|
||||||
tree artifacts/pkg/
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: Set up Python ${{ inputs.python-version }}
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
||||||
|
|
||||||
- name: Download nox.macos.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-macos-${{ inputs.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
nox --force-color -e decompress-dependencies -- macos ${{ inputs.arch }}
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_SYSTEM_INFO_ONLY: "1"
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Package Tests
|
|
||||||
env:
|
|
||||||
SKIP_REQUIREMENTS_INSTALL: "1"
|
|
||||||
PRINT_TEST_SELECTION: "0"
|
|
||||||
PRINT_TEST_PLAN_ONLY: "0"
|
|
||||||
PRINT_SYSTEM_INFO: "0"
|
|
||||||
RERUN_FAILURES: "1"
|
|
||||||
GITHUB_ACTIONS_PIPELINE: "1"
|
|
||||||
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
|
||||||
COVERAGE_CONTEXT: ${{ inputs.distro-slug }}
|
|
||||||
run: |
|
|
||||||
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }} \
|
|
||||||
${{ matrix.version && format('--prev-version {0}', matrix.version) || ''}}
|
|
||||||
|
|
||||||
- name: Fix file ownership
|
|
||||||
run: |
|
|
||||||
sudo chown -R "$(id -un)" .
|
|
||||||
|
|
||||||
- name: Prepare Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
|
||||||
if: always()
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ matrix.tests-chunk }}
|
|
||||||
path: |
|
|
||||||
artifacts
|
|
||||||
!artifacts/pkg/*
|
|
||||||
!artifacts/salt/*
|
|
||||||
!artifacts/salt-*.tar.*
|
|
||||||
|
|
||||||
report:
|
|
||||||
name: Report
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: always() && inputs.skip-code-coverage == false && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
|
|
||||||
needs:
|
|
||||||
- test
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-test-run-artifacts
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ matrix.tests-chunk }}
|
|
||||||
path: artifacts
|
|
||||||
|
|
||||||
- name: Show Test Run Artifacts
|
|
||||||
if: always() && steps.download-test-run-artifacts.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Set up Python ${{ inputs.python-version }}
|
|
||||||
uses: actions/setup-python@v5
|
|
||||||
with:
|
|
||||||
python-version: "${{ inputs.python-version }}"
|
|
||||||
|
|
||||||
- name: Install Nox
|
|
||||||
run: |
|
|
||||||
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
|
260
.github/workflows/test-packages-action-windows.yml
vendored
260
.github/workflows/test-packages-action-windows.yml
vendored
|
@ -1,260 +0,0 @@
|
||||||
name: Test Artifact
|
|
||||||
|
|
||||||
on:
|
|
||||||
workflow_call:
|
|
||||||
inputs:
|
|
||||||
distro-slug:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The OS slug to run tests against
|
|
||||||
platform:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform being tested
|
|
||||||
arch:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
pkg-type:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The platform arch being tested
|
|
||||||
salt-version:
|
|
||||||
type: string
|
|
||||||
required: true
|
|
||||||
description: The Salt version of the packages to install and test
|
|
||||||
cache-prefix:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: Seed used to invalidate caches
|
|
||||||
testing-releases:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: A JSON list of releases to test upgrades against
|
|
||||||
nox-version:
|
|
||||||
required: true
|
|
||||||
type: string
|
|
||||||
description: The nox version to install
|
|
||||||
python-version:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The python version to run tests with
|
|
||||||
default: "3.10"
|
|
||||||
fips:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
default: false
|
|
||||||
description: Test run with FIPS enabled
|
|
||||||
package-name:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The onedir package name to use
|
|
||||||
default: salt
|
|
||||||
nox-session:
|
|
||||||
required: false
|
|
||||||
type: string
|
|
||||||
description: The nox session to run
|
|
||||||
default: ci-test-onedir
|
|
||||||
skip-code-coverage:
|
|
||||||
required: false
|
|
||||||
type: boolean
|
|
||||||
description: Skip code coverage
|
|
||||||
default: false
|
|
||||||
|
|
||||||
env:
|
|
||||||
COLUMNS: 190
|
|
||||||
AWS_MAX_ATTEMPTS: "10"
|
|
||||||
AWS_RETRY_MODE: "adaptive"
|
|
||||||
PIP_INDEX_URL: https://pypi-proxy.saltstack.net/root/local/+simple/
|
|
||||||
PIP_EXTRA_INDEX_URL: https://pypi.org/simple
|
|
||||||
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
|
||||||
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
|
||||||
|
|
||||||
jobs:
|
|
||||||
|
|
||||||
generate-matrix:
|
|
||||||
name: Generate Matrix
|
|
||||||
runs-on:
|
|
||||||
# We need to run on our self-hosted runners because we need proper credentials
|
|
||||||
# for boto3 to scan through our repositories.
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- x86_64
|
|
||||||
outputs:
|
|
||||||
pkg-matrix-include: ${{ steps.generate-pkg-matrix.outputs.matrix }}
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Generate Package Test Matrix
|
|
||||||
id: generate-pkg-matrix
|
|
||||||
run: |
|
|
||||||
tools ci pkg-matrix ${{ inputs.fips && '--fips ' || '' }}${{ inputs.distro-slug }} \
|
|
||||||
${{ inputs.pkg-type }} --testing-releases ${{ join(fromJSON(inputs.testing-releases), ' ') }}
|
|
||||||
|
|
||||||
|
|
||||||
test:
|
|
||||||
name: Test
|
|
||||||
runs-on:
|
|
||||||
- self-hosted
|
|
||||||
- linux
|
|
||||||
- bastion
|
|
||||||
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
|
||||||
needs:
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
|
|
||||||
- name: "Throttle Builds"
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
|
||||||
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Packages
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ inputs.arch }}-${{ inputs.pkg-type }}
|
|
||||||
path: artifacts/pkg/
|
|
||||||
|
|
||||||
- name: Download Onedir Tarball as an Artifact
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
path: artifacts/
|
|
||||||
|
|
||||||
- name: Decompress Onedir Tarball
|
|
||||||
shell: bash
|
|
||||||
run: |
|
|
||||||
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
|
||||||
cd artifacts
|
|
||||||
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ inputs.platform }}-${{ inputs.arch }}.tar.xz
|
|
||||||
|
|
||||||
- name: List Packages
|
|
||||||
run: |
|
|
||||||
tree artifacts/pkg/
|
|
||||||
|
|
||||||
- name: Download nox.windows.${{ inputs.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
|
||||||
uses: actions/download-artifact@v4
|
|
||||||
with:
|
|
||||||
name: nox-windows-${{ inputs.arch }}-${{ inputs.nox-session }}
|
|
||||||
|
|
||||||
- name: Setup Python Tools Scripts
|
|
||||||
uses: ./.github/actions/setup-python-tools-scripts
|
|
||||||
with:
|
|
||||||
cache-prefix: ${{ inputs.cache-prefix }}
|
|
||||||
|
|
||||||
- name: Get Salt Project GitHub Actions Bot Environment
|
|
||||||
run: |
|
|
||||||
TOKEN=$(curl -sS -f -X PUT "http://169.254.169.254/latest/api/token" -H "X-aws-ec2-metadata-token-ttl-seconds: 30")
|
|
||||||
SPB_ENVIRONMENT=$(curl -sS -f -H "X-aws-ec2-metadata-token: $TOKEN" http://169.254.169.254/latest/meta-data/tags/instance/spb:environment)
|
|
||||||
echo "SPB_ENVIRONMENT=$SPB_ENVIRONMENT" >> "$GITHUB_ENV"
|
|
||||||
|
|
||||||
- name: Start VM
|
|
||||||
id: spin-up-vm
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm create --environment "${SPB_ENVIRONMENT}" --retries=2 ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: List Free Space
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- df -h || true
|
|
||||||
|
|
||||||
- name: Upload Checkout To VM
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm rsync ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Decompress .nox Directory
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm decompress-dependencies ${{ inputs.distro-slug }}
|
|
||||||
|
|
||||||
- name: Downgrade importlib-metadata
|
|
||||||
if: ${{ contains(fromJSON('["amazonlinux-2", "centos-7", "debian-10"]'), inputs.distro-slug) && contains(fromJSON('["upgrade-classic", "downgrade-classic"]'), matrix.tests-chunk) }}
|
|
||||||
run: |
|
|
||||||
# This step can go away once we stop testing classic packages upgrade/downgrades to/from 3005.x
|
|
||||||
tools --timestamps vm ssh ${{ inputs.distro-slug }} -- "sudo python3 -m pip install -U 'importlib-metadata<=4.13.0' 'virtualenv<=20.21.1'"
|
|
||||||
|
|
||||||
- name: Show System Info
|
|
||||||
run: |
|
|
||||||
tools --timestamps --timeout-secs=1800 vm test --skip-requirements-install --print-system-information-only \
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }}
|
|
||||||
|
|
||||||
- name: Run Package Tests
|
|
||||||
run: |
|
|
||||||
tools --timestamps --no-output-timeout-secs=1800 --timeout-secs=14400 vm test --skip-requirements-install ${{ matrix.fips && '--fips ' || '' }}\
|
|
||||||
--nox-session=${{ inputs.nox-session }}-pkgs --rerun-failures ${{ inputs.distro-slug }} -- ${{ matrix.tests-chunk }} \
|
|
||||||
${{ matrix.version && format('--prev-version {0}', matrix.version) || ''}}
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-artifacts-from-vm
|
|
||||||
if: always() && steps.spin-up-vm.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm download-artifacts ${{ inputs.distro-slug }}
|
|
||||||
# Delete the salt onedir, we won't need it anymore and it will prevent
|
|
||||||
# from it showing in the tree command below
|
|
||||||
rm -rf artifacts/salt*
|
|
||||||
tree -a artifacts
|
|
||||||
|
|
||||||
- name: Destroy VM
|
|
||||||
if: always()
|
|
||||||
run: |
|
|
||||||
tools --timestamps vm destroy --no-wait ${{ inputs.distro-slug }} || true
|
|
||||||
|
|
||||||
- name: Upload Test Run Artifacts
|
|
||||||
if: always() && steps.download-artifacts-from-vm.outcome == 'success'
|
|
||||||
uses: actions/upload-artifact@v3
|
|
||||||
# This needs to be actions/upload-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ matrix.tests-chunk }}
|
|
||||||
path: |
|
|
||||||
artifacts
|
|
||||||
!artifacts/pkg/*
|
|
||||||
!artifacts/salt/*
|
|
||||||
!artifacts/salt-*.tar.*
|
|
||||||
|
|
||||||
report:
|
|
||||||
name: Report
|
|
||||||
runs-on: ubuntu-latest
|
|
||||||
if: always() && inputs.skip-code-coverage == false && needs.test.result != 'cancelled' && needs.test.result != 'skipped'
|
|
||||||
needs:
|
|
||||||
- test
|
|
||||||
- generate-matrix
|
|
||||||
strategy:
|
|
||||||
fail-fast: false
|
|
||||||
matrix:
|
|
||||||
include: ${{ fromJSON(needs.generate-matrix.outputs.pkg-matrix-include) }}
|
|
||||||
|
|
||||||
steps:
|
|
||||||
- name: Checkout Source Code
|
|
||||||
uses: actions/checkout@v4
|
|
||||||
|
|
||||||
- name: Download Test Run Artifacts
|
|
||||||
id: download-test-run-artifacts
|
|
||||||
uses: actions/download-artifact@v3
|
|
||||||
# This needs to be actions/download-artifact@v3 because we upload multiple artifacts
|
|
||||||
# under the same name something that actions/upload-artifact@v4 does not do.
|
|
||||||
with:
|
|
||||||
name: pkg-testrun-artifacts-${{ inputs.distro-slug }}-${{ matrix.tests-chunk }}
|
|
||||||
path: artifacts
|
|
||||||
|
|
||||||
- name: Show Test Run Artifacts
|
|
||||||
if: always() && steps.download-test-run-artifacts.outcome == 'success'
|
|
||||||
run: |
|
|
||||||
tree -a artifacts
|
|
511
.github/workflows/test-packages-action.yml
vendored
Normal file
511
.github/workflows/test-packages-action.yml
vendored
Normal file
|
@ -0,0 +1,511 @@
|
||||||
|
---
|
||||||
|
name: Test Packages
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
salt-version:
|
||||||
|
type: string
|
||||||
|
required: true
|
||||||
|
description: The Salt version of the packages to install and test
|
||||||
|
cache-prefix:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Seed used to invalidate caches
|
||||||
|
testing-releases:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: A JSON list of releases to test upgrades against
|
||||||
|
nox-version:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: The nox version to install
|
||||||
|
python-version:
|
||||||
|
required: false
|
||||||
|
type: string
|
||||||
|
description: The python version to run tests with
|
||||||
|
default: "3.10"
|
||||||
|
nox-session:
|
||||||
|
required: false
|
||||||
|
type: string
|
||||||
|
description: The nox session to run
|
||||||
|
default: ci-test-onedir
|
||||||
|
skip-code-coverage:
|
||||||
|
required: false
|
||||||
|
type: boolean
|
||||||
|
description: Skip code coverage
|
||||||
|
default: false
|
||||||
|
package-name:
|
||||||
|
required: false
|
||||||
|
type: string
|
||||||
|
description: The onedir package name to use
|
||||||
|
default: salt
|
||||||
|
matrix:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
linux_arm_runner:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
description: Json job matrix config
|
||||||
|
|
||||||
|
env:
|
||||||
|
COLUMNS: 190
|
||||||
|
AWS_MAX_ATTEMPTS: "10"
|
||||||
|
AWS_RETRY_MODE: "adaptive"
|
||||||
|
PIP_INDEX_URL: ${{ vars.PIP_INDEX_URL }}
|
||||||
|
PIP_TRUSTED_HOST: ${{ vars.PIP_TRUSTED_HOST }}
|
||||||
|
PIP_EXTRA_INDEX_URL: ${{ vars.PIP_EXTRA_INDEX_URL }}
|
||||||
|
PIP_DISABLE_PIP_VERSION_CHECK: "1"
|
||||||
|
RAISE_DEPRECATIONS_RUNTIME_ERRORS: "1"
|
||||||
|
USE_S3_CACHE: 'false'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
|
||||||
|
test-linux:
|
||||||
|
name: ${{ matrix.display_name }} ${{ matrix.tests-chunk }}
|
||||||
|
runs-on: ${{ matrix.arch == 'x86_64' && 'ubuntu-24.04' || inputs.linux_arm_runner }}
|
||||||
|
if: ${{ !cancelled() && toJSON(fromJSON(inputs.matrix)['linux']) != '[]' }}
|
||||||
|
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix)['linux'] }}
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: "Throttle Builds"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
||||||
|
|
||||||
|
- name: "Set `TIMESTAMP` environment variable"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
|
||||||
|
|
||||||
|
- name: Checkout Source Code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Python ${{ inputs.python-version }}
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "${{ inputs.python-version }}"
|
||||||
|
|
||||||
|
- name: Setup Python Tools Scripts
|
||||||
|
uses: ./.github/actions/setup-python-tools-scripts
|
||||||
|
with:
|
||||||
|
cache-prefix: ${{ inputs.cache-prefix }}
|
||||||
|
|
||||||
|
- name: Download Packages
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ matrix.arch }}-${{ matrix.pkg_type }}
|
||||||
|
path: artifacts/pkg/
|
||||||
|
|
||||||
|
- name: Download Onedir Tarball as an Artifact
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
|
||||||
|
path: artifacts/
|
||||||
|
|
||||||
|
- name: Decompress Onedir Tarball
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
||||||
|
cd artifacts
|
||||||
|
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
|
||||||
|
|
||||||
|
- name: Install Nox
|
||||||
|
run: |
|
||||||
|
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
|
||||||
|
- name: List Packages
|
||||||
|
run: |
|
||||||
|
tree artifacts/pkg/
|
||||||
|
|
||||||
|
- name: Download nox.linux.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: nox-linux-${{ matrix.arch }}-${{ inputs.nox-session }}
|
||||||
|
|
||||||
|
- name: "Ensure docker is running"
|
||||||
|
run: |
|
||||||
|
sudo systemctl start containerd || exit 0
|
||||||
|
|
||||||
|
- name: "Pull container ${{ matrix.container }}"
|
||||||
|
run: |
|
||||||
|
docker pull ${{ matrix.container }}
|
||||||
|
|
||||||
|
- name: "Create container ${{ matrix.container }}"
|
||||||
|
run: |
|
||||||
|
tools container create ${{ matrix.container }} --name ${{ github.run_id }}_salt-test-pkg
|
||||||
|
|
||||||
|
- name: "Start container ${{ matrix.container }}"
|
||||||
|
run: |
|
||||||
|
/usr/bin/docker start ${{ github.run_id }}_salt-test-pkg
|
||||||
|
|
||||||
|
- name: Decompress .nox Directory
|
||||||
|
run: |
|
||||||
|
docker exec ${{ github.run_id}}_salt-test-pkg python3 -m nox --force-color -e decompress-dependencies -- linux ${{ matrix.arch }}
|
||||||
|
|
||||||
|
- name: List Free Space
|
||||||
|
run: |
|
||||||
|
df -h || true
|
||||||
|
|
||||||
|
- name: Show System Info
|
||||||
|
run: |
|
||||||
|
docker exec \
|
||||||
|
-e SKIP_REQUIREMENTS_INSTALL=1 \
|
||||||
|
-e PRINT_SYSTEM_INFO_ONLY=1 \
|
||||||
|
${{ github.run_id }}_salt-test-pkg python3 -m nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
|
||||||
|
|
||||||
|
- name: Run Package Tests
|
||||||
|
run: |
|
||||||
|
docker exec \
|
||||||
|
${{ github.run_id }}_salt-test-pkg \
|
||||||
|
python3 -m nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }} \
|
||||||
|
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
|
||||||
|
|
||||||
|
- name: Upload Test Run Log Artifacts
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-log-artifacts-${{ matrix.slug }}-${{ inputs.nox-session }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
|
||||||
|
path: |
|
||||||
|
artifacts/logs
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
|
- name: Upload Test Run Artifacts
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
|
||||||
|
path: |
|
||||||
|
artifacts/
|
||||||
|
!artifacts/pkg/*
|
||||||
|
!artifacts/salt/*
|
||||||
|
!artifacts/salt-*.tar.*
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
|
test-macos:
|
||||||
|
name: ${{ matrix.display_name }} ${{ matrix.tests-chunk }}
|
||||||
|
runs-on: ${{ matrix.runner }}
|
||||||
|
if: ${{ !cancelled() && toJSON(fromJSON(inputs.matrix)['macos']) != '[]' }}
|
||||||
|
timeout-minutes: 150 # 2 & 1/2 Hours - More than this and something is wrong (MacOS needs a little more time)
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix)['macos'] }}
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: "Throttle Builds"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
||||||
|
|
||||||
|
- name: "Set `TIMESTAMP` environment variable"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
|
||||||
|
|
||||||
|
- name: Checkout Source Code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Download Packages
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: salt-${{ inputs.salt-version }}-${{ matrix.arch }}-macos
|
||||||
|
path: artifacts/pkg/
|
||||||
|
|
||||||
|
- name: Install System Dependencies
|
||||||
|
run: |
|
||||||
|
brew install tree pkg-config mysql
|
||||||
|
|
||||||
|
- name: List Packages
|
||||||
|
run: |
|
||||||
|
tree artifacts/pkg/
|
||||||
|
|
||||||
|
- name: Download Onedir Tarball as an Artifact
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
|
||||||
|
path: artifacts/
|
||||||
|
|
||||||
|
- name: Decompress Onedir Tarball
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
||||||
|
cd artifacts
|
||||||
|
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
|
||||||
|
|
||||||
|
- name: Set up Python ${{ inputs.python-version }}
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "${{ inputs.python-version }}"
|
||||||
|
|
||||||
|
- name: Install Nox
|
||||||
|
run: |
|
||||||
|
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
|
||||||
|
- name: Download nox.macos.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: nox-macos-${{ matrix.arch }}-${{ inputs.nox-session }}
|
||||||
|
|
||||||
|
- name: Decompress .nox Directory
|
||||||
|
run: |
|
||||||
|
nox --force-color -e decompress-dependencies -- macos ${{ matrix.arch }}
|
||||||
|
|
||||||
|
- name: Show System Info
|
||||||
|
env:
|
||||||
|
SKIP_REQUIREMENTS_INSTALL: "1"
|
||||||
|
PRINT_SYSTEM_INFO_ONLY: "1"
|
||||||
|
run: |
|
||||||
|
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
|
||||||
|
|
||||||
|
- name: Run Package Tests
|
||||||
|
env:
|
||||||
|
SKIP_REQUIREMENTS_INSTALL: "1"
|
||||||
|
PRINT_TEST_SELECTION: "0"
|
||||||
|
PRINT_TEST_PLAN_ONLY: "0"
|
||||||
|
PRINT_SYSTEM_INFO: "0"
|
||||||
|
RERUN_FAILURES: "1"
|
||||||
|
GITHUB_ACTIONS_PIPELINE: "1"
|
||||||
|
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
||||||
|
COVERAGE_CONTEXT: ${{ matrix.slug }}
|
||||||
|
run: |
|
||||||
|
sudo -E nox --force-color -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }} \
|
||||||
|
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
|
||||||
|
|
||||||
|
- name: Fix file ownership
|
||||||
|
run: |
|
||||||
|
sudo chown -R "$(id -un)" .
|
||||||
|
|
||||||
|
- name: Prepare Test Run Artifacts
|
||||||
|
id: download-artifacts-from-vm
|
||||||
|
if: always()
|
||||||
|
run: |
|
||||||
|
# Delete the salt onedir, we won't need it anymore and it will prevent
|
||||||
|
# from it showing in the tree command below
|
||||||
|
rm -rf artifacts/salt*
|
||||||
|
tree -a artifacts
|
||||||
|
|
||||||
|
- name: Upload Test Run Log Artifacts
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-log-artifacts-${{ matrix.slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
|
||||||
|
path: |
|
||||||
|
artifacts/logs
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
|
- name: Upload Test Run Artifacts
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-artifacts-${{ matrix.slug }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
|
||||||
|
path: |
|
||||||
|
artifacts/
|
||||||
|
!artifacts/pkg/*
|
||||||
|
!artifacts/salt/*
|
||||||
|
!artifacts/salt-*.tar.*
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
|
|
||||||
|
test-windows:
|
||||||
|
name: ${{ matrix.display_name }} ${{ matrix.tests-chunk }}
|
||||||
|
runs-on: ${{ matrix.slug }}
|
||||||
|
timeout-minutes: 120 # 2 Hours - More than this and something is wrong
|
||||||
|
if: ${{ !cancelled() && toJSON(fromJSON(inputs.matrix)['windows']) != '[]' }}
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix)['windows'] }}
|
||||||
|
steps:
|
||||||
|
|
||||||
|
- name: Set up Python ${{ inputs.python-version }}
|
||||||
|
uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "${{ inputs.python-version }}"
|
||||||
|
|
||||||
|
- name: "Throttle Builds"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
t=$(python3 -c 'import random, sys; sys.stdout.write(str(random.randint(1, 15)))'); echo "Sleeping $t seconds"; sleep "$t"
|
||||||
|
|
||||||
|
|
||||||
|
- name: "Set `TIMESTAMP` environment variable"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
echo "TIMESTAMP=$(date +%s)" | tee -a "$GITHUB_ENV"
|
||||||
|
|
||||||
|
- name: Checkout Source Code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Download Packages
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-${{ matrix.arch }}-${{ matrix.pkg_type }}
|
||||||
|
path: ./artifacts/pkg/
|
||||||
|
|
||||||
|
- name: Download Onedir Tarball as an Artifact
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
|
||||||
|
path: ./artifacts/
|
||||||
|
|
||||||
|
- name: Decompress Onedir Tarball
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
python3 -c "import os; os.makedirs('artifacts', exist_ok=True)"
|
||||||
|
cd artifacts
|
||||||
|
tar xvf ${{ inputs.package-name }}-${{ inputs.salt-version }}-onedir-${{ matrix.platform }}-${{ matrix.arch }}.tar.xz
|
||||||
|
|
||||||
|
- name: Install Nox
|
||||||
|
run: |
|
||||||
|
python3 -m pip install 'nox==${{ inputs.nox-version }}'
|
||||||
|
env:
|
||||||
|
PIP_INDEX_URL: https://pypi.org/simple
|
||||||
|
|
||||||
|
- run: python3 --version
|
||||||
|
|
||||||
|
- name: Download nox.windows.${{ matrix.arch }}.tar.* artifact for session ${{ inputs.nox-session }}
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
name: nox-windows-${{ matrix.arch }}-${{ inputs.nox-session }}
|
||||||
|
|
||||||
|
- name: Decompress .nox Directory
|
||||||
|
run: |
|
||||||
|
nox --force-color -e decompress-dependencies -- windows ${{ matrix.arch }}
|
||||||
|
|
||||||
|
- name: List Important Directories
|
||||||
|
run: |
|
||||||
|
dir d:/
|
||||||
|
dir .
|
||||||
|
dir artifacts/
|
||||||
|
dir artifacts/pkg
|
||||||
|
dir .nox/ci-test-onedir/Scripts
|
||||||
|
|
||||||
|
- name: Check onedir python
|
||||||
|
continue-on-error: true
|
||||||
|
run: |
|
||||||
|
artifacts/salt/Scripts/python.exe --version
|
||||||
|
|
||||||
|
- name: Check nox python
|
||||||
|
continue-on-error: true
|
||||||
|
run: |
|
||||||
|
.nox/ci-test-onedir/Scripts/python.exe --version
|
||||||
|
|
||||||
|
- name: Show System Info
|
||||||
|
env:
|
||||||
|
SKIP_REQUIREMENTS_INSTALL: "1"
|
||||||
|
SKIP_CODE_COVERAGE: "1"
|
||||||
|
PRINT_SYSTEM_INFO_ONLY: "1"
|
||||||
|
PYTHONUTF8: "1"
|
||||||
|
run: |
|
||||||
|
nox --force-color -f noxfile.py -e "${{ inputs.nox-session }}-pkgs" -- '${{ matrix.tests-chunk }}' --log-cli-level=debug
|
||||||
|
|
||||||
|
- name: Run Package Tests
|
||||||
|
env:
|
||||||
|
SKIP_REQUIREMENTS_INSTALL: "1"
|
||||||
|
PRINT_TEST_SELECTION: "0"
|
||||||
|
PRINT_TEST_PLAN_ONLY: "0"
|
||||||
|
PRINT_SYSTEM_INFO: "0"
|
||||||
|
RERUN_FAILURES: "1"
|
||||||
|
GITHUB_ACTIONS_PIPELINE: "1"
|
||||||
|
SKIP_INITIAL_ONEDIR_FAILURES: "1"
|
||||||
|
SKIP_INITIAL_GH_ACTIONS_FAILURES: "1"
|
||||||
|
COVERAGE_CONTEXT: ${{ matrix.slug }}
|
||||||
|
OUTPUT_COLUMNS: "190"
|
||||||
|
PYTHONUTF8: "1"
|
||||||
|
run: >
|
||||||
|
nox --force-color -f noxfile.py -e ${{ inputs.nox-session }}-pkgs -- ${{ matrix.tests-chunk }}
|
||||||
|
${{ matrix.version && format('--prev-version={0}', matrix.version) || ''}}
|
||||||
|
|
||||||
|
- name: Prepare Test Run Artifacts
|
||||||
|
id: download-artifacts-from-vm
|
||||||
|
if: always()
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
# Delete the salt onedir, we won't need it anymore and it will prevent
|
||||||
|
# from it showing in the tree command below
|
||||||
|
rm -rf artifacts/salt*
|
||||||
|
if [ "${{ inputs.skip-code-coverage }}" != "true" ]; then
|
||||||
|
mv artifacts/coverage/.coverage artifacts/coverage/.coverage.${{ matrix.slug }}.${{ inputs.nox-session }}.${{ matrix.transport }}.${{ matrix.tests-chunk }}
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Upload Test Run Log Artifacts
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-log-artifacts-${{ matrix.slug }}-${{ inputs.nox-session }}-${{ matrix.transport }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
|
||||||
|
path: |
|
||||||
|
artifacts/logs
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
|
- name: Upload Test Run Artifacts
|
||||||
|
if: always()
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-artifacts-${{ matrix.slug }}-${{ matrix.pkg_type }}-${{ matrix.arch }}-${{ matrix.tests-chunk }}-${{ matrix.version || 'no-version'}}-${{ env.TIMESTAMP }}
|
||||||
|
path: |
|
||||||
|
artifacts/
|
||||||
|
!artifacts/pkg/*
|
||||||
|
!artifacts/salt/*
|
||||||
|
!artifacts/salt-*.tar.*
|
||||||
|
include-hidden-files: true
|
||||||
|
|
||||||
|
report:
|
||||||
|
name: Report
|
||||||
|
runs-on: ubuntu-22.04
|
||||||
|
if: ${{ false }}
|
||||||
|
needs:
|
||||||
|
- test-linux
|
||||||
|
- test-macos
|
||||||
|
- test-windows
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
include: ${{ fromJSON(inputs.matrix)['linux'] }}
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout Source Code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: "Throttle Builds"
|
||||||
|
shell: bash
|
||||||
|
run: |
|
||||||
|
t=$(shuf -i 1-30 -n 1); echo "Sleeping $t seconds"; sleep "$t"
|
||||||
|
|
||||||
|
- name: Wait For Artifacts
|
||||||
|
run: |
|
||||||
|
sleep 60
|
||||||
|
|
||||||
|
- name: Merge Test Run Artifacts
|
||||||
|
continue-on-error: true
|
||||||
|
uses: actions/upload-artifact/merge@v4
|
||||||
|
with:
|
||||||
|
name: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}
|
||||||
|
pattern: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}-*
|
||||||
|
separate-directories: true
|
||||||
|
delete-merged: true
|
||||||
|
|
||||||
|
- name: Wait For Artifacts 2
|
||||||
|
run: |
|
||||||
|
sleep 60
|
||||||
|
|
||||||
|
- name: Download Test Run Artifacts
|
||||||
|
id: download-test-run-artifacts
|
||||||
|
uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
path: artifacts/
|
||||||
|
pattern: pkg-testrun-artifacts-${{ matrix.slug }}${{ matrix.fips && '-fips' || '' }}-${{ matrix.pkg_type }}*
|
||||||
|
merge-multiple: true
|
||||||
|
|
||||||
|
- name: Show Test Run Artifacts
|
||||||
|
if: always()
|
||||||
|
run: |
|
||||||
|
tree -a artifacts
|
2
.github/workflows/triage.yml
vendored
2
.github/workflows/triage.yml
vendored
|
@ -38,7 +38,7 @@ jobs:
|
||||||
|
|
||||||
- name: Download last assignment cache
|
- name: Download last assignment cache
|
||||||
continue-on-error: true
|
continue-on-error: true
|
||||||
uses: dawidd6/action-download-artifact@v2
|
uses: dawidd6/action-download-artifact@v8
|
||||||
with:
|
with:
|
||||||
workflow: triage.yml
|
workflow: triage.yml
|
||||||
name: last-assignment
|
name: last-assignment
|
||||||
|
|
26
.github/workflows/workflow-finished.yml
vendored
Normal file
26
.github/workflows/workflow-finished.yml
vendored
Normal file
|
@ -0,0 +1,26 @@
|
||||||
|
name: Workflow Finished
|
||||||
|
run-name: Workflow Finished ${{ github.event.workflow_run.display_title }} (${{ github.event.workflow_run.conclusion }})
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_run:
|
||||||
|
workflows: [Nightly, Scheduled, Stage Release]
|
||||||
|
types:
|
||||||
|
- completed
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
pull-requests: read
|
||||||
|
actions: write
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
|
||||||
|
restart-failed-jobs:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: ${{ github.event.workflow_run.conclusion == 'failure' && github.event.workflow_run.run_attempt < 5 }}
|
||||||
|
steps:
|
||||||
|
- name: Restart failed jobs
|
||||||
|
env:
|
||||||
|
GH_REPO: ${{ github.repository }}
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
run: |
|
||||||
|
gh run rerun ${{ github.event.workflow_run.id }} --failed
|
27
.github/workflows/workflow-pr-finished.yml
vendored
Normal file
27
.github/workflows/workflow-pr-finished.yml
vendored
Normal file
|
@ -0,0 +1,27 @@
|
||||||
|
name: Workflow PR Finished
|
||||||
|
run-name: Workflow PR Finished ${{ github.event.workflow_run.display_title }} (${{ github.event.workflow_run.conclusion }})
|
||||||
|
|
||||||
|
on:
|
||||||
|
workflow_run:
|
||||||
|
workflows:
|
||||||
|
- CI
|
||||||
|
types:
|
||||||
|
- completed
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
pull-requests: read
|
||||||
|
actions: write
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
|
||||||
|
restart-failed-jobs:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: ${{ github.event.workflow_run.conclusion == 'failure' && github.event.workflow_run.run_attempt < 5 }}
|
||||||
|
steps:
|
||||||
|
- name: Restart failed jobs
|
||||||
|
env:
|
||||||
|
GH_REPO: ${{ github.repository }}
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
run: |
|
||||||
|
gh run rerun ${{ github.event.workflow_run.id }} --failed
|
13
.gitignore
vendored
13
.gitignore
vendored
|
@ -16,13 +16,14 @@ MANIFEST
|
||||||
.pytest_cache
|
.pytest_cache
|
||||||
Pipfile.lock
|
Pipfile.lock
|
||||||
.mypy_cache/*
|
.mypy_cache/*
|
||||||
|
.tools-venvs/
|
||||||
|
|
||||||
# virtualenv
|
# virtualenv
|
||||||
# - ignores directories of a virtualenv when you create it right on
|
# - ignores directories of a virtualenv when you create it right on
|
||||||
# top of salt such as
|
# top of salt such as
|
||||||
# - /some/path$ git clone https://github.com/thatch45/salt.git
|
# - /some/path$ git clone https://github.com/thatch45/salt.git
|
||||||
# - /some/path$ virtualenv --python=/usr/bin/python2.6 salt
|
# - /some/path$ virtualenv --python=/usr/bin/python2.6 salt
|
||||||
/env/
|
/.?env/
|
||||||
/bin/
|
/bin/
|
||||||
/etc/
|
/etc/
|
||||||
/include/
|
/include/
|
||||||
|
@ -35,7 +36,7 @@ Pipfile.lock
|
||||||
/tests/cachedir/
|
/tests/cachedir/
|
||||||
/tests/unit/templates/roots/
|
/tests/unit/templates/roots/
|
||||||
/var/
|
/var/
|
||||||
/venv/
|
/.?venv/
|
||||||
|
|
||||||
# setuptools stuff
|
# setuptools stuff
|
||||||
*.egg-info
|
*.egg-info
|
||||||
|
@ -89,6 +90,7 @@ tests/unit/templates/roots
|
||||||
# Pycharm
|
# Pycharm
|
||||||
.idea
|
.idea
|
||||||
venv/
|
venv/
|
||||||
|
.venv/
|
||||||
|
|
||||||
# VS Code
|
# VS Code
|
||||||
.vscode
|
.vscode
|
||||||
|
@ -110,12 +112,6 @@ tests/integration/cloud/providers/pki/minions
|
||||||
# Ignore pyenv files
|
# Ignore pyenv files
|
||||||
.python-version
|
.python-version
|
||||||
|
|
||||||
# Kitchen tests files
|
|
||||||
.kitchen.local.yml
|
|
||||||
kitchen.local.yml
|
|
||||||
.kitchen/
|
|
||||||
.bundle/
|
|
||||||
Gemfile.lock
|
|
||||||
/artifacts/
|
/artifacts/
|
||||||
requirements/static/*/py*/*.log
|
requirements/static/*/py*/*.log
|
||||||
|
|
||||||
|
@ -127,6 +123,7 @@ Session.vim
|
||||||
|
|
||||||
# Nox requirements archives
|
# Nox requirements archives
|
||||||
nox.*.tar.bzip2
|
nox.*.tar.bzip2
|
||||||
|
nox.*.tar.gz
|
||||||
nox.*.tar.xz
|
nox.*.tar.xz
|
||||||
|
|
||||||
# Debian packages
|
# Debian packages
|
||||||
|
|
108
.gitlab-ci.yml
108
.gitlab-ci.yml
|
@ -1,108 +0,0 @@
|
||||||
---
|
|
||||||
stages:
|
|
||||||
- lint
|
|
||||||
- test
|
|
||||||
|
|
||||||
include:
|
|
||||||
- local: 'cicd/kitchen_template.yml'
|
|
||||||
- local: 'cicd/kitchen_testruns.yml'
|
|
||||||
|
|
||||||
# pre-commit-run-all:
|
|
||||||
# image:
|
|
||||||
# name: registry.gitlab.com/saltstack/pop/cicd/containers/ubuntu1804:latest
|
|
||||||
# entrypoint: [""]
|
|
||||||
# stage: lint
|
|
||||||
# variables:
|
|
||||||
# PRE_COMMIT_HOME: "${CI_PROJECT_DIR}/pre-commit-cache"
|
|
||||||
# only:
|
|
||||||
# refs:
|
|
||||||
# - merge_requests
|
|
||||||
# cache:
|
|
||||||
# key: pre-commit-cache
|
|
||||||
# paths:
|
|
||||||
# - pre-commit-cache/
|
|
||||||
# script:
|
|
||||||
# - pip3 install pre-commit
|
|
||||||
# - pre-commit run -a -v --color always
|
|
||||||
|
|
||||||
lint-salt-full:
|
|
||||||
image: registry.gitlab.com/saltstack/pop/cicd/containers/ubuntu1804:latest
|
|
||||||
stage: lint
|
|
||||||
tags:
|
|
||||||
- saltstack-internal
|
|
||||||
cache:
|
|
||||||
key: nox-lint-cache
|
|
||||||
paths:
|
|
||||||
- .nox
|
|
||||||
only:
|
|
||||||
refs:
|
|
||||||
- schedules
|
|
||||||
script:
|
|
||||||
- python --version
|
|
||||||
- pip3 install -U nox-py2==2019.6.25
|
|
||||||
- nox --version
|
|
||||||
- nox --install-only -e lint-salt
|
|
||||||
- EC=254
|
|
||||||
- export PYLINT_REPORT=pylint-report-salt-full.log
|
|
||||||
- nox -e lint-salt
|
|
||||||
- EC=$?
|
|
||||||
- exit $EC
|
|
||||||
|
|
||||||
lint-tests-full:
|
|
||||||
image: registry.gitlab.com/saltstack/pop/cicd/containers/ubuntu1804:latest
|
|
||||||
stage: lint
|
|
||||||
tags:
|
|
||||||
- saltstack-internal
|
|
||||||
cache:
|
|
||||||
key: nox-lint-cache
|
|
||||||
paths:
|
|
||||||
- .nox
|
|
||||||
only:
|
|
||||||
refs:
|
|
||||||
- schedules
|
|
||||||
script:
|
|
||||||
- python --version
|
|
||||||
- pip3 install -U nox-py2==2019.6.25
|
|
||||||
- nox --version
|
|
||||||
- nox --install-only -e lint-tests
|
|
||||||
- EC=254
|
|
||||||
- export PYLINT_REPORT=pylint-report-tests-full.log
|
|
||||||
- nox -e lint-tests
|
|
||||||
- EC=$?
|
|
||||||
- exit $EC
|
|
||||||
|
|
||||||
docs-build-html:
|
|
||||||
image: registry.gitlab.com/saltstack/pop/cicd/containers/ubuntu1804:latest
|
|
||||||
stage: test
|
|
||||||
tags:
|
|
||||||
- saltstack-internal
|
|
||||||
cache:
|
|
||||||
key: nox-docs-cache
|
|
||||||
paths:
|
|
||||||
- .nox
|
|
||||||
only:
|
|
||||||
refs:
|
|
||||||
- schedules
|
|
||||||
script:
|
|
||||||
- python --version
|
|
||||||
- pip install -U nox-py2==2019.6.25
|
|
||||||
- nox --version
|
|
||||||
- nox -e 'docs-html(compress=True)'
|
|
||||||
|
|
||||||
docs-build-man-pages:
|
|
||||||
image: registry.gitlab.com/saltstack/pop/cicd/containers/ubuntu1804:latest
|
|
||||||
stage: test
|
|
||||||
tags:
|
|
||||||
- saltstack-internal
|
|
||||||
cache:
|
|
||||||
key: nox-docs-cache
|
|
||||||
paths:
|
|
||||||
- .nox
|
|
||||||
only:
|
|
||||||
refs:
|
|
||||||
- schedules
|
|
||||||
script:
|
|
||||||
- python --version
|
|
||||||
- pip install -U nox-py2==2019.6.25
|
|
||||||
- nox --version
|
|
||||||
- nox -e 'docs-man(compress=True, update=False)'
|
|
File diff suppressed because it is too large
Load diff
|
@ -1,27 +0,0 @@
|
||||||
[rstcheck]
|
|
||||||
ignore_directives=
|
|
||||||
automodule,
|
|
||||||
autoclass,
|
|
||||||
autofunction,
|
|
||||||
conf_proxy,
|
|
||||||
conf_log,
|
|
||||||
conf_master,
|
|
||||||
conf_minion,
|
|
||||||
releasestree,
|
|
||||||
jinja_ref,
|
|
||||||
salt:event
|
|
||||||
ignore_roles=
|
|
||||||
conf_master,
|
|
||||||
conf_minion,
|
|
||||||
conf_proxy,
|
|
||||||
conf_log,
|
|
||||||
formula_url,
|
|
||||||
issue,
|
|
||||||
pull,
|
|
||||||
blob,
|
|
||||||
jinja_ref
|
|
||||||
ignore_substitutions=
|
|
||||||
saltrepo,
|
|
||||||
repo_primary_branch,
|
|
||||||
windownload,
|
|
||||||
osxdownloadpy3
|
|
112
AUTHORS
112
AUTHORS
|
@ -8,114 +8,28 @@ Whos Who in Salt
|
||||||
The Man With the Plan
|
The Man With the Plan
|
||||||
----------------------------
|
----------------------------
|
||||||
|
|
||||||
Thomas S. Hatch is the main developer of Salt. He is the founder, owner,
|
Thomas S. Hatch is the creator of Salt. He was the founder, owner,
|
||||||
maintainer and lead of the Salt project, as well as author of the majority
|
maintainer that lead Salt project, as well as author of the majority
|
||||||
of the Salt code and documentation.
|
of initial Salt code and documentation.
|
||||||
|
|
||||||
|
SaltStack, Inc. was acquired by VMware in 2020. In 2023, VMware was
|
||||||
|
acquired by Broadcom.
|
||||||
|
|
||||||
|
The Salt Project core team of developers are employed by Broadcom.
|
||||||
|
|
||||||
Documentation System
|
Documentation System
|
||||||
----------------------------
|
----------------------------
|
||||||
|
|
||||||
The documentation system was put together by Seth House, much of the
|
The initial documentation system was put together by Seth House.
|
||||||
documentation is being maintained by Seth.
|
|
||||||
|
|
||||||
Developers
|
|
||||||
----------------------------
|
|
||||||
|
|
||||||
Aaron Bull Schaefer <aaron@elasticdog.com>
|
|
||||||
Aaron Toponce <aaron.toponce@gmail.com>
|
|
||||||
Andrew Hammond <andrew.george.hammond@gmail.com>
|
|
||||||
Aditya Kulkarni <adi@saltstack.com>
|
|
||||||
Alexander Pyatkin <asp@thexyz.net>
|
|
||||||
Andre Sachs <andre@sachs.nom.za>
|
|
||||||
Andrew Colin Kissa <andrew@topdog.za.net>
|
|
||||||
Andrew Kuhnhausen <trane@errstr.com>
|
|
||||||
Antti Kaihola <akaihol+github@ambitone.com>
|
|
||||||
archme <archme.mail@gmail.com>
|
|
||||||
Brad Barden <brad@mifflinet.net>
|
|
||||||
Bret Palsson <bretep@gmail.com>
|
|
||||||
Brian Wagner <wags@wagsworld.net>
|
|
||||||
C. R. Oldham <cr@saltstack.com>
|
|
||||||
Carl Loa Odin <carlodin@gmail.com>
|
|
||||||
Carlo Pires <carlopires@gmail.com>
|
|
||||||
Chris Rebert <chris.rebert@hulu.com>
|
|
||||||
Chris Scheller <schelcj@umich.edu>
|
|
||||||
Christer Edwards <christer.edwards@gmail.com>
|
|
||||||
Clint Savage <herlo1@gmail.com>
|
|
||||||
Colton Myers <cmyers@saltstack.com>
|
|
||||||
Corey Quinn <corey@sequestered.net>
|
|
||||||
Corin Kochenower <ckochenower@saltstack.com>
|
|
||||||
Dan Garthwaite <dan@garthwaite.org>
|
|
||||||
Daniel Wallace <danielwallace at gtmanfred dot com>
|
|
||||||
David Boucha <boucha@gmail.com>
|
|
||||||
David Pravec <alekibango@pravec.tk>
|
|
||||||
deutsche
|
|
||||||
Dmitry Kuzmenko <dkuzmenko@saltstack.com>
|
|
||||||
Doug Renn <renn@nestegg.com>
|
|
||||||
Eivind Uggedal <eivind@uggedal.com>
|
|
||||||
epoelke@gmail.com <epoelke@heartflow.com>
|
|
||||||
Eric Poelke <epoelke@gmail.com>
|
|
||||||
Erik Nolte <enolte@beyondoblivion.com>
|
|
||||||
Evan Borgstrom <evan@fatbox.ca>
|
|
||||||
Forrest Alvarez <forrest.alvarez@gmail.com>
|
|
||||||
Fred Reimer <freimer@freimer.org>
|
|
||||||
Henrik Holmboe <henrik@holmboe.se>
|
|
||||||
Gareth J. Greenaway <gareth@wiked.org>
|
|
||||||
Jacob Albretsen <jakea@xmission.com>
|
|
||||||
Jed Glazner <jglazner@coldcrow.com>
|
|
||||||
Jeff Bauer <jbauer@rubic.com>
|
|
||||||
Jeff Hutchins <jhutchins@getjive.com>
|
|
||||||
Jeffrey C. Ollie <jeff@ocjtech.us>
|
|
||||||
Jeff Schroeder <jeffschroeder@computer.org>
|
|
||||||
Johnny Bergström
|
|
||||||
Jonas Buckner <buckner.jonas@gmail.com>
|
|
||||||
Jonathan Harker <k.jonathan.harker@hp.com>
|
|
||||||
Joseph Hall <joseph@saltstack.com>
|
|
||||||
Josmar Dias <josmarnet@gmail.com>
|
|
||||||
Kent Tenney <ktenney@gmail.com>
|
|
||||||
lexual
|
|
||||||
Marat Shakirov
|
|
||||||
Marc Abramowitz <marc+github@marc-abramowitz.com>
|
|
||||||
Martin Schnabel <mb0@mb0.org>
|
|
||||||
Mathieu Le Marec - Pasquet <kiorky@cryptelium.net>
|
|
||||||
Matt Black
|
|
||||||
Matthew Printz <hipokrit@gmail.com>
|
|
||||||
Matthias Teege <matthias-git@mteege.de>
|
|
||||||
Maxim Burgerhout <maxim@wzzrd.com>
|
|
||||||
Mickey Malone <mickey.malone@gmail.com>
|
|
||||||
Michael Steed <msteed@saltstack.com>
|
|
||||||
Mike Place <mp@saltstack.com>
|
|
||||||
Mircea Ulinic <ping@mirceaulinic.net>
|
|
||||||
Mitch Anderson <mitch@metauser.net>
|
|
||||||
Mostafa Hussein <mostafa.hussein91@gmail.com>
|
|
||||||
Nathaniel Whiteinge <seth@eseth.com>
|
|
||||||
Nicolas Delaby <nicolas.delaby@ezeep.com>
|
|
||||||
Nicole Thomas <nicole@saltstack.com>
|
|
||||||
Nigel Owen <nigelowen2.gmail.com>
|
|
||||||
Nitin Madhok <nmadhok@g.clemson.edu>
|
|
||||||
Oleg Anashkin <oleg.anashkin@gmail.com>
|
|
||||||
Pedro Algarvio <pedro@algarvio.me>
|
|
||||||
Peter Baumgartner
|
|
||||||
Pierre Carrier <pierre@spotify.com>
|
|
||||||
Rhys Elsmore <me@rhys.io>
|
|
||||||
Rafael Caricio <rafael@caricio.com>
|
|
||||||
Robert Fielding
|
|
||||||
Sean Channel <pentabular@gmail.com>
|
|
||||||
Seth House <seth@eseth.com>
|
|
||||||
Seth Vidal <skvidal@fedoraproject.org>
|
|
||||||
Stas Alekseev <stas.alekseev@gmail.com>
|
|
||||||
Thibault Cohen <titilambert@gmail.com>
|
|
||||||
Thomas Schreiber <tom@rizumu.us>
|
|
||||||
Thomas S Hatch <thatch45@gmail.com>
|
|
||||||
Tor Hveem <xt@bash.no>
|
|
||||||
Travis Cline <travis.cline@gmail.com>
|
|
||||||
Wieland Hoffmann <themineo+github@gmail.com>
|
|
||||||
|
|
||||||
|
Documentation is now primarily maintained by the Salt Project core team and
|
||||||
|
community members.
|
||||||
|
|
||||||
Growing Community
|
Growing Community
|
||||||
--------------------------------
|
--------------------------------
|
||||||
|
|
||||||
Salt is a rapidly growing project with a large community, to view all
|
Salt is a rapidly growing project with a large community, and has had more than
|
||||||
contributors please check Github, this file can sometimes be out of date:
|
2,400 contributors over the years. To view all contributors, please check Github:
|
||||||
|
|
||||||
https://github.com/saltstack/salt/graphs/contributors
|
https://github.com/saltstack/salt/graphs/contributors
|
||||||
|
|
||||||
|
|
529
CHANGELOG.md
529
CHANGELOG.md
|
@ -7,6 +7,535 @@ Versions are `MAJOR.PATCH`.
|
||||||
|
|
||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## 3007.1 (2024-05-19)
|
||||||
|
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
|
||||||
|
- The ``salt.utils.psutil_compat`` was deprecated and now removed in Salt 3008. Please use the ``psutil`` module directly. [#66160](https://github.com/saltstack/salt/issues/66160)
|
||||||
|
## 3006.9 (2024-07-29)
|
||||||
|
|
||||||
|
|
||||||
|
### Deprecated
|
||||||
|
|
||||||
|
- Drop CentOS 7 support [#66623](https://github.com/saltstack/salt/issues/66623)
|
||||||
|
- No longer build RPM packages with CentOS Stream 9 [#66624](https://github.com/saltstack/salt/issues/66624)
|
||||||
|
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- Made slsutil.renderer work with salt-ssh [#50196](https://github.com/saltstack/salt/issues/50196)
|
||||||
|
- Fixed defaults.merge is not available when using salt-ssh [#51605](https://github.com/saltstack/salt/issues/51605)
|
||||||
|
- Fixed config.get does not support merge option with salt-ssh [#56441](https://github.com/saltstack/salt/issues/56441)
|
||||||
|
- Update to include croniter in pkg requirements [#57649](https://github.com/saltstack/salt/issues/57649)
|
||||||
|
- Fixed state.test does not work with salt-ssh [#61100](https://github.com/saltstack/salt/issues/61100)
|
||||||
|
- Made slsutil.findup work with salt-ssh [#61143](https://github.com/saltstack/salt/issues/61143)
|
||||||
|
- Fixes multiple issues with the cmd module on Windows. Scripts are called using
|
||||||
|
the ``-File`` parameter to the ``powershell.exe`` binary. ``CLIXML`` data in
|
||||||
|
stderr is now removed (only applies to encoded commands). Commands can now be
|
||||||
|
sent to ``cmd.powershell`` as a list. Makes sure JSON data returned is valid.
|
||||||
|
Strips whitespace from the return when using ``runas``. [#61166](https://github.com/saltstack/salt/issues/61166)
|
||||||
|
- Fixed the win_lgpo_netsh salt util to handle non-English systems. This was a
|
||||||
|
rewrite to use PowerShell instead of netsh to make the changes on the system [#61534](https://github.com/saltstack/salt/issues/61534)
|
||||||
|
- Fix typo in nftables module to ensure unique nft family values [#65295](https://github.com/saltstack/salt/issues/65295)
|
||||||
|
- Corrected x509_v2 CRL creation `last_update` and `next_update` values when system timezone is not UTC [#65837](https://github.com/saltstack/salt/issues/65837)
|
||||||
|
- Fix for NoneType can't be used in 'await' expression error. [#66177](https://github.com/saltstack/salt/issues/66177)
|
||||||
|
- Log "Publish server binding pub to" messages to debug instead of error level. [#66179](https://github.com/saltstack/salt/issues/66179)
|
||||||
|
- Fix syndic startup by making payload handler a coroutine [#66237](https://github.com/saltstack/salt/issues/66237)
|
||||||
|
- Fixed `aptpkg.remove` "unable to locate package" error for non-existent package [#66260](https://github.com/saltstack/salt/issues/66260)
|
||||||
|
- Fixed pillar.ls doesn't accept kwargs [#66262](https://github.com/saltstack/salt/issues/66262)
|
||||||
|
- Fix cache directory setting in Master Cluster tutorial [#66264](https://github.com/saltstack/salt/issues/66264)
|
||||||
|
- Change log level of successful master cluster key exchange from error to info. [#66266](https://github.com/saltstack/salt/issues/66266)
|
||||||
|
- Made `file.managed` skip download of a remote source if the managed file already exists with the correct hash [#66342](https://github.com/saltstack/salt/issues/66342)
|
||||||
|
- Fixed nftables.build_rule breaks ipv6 rules by using the wrong syntax for source and destination addresses [#66382](https://github.com/saltstack/salt/issues/66382)
|
||||||
|
- file.replace and file.search work properly with /proc files [#63102](https://github.com/saltstack/salt/issues/63102)
|
||||||
|
- Fix utf8 handling in 'pass' renderer [#64300](https://github.com/saltstack/salt/issues/64300)
|
||||||
|
- Fixed incorrect version argument will be ignored for multiple package targets warning when using pkgs argument to yumpkg module. [#64563](https://github.com/saltstack/salt/issues/64563)
|
||||||
|
- salt-cloud honors root_dir config setting for log_file location and fixes for root_dir locations on windows. [#64728](https://github.com/saltstack/salt/issues/64728)
|
||||||
|
- Fixed slsutil.update with salt-ssh during template rendering [#65067](https://github.com/saltstack/salt/issues/65067)
|
||||||
|
- Fix config.items when called on minion [#65251](https://github.com/saltstack/salt/issues/65251)
|
||||||
|
- Ensure on rpm and deb systems, that user and group for existing Salt, is maintained on upgrade [#65264](https://github.com/saltstack/salt/issues/65264)
|
||||||
|
- Fix typo in nftables module to ensure unique nft family values [#65295](https://github.com/saltstack/salt/issues/65295)
|
||||||
|
- pkg.installed state aggregate does not honors requires requisite [#65304](https://github.com/saltstack/salt/issues/65304)
|
||||||
|
- Added SSH wrapper for logmod [#65630](https://github.com/saltstack/salt/issues/65630)
|
||||||
|
- Fix for GitFS failure to unlock lock file, and resource cleanup for process SIGTERM [#65816](https://github.com/saltstack/salt/issues/65816)
|
||||||
|
- Corrected x509_v2 CRL creation `last_update` and `next_update` values when system timezone is not UTC [#65837](https://github.com/saltstack/salt/issues/65837)
|
||||||
|
- Make sure the root minion process handles SIGUSR1 and emits a traceback like it's child processes [#66095](https://github.com/saltstack/salt/issues/66095)
|
||||||
|
- Replaced pyvenv with builtin venv for virtualenv_mod [#66132](https://github.com/saltstack/salt/issues/66132)
|
||||||
|
- Made `file.managed` skip download of a remote source if the managed file already exists with the correct hash [#66342](https://github.com/saltstack/salt/issues/66342)
|
||||||
|
- Fix win_task ExecutionTimeLimit and result/error code interpretation [#66347](https://github.com/saltstack/salt/issues/66347), [#66441](https://github.com/saltstack/salt/issues/66441)
|
||||||
|
- Fixed nftables.build_rule breaks ipv6 rules by using the wrong syntax for source and destination addresses [#66382](https://github.com/saltstack/salt/issues/66382)
|
||||||
|
- Fixed x509_v2 certificate.managed crash for locally signed certificates if the signing policy defines signing_private_key [#66414](https://github.com/saltstack/salt/issues/66414)
|
||||||
|
- Fixed parallel state execution with Salt-SSH [#66514](https://github.com/saltstack/salt/issues/66514)
|
||||||
|
- Fix support for FIPS approved encryption and signing algorithms. [#66579](https://github.com/saltstack/salt/issues/66579)
|
||||||
|
- Fix relative file_roots paths [#66588](https://github.com/saltstack/salt/issues/66588)
|
||||||
|
- Fixed an issue with cmd.run with requirements when the shell is not the
|
||||||
|
default [#66596](https://github.com/saltstack/salt/issues/66596)
|
||||||
|
- Fix RPM package provides [#66604](https://github.com/saltstack/salt/issues/66604)
|
||||||
|
- Upgrade relAenv to 0.16.1. This release fixes several package installs for salt-pip [#66632](https://github.com/saltstack/salt/issues/66632)
|
||||||
|
- Upgrade relenv to 0.17.0 (https://github.com/saltstack/relenv/blob/v0.17.0/CHANGELOG.md) [#66663](https://github.com/saltstack/salt/issues/66663)
|
||||||
|
- Upgrade dependencies due to security issues:
|
||||||
|
- pymysql>=1.1.1
|
||||||
|
- requests>=2.32.0
|
||||||
|
- docker>=7.1.0 [#66666](https://github.com/saltstack/salt/issues/66666)
|
||||||
|
- Corrected missed line in branch 3006.x when backporting from PR 61620 and 65044 [#66683](https://github.com/saltstack/salt/issues/66683)
|
||||||
|
- Remove debug output from shell scripts for packaging [#66747](https://github.com/saltstack/salt/issues/66747)
|
||||||
|
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- Added the ability to pass a version of chocolatey to install to the
|
||||||
|
chocolatey.bootstrap function. Also added states to bootstrap and
|
||||||
|
unbootstrap chocolatey. [#64722](https://github.com/saltstack/salt/issues/64722)
|
||||||
|
- Add Ubuntu 24.04 support [#66180](https://github.com/saltstack/salt/issues/66180)
|
||||||
|
- Add Fedora 40 support, replacing Fedora 39 [#66300](https://github.com/saltstack/salt/issues/66300)
|
||||||
|
- Add Ubuntu 24.04 support [#66180](https://github.com/saltstack/salt/issues/66180)
|
||||||
|
- Add Fedora 40 support, replacing Fedora 39 [#66300](https://github.com/saltstack/salt/issues/66300)
|
||||||
|
- Build RPM packages with Rocky Linux 9 (instead of CentOS Stream 9) [#66624](https://github.com/saltstack/salt/issues/66624)
|
||||||
|
|
||||||
|
|
||||||
|
### Security
|
||||||
|
|
||||||
|
- Bump to `pydantic==2.6.4` due to https://github.com/advisories/GHSA-mr82-8j83-vxmv [#66433](https://github.com/saltstack/salt/issues/66433)
|
||||||
|
- Bump to ``jinja2==3.1.4`` due to https://github.com/advisories/GHSA-h75v-3vvj-5mfj [#66488](https://github.com/saltstack/salt/issues/66488)
|
||||||
|
- Bump to ``jinja2==3.1.4`` due to https://github.com/advisories/GHSA-h75v-3vvj-5mfj [#66488](https://github.com/saltstack/salt/issues/66488)
|
||||||
|
- CVE-2024-37088 salt-call will fail with exit code 1 if bad pillar data is
|
||||||
|
encountered. [#66702](https://github.com/saltstack/salt/issues/66702)
|
||||||
|
|
||||||
|
|
||||||
|
## 3006.8 (2024-04-29)
|
||||||
|
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
|
||||||
|
- Removed deprecated code scheduled to be removed on 2024-01-01:
|
||||||
|
|
||||||
|
* ``TemporaryLoggingHandler`` and ``QueueHandler`` in ``salt/_logging/handlers.py``
|
||||||
|
* All of the ``salt/log`` package.
|
||||||
|
* The ``salt/modules/cassandra_mod.py`` module.
|
||||||
|
* The ``salt/returners/cassandra_return.py`` returner.
|
||||||
|
* The ``salt/returners/django_return.py`` returner. [#66147](https://github.com/saltstack/salt/issues/66147)
|
||||||
|
|
||||||
|
|
||||||
|
### Deprecated
|
||||||
|
|
||||||
|
- Drop Fedora 37 and Fedora 38 support [#65860](https://github.com/saltstack/salt/issues/65860)
|
||||||
|
- Drop CentOS Stream 8 and 9 from CI/CD [#66104](https://github.com/saltstack/salt/issues/66104)
|
||||||
|
- Drop Photon OS 3 support [#66105](https://github.com/saltstack/salt/issues/66105)
|
||||||
|
- The ``salt.utils.psutil_compat`` module has been deprecated and will be removed in Salt 3008. Please use the ``psutil`` module directly. [#66139](https://github.com/saltstack/salt/issues/66139)
|
||||||
|
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- ``user.add`` on Windows now allows you to add user names that contain all
|
||||||
|
numeric characters [#53363](https://github.com/saltstack/salt/issues/53363)
|
||||||
|
- Fix an issue with the win_system module detecting established connections on
|
||||||
|
non-Windows systems. Uses psutils instead of parsing the return of netstat [#60508](https://github.com/saltstack/salt/issues/60508)
|
||||||
|
- pkg.refresh_db on Windows now honors saltenv [#61807](https://github.com/saltstack/salt/issues/61807)
|
||||||
|
- Fixed an issue with adding new machine policies and applying those same
|
||||||
|
policies in the same state by adding a ``refresh_cache`` option to the
|
||||||
|
``lgpo.set`` state. [#62734](https://github.com/saltstack/salt/issues/62734)
|
||||||
|
- file.managed correctly handles file path with '#' [#63060](https://github.com/saltstack/salt/issues/63060)
|
||||||
|
- Fix master ip detection when DNS records change [#63654](https://github.com/saltstack/salt/issues/63654)
|
||||||
|
- Fix user and group management on Windows to handle the Everyone group [#63667](https://github.com/saltstack/salt/issues/63667)
|
||||||
|
- Fixes an issue in pkg.refresh_db on Windows where new package definition
|
||||||
|
files were not being picked up on the first run [#63848](https://github.com/saltstack/salt/issues/63848)
|
||||||
|
- Display a proper error when pki commands fail in the win_pki module [#64933](https://github.com/saltstack/salt/issues/64933)
|
||||||
|
- Prevent full system upgrade on single package install for Arch Linux [#65200](https://github.com/saltstack/salt/issues/65200)
|
||||||
|
- When using s3fs, if files are deleted from the bucket, they were not deleted in
|
||||||
|
the master or minion local cache, which could lead to unexpected file copies or
|
||||||
|
even state applications. This change makes the local cache consistent with the
|
||||||
|
remote bucket by deleting files locally that are deleted from the bucket.
|
||||||
|
|
||||||
|
**NOTE** this could lead to **breakage** on your affected systems if it was
|
||||||
|
inadvertently depending on previously deleted files. [#65611](https://github.com/saltstack/salt/issues/65611)
|
||||||
|
- Fixed an issue with file.directory state where paths would be modified in test
|
||||||
|
mode if backupname is used. [#66049](https://github.com/saltstack/salt/issues/66049)
|
||||||
|
- Execution modules have access to regular fileclient durring pillar rendering. [#66124](https://github.com/saltstack/salt/issues/66124)
|
||||||
|
- Fixed a issue with server channel where a minion's public key
|
||||||
|
would be rejected if it contained a final newline character. [#66126](https://github.com/saltstack/salt/issues/66126)
|
||||||
|
- Fix content type backwards compatablity with http proxy post requests in the http utils module. [#66127](https://github.com/saltstack/salt/issues/66127)
|
||||||
|
- Fix systemctl with "try-restart" instead of "retry-restart" within the RPM spec, properly restarting upgraded services [#66143](https://github.com/saltstack/salt/issues/66143)
|
||||||
|
- Auto discovery of ssh, scp and ssh-keygen binaries. [#66205](https://github.com/saltstack/salt/issues/66205)
|
||||||
|
- Add leading slash to salt helper file paths as per dh_links requirement [#66280](https://github.com/saltstack/salt/issues/66280)
|
||||||
|
- Fixed x509.certificate_managed - ca_server did not return a certificate [#66284](https://github.com/saltstack/salt/issues/66284)
|
||||||
|
- removed log line that did nothing. [#66289](https://github.com/saltstack/salt/issues/66289)
|
||||||
|
- Chocolatey: Make sure the return dictionary from ``chocolatey.version``
|
||||||
|
contains lowercase keys [#66290](https://github.com/saltstack/salt/issues/66290)
|
||||||
|
- fix cacheing inline pillar, by not rendering inline pillar during cache save function. [#66292](https://github.com/saltstack/salt/issues/66292)
|
||||||
|
- The file module correctly perserves file permissions on link target. [#66400](https://github.com/saltstack/salt/issues/66400)
|
||||||
|
- Upgrade relenv to 0.16.0 and python to 3.10.14 [#66402](https://github.com/saltstack/salt/issues/66402)
|
||||||
|
- backport the fix from #66164 to fix #65703. use OrderedDict to fix bad indexing. [#66705](https://github.com/saltstack/salt/issues/66705)
|
||||||
|
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- Add Fedora 39 support [#65859](https://github.com/saltstack/salt/issues/65859)
|
||||||
|
|
||||||
|
|
||||||
|
### Security
|
||||||
|
|
||||||
|
- Upgrade to `cryptography==42.0.5` due to a few security issues:
|
||||||
|
|
||||||
|
* https://github.com/advisories/GHSA-9v9h-cgj8-h64p
|
||||||
|
* https://github.com/advisories/GHSA-3ww4-gg4f-jr7f
|
||||||
|
* https://github.com/advisories/GHSA-6vqw-3v5j-54x4 [#66141](https://github.com/saltstack/salt/issues/66141)
|
||||||
|
- Bump to `idna==3.7` due to https://github.com/advisories/GHSA-jjg7-2v4v-x38h [#66377](https://github.com/saltstack/salt/issues/66377)
|
||||||
|
- Bump to `aiohttp==3.9.4` due to https://github.com/advisories/GHSA-7gpw-8wmc-pm8g [#66411](https://github.com/saltstack/salt/issues/66411)
|
||||||
|
|
||||||
|
|
||||||
|
## 3007.0 (2024-03-03)
|
||||||
|
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
|
||||||
|
- Removed RHEL 5 support since long since end-of-lifed [#62520](https://github.com/saltstack/salt/issues/62520)
|
||||||
|
- Removing Azure-Cloud modules from the code base. [#64322](https://github.com/saltstack/salt/issues/64322)
|
||||||
|
- Dropped Python 3.7 support since it's EOL in 27 Jun 2023 [#64417](https://github.com/saltstack/salt/issues/64417)
|
||||||
|
- Remove salt.payload.Serial [#64459](https://github.com/saltstack/salt/issues/64459)
|
||||||
|
- Remove netmiko_conn and pyeapi_conn from salt.modules.napalm_mod [#64460](https://github.com/saltstack/salt/issues/64460)
|
||||||
|
- Removed 'transport' arg from salt.utils.event.get_event [#64461](https://github.com/saltstack/salt/issues/64461)
|
||||||
|
- Removed the usage of retired Linode API v3 from Salt Cloud [#64517](https://github.com/saltstack/salt/issues/64517)
|
||||||
|
|
||||||
|
|
||||||
|
### Deprecated
|
||||||
|
|
||||||
|
- Deprecate all Proxmox cloud modules [#64224](https://github.com/saltstack/salt/issues/64224)
|
||||||
|
- Deprecate all the Vault modules in favor of the Vault Salt Extension https://github.com/salt-extensions/saltext-vault. The Vault modules will be removed in Salt core in 3009.0. [#64893](https://github.com/saltstack/salt/issues/64893)
|
||||||
|
- Deprecate all the Docker modules in favor of the Docker Salt Extension https://github.com/saltstack/saltext-docker. The Docker modules will be removed in Salt core in 3009.0. [#64894](https://github.com/saltstack/salt/issues/64894)
|
||||||
|
- Deprecate all the Zabbix modules in favor of the Zabbix Salt Extension https://github.com/salt-extensions/saltext-zabbix. The Zabbix modules will be removed in Salt core in 3009.0. [#64896](https://github.com/saltstack/salt/issues/64896)
|
||||||
|
- Deprecate all the Apache modules in favor of the Apache Salt Extension https://github.com/salt-extensions/saltext-apache. The Apache modules will be removed in Salt core in 3009.0. [#64909](https://github.com/saltstack/salt/issues/64909)
|
||||||
|
- Deprecation warning for Salt's backport of ``OrderedDict`` class which will be removed in 3009 [#65542](https://github.com/saltstack/salt/issues/65542)
|
||||||
|
- Deprecate Kubernetes modules for move to saltext-kubernetes in version 3009 [#65565](https://github.com/saltstack/salt/issues/65565)
|
||||||
|
- Deprecated all Pushover modules in favor of the Salt Extension at https://github.com/salt-extensions/saltext-pushover. The Pushover modules will be removed from Salt core in 3009.0 [#65567](https://github.com/saltstack/salt/issues/65567)
|
||||||
|
- Removed deprecated code:
|
||||||
|
|
||||||
|
* All of ``salt/log/`` which has been on a deprecation path for a long time.
|
||||||
|
* Some of the logging handlers found in ``salt/_logging/handlers`` have been removed since the standard library provides
|
||||||
|
them.
|
||||||
|
* Removed the deprecated ``salt/modules/cassandra_mod.py`` module and any tests for it.
|
||||||
|
* Removed the deprecated ``salt/returners/cassandra_return.py`` module and any tests for it.
|
||||||
|
* Removed the deprecated ``salt/returners/django_return.py`` module and any tests for it. [#65986](https://github.com/saltstack/salt/issues/65986)
|
||||||
|
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- Masquerade property will not default to false turning off masquerade if not specified. [#53120](https://github.com/saltstack/salt/issues/53120)
|
||||||
|
- Addressed Python 3.11 deprecations:
|
||||||
|
|
||||||
|
* Switch to `FullArgSpec` since Py 3.11 no longer has `ArgSpec`, deprecated since Py 3.0
|
||||||
|
* Stopped using the deprecated `cgi` module.
|
||||||
|
* Stopped using the deprecated `pipes` module
|
||||||
|
* Stopped using the deprecated `imp` module [#64457](https://github.com/saltstack/salt/issues/64457)
|
||||||
|
- changed 'gpg_decrypt_must_succeed' default from False to True [#64462](https://github.com/saltstack/salt/issues/64462)
|
||||||
|
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- When an NFS or FUSE mount fails to unmount when mount options have changed, try again with a lazy umount before mounting again. [#18907](https://github.com/saltstack/salt/issues/18907)
|
||||||
|
- fix autoaccept gpg keys by supporting it in refresh_db module [#42039](https://github.com/saltstack/salt/issues/42039)
|
||||||
|
- Made cmd.script work with files from the fileserver via salt-ssh [#48067](https://github.com/saltstack/salt/issues/48067)
|
||||||
|
- Made slsutil.renderer work with salt-ssh [#50196](https://github.com/saltstack/salt/issues/50196)
|
||||||
|
- Fixed defaults.merge is not available when using salt-ssh [#51605](https://github.com/saltstack/salt/issues/51605)
|
||||||
|
- Fix extfs.mkfs missing parameter handling for -C, -d, and -e [#51858](https://github.com/saltstack/salt/issues/51858)
|
||||||
|
- Fixed Salt master does not renew token [#51986](https://github.com/saltstack/salt/issues/51986)
|
||||||
|
- Fixed salt-ssh continues state/pillar rendering with incorrect data when an exception is raised by a module on the target [#52452](https://github.com/saltstack/salt/issues/52452)
|
||||||
|
- Fix extfs.tune has 'reserved' documented twice and is missing the 'reserved_percentage' keyword argument [#54426](https://github.com/saltstack/salt/issues/54426)
|
||||||
|
- Fix the ability of the 'selinux.port_policy_present' state to modify. [#55687](https://github.com/saltstack/salt/issues/55687)
|
||||||
|
- Fixed config.get does not support merge option with salt-ssh [#56441](https://github.com/saltstack/salt/issues/56441)
|
||||||
|
- Removed an unused assignment in file.patch [#57204](https://github.com/saltstack/salt/issues/57204)
|
||||||
|
- Fixed vault module fetching more than one secret in one run with single-use tokens [#57561](https://github.com/saltstack/salt/issues/57561)
|
||||||
|
- Use brew path from which in mac_brew_pkg module and rely on _homebrew_bin() everytime [#57946](https://github.com/saltstack/salt/issues/57946)
|
||||||
|
- Fixed Vault verify option to work on minions when only specified in master config [#58174](https://github.com/saltstack/salt/issues/58174)
|
||||||
|
- Fixed vault command errors configured locally [#58580](https://github.com/saltstack/salt/issues/58580)
|
||||||
|
- Fixed issue with basic auth causing invalid header error and 401 Bad Request, by using HTTPBasicAuthHandler instead of header. [#58936](https://github.com/saltstack/salt/issues/58936)
|
||||||
|
- Make the LXD module work with pyLXD > 2.10 [#59514](https://github.com/saltstack/salt/issues/59514)
|
||||||
|
- Return error if patch file passed to state file.patch is malformed. [#59806](https://github.com/saltstack/salt/issues/59806)
|
||||||
|
- Handle failure and error information from tuned module/state [#60500](https://github.com/saltstack/salt/issues/60500)
|
||||||
|
- Fixed sdb.get_or_set_hash with Vault single-use tokens [#60779](https://github.com/saltstack/salt/issues/60779)
|
||||||
|
- Fixed state.test does not work with salt-ssh [#61100](https://github.com/saltstack/salt/issues/61100)
|
||||||
|
- Made slsutil.findup work with salt-ssh [#61143](https://github.com/saltstack/salt/issues/61143)
|
||||||
|
- Allow all primitive grain types for autosign_grains [#61416](https://github.com/saltstack/salt/issues/61416), [#63708](https://github.com/saltstack/salt/issues/63708)
|
||||||
|
- `ipset.new_set` no longer fails when creating a set type that uses the `family` create option [#61620](https://github.com/saltstack/salt/issues/61620)
|
||||||
|
- Fixed Vault session storage to allow unlimited use tokens [#62380](https://github.com/saltstack/salt/issues/62380)
|
||||||
|
- fix the efi grain on FreeBSD [#63052](https://github.com/saltstack/salt/issues/63052)
|
||||||
|
- Fixed gpg.receive_keys returns success on failed import [#63144](https://github.com/saltstack/salt/issues/63144)
|
||||||
|
- Fixed GPG state module always reports success without changes [#63153](https://github.com/saltstack/salt/issues/63153)
|
||||||
|
- Fixed GPG state module does not respect test mode [#63156](https://github.com/saltstack/salt/issues/63156)
|
||||||
|
- Fixed gpg.absent with gnupghome/user, fixed gpg.delete_key with gnupghome [#63159](https://github.com/saltstack/salt/issues/63159)
|
||||||
|
- Fixed service module does not handle enable/disable if systemd service is an alias [#63214](https://github.com/saltstack/salt/issues/63214)
|
||||||
|
- Made x509_v2 compound match detection use new runner instead of peer publishing [#63278](https://github.com/saltstack/salt/issues/63278)
|
||||||
|
- Need to make sure we update __pillar__ during a pillar refresh to ensure that process_beacons has the updated beacons loaded from pillar. [#63583](https://github.com/saltstack/salt/issues/63583)
|
||||||
|
- This implements the vpc_uuid parameter when creating a droplet. This parameter selects the correct virtual private cloud (private network interface). [#63714](https://github.com/saltstack/salt/issues/63714)
|
||||||
|
- pkg.installed no longer reports failure when installing packages that are installed via the task manager [#63767](https://github.com/saltstack/salt/issues/63767)
|
||||||
|
- mac_xattr.list and mac_xattr.read will replace undecode-able bytes to avoid raising CommandExecutionError. [#63779](https://github.com/saltstack/salt/issues/63779) [#63779](https://github.com/saltstack/salt/issues/63779)
|
||||||
|
- Fix aptpkg.latest_version performance, reducing number of times to 'shell out' [#63982](https://github.com/saltstack/salt/issues/63982)
|
||||||
|
- Added option to use a fresh connection for mysql cache [#63991](https://github.com/saltstack/salt/issues/63991)
|
||||||
|
- [lxd] Fixed a bug in `container_create` which prevented devices which are not of type `disk` to be correctly created and added to the container when passed via the `devices` parameter. [#63996](https://github.com/saltstack/salt/issues/63996)
|
||||||
|
- Skipped the `isfile` check to greatly increase speed of reading minion keys for systems with a large number of minions on slow file storage [#64260](https://github.com/saltstack/salt/issues/64260)
|
||||||
|
- Fix utf8 handling in 'pass' renderer [#64300](https://github.com/saltstack/salt/issues/64300)
|
||||||
|
- Upgade tornado to 6.3.2 [#64305](https://github.com/saltstack/salt/issues/64305)
|
||||||
|
- Prevent errors due missing 'transactional_update.apply' on SLE Micro and MicroOS. [#64369](https://github.com/saltstack/salt/issues/64369)
|
||||||
|
- Fix 'unable to unmount' failure to return False result instead of None [#64420](https://github.com/saltstack/salt/issues/64420)
|
||||||
|
- Fixed issue uninstalling duplicate packages in ``win_appx`` execution module [#64450](https://github.com/saltstack/salt/issues/64450)
|
||||||
|
- Clean up tech debt, IPC now uses tcp transport. [#64488](https://github.com/saltstack/salt/issues/64488)
|
||||||
|
- Made salt-ssh more strict when handling unexpected situations and state.* wrappers treat a remote exception as failure, excluded salt-ssh error returns from mine [#64531](https://github.com/saltstack/salt/issues/64531)
|
||||||
|
- Fix flaky test for LazyLoader with isolated mocking of threading.RLock [#64567](https://github.com/saltstack/salt/issues/64567)
|
||||||
|
- Fix possible `KeyError` exceptions in `salt.utils.user.get_group_dict`
|
||||||
|
while reading improper duplicated GID assigned for the user. [#64599](https://github.com/saltstack/salt/issues/64599)
|
||||||
|
- changed vm_config() to deep-merge vm_overrides of specific VM, instead of simple-merging the whole vm_overrides [#64610](https://github.com/saltstack/salt/issues/64610)
|
||||||
|
- Fix the way Salt tries to get the Homebrew's prefix
|
||||||
|
|
||||||
|
The first attempt to get the Homebrew's prefix is to look for
|
||||||
|
the `HOMEBREW_PREFIX` environment variable. If it's not set, then
|
||||||
|
Salt tries to get the prefix from the `brew` command. However, the
|
||||||
|
`brew` command can fail. So a last attempt is made to get the
|
||||||
|
prefix by guessing the installation path. [#64924](https://github.com/saltstack/salt/issues/64924)
|
||||||
|
- Add missing MySQL Grant SERVICE_CONNECTION_ADMIN to mysql module. [#64934](https://github.com/saltstack/salt/issues/64934)
|
||||||
|
- Fixed slsutil.update with salt-ssh during template rendering [#65067](https://github.com/saltstack/salt/issues/65067)
|
||||||
|
- Keep track when an included file only includes sls files but is a requisite. [#65080](https://github.com/saltstack/salt/issues/65080)
|
||||||
|
- Fixed `gpg.present` succeeds when the keyserver is unreachable [#65169](https://github.com/saltstack/salt/issues/65169)
|
||||||
|
- Fix typo in nftables module to ensure unique nft family values [#65295](https://github.com/saltstack/salt/issues/65295)
|
||||||
|
- Dereference symlinks to set proper __cli opt [#65435](https://github.com/saltstack/salt/issues/65435)
|
||||||
|
- Made salt-ssh merge master top returns for the same environment [#65480](https://github.com/saltstack/salt/issues/65480)
|
||||||
|
- Account for situation where the metadata grain fails because the AWS environment requires an authentication token to query the metadata URL. [#65513](https://github.com/saltstack/salt/issues/65513)
|
||||||
|
- Improve the condition of overriding target for pip with VENV_PIP_TARGET environment variable. [#65562](https://github.com/saltstack/salt/issues/65562)
|
||||||
|
- Added SSH wrapper for logmod [#65630](https://github.com/saltstack/salt/issues/65630)
|
||||||
|
- Include changes in the results when schedule.present state is run with test=True. [#65652](https://github.com/saltstack/salt/issues/65652)
|
||||||
|
- Fix extfs.tune doesn't pass retcode to module.run [#65686](https://github.com/saltstack/salt/issues/65686)
|
||||||
|
- Return an error message when the DNS plugin is not supported [#65739](https://github.com/saltstack/salt/issues/65739)
|
||||||
|
- Execution modules have access to regular fileclient durring pillar rendering. [#66124](https://github.com/saltstack/salt/issues/66124)
|
||||||
|
- Fixed a issue with server channel where a minion's public key
|
||||||
|
would be rejected if it contained a final newline character. [#66126](https://github.com/saltstack/salt/issues/66126)
|
||||||
|
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- Allowed publishing to regular minions from the SSH wrapper [#40943](https://github.com/saltstack/salt/issues/40943)
|
||||||
|
- Added syncing of custom salt-ssh wrappers [#45450](https://github.com/saltstack/salt/issues/45450)
|
||||||
|
- Made salt-ssh sync custom utils [#53666](https://github.com/saltstack/salt/issues/53666)
|
||||||
|
- Add ability to use file.managed style check_cmd in file.serialize [#53982](https://github.com/saltstack/salt/issues/53982)
|
||||||
|
- Revised use of deprecated net-tools and added support for ip neighbour with IPv4 ip_neighs, IPv6 ip_neighs6 [#57541](https://github.com/saltstack/salt/issues/57541)
|
||||||
|
- Added password support to Redis returner. [#58044](https://github.com/saltstack/salt/issues/58044)
|
||||||
|
- Added a state (win_task) for managing scheduled tasks on Windows [#59037](https://github.com/saltstack/salt/issues/59037)
|
||||||
|
- Added keyring param to gpg modules [#59783](https://github.com/saltstack/salt/issues/59783)
|
||||||
|
- Added new grain to detect the Salt package type: onedir, pip or system [#62589](https://github.com/saltstack/salt/issues/62589)
|
||||||
|
- Added Vault AppRole and identity issuance to minions [#62823](https://github.com/saltstack/salt/issues/62823)
|
||||||
|
- Added Vault AppRole auth mount path configuration option [#62825](https://github.com/saltstack/salt/issues/62825)
|
||||||
|
- Added distribution of Vault authentication details via response wrapping [#62828](https://github.com/saltstack/salt/issues/62828)
|
||||||
|
- Add salt package type information. Either onedir, pip or system. [#62961](https://github.com/saltstack/salt/issues/62961)
|
||||||
|
- Added signature verification to file.managed/archive.extracted [#63143](https://github.com/saltstack/salt/issues/63143)
|
||||||
|
- Added signed_by_any/signed_by_all parameters to gpg.verify [#63166](https://github.com/saltstack/salt/issues/63166)
|
||||||
|
- Added match runner [#63278](https://github.com/saltstack/salt/issues/63278)
|
||||||
|
- Added Vault token lifecycle management [#63406](https://github.com/saltstack/salt/issues/63406)
|
||||||
|
- adding new call for openscap xccdf eval supporting new parameters [#63416](https://github.com/saltstack/salt/issues/63416)
|
||||||
|
- Added Vault lease management utility [#63440](https://github.com/saltstack/salt/issues/63440)
|
||||||
|
- implement removal of ptf packages in zypper pkg module [#63442](https://github.com/saltstack/salt/issues/63442)
|
||||||
|
- add JUnit output for saltcheck [#63463](https://github.com/saltstack/salt/issues/63463)
|
||||||
|
- Add ability for file.keyvalue to create a file if it doesn't exist [#63545](https://github.com/saltstack/salt/issues/63545)
|
||||||
|
- added cleanup of temporary mountpoint dir for macpackage installed state [#63905](https://github.com/saltstack/salt/issues/63905)
|
||||||
|
- Add pkg.installed show installable version in test mode [#63985](https://github.com/saltstack/salt/issues/63985)
|
||||||
|
- Added patch option to Vault SDB driver [#64096](https://github.com/saltstack/salt/issues/64096)
|
||||||
|
- Added flags to create local users and groups [#64256](https://github.com/saltstack/salt/issues/64256)
|
||||||
|
- Added inline specification of trusted CA root certificate for Vault [#64379](https://github.com/saltstack/salt/issues/64379)
|
||||||
|
- Add ability to return False result in test mode of configurable_test_state [#64418](https://github.com/saltstack/salt/issues/64418)
|
||||||
|
- Switched Salt's onedir Python version to 3.11 [#64457](https://github.com/saltstack/salt/issues/64457)
|
||||||
|
- Added support for dnf5 and its new command syntax [#64532](https://github.com/saltstack/salt/issues/64532)
|
||||||
|
- Adding a new decorator to indicate when a module is deprecated in favor of a Salt extension. [#64569](https://github.com/saltstack/salt/issues/64569)
|
||||||
|
- Add jq-esque to_entries and from_entries functions [#64600](https://github.com/saltstack/salt/issues/64600)
|
||||||
|
- Added ability to use PYTHONWARNINGS=ignore to silence deprecation warnings. [#64660](https://github.com/saltstack/salt/issues/64660)
|
||||||
|
- Add follow_symlinks to file.symlink exec module to switch to os.path.lexists when False [#64665](https://github.com/saltstack/salt/issues/64665)
|
||||||
|
- Strenghten Salt's HA capabilities with master clustering. [#64939](https://github.com/saltstack/salt/issues/64939)
|
||||||
|
- Added win_appx state and execution modules for managing Microsoft Store apps and deprovisioning them from systems [#64978](https://github.com/saltstack/salt/issues/64978)
|
||||||
|
- Add support for show_jid to salt-run
|
||||||
|
|
||||||
|
Adds support for show_jid master config option to salt-run, so its behaviour matches the salt cli command. [#65008](https://github.com/saltstack/salt/issues/65008)
|
||||||
|
- Add ability to remove packages by wildcard via apt execution module [#65220](https://github.com/saltstack/salt/issues/65220)
|
||||||
|
- Added support for master top modules on masterless minions [#65479](https://github.com/saltstack/salt/issues/65479)
|
||||||
|
- Allowed accessing the regular mine from the SSH wrapper [#65645](https://github.com/saltstack/salt/issues/65645)
|
||||||
|
- Allow enabling backup for Linode in Salt Cloud [#65697](https://github.com/saltstack/salt/issues/65697)
|
||||||
|
- Add a backup schedule setter fFunction for Linode VMs [#65713](https://github.com/saltstack/salt/issues/65713)
|
||||||
|
- Add acme support for manual plugin hooks [#65744](https://github.com/saltstack/salt/issues/65744)
|
||||||
|
|
||||||
|
|
||||||
|
### Security
|
||||||
|
|
||||||
|
- Upgrade to `tornado>=6.3.3` due to https://github.com/advisories/GHSA-qppv-j76h-2rpx [#64989](https://github.com/saltstack/salt/issues/64989)
|
||||||
|
- Update to `gitpython>=3.1.35` due to https://github.com/advisories/GHSA-wfm5-v35h-vwf4 and https://github.com/advisories/GHSA-cwvm-v4w8-q58c [#65137](https://github.com/saltstack/salt/issues/65137)
|
||||||
|
|
||||||
|
|
||||||
|
## 3007.0rc1 (2024-01-02)
|
||||||
|
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
|
||||||
|
- Removed RHEL 5 support since long since end-of-lifed [#62520](https://github.com/saltstack/salt/issues/62520)
|
||||||
|
- Removing Azure-Cloud modules from the code base. [#64322](https://github.com/saltstack/salt/issues/64322)
|
||||||
|
- Dropped Python 3.7 support since it's EOL in 27 Jun 2023 [#64417](https://github.com/saltstack/salt/issues/64417)
|
||||||
|
- Remove salt.payload.Serial [#64459](https://github.com/saltstack/salt/issues/64459)
|
||||||
|
- Remove netmiko_conn and pyeapi_conn from salt.modules.napalm_mod [#64460](https://github.com/saltstack/salt/issues/64460)
|
||||||
|
- Removed 'transport' arg from salt.utils.event.get_event [#64461](https://github.com/saltstack/salt/issues/64461)
|
||||||
|
- Removed the usage of retired Linode API v3 from Salt Cloud [#64517](https://github.com/saltstack/salt/issues/64517)
|
||||||
|
|
||||||
|
|
||||||
|
### Deprecated
|
||||||
|
|
||||||
|
- Deprecate all Proxmox cloud modules [#64224](https://github.com/saltstack/salt/issues/64224)
|
||||||
|
- Deprecate all the Vault modules in favor of the Vault Salt Extension https://github.com/salt-extensions/saltext-vault. The Vault modules will be removed in Salt core in 3009.0. [#64893](https://github.com/saltstack/salt/issues/64893)
|
||||||
|
- Deprecate all the Docker modules in favor of the Docker Salt Extension https://github.com/saltstack/saltext-docker. The Docker modules will be removed in Salt core in 3009.0. [#64894](https://github.com/saltstack/salt/issues/64894)
|
||||||
|
- Deprecate all the Zabbix modules in favor of the Zabbix Salt Extension https://github.com/salt-extensions/saltext-zabbix. The Zabbix modules will be removed in Salt core in 3009.0. [#64896](https://github.com/saltstack/salt/issues/64896)
|
||||||
|
- Deprecate all the Apache modules in favor of the Apache Salt Extension https://github.com/salt-extensions/saltext-apache. The Apache modules will be removed in Salt core in 3009.0. [#64909](https://github.com/saltstack/salt/issues/64909)
|
||||||
|
- Deprecation warning for Salt's backport of ``OrderedDict`` class which will be removed in 3009 [#65542](https://github.com/saltstack/salt/issues/65542)
|
||||||
|
- Deprecate Kubernetes modules for move to saltext-kubernetes in version 3009 [#65565](https://github.com/saltstack/salt/issues/65565)
|
||||||
|
- Deprecated all Pushover modules in favor of the Salt Extension at https://github.com/salt-extensions/saltext-pushover. The Pushover modules will be removed from Salt core in 3009.0 [#65567](https://github.com/saltstack/salt/issues/65567)
|
||||||
|
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
|
||||||
|
- Masquerade property will not default to false turning off masquerade if not specified. [#53120](https://github.com/saltstack/salt/issues/53120)
|
||||||
|
- Addressed Python 3.11 deprecations:
|
||||||
|
|
||||||
|
* Switch to `FullArgSpec` since Py 3.11 no longer has `ArgSpec`, deprecated since Py 3.0
|
||||||
|
* Stopped using the deprecated `cgi` module.
|
||||||
|
* Stopped using the deprecated `pipes` module
|
||||||
|
* Stopped using the deprecated `imp` module [#64457](https://github.com/saltstack/salt/issues/64457)
|
||||||
|
- changed 'gpg_decrypt_must_succeed' default from False to True [#64462](https://github.com/saltstack/salt/issues/64462)
|
||||||
|
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- When an NFS or FUSE mount fails to unmount when mount options have changed, try again with a lazy umount before mounting again. [#18907](https://github.com/saltstack/salt/issues/18907)
|
||||||
|
- fix autoaccept gpg keys by supporting it in refresh_db module [#42039](https://github.com/saltstack/salt/issues/42039)
|
||||||
|
- Made cmd.script work with files from the fileserver via salt-ssh [#48067](https://github.com/saltstack/salt/issues/48067)
|
||||||
|
- Made slsutil.renderer work with salt-ssh [#50196](https://github.com/saltstack/salt/issues/50196)
|
||||||
|
- Fixed defaults.merge is not available when using salt-ssh [#51605](https://github.com/saltstack/salt/issues/51605)
|
||||||
|
- Fix extfs.mkfs missing parameter handling for -C, -d, and -e [#51858](https://github.com/saltstack/salt/issues/51858)
|
||||||
|
- Fixed Salt master does not renew token [#51986](https://github.com/saltstack/salt/issues/51986)
|
||||||
|
- Fixed salt-ssh continues state/pillar rendering with incorrect data when an exception is raised by a module on the target [#52452](https://github.com/saltstack/salt/issues/52452)
|
||||||
|
- Fix extfs.tune has 'reserved' documented twice and is missing the 'reserved_percentage' keyword argument [#54426](https://github.com/saltstack/salt/issues/54426)
|
||||||
|
- Fix the ability of the 'selinux.port_policy_present' state to modify. [#55687](https://github.com/saltstack/salt/issues/55687)
|
||||||
|
- Fixed config.get does not support merge option with salt-ssh [#56441](https://github.com/saltstack/salt/issues/56441)
|
||||||
|
- Removed an unused assignment in file.patch [#57204](https://github.com/saltstack/salt/issues/57204)
|
||||||
|
- Fixed vault module fetching more than one secret in one run with single-use tokens [#57561](https://github.com/saltstack/salt/issues/57561)
|
||||||
|
- Use brew path from which in mac_brew_pkg module and rely on _homebrew_bin() everytime [#57946](https://github.com/saltstack/salt/issues/57946)
|
||||||
|
- Fixed Vault verify option to work on minions when only specified in master config [#58174](https://github.com/saltstack/salt/issues/58174)
|
||||||
|
- Fixed vault command errors configured locally [#58580](https://github.com/saltstack/salt/issues/58580)
|
||||||
|
- Fixed issue with basic auth causing invalid header error and 401 Bad Request, by using HTTPBasicAuthHandler instead of header. [#58936](https://github.com/saltstack/salt/issues/58936)
|
||||||
|
- Make the LXD module work with pyLXD > 2.10 [#59514](https://github.com/saltstack/salt/issues/59514)
|
||||||
|
- Return error if patch file passed to state file.patch is malformed. [#59806](https://github.com/saltstack/salt/issues/59806)
|
||||||
|
- Handle failure and error information from tuned module/state [#60500](https://github.com/saltstack/salt/issues/60500)
|
||||||
|
- Fixed sdb.get_or_set_hash with Vault single-use tokens [#60779](https://github.com/saltstack/salt/issues/60779)
|
||||||
|
- Fixed state.test does not work with salt-ssh [#61100](https://github.com/saltstack/salt/issues/61100)
|
||||||
|
- Made slsutil.findup work with salt-ssh [#61143](https://github.com/saltstack/salt/issues/61143)
|
||||||
|
- Allow all primitive grain types for autosign_grains [#61416](https://github.com/saltstack/salt/issues/61416), [#63708](https://github.com/saltstack/salt/issues/63708)
|
||||||
|
- `ipset.new_set` no longer fails when creating a set type that uses the `family` create option [#61620](https://github.com/saltstack/salt/issues/61620)
|
||||||
|
- Fixed Vault session storage to allow unlimited use tokens [#62380](https://github.com/saltstack/salt/issues/62380)
|
||||||
|
- fix the efi grain on FreeBSD [#63052](https://github.com/saltstack/salt/issues/63052)
|
||||||
|
- Fixed gpg.receive_keys returns success on failed import [#63144](https://github.com/saltstack/salt/issues/63144)
|
||||||
|
- Fixed GPG state module always reports success without changes [#63153](https://github.com/saltstack/salt/issues/63153)
|
||||||
|
- Fixed GPG state module does not respect test mode [#63156](https://github.com/saltstack/salt/issues/63156)
|
||||||
|
- Fixed gpg.absent with gnupghome/user, fixed gpg.delete_key with gnupghome [#63159](https://github.com/saltstack/salt/issues/63159)
|
||||||
|
- Fixed service module does not handle enable/disable if systemd service is an alias [#63214](https://github.com/saltstack/salt/issues/63214)
|
||||||
|
- Made x509_v2 compound match detection use new runner instead of peer publishing [#63278](https://github.com/saltstack/salt/issues/63278)
|
||||||
|
- Need to make sure we update __pillar__ during a pillar refresh to ensure that process_beacons has the updated beacons loaded from pillar. [#63583](https://github.com/saltstack/salt/issues/63583)
|
||||||
|
- This implements the vpc_uuid parameter when creating a droplet. This parameter selects the correct virtual private cloud (private network interface). [#63714](https://github.com/saltstack/salt/issues/63714)
|
||||||
|
- pkg.installed no longer reports failure when installing packages that are installed via the task manager [#63767](https://github.com/saltstack/salt/issues/63767)
|
||||||
|
- mac_xattr.list and mac_xattr.read will replace undecode-able bytes to avoid raising CommandExecutionError. [#63779](https://github.com/saltstack/salt/issues/63779) [#63779](https://github.com/saltstack/salt/issues/63779)
|
||||||
|
- Fix aptpkg.latest_version performance, reducing number of times to 'shell out' [#63982](https://github.com/saltstack/salt/issues/63982)
|
||||||
|
- Added option to use a fresh connection for mysql cache [#63991](https://github.com/saltstack/salt/issues/63991)
|
||||||
|
- [lxd] Fixed a bug in `container_create` which prevented devices which are not of type `disk` to be correctly created and added to the container when passed via the `devices` parameter. [#63996](https://github.com/saltstack/salt/issues/63996)
|
||||||
|
- Skipped the `isfile` check to greatly increase speed of reading minion keys for systems with a large number of minions on slow file storage [#64260](https://github.com/saltstack/salt/issues/64260)
|
||||||
|
- Fix utf8 handling in 'pass' renderer [#64300](https://github.com/saltstack/salt/issues/64300)
|
||||||
|
- Upgade tornado to 6.3.2 [#64305](https://github.com/saltstack/salt/issues/64305)
|
||||||
|
- Prevent errors due missing 'transactional_update.apply' on SLE Micro and MicroOS. [#64369](https://github.com/saltstack/salt/issues/64369)
|
||||||
|
- Fix 'unable to unmount' failure to return False result instead of None [#64420](https://github.com/saltstack/salt/issues/64420)
|
||||||
|
- Fixed issue uninstalling duplicate packages in ``win_appx`` execution module [#64450](https://github.com/saltstack/salt/issues/64450)
|
||||||
|
- Clean up tech debt, IPC now uses tcp transport. [#64488](https://github.com/saltstack/salt/issues/64488)
|
||||||
|
- Made salt-ssh more strict when handling unexpected situations and state.* wrappers treat a remote exception as failure, excluded salt-ssh error returns from mine [#64531](https://github.com/saltstack/salt/issues/64531)
|
||||||
|
- Fix flaky test for LazyLoader with isolated mocking of threading.RLock [#64567](https://github.com/saltstack/salt/issues/64567)
|
||||||
|
- Fix possible `KeyError` exceptions in `salt.utils.user.get_group_dict`
|
||||||
|
while reading improper duplicated GID assigned for the user. [#64599](https://github.com/saltstack/salt/issues/64599)
|
||||||
|
- changed vm_config() to deep-merge vm_overrides of specific VM, instead of simple-merging the whole vm_overrides [#64610](https://github.com/saltstack/salt/issues/64610)
|
||||||
|
- Fix the way Salt tries to get the Homebrew's prefix
|
||||||
|
|
||||||
|
The first attempt to get the Homebrew's prefix is to look for
|
||||||
|
the `HOMEBREW_PREFIX` environment variable. If it's not set, then
|
||||||
|
Salt tries to get the prefix from the `brew` command. However, the
|
||||||
|
`brew` command can fail. So a last attempt is made to get the
|
||||||
|
prefix by guessing the installation path. [#64924](https://github.com/saltstack/salt/issues/64924)
|
||||||
|
- Add missing MySQL Grant SERVICE_CONNECTION_ADMIN to mysql module. [#64934](https://github.com/saltstack/salt/issues/64934)
|
||||||
|
- Fixed slsutil.update with salt-ssh during template rendering [#65067](https://github.com/saltstack/salt/issues/65067)
|
||||||
|
- Keep track when an included file only includes sls files but is a requisite. [#65080](https://github.com/saltstack/salt/issues/65080)
|
||||||
|
- Fixed `gpg.present` succeeds when the keyserver is unreachable [#65169](https://github.com/saltstack/salt/issues/65169)
|
||||||
|
- Fix issue with openscap when the error was outside the expected scope. It now
|
||||||
|
returns failed with the error code and the error [#65193](https://github.com/saltstack/salt/issues/65193)
|
||||||
|
- Fix typo in nftables module to ensure unique nft family values [#65295](https://github.com/saltstack/salt/issues/65295)
|
||||||
|
- Dereference symlinks to set proper __cli opt [#65435](https://github.com/saltstack/salt/issues/65435)
|
||||||
|
- Made salt-ssh merge master top returns for the same environment [#65480](https://github.com/saltstack/salt/issues/65480)
|
||||||
|
- Account for situation where the metadata grain fails because the AWS environment requires an authentication token to query the metadata URL. [#65513](https://github.com/saltstack/salt/issues/65513)
|
||||||
|
- Improve the condition of overriding target for pip with VENV_PIP_TARGET environment variable. [#65562](https://github.com/saltstack/salt/issues/65562)
|
||||||
|
- Added SSH wrapper for logmod [#65630](https://github.com/saltstack/salt/issues/65630)
|
||||||
|
- Include changes in the results when schedule.present state is run with test=True. [#65652](https://github.com/saltstack/salt/issues/65652)
|
||||||
|
- Fixed Salt-SSH pillar rendering and state rendering with nested SSH calls when called via saltutil.cmd or in an orchestration [#65670](https://github.com/saltstack/salt/issues/65670)
|
||||||
|
- Fix extfs.tune doesn't pass retcode to module.run [#65686](https://github.com/saltstack/salt/issues/65686)
|
||||||
|
- Fix boto execution module loading [#65691](https://github.com/saltstack/salt/issues/65691)
|
||||||
|
- Removed PR 65185 changes since incomplete solution [#65692](https://github.com/saltstack/salt/issues/65692)
|
||||||
|
- Return an error message when the DNS plugin is not supported [#65739](https://github.com/saltstack/salt/issues/65739)
|
||||||
|
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- Allowed publishing to regular minions from the SSH wrapper [#40943](https://github.com/saltstack/salt/issues/40943)
|
||||||
|
- Added syncing of custom salt-ssh wrappers [#45450](https://github.com/saltstack/salt/issues/45450)
|
||||||
|
- Made salt-ssh sync custom utils [#53666](https://github.com/saltstack/salt/issues/53666)
|
||||||
|
- Add ability to use file.managed style check_cmd in file.serialize [#53982](https://github.com/saltstack/salt/issues/53982)
|
||||||
|
- Revised use of deprecated net-tools and added support for ip neighbour with IPv4 ip_neighs, IPv6 ip_neighs6 [#57541](https://github.com/saltstack/salt/issues/57541)
|
||||||
|
- Added password support to Redis returner. [#58044](https://github.com/saltstack/salt/issues/58044)
|
||||||
|
- Added keyring param to gpg modules [#59783](https://github.com/saltstack/salt/issues/59783)
|
||||||
|
- Added new grain to detect the Salt package type: onedir, pip or system [#62589](https://github.com/saltstack/salt/issues/62589)
|
||||||
|
- Added Vault AppRole and identity issuance to minions [#62823](https://github.com/saltstack/salt/issues/62823)
|
||||||
|
- Added Vault AppRole auth mount path configuration option [#62825](https://github.com/saltstack/salt/issues/62825)
|
||||||
|
- Added distribution of Vault authentication details via response wrapping [#62828](https://github.com/saltstack/salt/issues/62828)
|
||||||
|
- Add salt package type information. Either onedir, pip or system. [#62961](https://github.com/saltstack/salt/issues/62961)
|
||||||
|
- Added signature verification to file.managed/archive.extracted [#63143](https://github.com/saltstack/salt/issues/63143)
|
||||||
|
- Added signed_by_any/signed_by_all parameters to gpg.verify [#63166](https://github.com/saltstack/salt/issues/63166)
|
||||||
|
- Added match runner [#63278](https://github.com/saltstack/salt/issues/63278)
|
||||||
|
- Added Vault token lifecycle management [#63406](https://github.com/saltstack/salt/issues/63406)
|
||||||
|
- adding new call for openscap xccdf eval supporting new parameters [#63416](https://github.com/saltstack/salt/issues/63416)
|
||||||
|
- Added Vault lease management utility [#63440](https://github.com/saltstack/salt/issues/63440)
|
||||||
|
- implement removal of ptf packages in zypper pkg module [#63442](https://github.com/saltstack/salt/issues/63442)
|
||||||
|
- add JUnit output for saltcheck [#63463](https://github.com/saltstack/salt/issues/63463)
|
||||||
|
- Add ability for file.keyvalue to create a file if it doesn't exist [#63545](https://github.com/saltstack/salt/issues/63545)
|
||||||
|
- added cleanup of temporary mountpoint dir for macpackage installed state [#63905](https://github.com/saltstack/salt/issues/63905)
|
||||||
|
- Add pkg.installed show installable version in test mode [#63985](https://github.com/saltstack/salt/issues/63985)
|
||||||
|
- Added patch option to Vault SDB driver [#64096](https://github.com/saltstack/salt/issues/64096)
|
||||||
|
- Added flags to create local users and groups [#64256](https://github.com/saltstack/salt/issues/64256)
|
||||||
|
- Added inline specification of trusted CA root certificate for Vault [#64379](https://github.com/saltstack/salt/issues/64379)
|
||||||
|
- Add ability to return False result in test mode of configurable_test_state [#64418](https://github.com/saltstack/salt/issues/64418)
|
||||||
|
- Switched Salt's onedir Python version to 3.11 [#64457](https://github.com/saltstack/salt/issues/64457)
|
||||||
|
- Added support for dnf5 and its new command syntax [#64532](https://github.com/saltstack/salt/issues/64532)
|
||||||
|
- Adding a new decorator to indicate when a module is deprecated in favor of a Salt extension. [#64569](https://github.com/saltstack/salt/issues/64569)
|
||||||
|
- Add jq-esque to_entries and from_entries functions [#64600](https://github.com/saltstack/salt/issues/64600)
|
||||||
|
- Added ability to use PYTHONWARNINGS=ignore to silence deprecation warnings. [#64660](https://github.com/saltstack/salt/issues/64660)
|
||||||
|
- Add follow_symlinks to file.symlink exec module to switch to os.path.lexists when False [#64665](https://github.com/saltstack/salt/issues/64665)
|
||||||
|
- Added win_appx state and execution modules for managing Microsoft Store apps and deprovisioning them from systems [#64978](https://github.com/saltstack/salt/issues/64978)
|
||||||
|
- Add support for show_jid to salt-run
|
||||||
|
|
||||||
|
Adds support for show_jid master config option to salt-run, so its behaviour matches the salt cli command. [#65008](https://github.com/saltstack/salt/issues/65008)
|
||||||
|
- Add ability to remove packages by wildcard via apt execution module [#65220](https://github.com/saltstack/salt/issues/65220)
|
||||||
|
- Added support for master top modules on masterless minions [#65479](https://github.com/saltstack/salt/issues/65479)
|
||||||
|
- Allowed accessing the regular mine from the SSH wrapper [#65645](https://github.com/saltstack/salt/issues/65645)
|
||||||
|
- Allow enabling backup for Linode in Salt Cloud [#65697](https://github.com/saltstack/salt/issues/65697)
|
||||||
|
- Add a backup schedule setter fFunction for Linode VMs [#65713](https://github.com/saltstack/salt/issues/65713)
|
||||||
|
- Add acme support for manual plugin hooks [#65744](https://github.com/saltstack/salt/issues/65744)
|
||||||
|
|
||||||
|
|
||||||
|
### Security
|
||||||
|
|
||||||
|
- Upgrade to `tornado>=6.3.3` due to https://github.com/advisories/GHSA-qppv-j76h-2rpx [#64989](https://github.com/saltstack/salt/issues/64989)
|
||||||
|
- Update to `gitpython>=3.1.35` due to https://github.com/advisories/GHSA-wfm5-v35h-vwf4 and https://github.com/advisories/GHSA-cwvm-v4w8-q58c [#65137](https://github.com/saltstack/salt/issues/65137)
|
||||||
|
|
||||||
|
|
||||||
## 3006.7 (2024-02-20)
|
## 3006.7 (2024-02-20)
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -60,7 +60,7 @@ representative at an online or offline event.
|
||||||
|
|
||||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||||
reported to the community leaders responsible for enforcement at
|
reported to the community leaders responsible for enforcement at
|
||||||
conduct@saltstack.com.
|
saltproject.pdl@broadcom.com.
|
||||||
All complaints will be reviewed and investigated promptly and fairly.
|
All complaints will be reviewed and investigated promptly and fairly.
|
||||||
|
|
||||||
All community leaders are obligated to respect the privacy and security of the
|
All community leaders are obligated to respect the privacy and security of the
|
||||||
|
|
|
@ -1,30 +1,56 @@
|
||||||
============
|
==============================================
|
||||||
Contributing
|
Contributing to Salt: A Guide for Contributors
|
||||||
============
|
==============================================
|
||||||
|
|
||||||
So you want to contribute to the Salt project? Excellent! You can help
|
So, you want to contribute to the Salt project? That's fantastic! There are many
|
||||||
in a number of ways:
|
ways you can help improve Salt:
|
||||||
|
|
||||||
- Use Salt and open well-written bug reports.
|
- Use Salt and report bugs with clear, detailed descriptions.
|
||||||
- Join a `working group <https://github.com/saltstack/community>`__.
|
- Join a `working group <https://github.com/saltstack/community>`__ to
|
||||||
- Answer questions on `irc <https://web.libera.chat/#salt>`__,
|
collaborate with other contributors.
|
||||||
the `community Slack <https://via.vmw.com/salt-slack>`__,
|
- Answer questions on platforms like
|
||||||
the `salt-users mailing
|
the `community Discord <https://discord.com/invite/J7b7EscrAs>`__,
|
||||||
list <https://groups.google.com/forum/#!forum/salt-users>`__,
|
the `salt-users mailing list <https://groups.google.com/forum/#!forum/salt-users>`__,
|
||||||
`Server Fault <https://serverfault.com/questions/tagged/saltstack>`__,
|
`Server Fault <https://serverfault.com/questions/tagged/saltstack>`__,
|
||||||
or `r/saltstack on Reddit <https://www.reddit.com/r/saltstack/>`__.
|
or `r/saltstack on Reddit <https://www.reddit.com/r/saltstack/>`__.
|
||||||
- Fix bugs.
|
- Fix bugs or contribute to the `documentation <https://saltstack.gitlab.io/open/docs/docs-hub/topics/contributing.html>`__.
|
||||||
- `Improve the documentation <https://saltstack.gitlab.io/open/docs/docs-hub/topics/contributing.html>`__.
|
- Submit workarounds, patches, or code (even without tests).
|
||||||
- Provide workarounds, patches, or other code without tests.
|
- Share your experiences and solutions to problems you've solved using Salt.
|
||||||
- Tell other people about problems you solved using Salt.
|
|
||||||
|
|
||||||
If you'd like to update docs or fix an issue, you're going to need the
|
Choosing the Right Branch for Your Pull Request
|
||||||
Salt repo. The best way to contribute is using
|
===============================================
|
||||||
`Git <https://git-scm.com/>`__.
|
|
||||||
|
|
||||||
|
We appreciate your contributions to the project! To ensure a smooth and
|
||||||
|
efficient workflow, please follow these guidelines when submitting a Pull
|
||||||
|
Request. Each type of contribution—whether it's fixing a bug, adding a feature,
|
||||||
|
updating documentation, or fixing tests—should be targeted at the appropriate
|
||||||
|
branch. This helps us manage changes effectively and maintain stability across
|
||||||
|
versions.
|
||||||
|
|
||||||
|
- **Bug Fixes:**
|
||||||
|
|
||||||
|
Create your Pull Request against the oldest supported branch where the bug
|
||||||
|
exists. This ensures that the fix can be applied to all relevant versions.
|
||||||
|
|
||||||
|
- **New Features**:
|
||||||
|
|
||||||
|
For new features or enhancements, create your Pull Request against the master
|
||||||
|
branch.
|
||||||
|
|
||||||
|
- **Documentation Updates:**
|
||||||
|
|
||||||
|
Documentation changes should be made against the master branch, unless they
|
||||||
|
are related to a bug fix, in which case they should follow the same branch as
|
||||||
|
the bug fix.
|
||||||
|
|
||||||
|
- **Test Fixes:**
|
||||||
|
|
||||||
|
Pull Requests that fix broken or failing tests should be created against the
|
||||||
|
oldest supported branch where the issue occurs.
|
||||||
|
|
||||||
|
Setting Up Your Salt Development Environment
|
||||||
|
============================================
|
||||||
|
|
||||||
Environment setup
|
|
||||||
=================
|
|
||||||
To hack on Salt or the docs you're going to need to set up your
|
To hack on Salt or the docs you're going to need to set up your
|
||||||
development environment. If you already have a workflow that you're
|
development environment. If you already have a workflow that you're
|
||||||
comfortable with, you can use that, but otherwise this is an opinionated
|
comfortable with, you can use that, but otherwise this is an opinionated
|
||||||
|
@ -109,7 +135,7 @@ Then activate it:
|
||||||
|
|
||||||
Sweet! Now you're ready to clone Salt so you can start hacking away! If
|
Sweet! Now you're ready to clone Salt so you can start hacking away! If
|
||||||
you get stuck at any point, check out the resources at the beginning of
|
you get stuck at any point, check out the resources at the beginning of
|
||||||
this guide. IRC and Slack are particularly helpful places to go.
|
this guide. Discord and GitHub Discussions are particularly helpful places to go.
|
||||||
|
|
||||||
|
|
||||||
Get the source!
|
Get the source!
|
||||||
|
@ -562,12 +588,12 @@ But that advice is backwards for the changelog. We follow the
|
||||||
`keepachangelog <https://keepachangelog.com/en/1.0.0/>`__ approach for
|
`keepachangelog <https://keepachangelog.com/en/1.0.0/>`__ approach for
|
||||||
our changelog, and use towncrier to generate it for each release. As a
|
our changelog, and use towncrier to generate it for each release. As a
|
||||||
contributor, all that means is that you need to add a file to the
|
contributor, all that means is that you need to add a file to the
|
||||||
``salt/changelog`` directory, using the ``<issue #>.<type>`` format. For
|
``salt/changelog`` directory, using the ``<issue #>.<type>.md`` format. For
|
||||||
instance, if you fixed issue 123, you would do:
|
instance, if you fixed issue 123, you would do:
|
||||||
|
|
||||||
::
|
::
|
||||||
|
|
||||||
echo "Made sys.doc inform when no minions return" > changelog/123.fixed
|
echo "Made sys.doc inform when no minions return" > changelog/123.fixed.md
|
||||||
|
|
||||||
And that's all that would go into your file. When it comes to your
|
And that's all that would go into your file. When it comes to your
|
||||||
commit message, it's usually a good idea to add other information, such as
|
commit message, it's usually a good idea to add other information, such as
|
||||||
|
@ -605,7 +631,7 @@ your PR is submitted during the week you should be able to expect some
|
||||||
kind of communication within that business day. If your tests are
|
kind of communication within that business day. If your tests are
|
||||||
passing and we're not in a code freeze, ideally your code will be merged
|
passing and we're not in a code freeze, ideally your code will be merged
|
||||||
that week or month. If you haven't heard from your assigned reviewer, ping them
|
that week or month. If you haven't heard from your assigned reviewer, ping them
|
||||||
on GitHub, `irc <https://web.libera.chat/#salt>`__, or Community Slack.
|
on GitHub or `Community Discord <https://discord.com/invite/J7b7EscrAs>`__.
|
||||||
|
|
||||||
It's likely that your reviewer will leave some comments that need
|
It's likely that your reviewer will leave some comments that need
|
||||||
addressing - it may be a style change, or you forgot a changelog entry,
|
addressing - it may be a style change, or you forgot a changelog entry,
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
| **OSS Software Name** | **Version** | **Primary License** | **Source Code Download URL** | **Author** | **Copyright Year** |
|
| **OSS Software Name** | **Version** | **Primary License** | **Source Code Download URL** | **Author** | **Copyright Year** |
|
||||||
| --- | :--- | --- | --- | --- | ---: |
|
| ------------------------- | :---------- | --------------------------------------- | ---------------------------------------------- | ------------------------------------------------------------------------ | -----------------: |
|
||||||
| | | | | | |
|
| | | | | | |
|
||||||
| Cheetah3 | 3.1.0 | MIT/X11 | https://pypi.org/project/Cheetah3/ | Travis Rudd | 2017-2019 |
|
| Cheetah3 | 3.1.0 | MIT/X11 | https://pypi.org/project/Cheetah3/ | Travis Rudd | 2017-2019 |
|
||||||
| CherryPy | 17.3.0 | BSD | https://pypi.org/project/CherryPy/ | CherryPy Team | 2004-2019 |
|
| CherryPy | 17.3.0 | BSD | https://pypi.org/project/CherryPy/ | CherryPy Team | 2004-2019 |
|
||||||
|
@ -12,95 +12,12 @@
|
||||||
| PyNaCl | 1.4.0 | Apache License, V2.0 | https://pypi.org/project/PyNaCl/ | The PyNaCl developers | 2004 |
|
| PyNaCl | 1.4.0 | Apache License, V2.0 | https://pypi.org/project/PyNaCl/ | The PyNaCl developers | 2004 |
|
||||||
| PyYAML | 5.3.1 | MIT/X11 | https://pypi.org/project/PyYAML/ | Kirill Simonov | 2006-2019 |
|
| PyYAML | 5.3.1 | MIT/X11 | https://pypi.org/project/PyYAML/ | Kirill Simonov | 2006-2019 |
|
||||||
| WerkZeug | 1.0.1 | BSD | https://pypi.org/project/Werkzeug/ | Armin Ronacher | 2007 |
|
| WerkZeug | 1.0.1 | BSD | https://pypi.org/project/Werkzeug/ | Armin Ronacher | 2007 |
|
||||||
| adal | 1.2.4 | MIT/X11 | https://pypi.org/project/adal | Microsoft Corporation | 2015 |
|
|
||||||
| apache-libcloud | 2.0.0 | Apache License, V2.0 | https://pypi.org/project/apache-libcloud/ | Apache Software Foundation | 2004 |
|
| apache-libcloud | 2.0.0 | Apache License, V2.0 | https://pypi.org/project/apache-libcloud/ | Apache Software Foundation | 2004 |
|
||||||
| appdirs | 1.4.4 | MIT/X11 | https://pypi.org/project/appdirs/ | Trent Mick | 2010 |
|
| appdirs | 1.4.4 | MIT/X11 | https://pypi.org/project/appdirs/ | Trent Mick | 2010 |
|
||||||
| asn1crypto | 1.3.0 | MIT/X11 | https://pypi.org/project/asn1crypto/ | wbond | 2015-2019 |
|
| asn1crypto | 1.3.0 | MIT/X11 | https://pypi.org/project/asn1crypto/ | wbond | 2015-2019 |
|
||||||
| attrs | 19.3.1 | MIT/X11 | https://pypi.org/project/attrs/ | Hynek Schlawack | 2015 |
|
| attrs | 19.3.1 | MIT/X11 | https://pypi.org/project/attrs/ | Hynek Schlawack | 2015 |
|
||||||
| aws-sam-translator | 1.25.0 | Apache License, V2.0 | https://pypi.org/project/aws-sam-translator/ | Amazon Web Services | 2004 |
|
| aws-sam-translator | 1.25.0 | Apache License, V2.0 | https://pypi.org/project/aws-sam-translator/ | Amazon Web Services | 2004 |
|
||||||
| aws-xray-sdk | 2.6.0 | Apache License, V2.0 | https://pypi.org/project/aws-xray-sdk/ | Amazon Web Services | 2004 |
|
| aws-xray-sdk | 2.6.0 | Apache License, V2.0 | https://pypi.org/project/aws-xray-sdk/ | Amazon Web Services | 2004 |
|
||||||
| azure | 4.0.0 | MIT/X11 | https://pypi.org/project/azure | Microsoft Corporation | 2016 |
|
|
||||||
| azure-applicationinsights | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-applicationinsights/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-batch | 4.1.3 | MIT/X11 | https://pypi.org/project/azure-batch/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-common | 1.1.25 | MIT/X11 | https://pypi.org/project/azure-common/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-cosmosdb-nspkg | 2.0.2 | Apache License, V2.0 | https://pypi.org/project/azure-cosmosdb-nspkg | Microsoft Corporation | 2004 |
|
|
||||||
| azure-cosmosdb-table | 1.0.6 | Apache License, V2.0 | https://pypi.org/project/azure-cosmosdb-table/ | Microsoft Corporation | 2004 |
|
|
||||||
| azure-datalake-store | 0.0.48 | MIT/X11 | https://pypi.org/project/azure-datalake-store/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-eventgrid | 1.3.0 | MIT/X11 | https://pypi.org/project/azure-eventgrid/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-graphrbac | 0.40.0 | MIT/X11 | https://pypi.org/project/azure-graphrbac/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-keyvault | 1.1.0 | MIT/X11 | https://pypi.org/project/azure-keyvault/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-loganalytics | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-loganalytics/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt | 4.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-advisor | 1.0.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-advisor/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-applicationinsights | 0.1.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-applicationinsights/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-authorization | 0.50.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-authorization/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-batch | 5.0.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-batch/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-batchai | 2.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-batchai/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-billing | 0.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-billing/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-cdn | 3.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-cdn/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-cognitiveservices | 3.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-cognitiveservices/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-commerce | 1.0.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-commerce/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-compute | 4.6.2 | MIT/X11 | https://pypi.org/project/azure-mgmt-compute/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-consumption | 2.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-consumption/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-containerinstance | 1.5.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-containerinstance/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-containerregistry | 2.8.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-containerregistry/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-containerservice | 4.4.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-containerservice/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-cosmosdb | 0.4.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-cosmosdb/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-datafactory | 0.6.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-datafactory/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-datalake-analytics | 0.6.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-datalake-analytics/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-datalake-nspkg | 3.0.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-datalake-nspkg/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-datalake-store | 0.5.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-datalake-store/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-datamigration | 1.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-datamigration/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-devspaces | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-devspaces/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-devtestlabs | 2.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-devtestlabs/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-dns | 2.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-dns/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-eventgrid | 1.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-eventgrid/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-eventhub | 2.6.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-eventhub/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-hanaonazure | 0.1.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-hanaonazure/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-iotcentral | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-iotcentral/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-iothub | 0.5.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-iothub/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-iothubprovisioningservices | 0.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-iothubprovisioningservices/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-keyvault | 1.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-keyvault/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-loganalytics | 0.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-loganalytics/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-logic | 3.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-logic/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-machinelearningcompute | 0.4.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-machinelearningcompute/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-managementgroups | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-managementgroups/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-managementpartner | 0.1.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-managementpartner/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-maps | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-maps/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-marketplaceordering | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-marketplaceordering/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-media | 1.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-media/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-monitor | 0.5.2 | MIT/X11 | https://pypi.org/project/azure-mgmt-monitor/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-msi | 0.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-msi/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-network | 2.7.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-network/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-notificationhubs | 2.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-notificationhubs/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-nspkg | 3.0.2 | MIT/X11 | https://pypi.org/project/azure-mgmt-nspkg/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-policyinsights | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-policyinsights/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-powerbiembedded | 2.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-powerbiembedded/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-rdbms | 1.9.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-rdbms/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-recoveryservices | 0.3.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-recoveryservices/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-recoveryservicesbackup | 0.3.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-recoveryservicesbackup/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-redis | 5.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-redis/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-relay | 0.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-relay/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-reservations | 0.2.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-reservations/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-resource | 2.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-resource/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-scheduler | 2.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-scheduler/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-search | 2.1.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-search/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-servicebus | 0.5.3 | MIT/X11 | https://pypi.org/project/azure-mgmt-servicebus/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-servicefabric | 0.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-servicefabric/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-signalr | 0.1.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-signalr/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-sql | 0.9.1 | MIT/X11 | https://pypi.org/project/azure-mgmt-sql/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-storage | 2.0.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-storage/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-subscription | 0.2.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-subscription/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-trafficmanager | 0.50.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-trafficmanager/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-mgmt-web | 0.35.0 | MIT/X11 | https://pypi.org/project/azure-mgmt-web/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-nspkg | 3.0.2 | MIT/X11 | https://pypi.org/project/azure-nspkg/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-servicebus | 0.21.1 | MIT/X11 | https://pypi.org/project/azure-servicebus/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-servicefabric | 6.3.0.0 | MIT/X11 | https://pypi.org/project/azure-servicefabric/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-servicemanagement-legacy | 0.20.7 | Apache License, V2.0 | https://pypi.org/project/azure-servicemanagement-legacy/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-storage-blob | 1.5.0 | MIT/X11 | https://pypi.org/project/azure-storage-blob/ | Microsoft Corporation | 2017 |
|
|
||||||
| azure-storage-common | 1.4.2 | MIT/X11 | https://pypi.org/project/azure-storage-common/ | Microsoft Corporation | 2017 |
|
|
||||||
| azure-storage-file | 1.4.0 | MIT/X11 | https://pypi.org/project/azure-storage-file/ | Microsoft Corporation | 2016 |
|
|
||||||
| azure-storage-queue | 1.4.0 | MIT/X11 | https://pypi.org/project/azure-storage-queue/ | Microsoft Corporation | 2017 |
|
|
||||||
| bcrypt | 3.1.7 | Apache License, V2.0 | https://pypi.org/project/bcrypt/ | The Python Cryptographic Authority | 2013 |
|
| bcrypt | 3.1.7 | Apache License, V2.0 | https://pypi.org/project/bcrypt/ | The Python Cryptographic Authority | 2013 |
|
||||||
| boto | 2.49.0 | MIT/X11 | https://pypi.org/project/boto/ | Mitch Garnaatt | 2013 |
|
| boto | 2.49.0 | MIT/X11 | https://pypi.org/project/boto/ | Mitch Garnaatt | 2013 |
|
||||||
| boto3 | 1.14.16 | Apache License, V2.0 | https://pypi.org/project/boto3/ | AWS | 2019 |
|
| boto3 | 1.14.16 | Apache License, V2.0 | https://pypi.org/project/boto3/ | AWS | 2019 |
|
||||||
|
@ -151,8 +68,6 @@
|
||||||
| mock | 4.0.2 | BSD | https://pypi.org/project/mock/ | Testing Cabal | 2003-2013 |
|
| mock | 4.0.2 | BSD | https://pypi.org/project/mock/ | Testing Cabal | 2003-2013 |
|
||||||
| more-itertools | 5.0.0 | MIT/X11 | https://pypi.org/project/more-itertools/ | Eric Rose | 2012 |
|
| more-itertools | 5.0.0 | MIT/X11 | https://pypi.org/project/more-itertools/ | Eric Rose | 2012 |
|
||||||
| moto | 1.3.14 | Apache License, V2.0 | https://pypi.org/project/moto/ | Steve Pulec | 2004 |
|
| moto | 1.3.14 | Apache License, V2.0 | https://pypi.org/project/moto/ | Steve Pulec | 2004 |
|
||||||
| msrest | 0.6.17 | MIT/X11 | https://pypi.org/project/msrest/ | Microsoft | 2016 |
|
|
||||||
| msrestazure | 0.6.4 | MIT/X11 | https://pypi.org/project/msrestazure/ | Microsoft | 2016 |
|
|
||||||
| natsort | 7.0.1 | MIT/X11 | https://pypi.org/project/natsort/ | Seth M. Morton | 2012-2020 |
|
| natsort | 7.0.1 | MIT/X11 | https://pypi.org/project/natsort/ | Seth M. Morton | 2012-2020 |
|
||||||
| ncclient | 0.6.7 | Apache License, V2.0 | https://pypi.org/project/ncclient/ | Shikhar Bhushan, Leonidas Poulopoulos, Ebben Aries, Einar Nilsen-Nygaard | 2004 |
|
| ncclient | 0.6.7 | Apache License, V2.0 | https://pypi.org/project/ncclient/ | Shikhar Bhushan, Leonidas Poulopoulos, Ebben Aries, Einar Nilsen-Nygaard | 2004 |
|
||||||
| netaddr | 0.8.0 | BSD | https://pypi.org/project/netaddr/ | Author: David P. D. Moss, Stefan Nordhausen et al | 2008 |
|
| netaddr | 0.8.0 | BSD | https://pypi.org/project/netaddr/ | Author: David P. D. Moss, Stefan Nordhausen et al | 2008 |
|
||||||
|
|
27
Gemfile
27
Gemfile
|
@ -1,27 +0,0 @@
|
||||||
# This file is only used for running the test suite with kitchen-salt.
|
|
||||||
|
|
||||||
source 'https://rubygems.org'
|
|
||||||
|
|
||||||
gem 'test-kitchen', '>=2.11.1'
|
|
||||||
gem 'kitchen-salt', :git => 'https://github.com/saltstack/kitchen-salt.git'
|
|
||||||
gem 'kitchen-sync'
|
|
||||||
gem 'git'
|
|
||||||
|
|
||||||
group :docker do
|
|
||||||
gem 'kitchen-docker', :git => 'https://github.com/test-kitchen/kitchen-docker.git', :branch => 'main'
|
|
||||||
end
|
|
||||||
|
|
||||||
group :windows do
|
|
||||||
gem 'winrm', '~>2.0'
|
|
||||||
# gem 'winrm-fs', '~>1.3.1'
|
|
||||||
gem 'winrm-fs', :git => 'https://github.com/s0undt3ch/winrm-fs.git', :branch => 'hotfix/saltstack-ci'
|
|
||||||
end
|
|
||||||
|
|
||||||
group :ec2 do
|
|
||||||
gem 'kitchen-ec2', '>=3.8'
|
|
||||||
end
|
|
||||||
|
|
||||||
group :vagrant do
|
|
||||||
gem 'vagrant-wrapper'
|
|
||||||
gem 'kitchen-vagrant'
|
|
||||||
end
|
|
58
README.rst
58
README.rst
|
@ -6,17 +6,9 @@
|
||||||
:alt: PyPi Package Downloads
|
:alt: PyPi Package Downloads
|
||||||
:target: https://pypi.org/project/salt
|
:target: https://pypi.org/project/salt
|
||||||
|
|
||||||
.. image:: https://img.shields.io/lgtm/grade/python/github/saltstack/salt
|
.. image:: https://img.shields.io/badge/discord-SaltProject-blue.svg?logo=discord
|
||||||
:alt: PyPi Package Downloads
|
:alt: Salt Project Discord Community
|
||||||
:target: https://lgtm.com/projects/g/saltstack/salt/context:python
|
:target: https://discord.com/invite/J7b7EscrAs
|
||||||
|
|
||||||
.. image:: https://img.shields.io/badge/slack-SaltProject-blue.svg?logo=slack
|
|
||||||
:alt: Salt Project Slack Community
|
|
||||||
:target: https://via.vmw.com/salt-slack
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/twitch/status/saltprojectoss
|
|
||||||
:alt: Salt Project Twitch Channel
|
|
||||||
:target: https://www.twitch.tv/saltprojectoss
|
|
||||||
|
|
||||||
.. image:: https://img.shields.io/reddit/subreddit-subscribers/saltstack?style=social
|
.. image:: https://img.shields.io/reddit/subreddit-subscribers/saltstack?style=social
|
||||||
:alt: Salt Project subreddit
|
:alt: Salt Project subreddit
|
||||||
|
@ -71,20 +63,21 @@ In addition to configuration management Salt can also:
|
||||||
|
|
||||||
About our sponsors
|
About our sponsors
|
||||||
==================
|
==================
|
||||||
Salt powers VMware's `VMware Aria Automation Config`_
|
|
||||||
(previously vRealize Automation SaltStack Config / SaltStack Enterprise), and can be found
|
Salt powers VMware by Broadcom's `Tanzu Salt`_
|
||||||
|
(previously Aria Automation Config / vRealize Automation SaltStack Config / SaltStack Enterprise), and can be found
|
||||||
under the hood of products from Juniper, Cisco, Cloudflare, Nutanix, SUSE, and
|
under the hood of products from Juniper, Cisco, Cloudflare, Nutanix, SUSE, and
|
||||||
Tieto, to name a few.
|
Tieto, to name a few.
|
||||||
|
|
||||||
The original sponsor of our community, SaltStack, was `acquired by VMware in 2020 <https://www.vmware.com/company/acquisitions/saltstack.html>`_.
|
The original sponsor of our community, SaltStack, was acquired by VMware in 2020.
|
||||||
The Salt Project remains an open source ecosystem that VMware supports and
|
`VMware was later acquired by Broadcom in 2023 <https://investors.broadcom.com/news-releases/news-release-details/broadcom-completes-acquisition-vmware>`__.
|
||||||
contributes to. VMware ensures the code integrity and quality of the Salt
|
The Salt Project remains an open source ecosystem that Broadcom supports and
|
||||||
|
contributes to. Broadcom ensures the code integrity and quality of the Salt
|
||||||
modules by acting as the official sponsor and manager of the Salt project. Many
|
modules by acting as the official sponsor and manager of the Salt project. Many
|
||||||
of the core Salt Project contributors are also VMware employees. This team
|
of the core Salt Project contributors are also Broadcom employees. This team
|
||||||
carefully reviews and enhances the Salt modules to ensure speed, quality, and
|
carefully reviews and enhances the Salt modules to ensure speed, quality, and
|
||||||
security.
|
security.
|
||||||
|
|
||||||
|
|
||||||
Download and install Salt
|
Download and install Salt
|
||||||
=========================
|
=========================
|
||||||
Salt is tested and packaged to run on CentOS, Debian, RHEL, Ubuntu, MacOS,
|
Salt is tested and packaged to run on CentOS, Debian, RHEL, Ubuntu, MacOS,
|
||||||
|
@ -93,9 +86,11 @@ Windows, and more. Download Salt and get started now. See
|
||||||
for more information.
|
for more information.
|
||||||
|
|
||||||
To download and install Salt, see:
|
To download and install Salt, see:
|
||||||
* `The Salt install guide <https://docs.saltproject.io/salt/install-guide/en/latest/index.html>`_
|
|
||||||
* `Salt Project repository <https://repo.saltproject.io/>`_
|
|
||||||
|
|
||||||
|
* `The Salt install guide <https://docs.saltproject.io/salt/install-guide/en/latest/index.html>`_
|
||||||
|
* `Salt Project Repository: Linux (RPM) <https://packages.broadcom.com/artifactory/saltproject-rpm>`__ - Where Salt ``rpm`` packages are officially stored and distributed.
|
||||||
|
* `Salt Project Repository: Linux (DEB) <https://packages.broadcom.com/artifactory/saltproject-deb>`__ - Where Salt ``deb`` packages are officially stored and distributed.
|
||||||
|
* `Salt Project Repository: GENERIC <https://packages.broadcom.com/artifactory/saltproject-generic>`__ - Where Salt Windows, macOS, etc. (non-rpm, non-deb) packages are officially stored and distributed.
|
||||||
|
|
||||||
Technical support
|
Technical support
|
||||||
=================
|
=================
|
||||||
|
@ -103,7 +98,8 @@ Report bugs or problems using Salt by opening an issue: `<https://github.com/sal
|
||||||
|
|
||||||
To join our community forum where you can exchange ideas, best practices,
|
To join our community forum where you can exchange ideas, best practices,
|
||||||
discuss technical support questions, and talk to project maintainers, join our
|
discuss technical support questions, and talk to project maintainers, join our
|
||||||
Slack workspace: `Salt Project Community Slack`_
|
Discord server: `Salt Project Community Discord`_
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
Salt Project documentation
|
Salt Project documentation
|
||||||
|
@ -127,7 +123,7 @@ announcements.
|
||||||
|
|
||||||
Other channels to receive security announcements include the
|
Other channels to receive security announcements include the
|
||||||
`Salt Community mailing list <https://groups.google.com/forum/#!forum/salt-users>`_
|
`Salt Community mailing list <https://groups.google.com/forum/#!forum/salt-users>`_
|
||||||
and the `Salt Project Community Slack`_.
|
and the `Salt Project Community Discord`_.
|
||||||
|
|
||||||
|
|
||||||
Responsibly reporting security vulnerabilities
|
Responsibly reporting security vulnerabilities
|
||||||
|
@ -152,11 +148,9 @@ Please be sure to review our
|
||||||
`Code of Conduct <https://github.com/saltstack/salt/blob/master/CODE_OF_CONDUCT.md>`_.
|
`Code of Conduct <https://github.com/saltstack/salt/blob/master/CODE_OF_CONDUCT.md>`_.
|
||||||
Also, check out some of our community resources including:
|
Also, check out some of our community resources including:
|
||||||
|
|
||||||
* `Salt Project Community Wiki <https://github.com/saltstack/community/wiki>`_
|
* `Salt Project Community Discord`_
|
||||||
* `Salt Project Community Slack`_
|
|
||||||
* `Salt Project: IRC on LiberaChat <https://web.libera.chat/#salt>`_
|
|
||||||
* `Salt Project YouTube channel <https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg>`_
|
* `Salt Project YouTube channel <https://www.youtube.com/channel/UCpveTIucFx9ljGelW63-BWg>`_
|
||||||
* `Salt Project Twitch channel <https://www.twitch.tv/saltprojectoss>`_
|
* `Salt Project Community Notes and Wiki <https://github.com/saltstack/community/>`_
|
||||||
|
|
||||||
There are lots of ways to get involved in our community. Every month, there are
|
There are lots of ways to get involved in our community. Every month, there are
|
||||||
around a dozen opportunities to meet with other contributors and the Salt Core
|
around a dozen opportunities to meet with other contributors and the Salt Core
|
||||||
|
@ -164,10 +158,8 @@ team and collaborate in real time. The best way to keep track is by subscribing
|
||||||
to the **Salt Project Community Events Calendar** on the main
|
to the **Salt Project Community Events Calendar** on the main
|
||||||
`<https://saltproject.io>`_ website.
|
`<https://saltproject.io>`_ website.
|
||||||
|
|
||||||
If you have additional questions, email us at saltproject@vmware.com or reach out
|
If you have additional questions, email us at saltproject.pdl@broadcom.com or reach out
|
||||||
directly to the Community Manager, Jimmy Chunga via Slack. We'd be glad to
|
directly to the Community Discord. We'd be glad to have you join our community!
|
||||||
have you join our community!
|
|
||||||
|
|
||||||
|
|
||||||
License
|
License
|
||||||
=======
|
=======
|
||||||
|
@ -180,10 +172,8 @@ used by external modules.
|
||||||
A complete list of attributions and dependencies can be found here:
|
A complete list of attributions and dependencies can be found here:
|
||||||
`salt/DEPENDENCIES.md <https://github.com/saltstack/salt/blob/master/DEPENDENCIES.md>`_
|
`salt/DEPENDENCIES.md <https://github.com/saltstack/salt/blob/master/DEPENDENCIES.md>`_
|
||||||
|
|
||||||
.. _Salt Project Community Slack: https://via.vmw.com/salt-slack
|
.. _Salt Project Community Discord: https://discord.com/invite/J7b7EscrAs
|
||||||
.. _VMware Aria Automation Config: https://www.vmware.com/products/vrealize-automation/saltstack-config.html
|
.. _Tanzu Salt: https://www.vmware.com/products/app-platform/tanzu-salt
|
||||||
.. _Latest Salt Documentation: https://docs.saltproject.io/en/latest/
|
.. _Latest Salt Documentation: https://docs.saltproject.io/en/latest/
|
||||||
.. _Open an issue: https://github.com/saltstack/salt/issues/new/choose
|
.. _Open an issue: https://github.com/saltstack/salt/issues/new/choose
|
||||||
.. _SECURITY.md: https://github.com/saltstack/salt/blob/master/SECURITY.md
|
.. _SECURITY.md: https://github.com/saltstack/salt/blob/master/SECURITY.md
|
||||||
.. _Calendar html: https://outlook.office365.com/owa/calendar/105f69bacd4541baa849529aed37eb2d@vmware.com/434ec2155b2b4cce90144c87f0dd03d56626754050155294962/calendar.html
|
|
||||||
.. _Calendar ics: https://outlook.office365.com/owa/calendar/105f69bacd4541baa849529aed37eb2d@vmware.com/434ec2155b2b4cce90144c87f0dd03d56626754050155294962/calendar.ics
|
|
||||||
|
|
114
SECURITY.md
114
SECURITY.md
|
@ -4,77 +4,65 @@
|
||||||
- saltproject-security.pdl@broadcom.com
|
- saltproject-security.pdl@broadcom.com
|
||||||
|
|
||||||
**GPG key ID:**
|
**GPG key ID:**
|
||||||
- 4EA0793D
|
- 37654A06
|
||||||
|
|
||||||
**GPG key fingerprint:**
|
**GPG key fingerprint:**
|
||||||
- `8ABE 4EFC F0F4 B24B FF2A AF90 D570 F2D3 4EA0 793D`
|
- `99EF 26F2 6469 2D24 973A 7007 E8BF 76A7 3765 4A06`
|
||||||
|
|
||||||
**GPG Public Key**
|
**GPG Public Key**
|
||||||
|
|
||||||
```
|
```
|
||||||
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||||
|
|
||||||
mQINBFO15mMBEADa3CfQwk5ED9wAQ8fFDku277CegG3U1hVGdcxqKNvucblwoKCb
|
mQINBGZpxDsBEACz8yoRBXaJiifaWz3wd4FLSO18mgH7H/+0iNTbV1ZwhgGEtWTF
|
||||||
hRK6u9ihgaO9V9duV2glwgjytiBI/z6lyWqdaD37YXG/gTL+9Md+qdSDeaOa/9eg
|
Z31HfrsbxVgICoMgFYt8WKnc4MHZLIgDfTuCFQpf7PV/VqRBAknZwQKEAjHfrYNz
|
||||||
7y+g4P+FvU9HWUlujRVlofUn5Dj/IZgUywbxwEybutuzvvFVTzsn+DFVwTH34Qoh
|
Q1vy3CeKC1qcKQISEQr7VFf58sOC8GJ54jLLc2rCsg9cXI6yvUFtGwL9Qv7g/NZn
|
||||||
QIuNzQCSEz3Lhh8zq9LqkNy91ZZQO1ZIUrypafspH6GBHHcE8msBFgYiNBnVcUFH
|
rtLjc4NZIKdIvSt+/PtooQtsz0jfLMdMpMFa41keH3MknIbydBUnGj7eC8ANN/iD
|
||||||
u0r4j1Rav+621EtD5GZsOt05+NJI8pkaC/dDKjURcuiV6bhmeSpNzLaXUhwx6f29
|
Re2QHAW2KfQh3Ocuh/DpJ0/dwbzXmXfMWHk30E+s31TfdLiFt1Iz5kZDF8iHrDMq
|
||||||
Vhag5JhVGGNQxlRTxNEM86HEFp+4zJQ8m/wRDrGX5IAHsdESdhP+ljDVlAAX/ttP
|
x39/GGmF10y5rfq43V1Ucxm+1tl5Km0JcX6GpPUtgRpfUYAxwxfGfezt4PjYRYH2
|
||||||
/Ucl2fgpTnDKVHOA00E515Q87ZHv6awJ3GL1veqi8zfsLaag7rw1TuuHyGLOPkDt
|
mNxXXPLsnVTvdWPTvS0msSrcTHmnU5His38I6goXI7dLZm0saqoWi3sqEQ8TPS6/
|
||||||
t5PAjsS9R3KI7pGnhqI6bTOi591odUdgzUhZChWUUX1VStiIDi2jCvyoOOLMOGS5
|
DkLtYjpb/+dql+KrXD7erd3j8KKflIXn7AEsv+luNk6czGOKgdG9agkklzOHfEPc
|
||||||
AEYXuWYP7KgujZCDRaTNqRDdgPd93Mh9JI8UmkzXDUgijdzVpzPjYgFaWtyK8lsc
|
xOGmaFfe/1mu8HxgaCuhNAQWlk79ZC+GAm0sBZIQAQRtABgag5vWr16hVix7BPMG
|
||||||
Fizqe3/Yzf9RCVX/lmRbiEH+ql/zSxcWlBQd17PKaL+TisQFXcmQzccYgAxFbj2r
|
Fp8+caOVv6qfQ7gBmJ3/aso6OzyOxsluVxQRt94EjPTm0xuwb1aYNJOhEj9cPkjQ
|
||||||
QHp5ABEu9YjFme2Jzun7Mv9V4qo3JF5dmnUk31yupZeAOGZkirIsaWC3hwARAQAB
|
XBjo3KN0rwcAViR/fdUzrIV1sn2hms0v5WZ+TDtz1w0OpLZOwe23BDE1+QARAQAB
|
||||||
tDBTYWx0U3RhY2sgU2VjdXJpdHkgVGVhbSA8c2VjdXJpdHlAc2FsdHN0YWNrLmNv
|
tEJTYWx0IFByb2plY3QgU2VjdXJpdHkgVGVhbSA8c2FsdHByb2plY3Qtc2VjdXJp
|
||||||
bT6JAj4EEwECACgFAlO15mMCGwMFCQeGH4AGCwkIBwMCBhUIAgkKCwQWAgMBAh4B
|
dHkucGRsQGJyb2FkY29tLmNvbT6JAlcEEwEKAEEWIQSZ7ybyZGktJJc6cAfov3an
|
||||||
AheAAAoJENVw8tNOoHk9z/MP/2vzY27fmVxU5X8joiiturjlgEqQw41IYEmWv1Bw
|
N2VKBgUCZmnEOwIbAwUJB4TOAAULCQgHAgIiAgYVCgkICwIEFgIDAQIeBwIXgAAK
|
||||||
4WVXYCHP1yu/1MC1uuvOmOd5BlI8YO2C2oyW7d1B0NorguPtz55b7jabCElekVCh
|
CRDov3anN2VKBk7rD/9QdcYdNGfk96W906HlVpb3JCwT0t9T7ElP97Ot0YN6LqMj
|
||||||
h/H4ZVThiwqgPpthRv/2npXjIm7SLSs/kuaXo6Qy2JpszwDVFw+xCRVL0tH9KJxz
|
vVQpxWYi7riUSyt1FtlCAM+hmghImzILF9LKDRCZ1H5UStI/u9T53cZpUZtVW/8R
|
||||||
HuNBeVq7abWD5fzIWkmGM9hicG/R2D0RIlco1Q0VNKy8klG+pOFOW886KnwkSPc7
|
bUNBCl495UcgioIZG5DsfZ/GdBOgY+hQfdgh7HC8a8A/owCt2hHbnth970NQ+LHb
|
||||||
JUYp1oUlHsSlhTmkLEG54cyVzrTP/XuZuyMTdtyTc3mfgW0adneAL6MARtC5UB/h
|
/0ERLfOHRxozgPBhze8Vqf939KlteM5ljgTw/IkJJIsxJi4C6pQntSHvB3/Bq/Nw
|
||||||
q+v9dqMf4iD3wY6ctu8KWE8Vo5MUEsNNO9EA2dUR88LwFZ3ZnnXdQkizgR/Aa515
|
Kf3vk3XYFtVibeQODSVvc6useo+SNGV/wsK/6kvh/vfP9Trv/GMOn/89Bj2aL1PR
|
||||||
dm17vlNkSoomYCo84eN7GOTfxWcq+iXYSWcKWT4X+h/ra+LmNndQWQBRebVUtbKE
|
M382E6sDB9d22p4ehVgbcOpkwHtr9DGerK9xzfG4aUjLu9qVD5Ep3gqKSsCe+P8z
|
||||||
ZDwKmiQz/5LY5EhlWcuU4lVmMSFpWXt5FR/PtzgTdZAo9QKkBjcv97LYbXvsPI69
|
bpADdVCnk+Vdp3Bi+KI7buSkqfbZ0m9vCY3ei1fMiDiTTjvNliL5QCO6PvYNYiDw
|
||||||
El1BLAg+m+1UpE1L7zJT1il6PqVyEFAWBxW46wXCCkGssFsvz2yRp0PDX8A6u4yq
|
+LLImrQThv55ZRQsRRT7J6A94kwDoI6zcBEalv/aPws0nQHJtgWRUpmy5RcbVu9Z
|
||||||
rTkt09uYht1is61joLDJ/kq3+6k8gJWkDOW+2NMrmf+/qcdYCMYXmrtOpg/wF27W
|
QBXlUpCzCB+gGaGRE1u0hCfuvkbcG1pXFFBdSUuAK4o4ktiRALVUndELic/PU1nR
|
||||||
GMNAkbdyzgeX/MbUBCGCMdzhevRuivOI5bu4vT5s3KdshG+yhzV45bapKRd5VN+1
|
jwo/+j0SGw/jTwqVChUfLDZbiAQ2JICoVpZ+e1zQfsxa/yDu2e4D543SvNFHDsxh
|
||||||
mZRqiQJVBBMBCAA/AhsDBgsJCAcDAgYVCAIJCgsEFgIDAQIeAQIXgBYhBIq+Tvzw
|
bsBeCsopzJSA0n2HAdYvPxOPoWVvZv+U8ZV3EEVOUgsO5//cRJddCgLU89Q4DrkC
|
||||||
9LJL/yqvkNVw8tNOoHk9BQJe1uRXBQkPoTz0AAoJENVw8tNOoHk9akAQANKIDIBY
|
DQRmacQ7ARAAsz8jnpfw3DCRxdCVGiqWAtgj8r2gx5n1wJsKsgvyGQdKUtPwlX04
|
||||||
J3DmWH3g6rWURdREQcBVfMkw6j5MHlIEwlGrN3whSaPv2KR3tatRccBCQ0olQeYb
|
7w13lIDT2DwoXFozquYsTn9XkIoWbVckqo0NN/V7/QxIZIYTqRcFXouHTbXDJm5C
|
||||||
ZeFtPuf0Du+LqGaAePo5DkPNU7GHoba2+ZE/sJ4wZ4CzAQM6+LvH2iLHeLZ1VLlu
|
tsvfDlnTsaplyRawPU2mhYg39/lzIt8zIjvy5zo/pElkRP5m03nG+ItrsHN6CCvf
|
||||||
ZEftxD1RFKTqpnav8KiyYGkeFuEn4eMSIhbudp/8wkN40sCWL22D141EhVSRvLlO
|
ZiRxme6EQdn+aoHh2GtICL8+c3HvQzTHYKxFn84Ibt3uNxwt+Mu6YhG9tkYMQQk5
|
||||||
BMUpTWdtSYTg0F2pgQL5U2A56syuiwUwPXzQb45JEJILmG8zkeJB9s8kGtErypIH
|
SkYA4CYAaw2Lc/g0ee36iqw/5d79M8YcQtHhy5zzqgdEvExjFPdowV1hhFIEkNkM
|
||||||
P+qxJXq24woGUFeJjiLdiOhI6/YoVBACUkKmig36CGf/DH5NAeQECeZq3YBNp7XK
|
uqIAknXVesqLLw2hPeYmyhYQqeBKIrWmBhBKX9c0vMYkDDH3T/sSylVhH0QAXP6E
|
||||||
tsF1dPitxuTM/UkOHoHUnGhDlBcQMWe9WuBK4rA+7GH9NT8o7M6+2OKhk181tJ+s
|
WmLja3E1ov6pt6j7j/wWzC9LSMFDJI2yWCeOE1oea5D89tH6XvsGRTiog62zF/9a
|
||||||
Y2kP7RSXOV162thRsNvVImXajAIFTR3ksEDFGVq/4jh85jFoIbNH3x27NxOu6e2p
|
77197iIa0+o91chp4iLkzDvuK8pVujPx8bNsK8jlJ+OW73NmliCVg+hecoFLNsri
|
||||||
OIkXNXmSFXLUmwbfEfIk06gqP3xzkaj+eWHcLDkn9bUKblBJhHdhf9Vsy/N2NRW2
|
/TsBngFNVcu79Q1XfyvoDdR2C09ItCBEZGt6LOlq/+ATUw1aBz6L1hvLBtiR3Hfu
|
||||||
23c64qDutw1NX7msDuN3KXisim+isBzPVVzymkkhkXK+UpjrRR0ePvph3fnGf1bc
|
X31YlbxdvVPjlzg6O6GXSfnokNTWv2mVXWTRIrP0RrKvMyiNPXVW7EunUuXI0Axk
|
||||||
NipVtn1KKM7kurSrSjFVLwLi52SGnEHKJnbbhh+AKV09SNYi6IaKL8yw8c1d0K80
|
Xg3E5kAjKXkBXzoCTCVz/sXPLjvjI0x3Z7obgPpcTi9h5DIX6PFyK/kAEQEAAYkC
|
||||||
PlBaJEvkC6myzaaRtYcna4pbiIysBaZtwDOOuQINBFO15mMBEAC5UuLii9ZLz6qH
|
PAQYAQoAJhYhBJnvJvJkaS0klzpwB+i/dqc3ZUoGBQJmacQ7AhsMBQkHhM4AAAoJ
|
||||||
fIJp35IOW9U8SOf7QFhzXR7NZ3DmJsd3f6Nb/habQFIHjm3K9wbpj+FvaW2oWRlF
|
EOi/dqc3ZUoGDeAQAKbyiHA1sl0fnvcZxoZ3mWA/Qesddp7Nv2aEW8I3hAJoTVml
|
||||||
VvYdzjUq6c82GUUjW1dnqgUvFwdmM8351n0YQ2TonmyaF882RvsRZrbJ65uvy7SQ
|
ZvMxk8leZgsQJtSsVDNnxeyW+WCIUkhxmd95UlkTTj5mpyci1YrxAltPJ2TWioLe
|
||||||
xlouXaAYOdqwLsPxBEOyOnMPSktW5V2UIWyxsNP3sADchWIGq9p5D3Y/loyIMsS1
|
F2doP8Y+4iGnaV+ApzWG33sLr95z37RKVdMuGk/O5nLMeWnSPA7HHWJCxECMm0SH
|
||||||
dj+TjoQZOKSj7CuRT98+8yhGAY8YBEXu9r3I9o6mDkuPpAljuMc8r09Im6az2egt
|
uI8aby8w2aBZ1kOMFB/ToEEzLBu9fk+zCzG3uH8QhdciMENVhsyBSULIrmwKglyI
|
||||||
K/szKt4Hy1bpSSBZU4W/XR7XwQNywmb3wxjmYT6Od3Mwj0jtzc3gQiH8hcEy3+BO
|
VQwj2dXHyekQh7QEHV+CdKMfs3ZOANwm52OwjaK0dVb3IMFGvlUf4UXXfcXwLAkj
|
||||||
+NNmyzFVyIwOLziwjmEcw62S57wYKUVnHD2nglMsQa8Ve0e6ABBMEY7zGEGStva5
|
vW+Ju4kLGxVQpOlh1EBain9WOaHZGh6EGuTpjJO32PyRq8iSMNb8coeonoPFWrE/
|
||||||
9rfgeh0jUMJiccGiUDTMs0tdkC6knYKbu/fdRqNYFoNuDcSeLEw4DdCuP01l2W4y
|
A5dy3z5x5CZhJ6kyNwYs/9951r30Ct9qNZo9WZwp8AGQVs+J9XEYnZIWXnO1hdKs
|
||||||
Y+fiK6hAcL25amjzc+yYo9eaaqTn6RATbzdhHQZdpAMxY+vNT0+NhP1Zo5gYBMR6
|
dRStPvY7VqS500t8eWqWRfCLgofZAb9Fv7SwTPQ2G7bOuTXmQKAIEkU9vzo5XACu
|
||||||
5Zp/VhFsf67ijb03FUtdw9N8dHwiR2m8vVA8kO/gCD6wS2p9RdXqrJ9JhnHYWjiV
|
AtR/9bC9ghNnlNuH4xiViBclrq2dif/I2ZwItpQHjuCDeMKz9kdADRI0tuNPpRHe
|
||||||
uXR+f755ZAndyQfRtowMdQIoiXuJEXYw6XN+/BX81gJaynJYc0uw0MnxWQX+A5m8
|
QP1YpURW+I+PYZzNgbnwzl6Bxo7jCHFgG6BQ0ih5sVwEDhlXjSejd8CNMYEy3ElL
|
||||||
HqEsbIFUXBYXPgbwXTm7c4IHGgXXdwARAQABiQI8BBgBCAAmAhsMFiEEir5O/PD0
|
xJLUpltwXLZSrJEXYjtJtnh0om71NXes0OyWE1cL4+U6WA9Hho6xedjk2bai
|
||||||
skv/Kq+Q1XDy006geT0FAl7W5K0FCQ+hPUoACgkQ1XDy006geT1Q0Q//atnw1D4J
|
=pPmt
|
||||||
13nL8Mygk+ANY4Xljub/TeZqKtzmnWGso843XysErLH1adCu1KDX1Dj4/o3WoPOt
|
|
||||||
0O78uSS81N428ocOPKx+fA63n7q1mRqHHy6pLLVKoT66tmvE1ZN0ObaiPK9IxZkB
|
|
||||||
ThGlHJk9VaUg0vzAaRznogWeBh1dyZktVrtbUO5u4xDX9iql/unVmCWm+U1R7t4q
|
|
||||||
fqPEbk8ZnWc7x4bAZf8/vSQ93mAbpnRRuJdDK9tsiuhl8pRz7OyzvMS81rVF75ja
|
|
||||||
7CcShPofrW4yZ7FqAUMwTbfrvsAraWmDjW17Ao7C2dUA9ViwSKJ6u6Pd5no/hwbm
|
|
||||||
jVoxtO2RvjGOBxKneD36uENAUMBExjDTkSHmOxUYSknrEKUy7P1OL2ZHLG8/rouN
|
|
||||||
5ZvIxHiMkz12ukSt29IHvCngn1UB4/7+tvDHqug4ZAZPuwH7TC5Hk6WO0OoK8Eb2
|
|
||||||
sQa2QoehQjwK0IakGd5kFEqKgbrwYPPa3my7l58nOZmPHdMcTOzgKvUEYAITjsT4
|
|
||||||
oOtocs9Nj+cfCfp6YUn6JeYfiHs+Xhze5igdWIl0ZO5rTmbqcD8A1URKBds0WA+G
|
|
||||||
FLP9shPC0rS/L3Y1fKhqAc0h+znWBU6xjipTkmzh3FdM8gGT6g9YwGQNbi/x47k5
|
|
||||||
vtBIWO4LPeGEvb2Gs65PL2eouOqU6yvBr5Y=
|
|
||||||
=F/97
|
|
||||||
-----END PGP PUBLIC KEY BLOCK-----
|
-----END PGP PUBLIC KEY BLOCK-----
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
27
SUPPORT.rst
27
SUPPORT.rst
|
@ -1,17 +1,10 @@
|
||||||
Get SaltStack Support and Help
|
Get Salt Project Support and Help
|
||||||
==============================
|
=================================
|
||||||
|
|
||||||
**IRC Chat** - Join the vibrant, helpful and positive SaltStack chat room in
|
**Salt Project Discord** - Join the Salt Project Community Discord!
|
||||||
LiberaChat at #salt. There is no need to introduce yourself, or ask permission
|
Use the following link to join the Discord server:
|
||||||
to join in, just help and be helped! Make sure to wait for an answer, sometimes
|
|
||||||
it may take a few moments for someone to reply.
|
|
||||||
|
|
||||||
`<https://web.libera.chat/#salt>`_
|
`<https://discord.com/invite/J7b7EscrAs>`_
|
||||||
|
|
||||||
**SaltStack Slack** - Alongside IRC is our SaltStack Community Slack for the
|
|
||||||
SaltStack Working groups. Use the following link to request an invitation.
|
|
||||||
|
|
||||||
`<https://via.vmw.com/salt-slack>`_
|
|
||||||
|
|
||||||
**Mailing List** - The SaltStack community users mailing list is hosted by
|
**Mailing List** - The SaltStack community users mailing list is hosted by
|
||||||
Google groups. Anyone can post to ask questions about SaltStack products and
|
Google groups. Anyone can post to ask questions about SaltStack products and
|
||||||
|
@ -20,13 +13,13 @@ anyone can help answer. Join the conversation!
|
||||||
`<https://groups.google.com/forum/#!forum/salt-users>`_
|
`<https://groups.google.com/forum/#!forum/salt-users>`_
|
||||||
|
|
||||||
You may subscribe to the list without a Google account by emailing
|
You may subscribe to the list without a Google account by emailing
|
||||||
salt-users+subscribe@googlegroups.com and you may post to the list by emailing
|
``salt-users+subscribe@googlegroups.com`` and you may post to the list by emailing
|
||||||
salt-users@googlegroups.com
|
``salt-users@googlegroups.com``
|
||||||
|
|
||||||
**Reporting Issues** - To report an issue with Salt, please follow the
|
**Reporting Issues** - To report an issue with Salt, please follow the
|
||||||
guidelines for filing bug reports:
|
guidelines for filing bug reports:
|
||||||
`<https://docs.saltproject.io/en/master/topics/development/reporting_bugs.html>`_
|
`<https://docs.saltproject.io/en/master/topics/development/reporting_bugs.html>`_
|
||||||
|
|
||||||
**SaltStack Support** - If you need dedicated, prioritized support, please
|
**Salt Project Support** - If you need dedicated, prioritized support, please
|
||||||
consider a SaltStack Support package that fits your needs:
|
consider taking a look at the Enterprise product:
|
||||||
`<http://www.saltstack.com/support>`_
|
`Tanzu Salt <https://www.vmware.com/products/app-platform/tanzu-salt>`__
|
||||||
|
|
3
changelog/33669.added.md
Normal file
3
changelog/33669.added.md
Normal file
|
@ -0,0 +1,3 @@
|
||||||
|
Issue #33669: Fixes an issue with the ``ini_managed`` execution module
|
||||||
|
where it would always wrap the separator with spaces. Adds a new parameter
|
||||||
|
named ``no_spaces`` that will not warp the separator with spaces.
|
1
changelog/41794.fixed.md
Normal file
1
changelog/41794.fixed.md
Normal file
|
@ -0,0 +1 @@
|
||||||
|
Fixed `salt.*.get` shorthand via Salt-SSH
|
2
changelog/44736.fixed.md
Normal file
2
changelog/44736.fixed.md
Normal file
|
@ -0,0 +1,2 @@
|
||||||
|
Commands on Windows are now prefixed with ``cmd /c`` so that compound
|
||||||
|
commands (commands separated by ``&&``) run properly when using ``runas``
|
1
changelog/47154.fixed.md
Normal file
1
changelog/47154.fixed.md
Normal file
|
@ -0,0 +1 @@
|
||||||
|
Fixed erroneous recursive requisite error when a prereq is used in combination with onchanges_any.
|
1
changelog/58931.added.md
Normal file
1
changelog/58931.added.md
Normal file
|
@ -0,0 +1 @@
|
||||||
|
Added metalink to mod_repo in yumpkg and documented in pkgrepo state
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue