Re: Modulesync Config 4.0.0


 

Hi people,

thanks for all the awesome work Ewoud!

Tiny update here: you might see failing tests on Ubuntu 20.04 with
Puppet 5. That's because Puppet does not provide packages for this
distro. The same applies for Fedora 33 on Puppet 5/6 and Fedora 32 on
Puppet 5, but we rarely have those in our metadata.json.

We already merged a few PRs where *only* acceptance tests fail for the
above mentioned platforms. IMO it's fine to continue with this until
Puppet provides packages / we figure out how do exclude those OS/Puppet
combinations from our test matrix.

cheers, Tim

On 26.11.20 13:37, Ewoud Kohl van Wijngaarden wrote:
Hello everyone,

You may have seen a wave of activity yesterday. Bastelfreak and I
released our msync config 4.0.0.

The major change was from Travis CI to Github Actions. In doing so, the
test matrix has become dynamic. This means that metadata.json drives it.
If you list Puppet >= 5.5.8 < 7.0.0 it will run the unit tests on Puppet
5 & 6. To add Puppet 7 support, just update metadata.json and it'll run
the tests. Similar, Puppet 5 can be dropped in the same way. We found
some modules which didn't declare any support or still only Puppet 4.
Most of these have been cleaned up.

Similarly, if there are acceptance tests it will run them. That means
that if you list CentOS 6 in metadata.json and there are accepance
tests, it will run them. We found a few broken modules and they have
dropped CentOS 6. There are also a bunch of modules where the acceptance
tests all fail because previously they didn't run. There are still a
bunch of them open now.

Now there are 52 open modulesync PRs.

https://github.com/pulls?q=is%3Aopen+is%3Apr+archived%3Afalse+user%3Avoxpupuli+label%3Amodulesync


For each one I left a message why they failed but I don't have time to
fix all issues. This is where you come in.

For a few Github Actions didn't trigger for some reason. I'll see if I
can work on that.

Then there are modules where there are problems that need code changes.
Either in the modules or in the tests. These have the tests-fail label.

I'd like to ask everyone to look at the modules they care about and fix
them. I'll be happy to advise but often I don't even know the software.

If modules are really broken for a longer time and nobody steps up I'd
suggest to mothball them. This means removing it from the modulesync
config so they're not holding us back. They should also have an open
issue which describes the state of the module.

Regards,
Ewoud Kohl van Wijngaarden




Join voxpupuli@groups.io to automatically receive all group messages.