SourceUrls

Jump to: navigation, search

TL;DR: URLs in Source are one step closer to verify authenticity of sources provided for building in the open Build Service and openSUSE Factory using automated verification methods. GPG signature verification is the next step possible. More complicated verification mechanisms can be used when tarball recompressing is wished for.

While we can not enforce using URLs all the time, whenever a proper source URL is available it must be used in the Source and Patch tags within .spec file.

Overview

The Source and Patch tags within .spec files can contain not just local filenames, but also full URLs. However, rpmbuild does not download the files automatically; instead, it assumes that there is a local filename that is the same as the last component of the Source path.

Automated integrity checking

Specifying the upstream locations allows for the introduction of more automation and especially verification into the openSUSE project's package handling in the future.

Archives having a GPG PKI signature can already be verified by way of gpg-offline (implemented since openSUSE 12.3) and are also verified by the source_validator service for openSUSE Factory beginning mid of 2013. Where no signature is provided, we want at least have a rudimentary check (e.g. looking at SHA checksums) for equality with upstream, so as to report potential compromise of security on either the upstream or openSUSE side.

When to use them

The short answer is: always, if available

Exceptions can be made for:

  • generated tarballs from obs_scm/tar_scm or similar services
  • non existent download hosts e.g. the project host went away. Question is if we should/could reference file archives from other distros in that case. In any case we want to keep the old download url as a comment.
  • upstream modifies tarballs. This is something that should be clarified upstream and documented in a spec file. We are not the only distribution which does source verification.
  • If patches found elsewhere had to be adapted to our code version, the url should also be provided in the patch comment from our patch guidelines

Semi-automatic updates

For updating with source urls you just need to increase the version, remove the old tarballs, then run osc service localrun, osc addremove, osc vc and osc ci and you are almost done.

Issues with recompression

openSUSE Factory development very strongly recommends to use the upstream provided tarballs unmodified to allow automated verification mechanisms and to increase packaging efficiency.

Certain packages are built from SCM snapshots instead of released tarballs, and thus may not have a common HTTP URI for the archive that would be downloadable (although some web frontends do offer tarball/zip downloads). There are also certain maintainers who wish to recompress {tarballs with a poor algorithm} to something better for space conservation, although in time of fast internet size differences are usually a non-issue. Recompressions methods makes tarball verification difficult if not impossible.

Because many released files use a stream compression that is independent of, and applied on top of the actual file container, such as .tar.gz, .tar.bz2, etc., a validator can still check the authenticity of an archive by looking at the checksum of the uncompressed objects. This still allows to flag archives that maintainers have recompressed the wrong way by way re-creating the underlying tar archive, without having to penalize maintainers that merely changed to on-top compression.

  • WRONG: tar -x foo.tar.bz2; tar -cJf foo.tar.xz foo/;
  • Better: bzip2 -d foo.tar.bz2; xz foo.tar;

In fact, if you are the upstream maintainer for some software and sign your archives with a key, consider signing the .tar archive (producing a .tar.sign/.tar.asc) rather than a particular compression variant (.tar.gz.sig) like the Linux kernel project already does. This makes it possible to verify the archives' authenticity irrespective of chosen compression algorithm(s).

The URL to the original archive can easily be supplied by an extra line in the .spec file, which is in fact what has been used in a handful of packages, where a #DL-URL: line is present. Alternate ways of storing the URI to the original are thinkable; this is up to the implementors of the final solution. As such, there is no justification for requiring URLs in the Source tag.

A potential way of pulling all of this off together is via Source Services; a _service file like

<services>
        <service name="download_url" mode="localonly">
                <param name="protocol">ftp</param>
                <param name="host">ftp.gap-system.org</param>
                <param name="path">/pub/gap/gap4/tar.bz2/packages/Alnuth-3.0.0.tar.bz2</param>
        </service>
        <service name="recompress" mode="localonly">
                <param name="file">*.tar</param>
                <param name="compression">none</param>
        </service>
        <service name="verify_file" mode="localonly">
                <param name="file">_service:recompress:Alnuth-3.0.0.tar</param>
                <param name="verifier">sha256</param>
                <param name="checksum">7203be33535135af16ba1a1479b057ec5fe4048c628d6d9bd2926824a017b477</param>
        </service>
        <service name="recompress" mode="localonly">
                <param name="file">*.tar</param>
                <param name="compression">xz</param>
        </service>
</services>

will do this. However, the drawback here is that this would cause osc to download the files everytime someone attempts a build, which increases build time, puts additional load on the origin servers (and their bills, where applicable). In addition, Factory disallows services such that server-side runs are not permitted, necessiting mode="buildtime", mode="localonly" or mode="disabled", therefore rendering the validation quite moot.

More plans are therefore needed.

Weblinks