Scap3
For information about deploying new software with Scap3 see the Scap3 Deployment Guide.
For information about migrating existing services from Trebuchet to Scap3 see the Scap3 Migration Guide
Scap3 Documentation
This is a collection of some of the documentation contained in the commit messages of merged commits that were part of the 'scap3' sprint. They are collected here for ease of access and search-ability.
The primary documentation for scap can be found in the form of auto-generated html docs, see scap documentation index.
Basic Use
The first step to deploying new code from the deployment_host is to use git
to bring the repository on the deployment host into the state that you want deployed to your targets.
Once the repository state is correct, use the deploy command to release the finished code use: scap deploy -v
deployer@tin:/srv/deployment/foo/deploy$ scap deploy
20:46:12 Started deploy_foo/deploy
Entering 'foo'
20:46:12
== DEFAULT ==
:* scap-target-07
:* scap-target-08
:* scap-target-09
:* scap-target-04
:* scap-target-05
:* scap-target-06
:* scap-target-10
:* scap-target-01
:* scap-target-02
:* scap-target-03
deploy_foo/deploy_config_deploy: 100% (ok: 10; fail: 0; left: 0)
deploy_foo/deploy_fetch: 100% (ok: 10; fail: 0; left: 0)
deploy_foo/deploy_promote: 100% (ok: 10; fail: 0; left: 0)
20:46:42 Finished deploy_foo/deploy (duration: 00m 29s)
Service Checks
NRPE checks
nrpe
check type for reusing existing Icinga/NRPE checks in deployments.
By default, checks are loaded and registered from all NRPE definitions in /etc/nagios/nrpe.d
and can be referenced in checks.yaml
using type: nrpe
.
checks: service_endpoints: type: nrpe command: check_service_endpoints stage: promote
Config Deployments
What happens on the deployment server
- Looks for environment-specific
./scap/config-files.yaml
with target files in the format:
/path/to/target: template: env-specific-template.yaml.j2 remote_vars: /optional/remote/variable/file.yaml
- Looks for environment-specific
./scap/vars.yaml
that includes variables used to render the template. These variables will be overridden by any conflicting variables in the file specified byremote_vars
- Variables from any environment-specific
vars.yaml
file are combined with variables from the rootvars.yaml
file. - A json file is created at
[repo]/.git/config-files/[revision].json
that contains the final path to any environment-specific templates as well as a final list of combined variables.
What happens on targets
- Download the file from
[deployment-server]/[repo]/.git/config-files/[revision].json
- Loop through the config files, and render each template using the variables from the downloaded json file and the variables from the (now) local
remote_vars
file - Links rendered file (in
[repo]/.git/config-files/[path]
) to final location
Grouping and Filtering deployment targets
Filter deploy hosts with --limit-hosts
--limit-hosts
or -l
accepts a pattern in one of the following formats:
~[regex]
- if the pattern starts with ~ it is interpreted as a regular expression, this is not combined with any other option![pattern]
- can be combined with any other pattern to negate that pattern. May be combined with other pattern matching options.host[01:10]
- range matching, works for ascii chars and numbers, including numbers with a leading 0, may be combined with other pattern-matching options.host*
- Matches 0 or more characters in the set A-z, '_', '.', or '-'. May be combined with other pattern matching options. This pattern is applied to thedsh_targets
file to return a sub-set of hosts to use as a deployment target.
Deployment groups
This feature was introduced in differential revision D16
In addition to the dsh_targets
config variable, scap looks for multiple [anything]_dsh_targets
config variables. This enables canary_dsh_targets
.
All additional deployment groups will be executed before the primary deployment group (defined by the dsh_targets
variable).
Additionally, checks now can be scoped to a specific deployment group using:
check_name: stage: promote group: dsh-group-name command: touch /tmp/hi-there
The group name is optional in a check. If not group name is specified, check runs for all deploy groups.
Structured Logging
Overview
This feature was introduced in differential revision D18
The main deploy application now sends all structured log output to a
file under scap/log/{git-tag}.log
which the new deploy-log utility
can tail and filter using a given free-form expression. By default the
latter utility will periodically scan the scap/log
directory for new
files and immediately begin tailing them. It can also be given an
explicit log file to parse via the --file
option or the latest log
file by using --latest
; in this case, it will simply filter the entire
file for matching records and exit.
Examples
Tail behavior
- Run
deploy-log {expr}
- Run
deploy
in a separate terminal - Verify that **deploy-log** in the first terminal starts reading the new log file. It should say -- Opening log file:
{file}
. - Verify that only log messages matching the given expression are output.
Latest log file behavior
- Run
deploy
. - Run
deploy-log -l {expr}
- Verify that only log messages from the latest log file matching the given expression are output.
Single log file behavior
- Run
deploy
a couple of times. - Run
deploy-log -f {log-file} {expr}
- Verify that only log messages from the given log file matching the given expression are output.
Production Upgrade
Building
To prepare a new release of the Debian packagbe for Scap, the Release Engineering Team needs to follow the steps in RELEASE.md (in the Scap git repository), and then the SRE team needs to build and deploy the package, which are listed below.
The Debian package for scap can be built with git-buildpackage
. More specifically: for production the standard procedure is to have a Debian source package built on the package_builder machine from the release branch
git clone https://gerrit.wikimedia.org/r/mediawiki/tools/scap pushd scap && git checkout release WIKIMEDIA=yes gbp buildpackage -sa -us -uc --git-pbuilder --git-no-pbuilder-autoconf --git-dist=stretch
If building fails, you can cleanup you working directory, checkout scap again and perform the follwing magic trick:
echo "1.0" > debian/source/format WIKIMEDIA=yes;DIST=stretch-wikimedia pdebuild
Uploading to apt repos
The resulting package will be in /var/cache/pbuilder/result/stretch-amd64/ and needs to be uploaded to the apt repo (e.g. from apt1001.wikimedia.org)
export DIST="stretch" rsync -vaz deneb.codfw.wmnet::pbuilder-result/$DIST-amd64/*scap* deb/ sudo -i reprepro --ignore=wrongdistribution include $DIST-wikimedia $(pwd)/deb/scap_<VERSION>_amd64.changes # scap is compatible as-is with jessie, stretch and buster, just copy the package there sudo -i reprepro copy jessie-wikimedia $DIST-wikimedia scap sudo -i reprepro copy buster-wikimedia $DIST-wikimedia scap
If you get errors about a missing GPG key and therefore not exporting indices you need to import the right key like described on Reprepro#Adding_a_new_external_repository.
If signing still fails after you imported the key you need to ensure it looks for them in the right home dir, see Reprepro#If_signing_fails.
If this already happened and you can't get it to export indices by repeating the "include" command you can work around it by copying it back from another distro (reprepro copy stretch-wikimedia buster-wikimedia scap).
Not having exported indices will manifest as "reprepro ls shows the right version but it is not found on a client even after running apt-get update".
Roll out to production
You will then need to use Debdeploy to deploy the new package in production.
- Start by deploying to
mw-api-canary, mw-canary, mw-jobrunner-canary
(cumin aliases), then login tomwdebug*
servers and check there if ascap pull
still works. - Later or the next day you can rollout to
all
.
Resources
There are Debian = scapcontrolfields.html#s-f-Standards-Version Versioning guidelines