HTTPS/Domains
What's included here
This is a list of service hostnames in wikimedia.org
which have HTTP[S] and are not terminated by our standard Traffic infrastructure (which enforces HTTPS-related things consistently for all). These are all services which have their own one-off direct public-facing server hosts, or are hosted by a 3rd party, usually for some valid reason or other. They have to be audited and taken care of individually, and are the most-difficult case for our HTTPS standardization efforts.
This list specifically excludes:
- Domains other than
wikimedia.org
(e.g.wmflabs.org
) - Anything terminated by our standardized Traffic cache clusters (known-good)
- Hostnames with no HTTP or HTTPS listeners
- Hosts which are servers rather than services, for which we can verify the mapping and audit the corresponding service hostname(s)
- (e.g.
silver.wikimedia.org
would otherwise be in this list, but it's not a service hostname, it was just the host forwikitech.wikimedia.org
service hostname which is in this list)
- (e.g.
- A few other special cases which aren't meant to be functionally-consumed as valid HTTP(S) services (e.g. Tor relay, RIPE Atlas nodes, etc)
List of wikimedia.org one-off domains' HTTPS support status
domain | http | https | SSL Labs | Last Audit | SSLLabs Status | Info/Issues (red row if big problems) |
---|---|---|---|---|---|---|
apt.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Can't redirect, might break old software |
archiva.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
blog.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | 3rd party, supports 3DES |
civicrm.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising |
civicrm.frdev.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising Dev |
dash.frdev.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising Dev |
dumps.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
frdata.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising |
fundraising.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising |
gerrit.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
gerrit-slave.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
icinga.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
librenms.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
lists.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
mirrors.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Can't redirect, might break old software |
netbox.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
payments.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising |
payments-listener.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising |
policy.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | 3rd party, supports 3DES |
reports.frdev.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Fundraising Dev |
store.wikimedia.org | http | https | ssllabs | 2018-05-15 | A | 3rd party - Insufficient HSTS task T128559 |
tendril.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | |
wikitech-static.wikimedia.org | http | https | ssllabs | 2018-05-15 | A+ | Externally hosted, but managed by us |
Audit method
Hostname list last audited for new entries ~2018-05-15
The list of hostnames in the left column above was generated semi-manually with a fairly awful process for filtering out exceptions that don't matter, etc. The rest of the columns are manually audited. The code paste below contains the commands used to generate hostname lists from the wikimedia.org
zonefile and how they're processed. It's not ideal, but at least it's recorded for future improvement or automation...
### Documenting an imperfect and somewhat-manual audit of non-GeoIP (cache cluster) HTTP[S]-responding hostnames from wikimedia.org zonefile: # Generate a broad hostlist for wikimedia.org from the zonefile in our DNS repo: cat templates/wikimedia.org |egrep -w 'A|AAAA|CNAME'|grep -v '^ '|grep -v ' 10\.'|awk '{print $1}'|sort|uniq >hlist # Run them all through curl for exitcodes about connecting to http:// + https:// (for hn in `cat hlist`; do hnf="${hn}.wikimedia.org"; curl -m 2.0 -Iv https://${hnf} >/dev/null 2>&1; https=$?; curl -m 2.0 -Iv http://${hnf} >/dev/null 2>&1; http=$?; echo === $hn https:$https http:$http; done) > curlres # Filter out hostnames with status 6|7|28 for both ports (basic connect fail/timeout, or DNS CNAME into .eqiad.wmnet...) - remainder need some kind of audit check egrep -v ' https:(6|7|28) http:(6|7|28)$' curlres >curlres-toaudit # Those with exitcode zero for https have working public certs, these need a full normal audit (e.g. ssllabs): grep https:0 curlres-toaudit | awk '{print $2".wikimedia.org"}' >audit-auto # The opposite set will *mostly* be HTTPS cert mismatches for a server hostname which hosts a service listed in audit-auto but listens on the any-address. We're ok with these for now, so we'll try to filter out all such easy cases in order to leave behind a minimal list that needs manual investigation... for h in $(grep -v https:0 curlres-toaudit | cut -d" " -f2); do cn=$(curl -vI https://${h}.wikimedia.org/ 2>&1 |grep CN= | sed -n -e 's/^.*CN=//p'); grep -q "^$cn\$" audit-auto || echo ${h}.wikimedia.org; done >audit-manual # Now you have two files to investigate, but may want to manually filter them a little for cases that are obvious: # audit-manual --- # ^ should have names that need manual investigation. Depending on the case, you may want to list them in audit data. # Known hostnames appearing in this file that can be safely ignored at last check: # frdev-eqiad.wikimedia.org - can be ignored (wildcard issue puts it in this list, but shouldn't be) # google<hexdigits>.wikimedia.org - google site verification crap, not ours, ok to ignore # ripe-atlas-codfw.wikimedia.org - RIPE Atlas, ok to ignore # ripe-atlas-eqiad.wikimedia.org - RIPE Atlas, ok to ignore # ripe-atlas-ulsfo.wikimedia.org - RIPE Atlas, ok to ignore # tor-eqiad-1.wikimedia.org - Tor relay stuff. not legit, ok to ignore # radium.wikimedia.org - Tor relay stuff. not legit, ok to ignore # install2002.wikimedia.org - backup apt.wikimedia.org server, invalid LE cert until manual switchover, ok to ignore # ms1001.wikimedia.org - apparently similar to above: backup dumps.wikimedia.org server (to dataset1001), invalid LE cert until manual switchover, ok to ignore # audit-auto --- # ^ should have names that need to be listed and audited via ssllabs or some such, and go into the audit list on wikitech. cp1008.wikimedia.org may need manual removal from this list (corner-case server issue, covered by "pinkunicorn").