New Digs

So I am moving and took the opportunity to change my desk setup to something a lilttle more focused from my previously insane configuration. Behold an actual matched pair of 24″ screens on a mount where I will hopefully be churning out all sorts of new projects (like YeRP – Yet another Ripping Portal that uses flask and youtube-dl to present a mobile friendly interface.

Patching CentOS 7 (and overcoming problems)

So I was working on patching some of my Icinga infrastructure at work, and it seems that sometimes libyajl breaks things, as illustrated below

root@icingasatellite ~]# yum update
Loaded plugins: fastestmirror, rhnplugin
This system is receiving updates from RHN Classic or Red Hat Satellite.
Loading mirror speeds from cached hostfile

  • epel: mirror.optus.net
    Resolving Dependencies
    --> Running transaction check
    ---> Package icinga2.x86_64 0:2.10.4-1.el7.icinga will be updated
    ---> Package icinga2.x86_64 0:2.10.5-1.el7.icinga will be an update
    ---> Package icinga2-bin.x86_64 0:2.10.4-1.el7.icinga will be updated
    ---> Package icinga2-bin.x86_64 0:2.10.5-1.el7.icinga will be an update
    --> Processing Dependency: libyajl.so.2()(64bit) for package: icinga2-bin-2.10.5-1.el7.icinga.x86_64
    Traceback (most recent call last):
    File "/bin/yum", line 29, in
    yummain.user_main(sys.argv[1:], exit_code=True)
    File "/usr/share/yum-cli/yummain.py", line 375, in user_main
    errcode = main(args)
    File "/usr/share/yum-cli/yummain.py", line 239, in main
    (result, resultmsgs) = base.buildTransaction()
    File "/usr/lib/python2.7/site-packages/yum/
    init.py", line 1198, in buildTransaction
    (rescode, restring) = self.resolveDeps()
    File "/usr/lib/python2.7/site-packages/yum/depsolve.py", line 893, in resolveDeps
    CheckDeps, checkinstalls, checkremoves, missing = self._resolveRequires(errors)
    File "/usr/lib/python2.7/site-packages/yum/depsolve.py", line 1025, in _resolveRequires
    (checkdep, missing, errormsgs) = self._processReq(po, dep)
    File "/usr/lib/python2.7/site-packages/yum/depsolve.py", line 350, in _processReq
    CheckDeps, missingdep = self._requiringFromTransaction(po, requirement, errormsgs)
    File "/usr/lib/python2.7/site-packages/yum/depsolve.py", line 680, in _requiringFromTransaction
    rel=pkg.rel)
    File "/usr/lib/python2.7/site-packages/yum/
    init.py", line 5280, in update
    availpkgs = self._compare_providers(availpkgs, requiringPo)
    File "/usr/lib/python2.7/site-packages/yum/depsolve.py", line 1648, in _compare_providers
    bestnum = max(pkgresults.values())
    ValueError: max() arg is an empty sequence

Turns out the secret is simply to install yaljl and yajal-devel and then I can patch successfully, really surprised nobody else out there has run into this yet but its the second time in a month I have had it happen when patching.

Project: hockey-info

So I don’t always have great cell phone service, sometimes its weak 4G or even not 4G at all so modern designed apps suffer when bandwidth is a trickle at best. I would be away from home trying to find out whats going on with a Caps game and the NHL app would just be painfully slow or not work at all sometimes. Eventually I decided the only reasonable thing a hockey nerd such as myself could do was write something to fill this void, ideally something simple and effective to get me the information I wanted without a lot of overhead and frilly extra stuff I didn’t really care about.

The repos are still in high flux right now as I don’t even have a readme file yet for the main one, however this page can serve to sort of explain the bits and pieces.

nhlapi – This is what started it all for me really, I wanted more information about games (for an IRC bot) and threw myself into pulling various bits of information together about the NHL API in an easy to read and access way so others didn’t have to spend the hours I did looking for how to do things.

hockey-info – A super simple website written in Python utilizing the Flask framework. The focus is to be fast, simple and mobile friendly. It directly queries the NHL API for all its information and is formatted in a way that works well on mobile

hockey-info-docker – A bare bones Dockerfile to deploy the latest release of hockey-info. The container is based on Alpine and is as trimmed down as possible, makes deployment super simple and easy for anybody to run their own instance with only a few commands (provided you already have a Docker host to run it on).

Naturally I make no warranty about this app or any of the code I have written, its purely something neat I built in my spare time and am tossing out there for others to enjoy, modify and extend to their hearts content. If you do happen to have input, ideas, or feedback hit me up on twitter or just open an issue on Gitlab if its a purely technical issue to address with the code.

Hockey Records

The NHL was kind enough to release records.nhl.com to the public to browse more interesting stats than just game-by-game data, things like players that have hit the 1000 point milestone and other more trivia-friendly factoids.  Naturally the spidey senses went to tingling as soon as I saw the news on Reddit so I ran off to start poking at it and lo-and-behold it actually hits what appears to be the same data source as nhl.com/stats/rest but with all sorts of extra endpoints to try out.  This time around I attempted to be slightly clever and looked at https://records.nhl.com/static/js/client.bundle.js to save myself the trial-and-error process I used on a lot of the Stats API.  Turns out this was actually a smart move an probably would have let me document a lot of this stuff sooner if I had thought to spend some time poking around the code of the stats website.  No matter,  what counts is that now there is a rough outline of the Records API and it has been rolled into the NHLAPI repo on Gitlab.  Just like before if you see something I missed feel free to open a PR and if I don’t happen to see it right away @ me on twitter, I try to respond fairly quickly.

Footnote: https://beautifier.io/ is fantastic, it let me unmangle the client.bundle.js file so it was readable

Simple CI with Chef

So I needed to work out a way to make a script I wrote recently be deployed across a whole host of systems, turns out the only option is Chef so I had to dive into it and read a bunch of stuff.  Also had to try a bunch of things and ended up with my own Chef server in the lab to test against.  Several hours of clicking and clacking later and I have my task worked out, so here it is.

First we need to create a new cookbook and drop a pretty simple default recipe in, all it does is make sure git is installed then clone a repo to /opt/nhlapi.

Once we have the recipe we need a role to tell it what to do.

Create the role with # knife role from file repo-update.json  (or whatever you named the file to create the role from).

Now all that is left is to assign the role to the node so use #knife node edit itsj-cheftest.itscum.local  and assign the role and repo to the node we want

That is enough to get it working, you can kick back and watch it with # while :; do knife status ‘role:repo-update’ –run-list; sleep 120; done and wait to see it run in about 30 minutes based on the interval and splay values.  Speaking of which Interval is pretty self explanatory, but Splay not-so-much; Splay is used keep a bunch of nodes from all running at once basically so it doesn’t overwhelm a system that they might be checking into or otherwise digitally assaulting.

Simple Icinga2 Plugin

I’ve seen bits and pieces of the process of creating an Icinga2 (or Nagios) plugin, so here are my notes dumped straight from my brain.

First and foremost we need a script to call from Icinga, in this case I created a very simple Python script to simply get the version of LibreNMS running on my monitoring system.

This is a pretty simple script, you could call it with ./check_lnms_ver.py -H 192.168.1.100 to see how it works.  With the script working the next portion is done in the command line, first create the directory that will later be referenced as CustomPluginDir

# mkdir -p /opt/monitoring/plugins

Now we need to tell Icinga2 about the directory, this is done in a few different places

in /etc/icinga2/constants.conf add the following

const CustomPluginDir = "/opt/monitoring/plugins"

and in /etc/icinga2/conf.d/commands.conf we add the following block

The block above defines the custom command, specifies the script we created first and also passes the correct flags.  Now its time to add the check into the hosts.conf file, so place the following block into /etc/icinga2/conf.d/hosts.conf

And with that we wait for the next polling cycle and should see something like the screenshot below

This is a highly simplistic example, but figuring it out was necessary for me because I had to port some existing code from Ruby to Python so I wanted to know exactly how a plugin was created to understand what values were returned and how it all fits together.

Homelab: Synology failure post-mortem

I take my homelab very seriously, its modeled after several production environments I have worked on over the years. What follows is my recap of events over a few weeks leading up to the total failure of my central storage system, my beloved Synology DS1515 hosting 5.5TB of redundant network storage. The first signs of problems cropped up on May 31st and culminated over the last week in June.

Read moreHomelab: Synology failure post-mortem

So long Github!

So Microsoft bought Github for a moderate mountain of money and now everyone is fleeing it before the deal has even been approved by regulatory bodies. Some folks are calling it over-reacting but the reality is that Microsoft has a terrible track record (Nokia, Skype, Codeplex) and has been at times outright antagonistic to Open Source as a whole. Given that lately purchases are often about getting access to data I really don’t feel like providing useful metrics to Microsoft about the projects I work on no matter how small and insignificant they may be so all new work will appear on my Gitlab account. I went through by hand and tried to find all the places I linked my code here but if I happened to miss something either leave a comment or hit me up on Twitter and I will update the links.

How fast does the NATO phonetic alphabet go through letters?

So I saw a thread on reddit about people using phrases like usual “quick brown fox” one to test out fountain pens and it got me to thinking, I normally use the NATO phonetic alphabet to test my pens out but how fast does that go through all the letters of the alphabet.  After some banging around I came up with code that figures it out, just without all the hassle of actually trying to time it.

 

Turns out that it really doesn’t use up everything until the very end but by the time you get to the word papa all but 4 letters have been used already.

So as a way to work through all the letters of the alphabet its really not the most efficient way to go but perhaps there are better phrase combinations than the quick brown fox?

Fakes abound!

Its already making the rounds in various online news outlets that Reddit banned deepfakes (AI assisted fake pornograph videos), and naturally its causing all manner of consternation as people on every side of the issue get all twisted up and yell at each other incoherently.  Whats slipping through the cracks however is that there is also technology out there to fake voice as well and while its not great its not absolutely terrible as one might expect.  Since I have no desire to see myself superimposed on the body of another I figured I might as well see how good a computer was at faking out my voice since so many things take only very brief conversations to authorize these days.

In order to prime the software you have to record yourself reading a bunch of sentences, enough material for at least 30 seconds according to the prompts.  Once you have that corpus of material ready you tell the service to go build your voice (I was imagining Bene Gesserit Voice training while it processed) and when its done you can type in anything you want and the synthesized version of your voice spits out the phrase for better or for worse.

Non-generated

Generated with Lyrebird.ai

 

Naturally there are some modulated sounds in the generated one, however having reviewed recorded phone calls of myself it sure could pass for me if the phone mic was bad.  What is scary is that it correctly hit the emphasis that I naturally put on some words, enough that I suspect had I not been sick when recording this and had a better soundproof room to do it in it might have done better. Of course for like 30 minutes of screwing around on the site re-recording my various gaffs I think it did an admirable job of spoofing my voice and I suspect given enough time to refine the software it could probably get pretty good, fortunately I’m broke compared to the Hollywood folks who are turning coal into diamonds right now worrying about faking technology producing sex tapes that they never actually starred in.

Bitnami