So long Github!

So Microsoft bought Github for a moderate mountain of money and now everyone is fleeing it before the deal has even been approved by regulatory bodies. Some folks are calling it over-reacting but the reality is that Microsoft has a terrible track record (Nokia, Skype, Codeplex) and has been at times outright antagonistic to Open Source as a whole. Given that lately purchases are often about getting access to data I really don’t feel like providing useful metrics to Microsoft about the projects I work on no matter how small and insignificant they may be so all new work will appear on my Gitlab account. I went through by hand and tried to find all the places I linked my code here but if I happened to miss something either leave a comment or hit me up on Twitter and I will update the links.

How fast does the NATO phonetic alphabet go through letters?

So I saw a thread on reddit about people using phrases like usual “quick brown fox” one to test out fountain pens and it got me to thinking, I normally use the NATO phonetic alphabet to test my pens out but how fast does that go through all the letters of the alphabet.  After some banging around I came up with code that figures it out, just without all the hassle of actually trying to time it.


alphabet = 'abcdefghijklmnopqrstuvwxyz'

chars = list(alphabet)

words = ['alpha','bravo','charlie','delta','echo','foxtrot','golf','hotel','india','juliett','kilo','lima','mike','november','oscar','papa','quebec','romeo','sierra','tango','uniform','victor','whiskey','xray','yankee','zulu']

for w in words:
    # loop through each word specified
    for cw in w:
        # now we work through each character in the word
        if cw in chars:


Turns out that it really doesn’t use up everything until the very end but by the time you get to the word papa all but 4 letters have been used already.

['q', 's', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']
['q', 'w', 'y', 'z']

So as a way to work through all the letters of the alphabet its really not the most efficient way to go but perhaps there are better phrase combinations than the quick brown fox?

Fakes abound!

Its already making the rounds in various online news outlets that Reddit banned deepfakes (AI assisted fake pornograph videos), and naturally its causing all manner of consternation as people on every side of the issue get all twisted up and yell at each other incoherently.  Whats slipping through the cracks however is that there is also technology out there to fake voice as well and while its not great its not absolutely terrible as one might expect.  Since I have no desire to see myself superimposed on the body of another I figured I might as well see how good a computer was at faking out my voice since so many things take only very brief conversations to authorize these days.

In order to prime the software you have to record yourself reading a bunch of sentences, enough material for at least 30 seconds according to the prompts.  Once you have that corpus of material ready you tell the service to go build your voice (I was imagining Bene Gesserit Voice training while it processed) and when its done you can type in anything you want and the synthesized version of your voice spits out the phrase for better or for worse.


Generated with


Naturally there are some modulated sounds in the generated one, however having reviewed recorded phone calls of myself it sure could pass for me if the phone mic was bad.  What is scary is that it correctly hit the emphasis that I naturally put on some words, enough that I suspect had I not been sick when recording this and had a better soundproof room to do it in it might have done better. Of course for like 30 minutes of screwing around on the site re-recording my various gaffs I think it did an admirable job of spoofing my voice and I suspect given enough time to refine the software it could probably get pretty good, fortunately I’m broke compared to the Hollywood folks who are turning coal into diamonds right now worrying about faking technology producing sex tapes that they never actually starred in.

NHL API Documentation

In the process of working on the project mentioned in my previous post I decided to try to round up all the bits and pieces of information I had found into a single place.  Some of the endpoints I was using even turned out to be undiscovered as far as I can tell (such as the one for standings) so I created a repository on Gitlab to try to collect it all along with examples of data returned in cases where it was feasible to include it.  The information is by no means complete as its largely a trial-and-error process to find new endpoints, however here is what I have been able to document so far in the NHL API.


If you happen find something that wasn’t included or otherwise incorrect feel free to open a PR and I will get it updated.

hockeystats for sopel

During a bout of boredom recently I set out to create a module for the Sopel bot framework to gather information about hockey games since a bunch of the guys on my IRC server are hockey fans to one degree or another.  Apparently there are lots of betting and analytic services out there that provide APIs however they all seem to charge or have some gimped up scheme for accessing the data, I easily spent a few hours searching around trying to find something that didn’t suck.  Finally I stumbled upon the work of Kevin Sidwar who partially documented some of the actual NHL API which provides game data.  Fast forward a few days of clicking and clacking about in vim and I managed to churn out a semi-functional module that will drop right into Sopel without any configuration at all.

Hockeystats is about as simple as it can get, having a grand total of 4 callable functions to grab previous game details, next game details, team stats and division standings.  I made sure to make it a little picky about user input so the risk of user abuse is fairly low and also tossed in docstring documentation so that the !help command will work for each function in the module.  A word of warning however, this is one of my fastest projects so far so its not gone beyond a smoke test and there aren’t any rate limits on commands so user beware and all that.

A cautionary tale of Git and Virtualbox

I have been trying to keep the code chops sharp since work doesn’t require it very often anymore, usually by working on polishing up projects I’ve started in the past but let fall by the wayside.  Last night was a bugfix session on imgur2pdf which I have been neglecting for a while now and specifically working on the resizing logic which was hosed up and created some ugly PDFs.  All told I think I spent about two and a half hours working on the code testing it over and over with galleries to make sure sizes were right and it made changes as appropriate to large dimension images.  Once I got things right to where I wanted them I pushed my commit up to the repo and took a break for a while, grabbing some tea and having a look at why the virtualbox VM I was working on wasn’t letting me copy/paste between it and my parent OS.

Some quick poking around and I realized I hadn’t installed the Guest Additions software so I loaded that up and rebooted the VM only to be graced with a solid black screen that was unresponsive. I rebuild the VM since it was mostly empty and try it a second time and the exact same thing happens when rebooting after installing.  Asked around on IRC and a buddy pointed out to me that this is kind of a known issue and there is a potential fix out for it, so I guess I know what I am doing with my evening tonight.

Thinking about this today made me realize that the old way of doing code would have probably cost me several hours of work and resulted in a great deal of profanity had I wiped out everything.  So let this be a lesson to anybody just getting started with Git, commit/push often as it beats the alternative of losing hours of work

Log parsing for cell phone records


So I totally screwed up the code previously, it didnt have any logic to look at only the target phone number so it was just running calculations on the entire log.  Also tossed in some avg calculations for number of calls and time per day for the hell of it.

So I was curious how much time I spend on the phone to certain people, so I decided to write some quick python to figure this out.


import csv
import sys
import datetime
import decimal

today ="%d")
total_monthly = 0
count = 0
call_log = sys.argv[1]
target_number = '123-456-7890'
with open(call_log) as csvDataFile:
    csvReader = csv.reader(csvDataFile)
    for row in csvReader:
        # this is the old format
        #(timestamp, caller, locale, code, duration, cost) = row

        # new format
        (datestamp, timestamp, caller, direction, duration) = row
        if caller == target_number:
            count += 1
            new_duration = duration.replace(' ', '')[:-2]
            total_monthly = total_monthly + int(new_duration)

# lets calculate total hours/minutes values
total_hours = total_monthly // 60
remaining_minutes = total_monthly - (total_hours * 60)
#print("%s hours, %s minutes" % str(total_hours), remaining_minutes)

print(str(total_hours)+" hours and "+str(remaining_minutes)+" minutes")
print(str(count)+" total calls")
(month, day, year) = datestamp.split('/')
avg_calls_day = count / int(today)
avg_time_day = total_monthly / int(today)

print("avg per day: "+str(round(avg_calls_day,2))+" calls")
print("avg time/day: "+str(round(avg_time_day,2))+" minutes"


This is pretty straight forward, change the value for target_number to whatever number in the logs you want to look for then let it rip like so

./ file.csv

low tech Salt deployment

So I have been tearing down and rebuilding a lot of crap in the lab lately (kubernetes clusters, ELK stack, etc) and I have been constantly having to re-add salt to the VMs because salt-cloud doesnt yet play nice with Xen.  After about the 3rd time of doing this I got tired of manually installing epel-release, salt-minion and then changing the config so I wrote perhaps the worst script ever to remotely do all that work for me and possibly be used later when I finally get salt-cloud working with Xen.


echo "deploying salt -> $HOST"
ssh root@$HOST "yum -y install epel-release && yum -y install salt-minion"
ssh root@$HOST "sed -i 's/#master: salt/master:' /etc/salt/minion"
ssh root@$HOST "systemctl start salt-minion && systemctl enable salt-minion"
echo "salt successfully deployed on host: $HOST"

Granted this relies upon me still manually doing ssh-copy-id so I don’t have to keep typing in passwords thats a lot fewer commands, maybe if I get the time I will add in some logic to then auto-accept the key in salt so that I don’t have to manually do that either.

Strange behavior from Postman

I was working through changing my Saltstack configuration to work with LibreNMS and was working through adding devices via the API as opposed to using auto discovery and realized that basically the same query in curl works fine, but when I tried it with Postman it doesn’t work and acts like I never passed some of the values, observe!

as opposed to when done in curl

dword@DESKTOP:~$ curl -X POST -d '{"hostname":"","version":"v2c","community":"public"}' -H 'X-Auth-Token: 286755fad04869ca523320acce0dc6a4'
    "status": "ok",
    "message": "Device (18) has been added successfully"
dword@DESKTOP:~$ man curl

The only possible thing I can figure that is going since this is such an absurdly simple API query is that Postman does some kind of magic thats not plainly visible that changes how the data is received by the API.  This is moderately troubling because it gets me wondering what else they are doing with data and if there is some kind underhanded snooping going on, not that I’m working on anything too terribly sensitive other than helping myself become more lazy in the lab.  If I was tossing in a pile of headers I could see where the room for mistakes exists but with only three key/value pairs passed in data and the X-Auth-Token passed in headers I can’t really see any possible place I have messed up but sure enough we get the error about not specifying the version of SNMP for the add device call, so something definitely is hosed up somewhere.

SNMP, Remotely!

So I have been building up a bit of a Windows environment in the lab (a DC, two clients and a sql server so far) and I wanted to push out SNMP to the environment because thats how I monitor things in the lab. Unfortunately I have seen absolutely no reliable way to do so with Group Policy, so in comes the glory that has become PowerShell lately. First we need to figure out which servers we are going to target, so lets whip up servers.txt with the shortnames and make a note of its path. Once we have that list we whip up a quick little loop that works on each line of the servers.txt file to make things happen.

$Servers = Get-Content C:\Users\dword\servers.txt
foreach($Server in $Servers){
    Invoke-Command -ComputerName $Server -Filepath C:\Users\dword\InstallSNMP.ps1
    Invoke-GPUpdate -ComputerName $Server -RandomDelayInMinutes 0 -Force

Of course if you only need to do a single server for a one-off reason you could just run Invoke-Command manually but where is the fun in that when we can push to every single system in a hurry AND kick off the gpupdate which will pull down the settings to enable SNMP across the lab.

Close Bitnami banner