Mar 082014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Since I mostly deal with Windows computers, and partially because I couldn’t figure out why phantomjs on my debian netinstall rendered fonts to be UGLY, I figured I’d use one of the many windows computers to render and email the reports. I’m using my gmail account for testing this.



That looks pretty awesome, right? The secret sauce here is PhantomJS It pretty much a headless WebKit. A browser without a display. And it’s cross-platform!

I use PhantomJS to screenshot for PNG output and print to PDF. I then use ImageMagick to crop the PNG to a reasonable size. I use sendEmail to, well, send an email with the files attached.

Enough jibber-jabber.
Continue reading »

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 4 of 4 – Part 1Part 2Part 3

Now that you’ve got all your logs flying through logstash into elasticsearch, how to remove old records that are no longer doing anything but consuming space and ram for the index?

These are all functions of elasticsearch. Deleting is pretty easy, as is closing an index.

The awesome people working on elasticsearch already have the solution! It’s called curator.


I like the idea of being able to let a cron job kick off the cleanup so I don’t forget.

To install, we’ll have to instal pip.

sudo apt-get install python-pip

Then use pip to install elasticsearch-curator

pip install elasticsearch-curator

When making a cron job, I always use full paths

which curator

edit the crontab. Any user should have access so I’ll run this under my user.

crontab -e

Add the following line to run curator at 20 minutes past midnight (system time) and connect to the elasticsearch node on and delete all indexes older than 120 days and close all indexes older than 90 days.

20 0 * * * /usr/local/bin/curator --host -d 120 -c 90

If you prefer an alternative, here’s one written in perl.

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

That’s quite a title. I work with an ONSSI Ocularis CS setup. Originally installed with NetDVMS, but upgraded to RC-C.

This post builds upon a couple earlier posts

What does all this mean? This heavily redacted screenshot should give some idea.
Number of overall motion events over time, same for failure events. Top list of cameras with motion events, top list of cameras with failure events.

You can see we’ve got a few failed cameras. Likely a power surge or network failure. Having this information will lower the time to repair, minimizing camera down time!
Continue reading »

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

To make sure I understood how to find data using Kibana3, I started collecting input from IRC.


I have a ZNC bouncer set up on my network.

I have it set to Keep Buffer, Prepend Timestamps.
Timestamp Format:

[%Y-%m-%d %H:%M:%S]

Continue reading »

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

I have been on a logging kick (or obsession) lately. See the previous series of posts.

I’ll start with a picture. This is seriously cool. If you’re running pfsense, you want this.

My home network is pretty boring. Network is Router is Logstash is installed on
Continue reading »

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 3 of 4 – Part 1Part 2Part 4
This is a continuation of

Again, I took a lot of inspiration from

The nxlog reference manual is surprisingly well written with excellent examples.

Loggly has some examples I found useful, even if I’m not using their service.

There are other options.
Continue reading »

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 2 of 4 – Part 1Part 3Part 4
This is a continuation from

The great folks working on Kibana have been so awesome as to provide an example nginx configuration!

Kibana prompting for login to save changes to the dashboard

Before I start, I’ve got a tip of the hat to the resources that helped me figure this out
Continue reading »

Feb 162014

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 1 of 4 – Part 2Part 3Part 4

Have you heard of Logstash / ElasticSearch / Kibana? I don’t wanna oversell it, but it’s AMAZING!

I’ll start with a screenshot. You know you want this. I have to blur a few things to keep some 53cr375 about my environment.

This is my configuration for collecting Windows event logs. I’m still working out the differences between the Windows XP, Server 2008R2, and Windows 7 computers I’m collecting logs from, but this has already proven very useful.

If you don’t know about it yet, you should really go watch this webinar. I’ll wait.
Continue reading »

Nov 212013

It’s that time again! Another construction project at work!

It was decided that I would run another time-lapse webcam of the construction process. This was a little last-minute. Our electrician mounted an analog camera and plugged into a network video encoder.

I haven’t had time to set up a proper linux box to manage all this so I threw together a batch file and a scheduled task to run on my workstation every 5 minutes until I get the rest set up.

I love how easy it is to get a jpg off the Axis M7001 Video Encoder

I’m familiar (and a fan of) wget. The win32 version works pretty well too.

This uses my favorite method of generating zero padded time & date stamp files. It concatenates 0 and the variable then selects the right 2 characters.


@echo off
REM Create the date and time elements.
for /f "tokens=1-7 delims=:/-, " %%i in ('echo exit^|cmd /q /k"prompt $d $t"') do (
   for /f "tokens=2-4 delims=/-,() skip=1" %%a in ('echo.^|date') do (
      set %%a=%%j
      set %%b=%%k
      set %%c=%%l
      set hh=%%m
      set min=%%n

REM zero-pad and see the result
set mm=0%mm%
set mm=%mm:~-2%
set dd=0%dd%
set dd=%dd:~-2%
set hh=0%hh%
set hh=%hh:~-2%
set min=0%min%
set min=%min:~-2%

echo %yy%%mm%%dd%%hh%%min%

set outFile=C:\construction_cam\%yy%%mm%%dd%%hh%%min%.jpg

"C:\Program Files (x86)\GnuWin32\bin\wget.exe" -O %outFile%
Oct 202013

Maybe someone will find my notes helpful

Configuration Overview

Win 8
Classic Shell
Installed SABnzbd/Sickbeard/CouchPotato/uTorrent/Plex Media Server

SABnzbd/Sickbeard/CouchPotato in startup folder

Internal 3TB drive mounted as Z:

Video stored in

I just move movies around after CouchPotato renames/adds metadata

I have plex media server set to scan/index media files every 15 minutes. Sickbeard is also set to notify Plex on new download

Downloaded files are initially on the C: drive

C:\Downloads\Audio - Reserved for when I get around to setting up Headphones
C:\Downloads\Complete - Default folder for downloads without a category
C:\Downloads\Incomplete - Working folder for SABnzbd
C:\Downloads\Movie - Destination for movie category - where CouchPotato looks for downloaded files
C:\Downloads\Torrent - Working folder for uTorrent
C:\Downloads\TV - Destination for TV category

Installed taking default settings


port 8080
make sure API key is available. required for Sick Beard and CouchPotato

Temporary Download Folder: C:\Downloads\incomplete
Completed Download Folder: C:\Downloads\complete
Post-Processing Scripts Folder: C:\SickBeard\autoProcessTV

Launch Browser on Startup: Unchecked

Add Server – Follow config from usenet provider

audio Default Default Default C:\Downloads\Audio
movie Default Default Default C:\Downloads\Movie
tv Default Default sabToSickBeard.exe C:\Downloads\TV

Add DOGnzb bookmark feed

Extracted to C:\SickBeard
made shortcut to SickBeard.exe in startup folder

Port 8081
Enable API key

Search Settings
NZB Method: SABnzbd
SABnzbd URL: http://localhost:8080/
SABnzbd API Key: *** The Key Copied from SABnzbd ***
SABnzbd Category: tv

Search Providers
Configure Custom Newznam Providers
Name / URL / API Key
Provider Priorities
Sick Beard Index
Womble’s Index

Everything else unchecked

Post Processing
TV Download Dir: Z:\Video\TV
Rename Episodes: Checked
Name Pattern: Custom…
Season %0S/%SN – S%0SE%0E – %EN
Multi-Episode Style: Duplicate

Custom Air-By-Date: checked
Show Name – 2011-03-09 – Ep Name

Metadata Type: XBMC
All boxes checked
Use Banners: Unchecked

Plex Media Server
Enable: Checked
Update Library: Checked
Plex Media Server IP:Port: Machine IP Address

Enable: Checked
Notify on Download: Checked
Prowl API key: *** API Key from ***

Installed taking defaults

Check Show Advanced Settings

port 8082

Preferred Words: x264
Ignored Words: german, dutch, french, truefrench, danish, swedish, spanish, italian, korean, dubbed, swesub, korsub, dksubs, GERMAN, DUBBED, R5, R5BR, R5BD, DVDScr, CAM, TS, TV-ripm, cd1, cd2

Newznab: checked
Add nmatrix and DOGnzb URL and API key
Nzbindex: Checked

I made a Custom quality profile named “Custom”
I prefer in order

1080p 600 2400
720p 600 2400
BR-Rip 600 2400
DVD-Rip 600 2400

Sabnzbd: Checked
Host: *** IP Address of machine ***
API Key: *** API Key copied from SABnzbd ***
Category: movie

Run Every: 10
To: Z:\Video\Movies
Folder Naming: ()
File Naming: .
Cleanup: checked
From: C:\Downloads\Movie\
Force Every: 1

Rename .NFO: Checked
NFO Naming: .orig.

XBMC: Checked
NFO: checked

Prowl: Checked
Api Key: *** API Key from ***

Plex: checked
Host: *** IP Address of machine ***

IMDB: Checked
URL: *** RSS URL of IMDB Watchlist ***

Movie Library Manager: Checked
Movie Folder: one for each of the following-

Installed latest PlexPass version (details in plexpass forum)
Took default settings

In plex web that popped up after install

signed in with username

Add Section
TV Shows - Z:\Video\TV
Movies - Z:\Video\Movies
Anime - Z:\Video\MoviesAnime
Children's - Z:\Video\MoviesChildrens
Classic Movies - Z:\Video\MoviesClassic
Documentary - Z:\Video\MoviesDocumentary

Settings (wrench and screwdriver in upper right)
Plex Media Server
Update my library periodically: checked
Library update interval: every 15 minutes

Local Quality: 3Mbps, 720p
Remote Quality: 1.5Mbps, 480p
Direct Play: checked
Direct Stream: checked