Mar 292015
 

Like many people, I’ve been trying to find an easier way to archive all of my movies and TV shows to hard disk instead of DVD disc. My previous attempt at semi-automated DVD ripping used an autoloader and some scripts, but it still required a lot of intervention because I couldn’t find a way to correctly name and tag my movies automatically. I wished there was something like CDDB or FreeDB for DVDs that would lookup metadata from the disc in the drive.

Windows Media Player does a lookup of the DVD metadata so I figured it had to be possible.
WindowsMediaPlayer

Thankfully, I’ve stumbled across an awesome utility that will get the DVD’s discid in the format Microsoft uses for their media player and media centers. http://dvdid.cjkey.org.uk/
dvdid-command

Time to fire up Wireshark (or Ethereal if you’ve been doing this forever) and see where that request is going. I stuck the disc in my DVD drive and opened Windows Media Player. A couple of HTTP 302 redirects later I’ve got a HTTP 200 and the current request and reply.
redirects-and-success

Looking at the discid and the URL, it looks like it strips out the vertical bar pipe character | that was used in previous versions.
dvdid-string

Opening the request gives me the URL. It looks like it’s using “User-Agent: Windows-Media-Player/12.0.7601.18741″. To avoid being ridiculously obvious while abusing this api, it might be a good idea to spoof your useragent as well.
request-url

Response is an XML file. Looks really straight forward to parse.
response-xml

The request URL is super straight forward too. Just replace everything after the CRC= with the output from dvdid, removing the | character.
http://metaservices.windowsmedia.com/pas_dvd_B/template/GetMDRDVDByCRC.xml?CRC=92918cc3b7506a7e

There doesn’t appear to be much of anything prevent this from being abused. It opened in a regular browser without issue. I used IE in this screenshot.
response-in-IE

Apr 132014
 

Sometimes I feel a need to automate some meme generation. Luckily I found this incredibly helpful blogpost from jackmeme. http://www.jackmeme.com/blog.php?id=1

I used it as inspiration and did something similar in Perl. Of course the Imagemagick perl-magick reference was invaluable. http://www.imagemagick.org/script/perl-magick.php

The first thing I thought of with automated meme generation? The Weather! See my previous post about getting data from the Weather Underground API using Perl

This naturally lead to the forecast overlaid on the current radar.
68127-radar

Inspired by http://thefuckingweather.com/, I decided I want my daily forecast in meme form. The first weather-related meme I think of is Ollie Williams.

OllieForecast

Advertisement:

But that doesn’t really fit with the meme. Change fucking to motherfucking and Samuel L Jackson is PERFECT!

68127-memetemp

Smoosh those memes together and I’ve got my morning weather!

68127

I found another source of amusement while playing with generating image macros. There is a Pyborg bot in one of the IRC channels I frequent. “She” says some pretty funny stuff, so we keep a quote list. Of course this is ripe for meme-ifying.

meme25

More Evequotes are available in the Imgur Album

Keep reading to see the secret sauce.
Continue reading »

Apr 132014
 

Weather data is pretty cool to play with. Especially when playing with home automation or just turning the current weather conditions into a meme.

For example:
Weather for Omaha, Nebraska on 4/13/2014 at 16:00

raging@virtualbox3:~/weather$ perl grabweather.pl
Forecast - Today:
weekday:        Sunday
high:   61
low:    27
precip: 100
cond:   Thunderstorm
month:  4
day:    13
hour:   22

Forecast - Tomorrow:
weekday:        Monday
high:   43
low:    25
precip: 10
cond:   Partly Cloudy
month:  4
day:    14
hour:   22

Forecast - Day After Tomorrow:
weekday:        Tuesday
high:   66
low:    37
precip: 0
cond:   Partly Cloudy
month:  4
day:    15
hour:   22

Current Conditions:
conditions:     Overcast
temp:   34
relative_humidity:      86
wind:   10.9
wind_gust:      0
pressure_trend: -
feelslike:      26
precip_1hr:     0

Radar Static Image:     /tmp/68127-radar.png
Animated Radar Image:   /tmp/68127-amimatedradar.gif

More info about Weather Underground’s API: www.wunderground.com/weather/api

Keep reading for my bit of Perl to grab it and make it usable.

Continue reading »

Mar 082014
 

Since I mostly deal with Windows computers, and partially because I couldn’t figure out why phantomjs on my debian netinstall rendered fonts to be UGLY, I figured I’d use one of the many windows computers to render and email the reports. I’m using my gmail account for testing this.

gmail-message

Samples:
overview.pdf
overview.png

That looks pretty awesome, right? The secret sauce here is PhantomJS It pretty much a headless WebKit. A browser without a display. And it’s cross-platform!

I use PhantomJS to screenshot for PNG output and print to PDF. I then use ImageMagick to crop the PNG to a reasonable size. I use sendEmail to, well, send an email with the files attached.

Enough jibber-jabber.
Continue reading »

Feb 162014
 

Part 4 of 4 – Part 1Part 2Part 3

Now that you’ve got all your logs flying through logstash into elasticsearch, how to remove old records that are no longer doing anything but consuming space and ram for the index?

These are all functions of elasticsearch. Deleting is pretty easy, as is closing an index.
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-delete-index.html
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-open-close.html

The awesome people working on elasticsearch already have the solution! It’s called curator.
https://github.com/elasticsearch/curator
https://logstash.jira.com/browse/LOGSTASH-211

I like the idea of being able to let a cron job kick off the cleanup so I don’t forget.

To install, we’ll have to instal pip.

sudo apt-get install python-pip

Then use pip to install elasticsearch-curator

pip install elasticsearch-curator

When making a cron job, I always use full paths

which curator
/usr/local/bin/curator

edit the crontab. Any user should have access so I’ll run this under my user.

crontab -e

Add the following line to run curator at 20 minutes past midnight (system time) and connect to the elasticsearch node on 127.0.0.1 and delete all indexes older than 120 days and close all indexes older than 90 days.

20 0 * * * /usr/local/bin/curator --host 127.0.0.1 -d 120 -c 90

If you prefer an alternative, here’s one written in perl.
https://github.com/bloonix/logstash-delete-index

Feb 162014
 

That’s quite a title. I work with an ONSSI Ocularis CS setup. Originally installed with NetDVMS, but upgraded to RC-C.

This post builds upon a couple earlier posts
http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs
http://www.ragingcomputer.com/2014/02/sending-windows-event-logs-to-logstash-elasticsearch-kibana-with-nxlog

What does all this mean? This heavily redacted screenshot should give some idea.
kibana-ocularis-logs
Number of overall motion events over time, same for failure events. Top list of cameras with motion events, top list of cameras with failure events.

You can see we’ve got a few failed cameras. Likely a power surge or network failure. Having this information will lower the time to repair, minimizing camera down time!
Continue reading »

Feb 162014
 

To make sure I understood how to find data using Kibana3, I started collecting input from IRC.

kibana-irc

I have a ZNC bouncer set up on my network. 192.168.1.10

http://wiki.znc.in/ZNC

I have it set to Keep Buffer, Prepend Timestamps.
Timestamp Format:

[%Y-%m-%d %H:%M:%S]

Continue reading »

Feb 162014
 

I have been on a logging kick (or obsession) lately. See the previous series of posts.

I’ll start with a picture. This is seriously cool. If you’re running pfsense, you want this.
pfsense-kibana

BACKGROUND
My home network is pretty boring. Network is 192.168.1.0/24. Router is 192.168.1.254. Logstash is installed on 192.168.1.126.
Continue reading »

Feb 162014
 

Part 3 of 4 – Part 1Part 2Part 4
This is a continuation of http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs

Again, I took a lot of inspiration from http://sysxfit.com/blog/2013/07/18/logging-with-logstash-part-3/

The nxlog reference manual is surprisingly well written with excellent examples.
http://nxlog.org/nxlog-docs/en/nxlog-reference-manual.pdf

Loggly has some examples I found useful, even if I’m not using their service.
http://community.loggly.com/customer/portal/articles/1266344-nxlog-windows-configuration
https://www.loggly.com/docs/logging-from-windows/

There are other options.
http://www.canopsis.org/2013/05/windows-eventlog-snare-logstash/
http://docs.fluentd.org/articles/windows
http://cookbook.logstash.net/recipes/log-shippers/
http://cookbook.logstash.net/recipes/windows-service/
Continue reading »

Feb 162014
 

Part 2 of 4 – Part 1Part 3Part 4
This is a continuation from http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs

The great folks working on Kibana have been so awesome as to provide an example nginx configuration!
https://github.com/elasticsearch/kibana/blob/master/sample/nginx.conf

Kibana prompting for login to save changes to the dashboard
kibana-login

Before I start, I’ve got a tip of the hat to the resources that helped me figure this out
https://www.digitalocean.com/community/articles/how-to-create-a-ssl-certificate-on-nginx-for-ubuntu-12-04/
http://nginx.org/en/docs/http/configuring_https_servers.html

https://www.digitalocean.com/community/articles/how-to-set-up-http-authentication-with-nginx-on-ubuntu-12-10
http://nginx.org/en/docs/http/ngx_http_auth_basic_module.html

http://stackoverflow.com/questions/15503455/elasticsearch-allow-only-local-requests
Continue reading »