Mar 082014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Since I mostly deal with Windows computers, and partially because I couldn’t figure out why phantomjs on my debian netinstall rendered fonts to be UGLY, I figured I’d use one of the many windows computers to render and email the reports. I’m using my gmail account for testing this.

gmail-message

Samples:
overview.pdf
overview.png

That looks pretty awesome, right? The secret sauce here is PhantomJS It pretty much a headless WebKit. A browser without a display. And it’s cross-platform!

I use PhantomJS to screenshot for PNG output and print to PDF. I then use ImageMagick to crop the PNG to a reasonable size. I use sendEmail to, well, send an email with the files attached.

Enough jibber-jabber.
Continue reading »

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 4 of 4 – Part 1Part 2Part 3

Now that you’ve got all your logs flying through logstash into elasticsearch, how to remove old records that are no longer doing anything but consuming space and ram for the index?

These are all functions of elasticsearch. Deleting is pretty easy, as is closing an index.
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-delete-index.html
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-open-close.html

The awesome people working on elasticsearch already have the solution! It’s called curator.
https://github.com/elasticsearch/curator
https://logstash.jira.com/browse/LOGSTASH-211

Advertisement:

I like the idea of being able to let a cron job kick off the cleanup so I don’t forget.

To install, we’ll have to instal pip.

sudo apt-get install python-pip

Then use pip to install elasticsearch-curator

pip install elasticsearch-curator

When making a cron job, I always use full paths

which curator
/usr/local/bin/curator

edit the crontab. Any user should have access so I’ll run this under my user.

crontab -e

Add the following line to run curator at 20 minutes past midnight (system time) and connect to the elasticsearch node on 127.0.0.1 and delete all indexes older than 120 days and close all indexes older than 90 days.

20 0 * * * /usr/local/bin/curator --host 127.0.0.1 -d 120 -c 90

If you prefer an alternative, here’s one written in perl.
https://github.com/bloonix/logstash-delete-index

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

That’s quite a title. I work with an ONSSI Ocularis CS setup. Originally installed with NetDVMS, but upgraded to RC-C.

This post builds upon a couple earlier posts
http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs
http://www.ragingcomputer.com/2014/02/sending-windows-event-logs-to-logstash-elasticsearch-kibana-with-nxlog

What does all this mean? This heavily redacted screenshot should give some idea.
kibana-ocularis-logs
Number of overall motion events over time, same for failure events. Top list of cameras with motion events, top list of cameras with failure events.

You can see we’ve got a few failed cameras. Likely a power surge or network failure. Having this information will lower the time to repair, minimizing camera down time!
Continue reading »

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

To make sure I understood how to find data using Kibana3, I started collecting input from IRC.

kibana-irc

I have a ZNC bouncer set up on my network. 192.168.1.10

http://wiki.znc.in/ZNC

I have it set to Keep Buffer, Prepend Timestamps.
Timestamp Format:

[%Y-%m-%d %H:%M:%S]

Continue reading »

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

I have been on a logging kick (or obsession) lately. See the previous series of posts.

I’ll start with a picture. This is seriously cool. If you’re running pfsense, you want this.
pfsense-kibana

BACKGROUND
My home network is pretty boring. Network is 192.168.1.0/24. Router is 192.168.1.254. Logstash is installed on 192.168.1.126.
Continue reading »

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 3 of 4 – Part 1Part 2Part 4
This is a continuation of http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs

Again, I took a lot of inspiration from http://sysxfit.com/blog/2013/07/18/logging-with-logstash-part-3/

The nxlog reference manual is surprisingly well written with excellent examples.
http://nxlog.org/nxlog-docs/en/nxlog-reference-manual.pdf

Loggly has some examples I found useful, even if I’m not using their service.
http://community.loggly.com/customer/portal/articles/1266344-nxlog-windows-configuration
https://www.loggly.com/docs/logging-from-windows/

There are other options.
http://www.canopsis.org/2013/05/windows-eventlog-snare-logstash/
http://docs.fluentd.org/articles/windows
http://cookbook.logstash.net/recipes/log-shippers/
http://cookbook.logstash.net/recipes/windows-service/
Continue reading »

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 2 of 4 – Part 1Part 3Part 4
This is a continuation from http://www.ragingcomputer.com/2014/02/logstash-elasticsearch-kibana-for-windows-event-logs

The great folks working on Kibana have been so awesome as to provide an example nginx configuration!
https://github.com/elasticsearch/kibana/blob/master/sample/nginx.conf

Kibana prompting for login to save changes to the dashboard
kibana-login

Before I start, I’ve got a tip of the hat to the resources that helped me figure this out
https://www.digitalocean.com/community/articles/how-to-create-a-ssl-certificate-on-nginx-for-ubuntu-12-04/
http://nginx.org/en/docs/http/configuring_https_servers.html

https://www.digitalocean.com/community/articles/how-to-set-up-http-authentication-with-nginx-on-ubuntu-12-10
http://nginx.org/en/docs/http/ngx_http_auth_basic_module.html

http://stackoverflow.com/questions/15503455/elasticsearch-allow-only-local-requests
Continue reading »

Feb 162014
 

Edit: This post is pretty old and Elasticsearch/Logstash/Kibana have evolved a lot since it was written.

Part 1 of 4 – Part 2Part 3Part 4

Have you heard of Logstash / ElasticSearch / Kibana? I don’t wanna oversell it, but it’s AMAZING!

I’ll start with a screenshot. You know you want this. I have to blur a few things to keep some 53cr375 about my environment.
kibana-windows

This is my configuration for collecting Windows event logs. I’m still working out the differences between the Windows XP, Server 2008R2, and Windows 7 computers I’m collecting logs from, but this has already proven very useful.

If you don’t know about it yet, you should really go watch this webinar. http://www.elasticsearch.org/webinars/introduction-to-logstash/ I’ll wait.
Continue reading »

Nov 212013
 

It’s that time again! Another construction project at work!

It was decided that I would run another time-lapse webcam of the construction process. This was a little last-minute. Our electrician mounted an analog camera and plugged into a network video encoder. http://www.axis.com/products/cam_m7001/

I haven’t had time to set up a proper linux box to manage all this so I threw together a batch file and a scheduled task to run on my workstation every 5 minutes until I get the rest set up.

I love how easy it is to get a jpg off the Axis M7001 Video Encoder http://xxx.xxx.xxx.xxx/jpg/1/image.jpg

I’m familiar (and a fan of) wget. The win32 version works pretty well too.
http://gnuwin32.sourceforge.net/packages/wget.htm

This uses my favorite method of generating zero padded time & date stamp files. It concatenates 0 and the variable then selects the right 2 characters.

constructioncamgrab.bat

@echo off
REM Create the date and time elements.
for /f "tokens=1-7 delims=:/-, " %%i in ('echo exit^|cmd /q /k"prompt $d $t"') do (
   for /f "tokens=2-4 delims=/-,() skip=1" %%a in ('echo.^|date') do (
      set %%a=%%j
      set %%b=%%k
      set %%c=%%l
      set hh=%%m
      set min=%%n
   )
)

REM zero-pad and see the result
set mm=0%mm%
set mm=%mm:~-2%
set dd=0%dd%
set dd=%dd:~-2%
set hh=0%hh%
set hh=%hh:~-2%
set min=0%min%
set min=%min:~-2%

echo %yy%%mm%%dd%%hh%%min%

set outFile=C:\construction_cam\%yy%%mm%%dd%%hh%%min%.jpg

"C:\Program Files (x86)\GnuWin32\bin\wget.exe" http://xxx.xxx.xxx.xxx/jpg/1/image.jpg -O %outFile%
Oct 202013
 

Maybe someone will find my notes helpful

Configuration Overview

Win 8
Classic Shell http://www.classicshell.net/
autologin http://www.howtogeek.com/112919/how-to-make-your-windows-8-computer-logon-automatically/
Installed SABnzbd/Sickbeard/CouchPotato/uTorrent/Plex Media Server

SABnzbd/Sickbeard/CouchPotato in startup folder

Internal 3TB drive mounted as Z:

Video stored in
Z:\Video\TV
Z:\Video\Movies
Z:\Video\MoviesAnime
Z:\Video\MoviesChildrens
Z:\Video\MoviesClassic
Z:\Video\MoviesDocumentary

I just move movies around after CouchPotato renames/adds metadata

I have plex media server set to scan/index media files every 15 minutes. Sickbeard is also set to notify Plex on new download

Downloaded files are initially on the C: drive

C:\Downloads\Audio - Reserved for when I get around to setting up Headphones
C:\Downloads\Complete - Default folder for downloads without a category
C:\Downloads\Incomplete - Working folder for SABnzbd
C:\Downloads\Movie - Destination for movie category - where CouchPotato looks for downloaded files
C:\Downloads\Torrent - Working folder for uTorrent
C:\Downloads\TV - Destination for TV category

SABnzbd
Installed taking default settings

Configuration

General
port 8080
make sure API key is available. required for Sick Beard and CouchPotato

Folders
Temporary Download Folder: C:\Downloads\incomplete
Completed Download Folder: C:\Downloads\complete
Post-Processing Scripts Folder: C:\SickBeard\autoProcessTV

Switches
Launch Browser on Startup: Unchecked

Servers
Add Server – Follow config from usenet provider

Categories
audio Default Default Default C:\Downloads\Audio
movie Default Default Default C:\Downloads\Movie
tv Default Default sabToSickBeard.exe C:\Downloads\TV

RSS
Add DOGnzb bookmark feed

Sickbeard
Extracted to C:\SickBeard
made shortcut to SickBeard.exe in startup folder

Config
General
Port 8081
Enable API key

Search Settings
NZB Method: SABnzbd
SABnzbd URL: http://localhost:8080/
SABnzbd API Key: *** The Key Copied from SABnzbd ***
SABnzbd Category: tv

Search Providers
Configure Custom Newznam Providers
Name / URL / API Key
Provider Priorities
dognzb.cr
Sick Beard Index
Womble’s Index
SmackDownOnYou

Everything else unchecked

Post Processing
TV Download Dir: Z:\Video\TV
Rename Episodes: Checked
Name Pattern: Custom…
Season %0S/%SN – S%0SE%0E – %EN
Multi-Episode Style: Duplicate

Custom Air-By-Date: checked
Show Name – 2011-03-09 – Ep Name

Metadata Type: XBMC
All boxes checked
Use Banners: Unchecked

Notifications
Plex Media Server
Enable: Checked
Update Library: Checked
Plex Media Server IP:Port: Machine IP Address

Prowl
Enable: Checked
Notify on Download: Checked
Prowl API key: *** API Key from prowlapp.com ***

CouchPotato
Installed taking defaults

Check Show Advanced Settings

General
port 8082

Searcher
Preferred Words: x264
Ignored Words: german, dutch, french, truefrench, danish, swedish, spanish, italian, korean, dubbed, swesub, korsub, dksubs, GERMAN, DUBBED, R5, R5BR, R5BD, DVDScr, CAM, TS, TV-ripm, cd1, cd2

Providers
Newznab: checked
Add nmatrix and DOGnzb URL and API key
Nzbindex: Checked

Quality
I made a Custom quality profile named “Custom”
I prefer in order
BR-Rip
DVD-Rip
720p
1080p

Sizes
1080p 600 2400
720p 600 2400
BR-Rip 600 2400
DVD-Rip 600 2400

Downloaders
Sabnzbd: Checked
Host: *** IP Address of machine ***
API Key: *** API Key copied from SABnzbd ***
Category: movie

Renamer
Run Every: 10
To: Z:\Video\Movies
Folder Naming: ()
File Naming: .
Cleanup: checked
From: C:\Downloads\Movie\
Force Every: 1

Rename .NFO: Checked
NFO Naming: .orig.

XBMC: Checked
NFO: checked

Notifications
Prowl: Checked
Api Key: *** API Key from prowlapp.com ***

Plex: checked
Host: *** IP Address of machine ***

Automation
Watchlists
IMDB: Checked
URL: *** RSS URL of IMDB Watchlist ***

Manage
Movie Library Manager: Checked
Movie Folder: one for each of the following-
Z:\Video\Movies
Z:\Video\MoviesAnime
Z:\Video\MoviesChildrens
Z:\Video\MoviesClassic
Z:\Video\MoviesDocumentary

Plex
Installed latest PlexPass version (details in plexpass forum)
Took default settings

In plex web that popped up after install

signed in with my.plexapp.com username

Add Section
TV Shows - Z:\Video\TV
Movies - Z:\Video\Movies
Anime - Z:\Video\MoviesAnime
Children's - Z:\Video\MoviesChildrens
Classic Movies - Z:\Video\MoviesClassic
Documentary - Z:\Video\MoviesDocumentary

Settings (wrench and screwdriver in upper right)
Plex Media Server
Library
Update my library periodically: checked
Library update interval: every 15 minutes

Plex/Web
player
Local Quality: 3Mbps, 720p
Remote Quality: 1.5Mbps, 480p
Direct Play: checked
Direct Stream: checked