Archive for the ‘Programming’ Category


Late last ‘week I noticed that my new nagios server was not responding anymore. Well, I checked it and it was down. Not only that, it was a vm on my test server and the entire server was down as well. Arrrgh.

Usually I use this as a foray to tell you all to remember to do your backups. Well, in this case I didn’t do them either. Hey, it’s a test vm server right? Yeah, well I am kicking myself about that anyhow. I just got nagios working really well the way I wanted. Oh well, I guess I get to practice some more right :-)

Well, as it turns out, my server had a catastrophic drive failure. I did EVERYTHING to try and resuscitate this thing. To start with, it had no partition table at all. Luckily I bought 2 of these servers and they were identically configured, so I checked out the partition table of the one and used fdisk to apply it to the broken one. After that I was able to fsck one partition, but as it would happen, that partition was only boot. Feh. The other partition had lost all it’s superblock info. I couldn’t even use a backup superblock. Nada. I noticed that mkfs had a command line switch of -S, which writes the superblock info on a artition without formatting or touching the inodes. I tried that and it appeared to be successful. At leat I could run fsck on the partition now and it was fixing the inodes. YAY! except that after a few hours of fixing, I still got nothing but a few system files in a pile under the lost-n-found directory. Shortly thereafter the drive lost it’s partition info again anyway. That’s life I guess.

So, it was off to Microcenter to get a new hdd. I brought that home and did a fresh CentOS 5.3 32 bit install and played with it a bit and thought to myself, hey, maybe I should run some kind of burn-in test on this server before I go investing a lot of time into it again.

That is where Sys_Basher comes in. Sys_Basher is a multithreaded memory and disk exerciser. That’s what the website says. It makes a pretty good burn in program by continually testing your memory and disk (which pushes on your cpu as well) for any length of time you specify. I kinda like it actually, and that is a good thing because there are woefully few burn-in or stress test type programs available to the Linux community. In fact, if you are a programmer and looking for a great project, you could generate a lot of traffic and interest by making one. Not that I don’t like Sys_Basher, mind you, but variety is the spice of life and certainly the way of open source!

Anyway, I ran Sys_Basher overnight on my new machine which passed with flying colors. Then, this morning, I decided that maybe I should run 64bit Linux on this box. Some days I am so fickle, but I decided it would be in my best interest to change up the OS before building a bunch of new test vms on there :-)

Maybe this time I’ll even back the darn thing up too! Wish me luck and, btw, do your backups!

Sunday, July 12th, 2009

Command Line Mail

Here’s one for the book…:

I have a script that monitors a process and I want it to email my cellphone (to page me) if things don’t look just right. The problem is that just using “mail” or “mailx” in a script fails because my carrier divines whether or not my return address is real. Obviously a from field that looks like “root@localhost” is just not getting through.

What’s the solution? Enter “mutt”.

Mutt, it seems, will let you specify your from field in the ~/.muttrc file. Also, it works pretty much the same on the command line as mail or mailx. So, I set up mu ~/.muttrc like so:

set realname = "menotyou"
set from = ""
set hostname = ""
set use_from = yes

And then, in the script I send mails like so:

echo "Wow I can send mail!" | /usr/bin/mutt -s "A present for you"

All in one line of course, but BINGO, all the sudden my cell phone springs to life at all hours of the night with information I don’t want to know :-)


Thursday, June 4th, 2009


Even though I wrote and use OSM I also use Nagios at work (along with OSM). Actually, I administer Nagios there, however I have never actually installed and configured it. It was in place before I started there.

That being said, my manager asked me how to get it installed and running today, as he wants to try using it at home. This sort of spurred me into setting it up at home tonight. It’s really nice having a server that can handle a few test VMs, by the way :-)

I decided I would install it on CentOS, because I need to be able to get it running on RedHat for work, so off to Google I went. After a bit of searching I finally came across a WONDERFUL site which provides a quick and dirty script for getting Nagios installed and working lickety split. It works perfectly and the only adjustment I made to the script, other than changing the passwords in it, was to comment out the SELinux lines because I already have SELinux disabled.

That really was it. Pretty simple. Of course the rip here is actually getting Nagios to monitor your systems, and that is probably beyond the scope of this post, which was really meant as a reference for that install script. Configuring nagios by the command line is not for the faint of heart. The files you need to pay attention to end up in /usr/local/nagios/etc and /usr/local/nagios/etc/objects. Just keep in mind that the configs seem to reference eachother in a cyclical way and you really need to pay attention. I found a good starter-help at the bottom of this website for adding your first non-local machine. Once you get that working you’ll understand how to add more, but I still found it a bit of a frustrating experience for a few minutes.

I did note, however, that there are quite a few projects out there which claim to configure Nagios for you via a web interface. I hope to give them a shot or two in the coming days/nights. Let me know if any of you have tried any and how they fair.

Monday, May 4th, 2009

CentOS 5.x and SVN server



Just a quickie tutorial on how to set up a web based subversion server on a CentOS server in a hurry.

To start with you need a CentOS 5.x install with a working web server.
# yum install subversion mod_dav_svn
Pick a directory where you want to house your repo. We’ll say for argument mine is /home/svn.
# vim /etc/httpd/conf.d/subversion.conf

Location /svn>
DAV svn
SVNParentPath /home/svn
AuthType Basic
AuthName “Subversion”
AuthUserFile /etc/svn-auth
Require valid-user

Note that the leading less-than symbols are not displayed on the “location” tags because wordpress is retarded.

Add yourself a user into your auth file:
# htpasswd -cm /etc/svn-auth
You’ll be prompted for your password couple times.
Add your directory and fix permissions:
# mkdir /home/svn
# chown apache:apache /home/svn
Create your first repo:
# svnadmin create /home/svn/test
Restart your web service:
# service httpd restart

That’s it! If you point your web browser to http://yourwebserver/svn/test you should get a “Revision 0″ notice.

Monday, April 6th, 2009

Building an rpm to install script files

On an rpm based system, say CentOS, first make sure that the rpm-build package is installed.

In your user account, not as root (bad form and all) make the following directories:

mkdir -p ~/rpm
mkdir -p ~/rpm/BUILD
mkdir -p ~/rpm/RPMS
mkdir -p ~/rpm/SOURCES
mkdir -p ~/rpm/SPECS
mkdir -p ~/rpm/SRPMS
mkdir -p ~/rpm/tmp

And create an ~/.rpmmacros file with the following in it:

%packager Your Name
%_topdir /home/YOUR HOME DIR/rpm
%_tmppath /home/YOUR HOME DIR/rpm/tmp

And now comes the fun part. Go to the ~/rpm/SOURCES directory and create a working package directory under that with the package name and a dash and the major revision number. For example, ~/rpm/SOURCES/linc-1. Now in that directory you will copy all the scripts/files that you wish to have in your package. For example, I might have a script in that directory called that I want to be installed as part of the linc package.

Once that is done, make a tarball of that directory in the ~/rpm/SOURCES directory named programname-revision.tar.gz. Using my previous example it would be:

tar czvf linc-1.tar.gz linc-1/

Now for the glue that makes this all stick together. Go to your ~/rpm/SPECS directory and create a spec file for your package. We’ll call mine linc.spec and it’ll look like this:

Summary: My first rpm script package
Name: linc
Version: 1
Release: 1
Source0: linc-1.tar.gz
License: GPL
Group: MyJunk
BuildArch: noarch
BuildRoot: %{_tmppath}/%{name}-buildroot
Make some relevant package description here
%setup -q
install -m 0755 -d $RPM_BUILD_ROOT/opt/linc
install -m 0755 $RPM_BUILD_ROOT/opt/linc/
echo " "
echo "This will display after rpm installs the package!"
%dir /opt/linc

A lot of that file is pretty self explanatory except then install lines and the lines after %file. The install lines tell rpm what to install where and with what permissions. You also have to do any directory creation there as well (the one with the -d in the line). The things after %file are similar in that this tells rpm’s database which files are attached to this package. The %dir signifies a new directory, otherwise the files are listed with their complete paths.

Now that you have all that together. The last thing you need do is create the package. Just go to ~/rpm and do an “rpmbuild -ba SPECS/linc.spec”. You will end up with an ~/rpm/RPMS/noarch/linc-1-1.noarch.rpm if all goes well.

Monday, February 16th, 2009


I mentioned previously that I was thinking of adding some new functionality to my BashPodder pages. I wanted something to help me keep track of information and new user contrib scripts there instead of me manually taking care of it all. The best solution I could think of was a forum. Then, I got thinking that I might just e able to use that forum to help me manage multiple software projects. A good idea was forming when Dann started to demand that I set everything up that way. Enough procrastination, he insisted. He wanted to jump right in and be the first to post HIS BashPodder modifications. I couldn’t go fast enough. If you know Dann like I know Dann, you realize you just don’t tell him no or “bad things may happen”.

I conceded and set everything up for Dann. The results can be found at

Seriously though, I set up a nice forum for a few projects that I work on regularly. You will be able to find information there for BashPodder, OSM, Rackspace and TivoGrab as well as others. Please wander over there and register if you use any of those projects. It will be a lot easier to keep things straight this way, not to mention being able to easily share information with not only me, but eachother as well.

Monday, January 19th, 2009

Catch up

I know I haven’t posted in a while again. It’s getting to be one of those things where I have so much going on I don’t know where to start :-)

I have been working on my RackSpace application a bit. Well, enough to get it registered on Sourceforge. Not sure what else to do with it there yet as it’s my first time on Sourceforge with anything and I haven’t had a lot of time to investigate.

I have been playing with my New Ferrets. Cute little fuzzies, they are. Puff is absolutely full of energy all the time. I also think he *may* be deaf – or at least hard of hearing. Teddy is very lovie, however, he has taken to play biting your arm if you hold him long enough. He thinks it’s playing, but he occasionally draws blood. Those fuzzies have darn sharp teeth!

I have been working on my OpenServerMon program a bit lately. Again, this is more of a monitoring framework that takes advantage of the telnet command for tcp port checks and takes advantage of the great bash scripting language. It’s amazing what you can do with a few bash modules! I am so lucky to be able to work on this at work as well. That just means that I will actually have something more substantial (and tested/vetted) to offer here. It’s also monitoring my personal servers just fine so far.

Finally got all my hardware here at the homestead running that way it should. I recently replaced both my router and then my cable modem. Installed Ubuntu Intrepid on my wife’s laptop as she really needed a clean install. Just how *do* people get their systems crudded up like that? Anyhow, it looks like it runs really nice and I would like to do that same on my main systems, but no time yet.

I wrote some backup scripts that will help me get my backups automated. Of course I didn’t add them into cron yet because I am stupid and a glutton for punishment, but they are written and work. Here’s your friendly reminder to do YOUR backups before you lose all your important stuff like I will.

Had to run to Staples tooday to buy a labeler. $20, ya can’t beat it and it seems to do the job, although I haven’t had a lot of time to play with it yet. While I was there I picked up a Cyber Acoustics USB headset/microphone. I picked it up hoping that there would be some info somewhere that let me know if it worked with Linux. There wasn’t. I took the dive anyhow and opened it up to test and it appears to work just fine. Of course I thought this would be a great thing for the TechShow since everybody pisses about our audio, and mine always seems to be the worst. Of course, there were immediate complaints about how crappy I sounded while I was on the show using the headset, however, I *think* I may have that nailed down to the codec I was using by default. No time to test again tonight so it’ll have to be next time.

Lastly (for right now anyhow) I have been working on my timelogger program. It’s been a long time since I have touched it, but I have been using it literally for over a year every day. Basically it’s a project/time tracker program written, of course, in bash. I am such a command line junkie. It also uses sqlite, which is a nice little DB that I think I will wrap into some logging on the OpenServerMon. And that’s how I tie all this together.

OH, and I also have a book or two from my friends at APRESS to review. Stay tuned for that too!

Wednesday, December 3rd, 2008

New Modem

I wrote a little bit ago about my writing a little service monitor script, OpenServerMon, yes, I know the name has changed already. Well, the reason I wrote the script initially is I wanted to be notified when my home internet service went down. Wouldn’t you know it, it worked…. A LOT!

Apparently, I put this thing into place right at the right time because my cable modem started to go on the fritz. My service shut off sometimes several times a day, and each time I was paged. Obviously, this means my script works great, however, I became quickly annoyed about my crappy cable modem, and it was only a year and half old. Well, tonight after resetting it twice, I finally hit Walmart, where I bought a nice Linksys cable modem.

To my amazement, this is the very first time I called Comcast and dealt with someone who wasn’t a complete idiot. I got a nice fellow in the phone who simply added my mac address and I was up and running. Now hopefully this will be the end to my home internet problems, and the annoying 3am pages about my service going down too!

Now as far as the OpenServerMon, I promise, I will be putting that up to share as well soon :-) I would really like to see someone else use/test it before I make it available to the masses though (if you want to beta for me, shoot me an email).

Sunday, November 30th, 2008


Apparently the code tags here don’t really like some code. This is made notable by the complete omission of some of the code that belonged in the sed commands of the script. So, in order for you all to actually get the complete code I put the code here at Have at it and enjoy!

Friday, October 31st, 2008


I was reading my rss feeds today and ran across this great article telling how to use wget or curl to make a twitter post. Well I just happen to use identica (free and all) server that my friend Dann set up, so that stuff didn’t quite work. Drat. Well, when I saw Dann pop online I suggested that he check into it because it sure would be cool to be able to microblog straight from the command line. After all, commandline=good! Dann said he would look into it.

Well, as it turns out, I am terrible at waiting and after cruising the net for some more specific information about how identica does things I stuck together a nice little bash script myself to read and post to that particular identica server. Here’s what the code looks like:

# Based in part on article found at


function readpub()
curl -s $puburl > /tmp/tweet.tmp
echo "-------------------------------------------"
while read line
if $(echo "${line}" | grep -q "text>")
echo "${line}" | sed 's///g' | sed 's/<\/text>//g'

if $(echo "${line}" | grep -q "created_at>")
echo -n "${line}" | sed 's// @: /g' | sed 's/<\/created_at>//g'

if $(echo "${line}" | grep -q "screen_name>")
echo "${line}" | sed 's// by: /g' | sed 's/<\/screen_name>//g'

if $(echo "${line}" | grep -q "")
echo "-------------------------------------------"
count=$((count + 1))
if [ $count -eq $maxitems ]
done < /tmp/tweet.tmp

function postmsg()
curl -s -u ${user}:${pass} -d status="$msg" ${posturl} 2>&1>/dev/null

getopts "rp:" flag
case "${flag}" in
r) readpub ;;
p) msg=${OPTARG}; postmsg; readpub ;;

If you would like to use this for identica or twitter for that matter there are only a couple changes you need to make. You have to change the variables in the beginning of the script to reflect your own settings. User is obviously your username and I understand that some services make you use your email address for this. Pass is simply your password. Note, however, that if your username or password contain wierd characters like spaces, you probably want to enclose them with double quotes. The maxitems variable tells the script how many posts to display. Five items seems to fit nicely into a standard xterminal window. Lastly, you need to alter the URL’s there, or simply join the identica server in our own little corner of the web. The url of the article included in the bash source code will give you the correct url’s for twitter and if you are using identica, you should be able to figure it out from the url’s I have provided in the script that connect to the server I use.

That’s really about it. The script is easy to use. Use the dash r option to read the recent tweets. Use the -p “some text here” option to post “some text here” to your account. Do note, however, that at this time the program does not do any urlen/decoding conversions, so watch your punctuation. If your post does not seem to actually post, that’s probably why.

As always, let me know what you think and if there is any interest, I’ll be happy to post up a project page for it and we can make it work a bit better. In fact, I keep thinking that it’d work really well as a command line php script, but that limits the userbase a lot more than bash does :-)

Friday, October 31st, 2008