Archive for the ‘System Administration’ Category

Replace HD in Dell Inspiron N5110

I am no stranger to replacing bad equipment in servers, desktops and laptops, but some laptops don’t make it easy. This was one.

A couple years ago I swapped out an aging hdd in an older Dell Inspiron with a new ssd and, boy, the performance improved drastically. Lately, I have been using a new(er) Inspiron, an N5110 and have noticed that it sure took a while for things like bootup and Chrome to initially load. It was really starting to annoy me, so I looked up the specs on the original hdd and found that there was a squirrel in there pounding out the bits with a chisel, so I decided it was high time for a modern drive and splurged on a 240Gb ssd. I assumed that this was a simple pull the panel off the bottom and swap kind of procedure like the old Dell, so I pulled off the hdd sized panel and boom. The only thing under there was more plastic and a small memory slot???!!

Not to be outdone I turned to youtube, just like an self respecting techie would and was pleased to find some instruction there. You can find the video i used here if you are interested:

That is where is starts to get fun. Apparently you have to disassemble THE ENTIRE LAPTOP to get the hdd out. You have to pull out the battery, memory, all the screws on the bottom, the dvd drive, then flip the machine over and pull off the keyboard, unscrew and pull off the top plate and all the ribbon cables, then unscrew and remove the entire motherboard and one of the monitor mounts. The hdd is underneath the motherboard. Unreal.

Believe it or not, after all that I only had one extra screw(?) and the laptop booted up on the first try. Now came the good part. How to get my existing Linux Mint install onto the new ssd. Normally I would have just used a disk cloning program or dd to do it but the old hdd was 500Gb and this new ssd is only 240Gb. There are also some complicated tutorials on the web on how accomplish this task but let me share with you the easy way.

Do a clean install of your OS. Really. With Linux it takes 15 minutes tops. Don’t bother with any of your configs or personalization. It’s a dummy install to not only get the partitioning correct on your ssd but generate the correct /etc/fstab file (or get the new uuids and make the correct partitions bootable.

Once you are done, boot into your install media again (I used USB because it was faster) and mount your new installation AND your old hdd (I used an external usb drive case for this). I made the directories I needed by doing (as root) “mkdir -p /mnt/newdisk ; mkdir -p /mnt/olddisk” and then putting things in place with “mount /dev/sda1 /mnt/newdisk ; mount /dev/sdc1 /mnt/olddisk”. I should mention here that my partitions were the default Mint layout with a big Linux partition first, then an extended partition, then swap, on both drives.

Once mounted I made a backup copy of the /etc/fstab on my olddisk (the hdd) and then I copied the /etc/fstab from the newdisk to the /etc/fstab on the olddisk. Now the fun part. Go to (cd) the /mnt/newdisk directory. MAKE SURE IT’S THE NEWDISK DIRECTORY, and “rm -rf *”. That is going to delete all the files you just installed. It’ll only take a second.

Next is the long part. I used rsync to copy all my old files over. If you aren’t a hoarder like me with six linux dvd isos in your download directory and 50Gb of music files, it’ll go a lot faster, but all the same, it’s pretty cool to watch. I did a “rsync -rvlpogdstHEAX /mnt/olddisk/ /mnt/newdisk”. Make note of those /’ in there or you’ll end up having to move stuff around afterwards. In retrospect, I think you could use just rsync -av, but ymmv. What you will see is every file on your old drive being copied to the new one. Like I mentioned, this takes a few minutes, just sit back or grab a coffee. Once it’s done you are *almost* ready.

The very last thing you’ll need to fix is your grub.cfg file. These days everyone wants to use uuid to assign devices and your boot file is still looking for your old hdd. Open up a couple terminals. In one, vi /mnt/newdisk/boot/grub/grub.cfg and in the other vi /mnt/newdisk/etc/fstab. In the fstab file you will see the uuid for your new ssd drive. It’s the first uuid mentioned and mounted at /. Io You need to replace the old one in there with the new one from your fstab. It’s easier than you think in vi. Just do a “:g/olduuidstring/s//newuuidstring/g” and hit enter where olduuidstring is your old uuid and newuuidstring is your new uuid from the fstab file. Once it is finished replacing you probably need to save it with a “:wq!” because your system will undoubtedly say it’s a read only file. The reboot! You should be greeted shortly with a much faster but very familiar linux install, complete with all your goodies.

One last note. You may want to increase the life of your ssd ehink in vi. Just do a “:g/olduuidstring/s//newuuidstring/g” and hit enter where olduuidstring is your old uuid and newuuidstring is your new uuid from the fstab file. Once it is finished replacing you probably need to save it with a “:wq!” because your system will undoubtedly say it’s a read only file. The reboot! You should be greeted shortly with a much faster but very familiar linux install, complete with all your goodies.cat by adding a couple options to your /etc/fstab file. Those options are discard and noatime. These options deal with extra disk writes that you really don’t need on ssd. Your / line options in the fstab should look something like “ext4 discard,noatime,errors=remount-ro 0 1”.

Enjoy!

Wednesday, March 16th, 2016

It’s NOT Telecommuting!


OK, so it is telecommuting – but hear me out for just a second..

I have been involved in a job search as a Linux admin for a few months now and one of the barriers I keep running in to is (get this) physical location, or company location. WHY? Business owners, let me reason with you for a moment here.

Your servers are “in the cloud”:
There are a LOT of companies these days who are using cloud servers and services. Buzz words like Paas, Saas and Iaas are all the rage now, along with their providers AWS, Rackspace, Azure, Google and the like. These services that you use locally for your business are not actually located at your business. Likely, they are not even in the same time zone, and, in some cases, country. Every time one of your server administrators or users access those services and systems, they are doing so remotely, even if they are sitting at a desk next to you in your corporate headquarters.

You have “datacenters”:
For those of you who have your own datacenters for your machines, you have the same issue. Most companies have at least two such facilities for redundancy and either one or both of them are typically located away from your corporate campus. This, again, means that when you are working on them in any capacity, you are doing so remotely, or “telecommuting”, whether it be from your corporate campus, from, home or across the world.

So you see, in almost every scenario in these modern times, you are already telecommuting to use your own resources. I am here to implore you to consider expanding your employment pool by letting computer workers do their jobs remotely. Save yourself some real estate space. Use conference calls, instant messaging, emails and video chats (free) for your office communications. Dramatically lower your corporate utility bills and *paper costs*. And give someone like myself a shot. You’ll be happy you did!

Tuesday, March 1st, 2016

“Fixing” an old laptop

Dell Inspiron 1545

Dell Inspiron 1545


A few years ago when I was in the market for a new laptop I picked up one of the then wildly popular and cheap Dell Inspiron 1545. There are gobs of these running around now and you can find them cheap if you look (click the pic for links to Amazon). I used this for for, it seems, forever. I only ever had one problem with it – a small plastic chip in one of the corners that I repaired with superglue (you would never notice). Lately, though, it has been running noticeably slow. I don’t know if it’s because it’s actually getting slower, the software is just getting fatter, my work computer is blazing fast in comparison, or a combination of any/all of those. Either way, it’s really been bugging me so much lately that I had considered just getting a new lappy. Before I did, I decided to look over the specs to see what I actually had here. Mine is a core duo 2.2Ghz with 4Gb ram and a 320gb HDD. Running Linux this thing *should* run like it was on fire. So why so freaking slow? A quick look at “top” revealed what had to be the problem. I was at almost 0% CPU and only 1.5Gb ram. It HAD to be the slow as pencil and paper hard drive writes and reads. A quick search says that somewhere in between now and the last time I came up from air at work SSD drive prices dramatically reduced, so I stopped by a bigbox store and picked up a 240Gb SSD for <$100 and screwed it in and WHAMO! It's like I have a brand new laptop! Seriously! Not only is the difference noticeable, it's amazing, so much so that I needed to break my blogging silence to tell you about it. If any of you have an aging laptop like me that runs but is "meh", it's totally worth it to spend the 15 minutes it takes to do this upgrade. It certainly just saved me $500 and I am now, once again, perfectly happy with my trusty old (but well kept) Dell Inspiron 1545.

Sunday, April 26th, 2015

Review: Penetration Testing with the Bash shell by Keith Makan – Packt Pub.

Penetration Testing with the Bash shell

I’ll have to say that, for some reason, I thought this book was going to be some kind of guide to using only bash itself to do penetration testing. It’s not that at all. It’s really more like doing penetration testing FROM the bash shell, or command line of you like.

Your first 2 chapters take you through a solid amount of background bash shell information. You cover topics like directory manipulation, grep, find, understanding some regular expressions, all the sorts of things you will appreciate knowing if you are going to be spending some time at the command line, or at least a good topical smattering. There is also some time spent on customization of your environment, like prompts and colorization and that sort of thing. I am not sure it’s really terribly relevant to the book topic, but still, as I mentioned before if you are going to be spending time at the command line, this is stuff that’s nice to know. I’ll admit that I got a little charge out of it because my foray into the command line was long ago on an amber phosphorous serial terminal. We’ve come a long way, Baby 🙂

The remainder of the book deals with some command line utilities and how to use them in penetration testing. At this point I really need to mention that you should be using Kali Linux or BackTrack Linux because some of the utilities they reference are not immediately available as packages in other distributions. If you are into this topic, then you probably already know that, but I just happened to be reviewing this book while using a Mint system while away from my test machine and could not immediately find a package for dnsmap.

The book gets topically heavier as you go through, which is a good thing IMHO, and by the time you are nearing the end you have covered standard bash arsenal commands like dig and nmap. You have spent some significant time with metasploit and you end up with the really technical subjects of disassembly (reverse engineering code) and debugging. Once you are through that you dive right into network monitoring, attacks and spoofs. I think the networking info should have come before the code hacking but I can also see their logic in this roadmap as well. Either way, the information is solid and sensical, it’s well written and the examples work. You are also given plenty of topical reference information should you care to continue your research, and this is something I think people will really appreciate.

To sum it up, I like the book. Again, it wasn’t what I thought it was going to be, but it surely will prove to be a valuable reference, especially combined with some of Packt’s other fine books like those on BackTrack. Buy your copy today!

Wednesday, July 16th, 2014

Linux System Administration LiveLessons By Ben Whaley (Pearson)

http://www.informit.com/store/linux-system-administration-livelessons-video-training-9780133551310

Wow, where do I even start. This is a LOT of material and really, my first review of a lengthy video (series). The series consists of 9 downloadable .mov files which total up to approximately 1.3Gb of space and around 350 minutes of video, or about 5.5 hours according to my video players calculations.

The first noticeable bonus from a video series as opposed to a book, is, well, video. You get to watch commands and examples in real time along with the information. Of course, the inverse is also true and if you are looking for quick reference or brevity then a book is really the way to go. Somehow, however, it almost seems as though I tend to get less distracted from the content with video than with a book. That can indeed be a bonus!

There are 9 video sections or selections in this series and the are as follows: Where to start, The Shell, Booting and Shutting Down, Access Controls and Root Powers, Controlling Processes, The File System, Log Files, TCP/IP Networking and finally, Security. This really is an exceptionally wide range of information to cover, I think, and that brings me to my review.

This videos series says it is aimed at Linux beginners, Administrators familiar with other OSes and Anyone interested in learning about Linux. All in all, I think that covers exactly everybody, everywhere. If you combine that with the enormous amount of information that wants to be covered in the subject material it just makes the objective impossible. I found the information good in some areas, too advanced for general and new users in others and completely missing in places as well. Even topically it seems a bit disjointed to me, for instance talking about how to “start out” without ever stepping through an actual Linux install, just use some pre built virtual machine copy. You hear a lot about running Linux via Vagrant and Virtualbox but as an actual System Administrator, I can assure you, that is not how most people run it. I realize we are talking nuts and bolts OS stuff here but I also found the content a bit dry. Some user or admin stories would have helped a great deal in that area. I would think finding a way to keep the interest of your audience would be even more paramount when dealing with dry technical content.

Now, does this mean it was all bad? Not at all and don’t walk away from this review with that impression. There is some genuinely good information buried in there for most Administrator levels, just realize that if something sounds too advanced or technical for you, skip to the next video chapter, much like you would in a book. Ben seems to not only know what he’s talking about but I don’t think I noticed him saying “er” or “ah” or “um” in nearly 6 hours of video 🙂 Usable as it is, the perfect fix for this would be to split the info up into 2 *much* shorter general videos. Aim one of them at the total beginner and aim the other at advandced. You may even want to break off some of the heavier topics for their own videos where they can get more specialized attention. Networking would be a great candidate for that.

I love Pearson to death as they have some of the best techie content out there, but this one needs some work I think.

Monday, June 9th, 2014

Linux Recruting

I get a LOT of emails from headhunters, many asking me to come work for them doing every-damn-thing for no money as a “consultant” on (only) a 6 month contract 🙂 I am sure all tech people do. Occasionally I get email from a recruiter who is actually asking me for help looking for a decent Linux person. I got one of those this afternoon. In summary, the email went like this:

I am looking for (Linux Admin) and you probably aren’t looking but I am having a hard time and could you help point me somewhere I can find one?

I always respond to those emails, and, for posterity and for any recruiters watching, here’s the answer:

Not necessarily true. I am always looking 😉
I get a lot of requests and offers and I’ll tell you what turns me off and that may help you find someone. Linux guys with any experience are in really short supply and they are a unique breed of techie. Most are driven to Linux by the premise of free software and/or open source ideals, and as such they do not necessarily have (current) windows skills and are even more likely to not be interested in using any that they do have. I fit into that category. Also, not every Linux guy is a java programmer/desktop technician/helpdesk/printer mechanic/insert other required skill set jumble here. I see a lot of those. “We need a Linux guy that will fix our windows desktop, program new device drivers, fix our mainframe and telephone system, sweep floors and wash cars” kind of things. Those kind of people do not exist 🙂 Lastly is the compensation. Most companies have dealt with the influx of paper certed, dime a dozen MCSE’s for their technical needs and they truly believe that anyone out of grade school can “do tech” for them. It has greatly devalued the industry as a whole. They do not understand the real high skilled people are rare and expensive and can *easily* find work, which is why most Linux/Unix people have not been effected by the technical recession.

So I guess in short,
Linux guys are almost always staunch Linux guys (and if they are not, be suspicious).
Be specific in what you need but remember that These kinds of tech guys are quick at catching on to related technologies so try and be general where you can. For example, there are a bunch of scripting languages and all of them are capable of getting the job done, so say you need a scripter instead of you need a perl scripter.
Be prepared to offer more compensation for a rarer Linux tech than you would an unemployed Windows tech.
Advertise in the right circles. When I get offers, I often send them out to some of the mailing lists of Linux techs I am on, and there are some great Linux groups on Facebook and Google Plus. There are also websites like Linuxquestions.org where Linux geeks hang out.
Lastly, if all else fails, try a few less experienced Linux guys.

Thursday, December 12th, 2013

Advanced Programming in the Unix Environment 3/ed

Good gracious this is a big book! What’s funny is I KNOW I have read and reviewed a previous edition of this book and I spent a half an hour looking for it this morning, but it must have been before I moved and on my old Blog. That being the case, well it’s high time you heard about this monster!

This book, Advanced Programming in the Unix Environment, by Stevens and Rago, is the 3rd edition of what is, essentially, the Unix Programming Bible. In fact, so much so that I cannot imagine any serious Unix/Linux/**ux contributor that doesn’t own a copy or at least know what it is.

This is *not* light reading. It is a reference book. This is the stuff geek dreams are coded in and you are going to want to be familiar with the C language to get a lot of this.

All the internal workings and ideas about this kind of operating system, how it works, or is supposed to work and code examples are included here. The least technical chapter in here is the 1st, which is the overview chapter. This goes over things like input/output, files/directories, processes, error handling, and system calls. From there, the chapters narrow in more on specific subjects like Process control, Daemons, Signals, Threading, etc.. Like I said, there is a LOT of very specific information in here. That being said, if you are developing anything more than some scripting, this has what you want to know. This is not to say that those are the only folks that can get anything out of this book, though. Even without understanding the code examples, a person could get a good understanding and overview of how this fantastic type of operating system works, and why. This is the category I find myself in more than any other. Although I have done some C programming, I find myself using this book to help me conceptualize how things are working the background.

No self respecting Unix/Linux geek should be without this book in one format or another. The hard copy I have was sent to me by Pearson Education for the purpose of review. They sell this in book in dead tree format for $70 and $45 for the electronic version. That may sound like a far bit of money, however, remember this is not a story book you read once, this is going to be something you turn to for the right information when you need it. I almost always give away my review books after I read through them, but this one is sticking around. In fact, I am just going to take it to work with me so I can have it handy where I would normally need the information anyway.

Saturday, September 7th, 2013

BackTrack 5 Cookbook: Quick answers to common problems

BackTrack 5 Cookbook

BackTrack 5 Cookbook

You know, sometimes, just sometimes something fortuitous happens to me. This was one of those times.

I was contacted by my friends over at Pakt Publishing to review their new book on BackTrack. Of course I said sure. Hey, I am a Linux junkie after all! It had actually been quite a while since I had played with BackTrack and this gave me *just* the incentive I needed, but let me tell you a bit about the book…

The book is a “cookbook” style book which gives you “recipes” or guided examples of common problems/scenarios and their fixes. The book is well written, a good reference for a pro, and a great tutorial for the beginner, and by beginner I am assuming that the person *does* have Linux experience, just not BackTrack experience as some command line comfort is pretty much a necessity for this kind of work. The first 2 chapters start you out exactly the way they should, by installing and customizing the distribution. What they don’t tell you is it takes a good while to actually download the distro, but that is beside the point.

Once you actually get things running well, you can follow the book through some really decent examples from Information Gathering all the way through Forensics. The book covers all matter of subject matter and applications in between such as using NMAP, Nessus, Metaspolit, UCSniff and more. I mentioned that this was fortuitous for me and that was because one of the things the book covered was the Hydra program, and, as it turns out, that was the perfect tool for me to use in remediating some password synchronization issues across several hundred servers.

Anyone using a computer should have at least a basic understanding about keeping their valuable data safe, whether that data is for a multi-million dollar company or your own invaluable family photographs. This book goes to great efforts to not only explain how to detect, analyze and remedy such issues, but also gives important background about just how systems become vulnerable to begin with. If only for that reason alone, it’s worth the read. If you are actually a sysadmin, this information is a must. For $23 for the ebook version, it’s a no brainer. Good book. It helped me out and I’ll wager that if you give it a read it’ll do the same for you!

Monday, February 18th, 2013

Screenshots

I have long been fascinated by different peoples computing environments. Somehow I believe it shows a little glimpse into someone’s mind. With that in mind, I thought it might be interesting to other people as well so I polled a group of my friends who are some of the most influential computing buddies I have. Here is what they sent:

Name: A.W.
What do you do?:
I’m a NetApp Wrangler and Windows Sysadmin by trade. Looking to add storage admin as well (EMC/Cisco).
Tell me about your DE?:
My main workstation is my MacBook. I identify with this machine the most and my desktop environments tend to show my personality and style choices. I like IBM style green on black terminals which I have been addicted to ever since I installed my first AIX machine (a POWERStation 320 that I got for free from my ex-girlfriend’s office). The desktop is a stylized Sylvanas the Banshee Queen of the Undead from World of Warcraft. I don’t currently play the game but I’m into zombies and undead stuff as art and game play (and hot pale powerful gothy women). My Windows 7 machine is a gaming machine and also used to do my work as it’s the best machine to log into our VPN with. It’s an Alienware with the Phobos Red theme and the LEDs are currently all set to red with a pulsating skull on the front. It’s kind of Darth Vader. Alienware does nice themes and some of the nicest pre-installs I’ve ever seen (yes, the first time I didn’t wipe the OS that came with the system)… It has no shovelware. I’ve owned the Powermac G5 Quad for years and bought it to be the last and best PowerPC machine. Eventually I was no longer using it as I supplanted its use with my MBP which I can carry all over the house and use wireless N with. Wanting to breathe new life into it, it became a PPC Linux test box and I’ve found the best environment with Fedora Core 17 Beefy Miracle. I’ve replaced the desktop graphic with something nicer than the default fireworks that is still Fedora themed. The Firefox window is a shot of my home file server control panel. It’s a red aluminum cased custom AMD A4 build with 8 GB of RAM, 6 x 2TB Seagates (SATA3)ZFS RAID6 and a memory stick to hold FREENAS 8.0.4 x64 MULTIMEDIA. Since it’s red I named it after my favorite Motts discontinued beverage: Beefamato.
aw2
aw3
aw1

Name: D.C.
What do you do?:
Programmer and professional Bearded Curmudgeon.
Tell me about your DE?:
vim is my IDE, and I have a window open full screen, split into up to eight or so buffers on my main screen. On a second screen I have terminals for running my code’s tests, viewing logs, and for talking to colleagues who work all over the world – my team is split between Utah, the UK, Moscow, and anywhere else that we can find good people. My windows are all slightly transparent when inactive, as it makes it easier to find stuff if I can see it when it’s behind something else. I do, of course, use focus-follows-pointer and click to bring to front, but almost all my navigation is via the keyboard. When I do need to move the pointer,I use a trackball. Desktop? Yeah, there’s one under there somewhere, but I hardly ever see it. It’s a plain neutral colour with no icons on it so it doesn’t interfere with window transparency.
dc

Name: J.B.
What do you do?:
Senior Software Engineer working on cloud managed digital media systems for the retail environment.
Tell me about your DE?:
Windows 7. I run Linux on my desktop, but I never felt like having the distribution to work to change what’s on my laptop, and I use the laptop the vast majority of the time.
jb

Name: J.F.
What do you do?:
Solutions Architect, Enterprise Services, HP.
Tell me about your DE?:
I alternate between a black desktop and this photo of my favorite car. A friend collects vintage gas station equipment and provided the setting when I took this picture. I try to keep my desktop clean and maintain a folder called “desktop-stuff” for all the junk that would normally accumulate.
jf

Name: J.S.
What do you do?:
Retired network engineer now part time Asterisk/VOIP and wireless consultant.
Tell me about your DE?:
Windows 7 for the most part, but I have a Ubuntu 12 VM running X11RDP so I use Remote Desktop rather than VNC. That’s where I do the majority of my compiling & code editing in Xemacs.
js

Name: K.H.
What do you do?:
I’m a senior engineer on the Enterprise Infrastructure Team for a state government. I wrangle Tivoli Storage Manager, VMWARE, DNS, Linux/Apache/MySQL/PHP, legacy and modern UNIX/Linux, SANs, some LAN/WAN, provide support to the CISO in all areas of infosec as needed, and function as troubleshooter of last resort for any given problem.
Tell me about your DE?:
Windows 7 would not be my first choice, but since I have to use Windows-only apps in the execution of my duties, it is the best for the job. I run two monitors, which have different resolutions, but this is the best that can be managed on a restricted budget. Ideally there would be two 23″ monitors, but if we’re dealing in ideals, I would have an Alienware laptop instead of a Dell. The theme is a transparent space-based theme courtesy of NASA, but the background is an image from Stickman featuring some of my favorite tools. Rather than hide the start bar, I leave it up all the time for quick access.
kh

Name: L.F.
What do you do?:
I.T.Manager and Senior Linux Admin, LAMP developer, scripter and all miscellaneous duties as assigned.
Tell me about your DE?:
Mint #newest_version running my usual slew of apps and xterms on 2 dual monitor machines. Dark wallpaper is currently a “black leather”. I like dark unobtrusive wallpapers best to avoid distraction. Windows running in a vm, where it belongs. Just can’t have enough desktop real estate you know! And, yes, that’s mutt for email – best client out there.
lf

Name: M.H.
What do you do?:
I’m an I/T support specialist and dispatcher.
Tell me about your DE?:
I have quite a number of different desktops really. In fact I always have had. When they get cluttered I throw things into folders and eventually archive them if I don’t want to delete them. (My folder structures in my home directories is horrible.) Each system I use has a different purpose. The desktop here is my home daily driver. Multiple screens often dictate what wallpaper I use though frustratingly it’s hard to span wallpaper across multiple monitors. At home I usually use single displays but at the office I use four screens total. Working on adding another one. 😉 As for colors I prefer a darker theme with light lettering. For terminals I prefer a black background with amber text or as close as I can get using a color picker. Green if I don’t have amber as a choice. Translucent terminals look nice initially but are a pain for me to focus on.
mh

Monday, September 24th, 2012

PHP and stuff

Lately I have been working so hard that I haven’t even had any desire to do any fun computering at home. Today that changed a bit.

I decided this morning that it was high time I upgraded my all time favorite rss feed reader, tiny tiny rss. Well, wouldn’t you know it, after I did the install I found it required a version of php higher than I had available on my server. Time to upgrade.

I run Centos 5 on my main server and, by default, that carries a php 5.1.x. I needed 5.2 or greater. As it happens, php 5.3 is available in the repos, so I did the upgrade. For the uninitiated, that entails doing a “yum list installed | grep php”, which gives you a list of what you *have* installed. Next you remove php by doing “yum remove <and name all the packages in the prior list here>”. This, followed by “yum install <list of files for php 5.3>”. For example, I had php-common.i386 and php.i386 installed, so I did a “yum remove php-common php” and then “yum install php53-common php53” to get all my php 5.3 packages on there. This was followed by a quick “service httpd restart” to make sure my webserver was using the new version.

Murphy’s law states that “something will go wrong if it can”. Well, *MY* law states that “something will go wrong”, and it did. As it turns out, I had built a whole bunch of php applications maybe 7 years ago that my wife uses almost daily. In the olden days of php, you could declare a php script at the top by doing a “<?”. NOW, you need to declare it by doing “<?php”. Consequently, nothing I had written worked. It only took me a minute or two to identify why the problem was occurring, but fixing it was another story.

So, how do you find all the files you have to fix? Well, I used the “grep” command. More specifically, egrep. I went to my html root directory and searched by doing “egrep -r “<\?” * | egrep -vi “<\?php” | egrep -vi “<\?xml” | grep -v inary”. What does all that do? The first stanza looks recursively through the directory structure at every file and outputs the ones that have any “<?”‘s in them. The second takes that output but does NOT pass through any that are “<?php”. Why, because they would already be ok! The third takes the results and doesn’t pass through any that contain “<?xml”. The last one doesn’t pass through results from binary files. The end result is I had a list of directory / file / line information of all the files I had to change / update. A few minutes later, after using vim, the best text editor around, I was back up and running!

Saturday, July 14th, 2012