Taking over from a failure of an IT company

Attention: This content is 8 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

Taking on a new client is a fairly normal occurrence most of the time. It usually goes decently smooth, getting domain and hardware passwords transferred over, sharing knowledge collected over time, making notes of any gotchya’s or unique issues with a client. Every once in a while though taking over a client leads to a complete horror of horrors in discovering how many things were done wrong and what a dangerous position the previous company had left their now former client in.
I’ve been doing this for a decade now and I thought I’d seen it all, but a recent case has proved to me to never underestimate the ability of someone to royally hose things up.
The original reason we were called in was because they had complained of their server freezing up. They had called their IT people 2 weeks ago and they kept getting put off. They were tired of their server freezing so they called us in. What did we find on arrival? A failing hard drive. Something that could have taken down their entire business, and the former IT company put it off for who knows whatever reason?! The good news was the disk was in a raid array, so they had some redundancy, but the failing disk was still causing the server to hang quite frequently. So, we replaced that and rebuilt the array.
The next issue we discovered during array maintenance, and that was a completely dead battery on the controller. So, we replaced the battery.
Next up, the server wasn’t even on a UPS. It was plugged in to the “surge” side (not the battery side) of a UPS, and the UPS wasn’t even big enough to handle the server anyway. So, we got them an appropriately sized UPS.
So, what if the array had died? What if they had lost power and ended up with corruption from a dead array battery and absent UPS? Well, they could have restored from backups, right? HAHAHA! No, no they couldn’t have. The “cloud” backup they were being charged from their previous company wasn’t even backing up any shared files. All of the business’s proprietary data would’ve been GONE. Their cloud backup was only configured to back up the “Program Files” directory, which would’ve been god damn useless in a disaster recovery situation.
While we’re on the subject of billing for services not being provided, we also found that they were getting charged for website hosting. The problem? Their IT company wasn’t hosting their website. They were hosted at another provider in town. The ONLY service their IT company was hosting was public DNS for their site, yet they were billing them at full website price. Nice little scam they had going there, don’t you think?
I wish I could tell you the horrors stopped here, but they don’t.

Read More

A word about Ubiquiti EdgeRouters

Attention: This content is 8 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

EdgeMAX-logoWell, it is 2016 and my Linksys E2000 router that I’ve been using since 2010 and running DD-WRT on was still in use.  It was still a fine router for what I was using it for, but it is starting to show its age.  For one, it didn’t support dual band Wifi (2.4ghz and 5ghz).  A month or so ago I decided to replace the Wifi functionality part of it with a Ubiquiti Unifi AC-AP-Pro to get better coverage and better wireless speeds, both of which were accomplished.  I’ve been pretty impressed with it and Ubiquiti’s controller software that I was becoming interested in some of their other products.  As a coincidence, my local Fiber-to-the-Home ISP announced that they will be rolling out Gigabit fiber access.  Previously, you could only get up to 250mbit.  I was on the 50mbit package, but for the Gig rollout, they’re running a promotion where you can lock in to the Gigabit speeds for as long as you have service with them for only $10/more a month than I was paying for the 50mbit fiber.  So, 20x the bandwidth for $10 more a month is a no-brainer for me.  This meant I had to upgrade my router though.  My Linksys E2000 running DD-WRT was only capable of about 60mbit throughput on the WAN interface due to its aging CPU.  I was already pushing it close with my 50mbit net, but Gigabit would be far too much for it to handle.  So I did some research and ended up selecting the Ubiquiti EdgeRouter Lite.  These are powerful little machines that run Ubiquiti’s EdgeMax OS which is a customized fork of the linux Vyatta routing software suite.  It seemed to have the best bang for the buck features, it was from Ubiqiuiti which I was already interested in and have one of their products, and most importantly it can push FULL gigabit line speed through the WAN!

Read More

Let’s Encrypt!

Attention: This content is 8 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

letsencrypt-logo-horizontalThanks to the Let’s Encrypt project you can now browse my website in glorious https using the free Let’s Encrypt certificate.  It was fairly simply to get the certificate issued and set up the automatic renewal job on the web server.  Let’s Encrypt is providing an interesting service to the masses by making basic https (TLS) encryption free and easily accessible.  In years past SSL certificates would cost hundreds of dollars.  That has changed in recent years with basic certificates coming down to just $5.  But, for sites like this one that I run for fun and don’t make money off of, even $5 seemed like an unnecessary cost and hassle.  Well, Let’s Encrypt virtually eliminates both of those two final barriers.  By providing free certificates issued through their script, there is practically no reason to NOT be running https on your site now.  Thanks Let’s Encrypt 🙂

Why isn’t everyone doing 2-factor Auth?

Attention: This content is 9 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

17v9nnjz8cwlijpgSeriously, it is 2015 now.  Every big service provider should be supporting some form of 2-factor authentication.  Google is a prime example of the right way to implement this, and everyone should be following their lead.  This weekend I had an email account I hadn’t used in over a year get its password cracked.  The bot then pulled my extremely outdated online address book and sent spam links out to them all.  Fantastic!  So, I changed the password and deleted all of the contacts out of the address book.  Had this provider (cough… AOL …cough) had a 2FA implementation this would have NEVER been able to happen.  Their service wouldn’t have been used to send out spam, and I wouldn’t look like a doofus with an apparently weak password on that old account.

I’ll also add, if you have a service like Google and you’re NOT using 2FA, you need to go set that shit up right now.  It makes your account nearly IMPOSSIBLE to get in to unless the hacker also has your physical device (usually your phone with an app, I recommend Authenticator Plus) to access your account.  Knowing your login name and password alone would never get them in.

Wondering if a service you use supports 2FA or now?  Well, check out this nifty website: https://twofactorauth.org/

Fix Synology Sickbeard Shortcut When Using https

Attention: This content is 10 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

sickbeardI have been working on getting SickBeard setup on my Synology DS1512+ NAS, and I’ve got pretty much everything worked out.  One of the final things I wanted to get working properly was https support with my self signed certificate I setup for my Synology.  I know, not really very important since I’ll only ever access it over my lan or via a VPN, but still… I went through the trouble of getting the self signed certificate working on my Synology, I wanted it to work here too.  It was a little tricky in a couple of regards.

First, I had to get it to use my certificate and key. I tried linking straight to the existing ones the Synology uses in /usr/syno/etc/ssl sub-directories but SickBeard just refused. I figured it was a permission issue since those certs were owned by root only. I decided the easiest way was to just copy over the 2 files I needed in to SickBeard’s directory and switch their owner:

Andromeda> cd /usr/syno/etc/ssl
Andromeda> cp ssl.crt/server.crt /usr/local/sickbeard-custom/var/server.crt
Andromeda> cp ssl.key/server.key /usr/local/sickbeard-custom/var/server.key
Andromeda> cd /usr/local/sickbeard-custom/var
Andromeda> chown sickbeard-custom server.crt
Andromeda> chown sickbeard-custom server.key

Then alas I was able to put in server.crt and server.key in SickBeard, restart it and it used my certs!  Woot!

I was pretty happy with myself until I clicked the SickBeard shortcut in the Synology menu and was greeted with my second issue:

The client sent a plain HTTP request, but this server only speaks HTTPS on this port.

Oh you son of a…

I’m way too anal about my things all working properly to live with that atrocity, so after a minute of poking around I quickly found the config for it the following file: /usr/local/sickbeard-custom/app/config

Pop that file open in vi and change the protocol line from http to https.  Save and quit, then simply reload your Synology web interface, and bam! Your shortcut will work once again, launching Sickbeard via https! Yay!

sbshortcut

MS Surface Pro Tablet Redux

Attention: This content is 11 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

Five months is an eternity in the technology industry.  Five months ago I wrote about how I was excited for the Windows Surface Pro tablets to hit the market.  Today, I couldn’t care less about them.  A few key factors have caused my 180° turn around from my previous excitement.

First off, I ended up buying an HP Ultrabook.  I ended up with the Envy 4 model.  I couldn’t take the slugish Atom powered netbook any more (I’m surprised I lived with it for as long as I did), and I needed something portable and with power.  I snagged a third gen Core i5, 8gb of ram, a 32gb ssd with 500gb hd, and wifi and bluetooth built in.  It was pretty much exactly what I wanted, and performs close to my gaming rig desktop (aside from games of course, but the Intel HD 3000 graphics actually do an ok job).  With the acquisition of the Ultrabook, I don’t really feel the need for a Surface Pro tablet any more.

Second, Windows 8 is an absolute train wreck.  I haven’t written anything on Windows 8 yet, but I think it is a disaster.  The new “start menu” gui is just poorly done.  Microsoft breaks GUI design 101 at every turn of the corner in 8.  Microsoft is trying way too hard to be like Apple, but they’re forgetting one thing; they’re not Apple, and that is why they control 90% of the desktop market.  Abandoning clean user interfaces for cluttered up disasters is the absolute wrong direction to take Windows and it is going to cost Microsoft dearly.  Now that I have spent time with Windows 8, I want no part of it.

Third, the limited storage options are a disappointment.  I know it is a tablet and so should be SSD, but 64 or 128gb?  That’s it?  When the Windows 8 install eats up 16gb of that?  Down right pathetic.  For a device that is supposed to replace a laptop, crippling it with such minuscule storage is going to prevent it from ever taking hold.

Anything Microsoft Surface is a PASS on my list.

Tricky php.ini settings

Attention: This content is 12 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

There are several ways PHP settings can be adjusted.  The 2 most common are in the Apache config file, or usually, the php.ini file.

Typically you can use phpinfo(); to get the php settings.  This is extremely useful as it also shows EXACTLY which php.ini file is being used to pull its configuration from.

I recently ran in to a particularly frustrating situation trying to adjust the settings.  I had already grep’ed throughall the Apache config files and knew there were no php configuration settings hidden in there.  So, I knew it was pulling its config from the php file.  In the past it has been a simple matter to adjust the value in the ini file and then reload Apache (this causes php to reload).  Well, this time it wasn’t working.  No matter how many times I killed Apache’s processes and restarted it, it was STILL pulling the old values.

I was really about to lose my mind when I ran the phpinfo via the php cli, and it showed my new correct values!  Arggg!!!

After a shamefully long amount of time trying to figure this out, I FINALLY found the solution.  Well, unbeknownst to me, this server was using something call php-fpm.  Php-fpm is a FastCGI Process Manager, which is an alternative to PHP FastCGI (which if was being used would have also hung me up).  This basically spawns php in a separate service to allow it to work faster.  Well, since php is basically its own process, you have to restart the php-fpm service to get Apache’s PHP to reload the config file!

[root@myserver php53]# /etc/init.d/php-fpm restart
Gracefully shutting down php-fpm . done
Starting php-fpm done

And bam, I finally got the new settings to load!  Note, if you’re using the standard FastCGI module you’d use php-fastcgi restart instead.

I feel silly for not knowing about this php-fpm process before… but I’m used to Apache just calling php directly itself.  This was a set up I had not dealt with before, so it was time for me to learn something new.

Happy PHP’ing folks!

Why can’t Winamp stay on top?!

Attention: This content is 12 years old. Please keep its age in mind while reading as its contents may now be outdated or inaccurate.

Seriously, this pisses me off to no end.  Winamp has had this problem for years.  I have the option set:

 

After a couple songs:

 

I love Winamp as an audio player, I really do.  But it drives me crazy when I go to pause the music or skip to the next track, and Winamp is not where it is supposed to be, and end up having to minimize my 3,847 windows to find it.