Categories
Uncategorized

Net Neutrality

On the topic of “Net Neutrality”, this is one case where I believe that a true neutral Internet is something that the US should mandate. I won’t write a lot here – use your favorite search engine for “net neutrality” and look at some of the sites that are backing it (Google, Wikipedia, NetFlix, Amazon, etc).

Without it, our Internet service providers can slow or block access to sites on the Internet that either don’t pay them directly, or have a competing product. You might have noticed this in your cell phone advertisements – notice how many offer “free music streaming” if you use their service? Cell companies aren’t held under the current net neutrality rules and can do that; they can pick-and-choose the music that doesn’t use up your data allowance for the month and by extension are choosing what music services “win”. Now image that Google pays your ISP to make searching their site the only search engine that your home computer can access.
Or worse yet, the local WalMart pays the local ISPs to block access to Amazon.com? They probably couldn’t pay your ISP block it, but I’m sure for the right sized check they could get the ISP to make Amazon.com take minutes to come up, but WalMart.com would be instant. Who would you buy from then?

I don’t like the degree of government oversight especially in rapidly evolving technical topics, but this is one that a basic law that says “treat all data as equal” has won me over.
If you’re interested in taking action, head over to “Battle for the Net” at https://www.battleforthenet.com/ and sign up to call your congress men and congress women to ask them to stand with us.
Categories
Uncategorized

Increasing storage on a RHEL 6 virtual machine with LVM

Recently I had to increase the available space on a virtual machine running RedHat Linux 6.   The filesystem was configured with LVM, so the quick answer that came to mind was to add another virtual hard drive and expand.  While this would work, it was “messy” as future expansion in this fashion would get out of hand.

Thankfully the solution was easier:

  1. Confirm your starting state to verify these steps work
    1. Check the free LVM space assuming “/dev/sdb” is your disk in that volume group, and “MyVG” is the name of the volume group.
      1. vgdisplay MyVG | grep Free
  2. Power down your virtual machine.
    1. In my experimenting, I could did this while the OS was running but it wasn’t 100% consistent.
  3. Expand the HDD size within the virtual system host environment
    1. In my case I am using VMware vSphere and increased from 60GB to 300GB.
  4. Boot the virtual machine
    1. It should boot without any difference than before.
    2. Login as root
  5. Run the “pvresize” command to tell the LVM subsystem to look at the new size of the HDD
    1. Assuming the drive in question is “/dev/sdb”, run this command:
      1. pvresize /dev/sdc
  6. You can confirm your LVM has additional space by noting the “Free” section of the output of the “vgdisplay” for the volume group that /dev/sdb is associated with
    1. vgdisplay MyVG | grep Free

 

 

Categories
Uncategorized

Happy New Year!

Happy New Year!

Categories
Uncategorized

Free Ansible training videos from RedHat

My RedHat rep sent me a link to this Ansible on-line training.  It’s not the standard 60-90 minute live walkthrough, this is a set of pre-recorded training videos.

Title: “Ansible Essentials: Simplicity in Automation Technical Overview”

Link: https://www.redhat.com/en/services/training/do007-ansible-essentials-simplicity-automation-technical-overview

As I just received this information today I haven’t had time to look at them, but the chapter titles look like a good overview of the entire Ansible suite.

Categories
Uncategorized

Easier Subversion to Git experience.

A couple weeks ago I posted about converting some Subversion repositories to Git.

Since then I’ve found a different write-up that seemed to work a bit better:

http://john.albin.net/git/convert-subversion-to-git

 

Categories
Uncategorized

Bad user experience…

My VMware support engineer forwarded on the current VMware knowledge base weekly digest, and one of the new KB article titles caught my eye:

Upgrading VMware Tools using PowerCLI or vSphere API (2147641)

Hey!  That sounds like something to look over and possibly provide to my operations team to help them upgrade older VMs that were now running on newer VMware hosts!  Clicking on the URL for that KB article (KB 2147641) brought up the usual Details and Solution sections, but they were strangely lacking.

The “Details” section usually explains the subject of the KB article in detail.  This one just said:

VMware Tools can be upgraded using vSphere PowerCLI, vSphere API and other methods. For optimum performance, upgrade VMware Tools in not more than five virtual machines simultaneously.

And the “Solution” section was even less helpful:

None

Talking with my support engineer, he thinks that the article may have been posted to note that it is possible to upgrade the VMware tools using either of those methods, just the actual steps how to do this were not documented.

I know from a quick search there are plenty of examples from non-VMware.com sites on how to do this:

 

Ironically, there are some links to VMware.com pages addressing this:

I’ve seen odd VMware KB articles in the past – hopefully the addition of a “Tip of the Week” flag or at least a sentence in the Solution field denoting the article is not fully fleshed out solution would save a lot of confusion.

Categories
Uncategorized

Bad administrator, no cookie…

Well, it had to happen.  I finally got my new site up and planned to restore the blog posts to the new site.  My backup files from various past sites were all in place – I had setup a backup script to dutifully collect the data monthly (I didn’t update the sites all that often), and also clean up after itself and only keep three months of backups.

The script ran, the backup files appeared and automatically cleaned themselves up after 90 days.  When I first ran it I verified that the files were complete – I didn’t restore them anywhere, but the blog text was there.  Success!  Add it to cron on my desktop and let it run.

And run.   And run.   And run.  Unattended.  For the past couple years.

I had been lax and wasn’t blogging much so I let my SquareSpace site go away a year ago.  Recently I decided to resume my ramblings..er, um, blogging, so I installed WordPress on my site and went looking through the backups.

The good news, my cron job continued to work dutifully backing up all the blog posts.  Except when I canceled my SquareSpace account, it continued to “backup” the site – except this time the files it saved were essentially “the site does not exist” messages.  (Insert sad face here…)  Thankfully I was able to restore some of the older text using Archive.org and I’m still combing through other old sources.  But much of it is a loss.

So, what did I learn (or re-learn) from all this?

  1. A single copy of a backup is not a backup.  Use the “Three/Two/One” rule.
  2. Don’t cleanup archives when it’s not necessary.  The backup files were small enough (less than a megabyte after compressions), so I could have kept many years in couple gigabytes on my server.
  3. Keep track of services and the processes associated with it – I didn’t need to keep the backup script running after cancelling the service. This didn’t have a real expense associated with it, but how often have we looked at our budget and realized that we’ve continued to pay for something well after we stopped using it.
Categories
Uncategorized

Subversion to Git

I have a Subversion project that I’m migrating to use Git, but I don’t want to loose the history if possible.  I found these steps and they mostly worked for me with one exception (below):

https://git-scm.com/book/en/v2/Git-and-Other-Systems-Migrating-to-Git

The only problem was during the export I got the error message:

Author: (no author) not defined in authors file

After a bit of searching I found this workaround:

Author: (no author) not defined in authors file

In short I had to add this line to my users.txt file:

(no author) = no_author <no_author@no_author>

Categories
Uncategorized

Successful backups in three, two, one…

Let me start off by saying that I didn’t come up with this backup mnemonic, rather Peter Krogh first wrote this up (to my knowledge) in this blog post.

As I recently re-learned, backups even done right are hard to do well.  In my case there’s a chance that I still would have lost my data, but there’s no accounting for human error in every case.

The “Three, Two, One” backup strategy is pretty simple:

  • Three – A file isn’t backed up until there are at least three copies of it, the original and two other copies not on that machine.
  • Two – The backups must be on two different media types.  For example, a hard drive and a DVD drive, or a tape backup.
  • One – Finally, one of those copies should be stored off-site or at least off-line.  A cloud storage service such as Carbonite, Amazon Cloud Drive, Google Drive, or even storing it at a friends house.

In my case (backing up my website), one version would have been the site itself, a second would have been a copy stored on my home computer, and a third would have been stored on a DVD (probably not every month, but probably once every six months or so) or I would have copied it up to my Google Drive.

Sadly, I didn’t take those precautions and now I’m paying the price (thankfully a small one).

And I’ll add one more thing – be sure to VERIFY the backup you created periodically.  It does you no good if the restore process fails or isn’t documented for someone else to perform.

Categories
Uncategorized

An upgraded Apple for sale.

This article hit home for me:

https://medium.com/charged-tech/apple-just-told-the-world-it-has-no-idea-who-the-mac-is-for-722a2438389b#.6q18so27a

I haven’t been a Mac user for a long time.  I worked on early (1990’s vintage) Macintosh “box” computers, but never really wanted much to do with them until they ditched the “System” operating system and went to “OSX” (now “MacOS”).  I’ve been a long-time Unix guy – I really wanted a NeXT computer back in the day, but didn’t have near the cash for it, and now that OSX had a nice user interface with the power of a Unix command line for all the tools and automation/scripting goodness I really wanted one.

I was lucky that my sister gave my dad her previous MacBook Pro 15″ (Mid 2010) when she upgraded a couple years later.  He liked it, but he is a Windows guy through-and-through so when BootCamp started giving him fits he was about to toss it in the dumpster.  I offered to trade him my 2-year-old HP laptop (which runs Windows just fine) and he took me up in a heartbeat.

In the past 18 months I’ve really grown fond of the MBP and OSX.  The software upgrades have gone fine with me, the battery life is still excellent, the hardware fit and finish is still solid and continues to look “current” even being 6 years old.

Knowing that the entire system is getting a bit long in the tooth and it has the occasional unexplained power issue, I was waiting for the next Apple MacBook Pro announcement.

In a word, I was disheartened.

My 6 year old 15″ MBP has an Intel Core i7 @ 2.66GHz, 8GB RAM and a dedicated NVidia GeForce GT 330M video card.  The latest MBP has an upgrade i7 CPU, but the performance compared between the two is barely noticeable in regular use.   Same goes for video – I don’t do any high-end editing or video processing, nor do I do much gaming anymore.  What I was looking forward to was a system that officially supported 32GB RAM and had a SSD drive that I could upgrade over time.  My primary tasks I do on my current MBP are playing with virtual machines (VirtualBox or VMWare), and RAM is the biggest constraint.

Instead, the big “new things” that the MBP brings us are that it is

  • “Thinner” – Really?  You couldn’t have made it the same thickness and given us 25-50% more battery life?  And keep at least one USB3 port?
  • “Touch function row buttons” – Ok, neat, but it’s not a good solution if you touch type (which I do), or if you’re vision impaired.
  • And if touch is such a good thing, why not make a touch-screen?  After-all, it’s been such a failure for everyone else…not.
  • Removed the “MagSafe” power – Really?  As a parent, that was one of the great things I like about my current MBP.  I’ve fixed a few laptop screens when a child or pet ran by and caught the power cord, sending the unit flying to the floor.
  • And apparently those that are knowledgeable about USB-C, their ports are good, but the clean lines of the Apple brand are all lost when I have multiple dongles for my other devices.  I’ll have a clean looking laptop, but my laptop bag will be a jumble of adapters.

The good thing is that there are enough people that will jump on the “latest, greatest” train and resell their older MBP.  I just hope I can luck into one for a decent price, especially when they realize that their devices need all those adapters.

Maybe I’ll be lucky and my sister will want to dump hers on me again.  No, she’s too smart to fall for that again.

(Edited Nov 1, 2016)