finally giving up on macports, adopting homebrew

I give. After having yet another broken install (this time maven), I give up. I’m so tired of ports being missing, outdated, broken, or just dumb. I’m tired of the complex maintenance.

So I’m giving in and moving on to homebrew like all the cool kids did forever ago. /usr/local makes my BSD side so much happier than the bizarro /opt/local anyway. I can’t even remember why I felt like homebrew was unclean anyway.

I’m also thrilled that I could bootstrap myself sooo freaking quickly. Vim, a recent bash, golang, git, python, and a slew of essentials were installed super fast and correctly the first time (brew options is so reasonable).

I’m hooked.

Python for Real Programmers

A quick list of Python awesomeness and oddities for practitioners of real programming languages like Java, C++, etc.

The interactive interpreter

You can run python from the command line and quickly type code in and evaluate it as you type. In the LISP world is this referred to as REPL or Read-Eval-Print Loop. Examples with the >>> and ... prompts come from this. It is super useful and you will come to love it and wonder how you ever did without it. It’s not unique to Python, but compiled language folks (other than people who speak with a LISP) are typically not used to that. Often times it is easier to test ideas this way than to stare at a debugger. It’s also a great way to generate exceptions on purpose.

Some useful built in stuff is always available to you here. First is the dir() function which returns a list of things (attributes and methods) in a given object. These are essentially namespace entries.

That is the list of namespace entries for an integer (1).

One useful property you will note in this list is __doc__. That’s the documentation attribute for whatever you are looking at.

There are plenty of others (eg. __class__), but I’ll leave that as an exercise to the reader :).


There are no primitive types in python. Everything is an object. Functions, classes, integers, strings, lists, dictionaries are all objects.

There are some basic built-in types that should be used whenever possible. Love Them

The next note is that anything that implements the interface of a built-in type can be used like a built in type. It is easy to define your own list-like object for example by implementing the list interface.


2 Rules:

  1. Do things simply
  2. Handle the errors

It is Pythonic to try, fail, and handle. Avoid the temptation to check and then try. The latter leads to race conditions and inflexibility to change.

Consider this snippet:

It works fine. But it assumes only integers are addable. What about floats?

OK, now it handles floats what what about long, complex, or some new numeric type? Rather than anticipate, handle the error appropriately.

Ahhh. Much better. Now it will work with literally anything that can be added to 1. That’s Pythonic.

Also, when possible keep things simple. I’ll give three examples of code to do the same thing.

The first example is unnecessarily verbose.

The second example, using the built-in function filter and a simple anonymous function (the lambda) is better and preferred by some. In this example in particular it is clear and concise.

The last, which is called a “list comprehension”, is usually the most clear in real life, and is generally considered the most Pythonic. Dictionary comprehensions, while dope, are still too new to support a lot of running Pythons out there. So ensure your users are on 2.7+ (ie. not RHEL6) if you want to use them.

Packing and unpacking lists and dicts

One oddity that you will run into are examples like this:

*args is the list of simple arguments, while **kwargs becomes a dictionary of
keyword arguments. The names of these are not important, just that * collapses values to a list and ** keyword arguments to a dict.

This also works in the inverse:

Here we turn a dictionary into keyword arguments. A great example of this is its use with format():

We can use the keyword referencing abilities of format() combined with a predefined dict of arguments.

It’s also useful when wrapping another function that you do not own. You specify the arguments you need and use the asterisk arguments to collect anything else passed to you and then pass those on to the function you call.

This allows you to support unrecognized utility or future utility without having to change your function for every release of the library you are using.

Multiple Return values

You can return multiple values. They will result in a list. You can also assign them without having to deal with the intervening list. This makes more sense in a code snippet.

Scope oddities

This is perfectly valid:


I cannot do this topic justice. Read this.


When in Rome follow PEP8. Except the 80 character line thing. Keep line length short when possible, but hitting 120 characters is totally reasonable in this day and age of 1080P monitors. Use an IDE like PyDev that checks this for you, or use flake8.

It’s OK to vary from the convention when you mean to. It’s more like a guideline. Just use the tools ability to ignore that exceptional piece of code. It makes everything better. It’s not OK to go all HumptyCase on a variable, just because it’s what you are used to. Someone will have to maintain your code one day. Doing the Python thing means that they will be able to know the norms without having to learn you. This goes for any language.


When you are actually writing a script that is meant to be called directly (something that you shouldn’t be doing too often given setuptools) you should make the shebang line like:

This allows the user to use whatever python is first in their path vs. hardcoding a path.

2 vs. 3

I still use Python 2 because it’s better supported. Python 3 does not lag behind too far and is excellent. I will be so happy when RHEL6 dies and I can rely on Python 2.7+ however.

Useful tools to know in the Python universe

I’ll leave this up to you to Google them.


  • setuptools
  • wheel


  • tox
  • unittest
  • nose


  • Sphinx

Feeling like a traitor: from vim to PyCharm

I’m a vim user.  I love vim.  I love modal editing.  And I just used an IDE for a day and liked it.

PyCharm is magical.

It gets out of my way.  There’s no big clunky toolbar, there’s no 3 hours of clicking buttons to make it not stupid, there’s no elaborate setup.  Here are some thing I love:

I use gitolite3 at work to manage some internal git repos.  I set in my .ssh/config to use the user gitolite3 for the host those are located on.  I put host:/path/to/repo.git and it just did the right thing.  It picked up my ssh keys, used normal git/ssh and just worked.

It has reasonable themes out of the box.  I never had to stare at eye-death blue.

It just does git right.  I add a file and it makes sure I want to add it to the next commit.  When I pull/commit/push it asks reasonable questions.

It does an excellent job with pep8.

The refactoring makes me almost not miss [esc]:%s/….

It does normal IDE stuff excellently.

I almost don’t miss vim….. almost.

But it has convinced me to try it out for a while longer and see if I can untrain [esc]vjjjjjjj%s/foo/bar, some command[esc]!!sh[return], and other fun vi-isms.

We’ll see, but it beats the pants off of eclipse + fill in the plugin, IDLE, etc. so I feel like I need to give credit where credit is due.

Enterprise Metaphors: Water

When you read about the Devops movement, there is generally an assumption of greenfield.  Those that address change usually are addressing an organization with less than 100 IT workers or ones that are bringing less than 10 years of history with them.  While devops (at least the idea behind it) is more than a fad, I think that the conversation focuses too heavily on implementation and charts, and not enough on strategy and purpose.  I will try and use some illustrative examples of what I feel are the mechanics behind devops, cloud-computing, and apply them to the Enterprise.

How did we get here?

So distributed computing… Like any cool new technology, businesses flocked to Unix as a cheaper alternative to mainframes.  The adoption looked like most technological adoption.  It didn’t replace the mainframe for core business functionality (often times, still hasn’t), mostly due to fear of risk.  But projects sprung up.  Each project evaluated its needs, hired staffing, bought hardware, developed and deployed.

The Sprawl

At this point there were dedicated developers, sysadmins, project personnel, hardware, and software, all of which were underutilized and non-portable.  And the staff could not easily be reallocated since each project required learning and working with an entirely different stack: You probably had HP-UX, DB2, and C++ one place. Then Solaris, Oracle, and Java for another.  And projects that matched tools, didn’t match versions or consumption patters.

From each consideration, it made sense.  But as a whole it was a mess.


Organic Growth
Organic Growth in Delhi 
photo by Rhiannon / CC0


To address the issue, phase 2 begins.  Standardize all the things!  Timed with the move to commodity software/hardware, management addresses the resource sprawl by standardization.  Instead of teams organized by project or business unit, teams are reorganized by function.  The sysadmin team, the hardware team, the DBA team, etc.  The choices available for development are reduced to Linux, Java, and a single RDBMS.  Process is in place to ensure all projects follow the rules through oversight.

Chinese Apartment Buildings photo by Jim Bowen /CC BY 2.0
Chinese Apartment Buildings
photo by Jim Bowen /CC BY 2.0


Now costs have been reduced, but time to market has grown. The business doesn’t understand why it takes 1-2 years to get a new application into production.  More resources are dedicated to oversight than to construction.  The business is frustrated, development is frustrated, operations is frustrated.  Management is itchy to buy into something new and vendors smell blood.

Process as water

In the beginning of the story of distributed, a project is like a bucket of water.  We just dumped it on the ground and it rand generally downhill to the destination.  It took whatever path was easiest for it.  So depending on each projects’ perspective and the conditions at the time, they took their own path.

Standardization meant taking each bucket and rather than pouring it and making it run, we carried each bucket to its destination.  They generally followed the same path (depending on who was carrying the bucket), but it is energy intensive.  The water resents being constrained by the bucket carriers, and the bucket carriers resent the weight.  Many times, the next evolution of  this is a bucket brigade.  The person who cares about infrastructure costs passes on to the person who cares about information security, and on to the operations policy, etc.  And sometimes buckets get set down along the way.

Bucket Brigade  photo by Appalachia Rising / CC by 2.0
Bucket Brigade
photo by Appalachia Rising / CC by 2.0

The bucket brigade, while more effective than before, is expensive and ever expansive.  The number of people carrying the bucket always grows, and it always increases opportunities for projects to be slowed down.  Also, oversight always depends on the overseer.  Development gets frustrated by seemingly constantly changing requirements, the process managers are frustrated by development always trying to skirt the rules, and no one feels like their concerns are being addressed.

Automation, not standardization

The solution is automation instead of rules and enforcement.  Stop lugging water, and start building aqueducts.  This is what things like Devops, cloud computing, IaaS, and PaaS promote.  You go back to the first state and you ensure that the easiest path is the “right” path.  You ensure that if people conform to the desired parameters they decrease effort and time to market.

photo by Sloopng / CC0

Final Thoughts

Let People Succeed

There is an easy litmus test to this:  if a job is following a procedure it needs to be done by automation.  This sounds harsh, but someone who only follows procedure has 0 opportunity for success.  They only have the ability to fail.  People’s energy needs to be spent making decisions where they can provide value.

Stop Documenting

It sounds wrong, but try not to write down how something needs to be done.  Make something just do it.  For those things you can’t tie together, have authoritative enforced data sources.  Let the system be the documentation.  Then you ensure it is never out of date.  And if you thought [FILL IN THE REGULATORY CONCERN] auditors liked good documentation, they love live queryable systems.  Well, except the whole reducing billable hours part.

Nothing alleviates the work.

Adopting any solution to this does not take the work out of it.  It just makes the work constructive.  Devops-style configuration management is a lot of work, but you only have to do it once.  Standing up PaaS, tying CI/CD into it, extending it to support your business, etc. is a lot of work, but then every project that can fit in the PaaS benefits. No vendor is going to have you something that is a ready-made solution to your problem.  Remember, every time it has more than one way of doing something, that’s work.  And when a vendor says it can do it however you want, that means you’re building it yourself.

Raised beds on the cheap

So you want a raised garden for vegetables, but don’t have a boatload of cash? Use what America is built with: pressure treated 2×4’s.

But but but, chemicals!

When originally looking into building raised beds I saw many articles requiring cedar, and shying away from pressure treated pine because of Arsenic. This has been factually incorrect for more than 10 years. CCA (Chromated Copper Arsenate) hasn’t been used to treat lumber since 2003 (see the EPA). And there the big issue isn’t actually leaching, but burning the wood and dealing with the ash. That said, use cedar if it’s reasonable. It smells great and looks awesome too, if you ask me. But at my lumberyard here, cedar 2x4x8’s are >$8 a piece. Pressure treated pine are ~$3 a pop.

What to buy

  • 9 2x4x8 Boards
  • 1 4x4x8 Board
  • Deck screws certified for pressure treated lumber

For the last one, I personally love these guys: FastenMaster FMGD003-75 GuardDog Exterior Wood Screw, Tan, 3-Inch, 75-Pack. They even come with a pozisquare bit in the package which has all the great features of a Phillips and a Robertson head… end result: let your drill bit fly and don’t worry about stripping it.



  1. Cut 3 x 2x4x8 boards in half
  2. Cut 4 x 1.5 foot sections from the 4x4x8


  • The 4 x 4x4x1.5 pieces are vertical posts
  • The 6 x 2x4x4 pieces are for the short sides (stacked 3 high)
  • The 6 x 2x4x8 pieces are for the long sides (stacked 3 high)


Assuming you are placing this somewhere that is currently grass.

  1. Drop your mower as low as it will possibly go and mow the area.
  2. Feel free to do stuff to physically (not chemically!) abuse whatever vegetation is left.
  3. Dig 4 holes at each corner for the posts
  4. Place the frame on the ground with the posts in your new holes
  5. Use a mattock or shovel or your hands if you have to, and break up the soil as much as possible to approximately 12″ below the surface.
  6. Do soily stuff
  7. Plant plants
  8. Mulch!
  9. Dance!

Soily stuff

This is worthy of a section of its own because your soil is where all of your hard work should go. If you want good plants focus on good soil, the rest will come.

There are a few ways to go with this.

Buy in bulk

If you have a local supplier who is worthy of your business, you can buy garden soil by the yard and haul it yourself or have it delivered. Just remember that 1 square yard covers 27 square feet 1 ft deep in soil. Make sure that the soil has been composted or “cooked”. You want the temperature to have been high enough to kill off residual seeds of weeds and such. You want the soil to have lots of awesome organic components to feed your plants and to provide that nice structural balance that promotes root growth and proper water retention. If they send you something that looks like topsoil send it back immediately. I have heard horror stories of raised gardens full of clay. Try not to let anyone unload anything other than rich, near-black, spongy awesomeness in your driveway/yard.

Buy by the bag

It’s easy enough to get bags of organic garden soil at your local garden center or big box retailer. And this is one of those times organic really pays off. Rather than just adding fertilizer to soil + filler, your typical organic soil will have those sweet sweet organic components that will continue to pay dividends both chemically and in terms of consistency for years.

Ammend your soil

This takes a lot more knowledge and work. You need to know your existing soil and know how to work it to get it the right consistency and achieve the correct chemical balance to make typical vegetables happy. This is more than a topic in itself.

After that

Start composting for next year

I can’t emphasize this enough. You will want it to amend your soil and for your expansion plans which will naturally result after you get hooked. You can start with just a pile or get a fancy composter. But get to work.


I got into setting up a rain barrel, and also setting up a drip-irrigation system. I’ve enjoyed playing with both. The latter especially seemed to help get bountiful vegetables and prevent disease by keeping leaves dry (in my humid climate).

Read science backed articles

I cannot emphasize this enough. Don’t use that spray you found on Pintrest to kill weeds. Yes it will kill the weed, but it will also make the soil you hit with it barren. Look at your state university’s agricultural information it’s often incredibly useful. Look for people citing sources.

When you can’t find science, go back to folk wisdom

It’s closer to science than internet wisdom. Talk to people at locally owned and operated nurseries and garden centers. Ask what works. Judge what they say by what you know. If they make crap up about something you know, then they’re probably not reputable.

Hubot Inspiration

At puppetconf I had the pleasure of attending Phil Zimmerman’s awesome Killer R10K Workflow session.

While R10K, and his voodoo were awesome, the use of Hubot as kind of the center spoke for the communications of the workflow had me feeling a little inspired.

I am unable, however to use anything as-is in the workflow because we can’t use external services (github, hipchat, etc.).

We have set up a workgroup XMPP host (running Openfire). And since we both develop Ops infrastructure and run a R&D lab, we set up a couple chatrooms for SysAdmin topics, and Development topics.

So now using Hubot and hubot-xmpp, we have our own friendly chatbot, Virgil. It’s a play on Dante’s Divine Comedy: Virgil being the guide through heaven and hell. Also the historical poet Virgil himself, and the Southern use of the name. I think it captures what I call the “redneck scholar” personality well… The guy who well read, and can operate a tractor. Theory and practice in a single person. In a way, what the whole devops movement strives for.

Currently Virgil is tied into our git repositories with a post-receive hook. He has the typical hubot, and random quote functionality.

In particular I saw one I liked that was made to cheer people up who mentioned failure. But it just spit out a single quote. So I wanted a little variety. I would have just used the msg.random piece, but one thing that bugged me was the single string. So I made him pull a random array from an array so I could store quotes and attribution reasonably, and deal with them as distinct, but related pieces of data.

Nothing crazy. Just pick a random number between 0 and n (and make sure it’s an int) and grab that element from the outer array. This lets me manage quotes and attribution in a more flexible way and give Virgil some personality.

I’m now polishing off putting him into the escalation chain on our monitoring system that uses a convoluted email->procmail->python script->json-via-http->hubot. It sounds more complicated than it is, as basically it’s a script feeding the hubot script. Procmail also buys me the ability to take all those annoying things that have to use email for notification and reduce them to just being chat notifications, where more of them belong.

The key being that I snarf the data as data, and use the bot script to present with personality.

Next up will be tying him into our API in front of Foreman and Puppet to magically provision machines for us. I can’t wait to ask him for 3 VMs with Tomcat.

Introducing GardenBuddy

I’ve put together a little project for the Raspberry Pi to monitor environmental conditions for my garden. For now I’ m calling it garden buddy, and the code is available here:

Using a few sensors (light, soil temperature, moisture) and available weather data from NOAA, I can monitor my garden and look for trending data.

The software is all in python and includes the little daemon for stuffing the data into RRD files, and a couple CGI’s (kickin’ it old-skool) for viewing the graphs.

I’ve made it so the sensor and graph configuration is all done in an INI formatted config file, so you don’t have to necessarily know Python to use it.

It’s like performance monitoring, but for your tomatoes.

I dream of one day intelligently managing a watering system with it, but for now semi-pretty graphs will do.

Some TODO items I have are:

  • interfacing with more sensor types
  • making it prettier
  • Unrolling the requirement for a “real” webserver
  • A rainbarrel/soakerhose/valve management piece

Battery Replacement on the Nexus 4

Just wanted to note I found this article, which describes the battery replacement process well (the Youtube clip helps immensely). One piece of errata, however: you need a 00 Philips not a 0 as described for removing the battery connection itself. My ebay-bought battery seems to be working great. Hopefully I can get a bit more life out of the thing before I buy my Nexus Eleventy-two.


I just have to share this great collection of bird photos that helps me identify birds in the backyard. It really does cover 99% of what I have ever seen in west Tennessee:

Birds of Tennessee by Bruce Cole


I’ve started on cloning ksb’s excellent xapply in python for two reasons:

  1. It’s an interesting exercise
  2. There are many times I don’t have msrc or want to bring msrc with me for a one-off usage, where a python script would be perfect

Currently it just requires Python 2.7+ (I really love argparse).

I currently support:

  • Parallel jobs!
  • Input from command arguments
  • Input from arbitrarily many files
  • Fancy dicer syntax (eg %[2,4])

So far it does most of what I need, but it is nowhere near feature parity yet. I was considering going with different command line arguments, but I decided to stay as close to the original as I can (although I cannot guarantee argparse will behave the same as ksb’s getopts behavior).

Feel free to contribute if you’re bored. Feel free to use if it helps. I’m releasing it under the standard 3-clause BSD License.