Larry Price

And The Endless Cup Of Coffee

Managing a Go Environment in Ubuntu

| Comments

Many moons ago, I wrote about setting up a Go environment in Ubuntu. After writing that post, I dropped Go development for nearly a year. Today I run the Indy Golang meetup, and soon I’ll be starting a new work project where I’ll be recommending a Go-based tech stack. I’ve learned a thing or two about Go and managing its dependencies since I wrote that initial blog post, and I intend to give a short presentation to my meetup about my recent findings. Before I do, I thought I’d write a preliminary blog post detailing the tools I use to keep my Go environment sane.

Installing Go

The easiest way to install Go in Ubuntu is through aptitude. However, the default version in the Ubuntu repos gets stale fast. I found a tool similar to rvm for downloading and installing local versions of Go called gvm. For better or worse, gvm is installed through a bash script. Fortunately, it doesn’t require sudo:

1
$ bash < <(curl -s -S -L https://raw.githubusercontent.com/moovweb/gvm/master/binscripts/gvm-installer)

At this point I removed all my previous finaglig with $GOROOT and $GOPATH from my dot files. You can use gvm listall to see all available versions of Go. As of the writing of this blog post, go1.4.1 is the latest release; however, go1.4 is the most recent release available with gvm. I’m not sure when the list is updated, but I have confidence that it is fairly regular. To install a Go and set it as the default Go:

1
2
3
$ gvm install go1.4 --default
$ which go
/home/lrp/.gvm/gos/go1.4/bin/go

Managing Packages

This is where I come to a fork in the road: gvm, the tool we used to install the desired version of Go, has a concept of pkgset similar to rvm gemset. However, I find the syntax for using a pkgset tiresome every time I come into a directory. I prefer something more automatic. As an additional pain, gvm does not provide a mechanism for installing dependencies from a list of known dependencies. I sought out other tools to address these pains and found gpm and gvp.

gpm is a tool used to manage Go packages. It reads in a file called Godeps which contains a list of packages with versions and can install them from their individual sources. I’m currently infatuated with gpm as it addresses a lot of concerns I had when initially learning Go: shared local dependencies, unclear versioning, and installing dependencies from a fresh clone. Install gvm and gvp:

1
2
3
4
5
6
7
$ pushd /tmp
$ git clone https://github.com/pote/gvp.git && cd gvp
$ ./configure && sudo make install
$ cd /tmp
$ git clone https://github.com/pote/gpm.git && cd gpm
$ ./configure && sudo make install
$ popd

We can follow two paths here: using gvm pkgset or using a local .godeps directory to store our dependencies discretely. For these examples, I’ll create a directory called gotest with a single file in it:

hello.go
1
2
3
4
5
6
7
8
9
10
11
12
package main

import "github.com/go-martini/martini"

func main() {
    server := martini.Classic()
    server.Get("/", func() string {
        return "<h1>Hello, world!</h1>"
    })

    server.Run()
}

Method 1: gvm pkgset alongside gpm

Create and start using a new pkgset:

1
2
3
4
5
6
$ gvm pkgset create gotest
$ echo $GOPATH
/home/lrp/.gvm/pkgsets/go1.4/global
$ gvm pkgset use gotest
$ echo $GOPATH
/home/lrp/.gvm/pkgsets/go1.4/gotest:/home/lrp/.gvm/pkgsets/go1.4/global

What this means is that we are now using our gotest pkgset as default, but the global pkgset will be used to dig up any missing packages. In order to install any dependencies, we need to create a Godeps file for gpm to consume. Our application from above has a dependency on go-martini, so let’s add a dependency to v1.0 in our Godeps file:

Godeps
1
github.com/go-martini/martini v1.0

After you run go build to verify that the dependencies aren’t installed, run gpm install to pull the required packages into your specified pkgset. Run go build again and revel in your own brilliance.

That was pretty great, right? The only issue is remembering to type gvm pkgset use gotest every time you restart your terminal or switch projects. Otherwise, gvm is practically a replacement for one of my favorite ruby tools rvm.

Method 2: gpm and gvp

gpm is intended to be similar to npm, a package management tool for NodeJS. The author suggests using a tool called gvp to set the GOPATH without much thought. If we start a fresh terminal in our example directory, we can use gvp to set up our GOPATH:

1
2
3
4
5
$ echo $GOPATH
/home/lrp/.gvm/pkgsets/go1.4/global
$ source gvp
$ echo $GOPATH
/home/lrp/Projects/2015/gotest/.godeps:/home/lrp/Projects/2015/gotest

You can note the difference in our GOPATH here relative to gvm: we will be using our current directory as a source of code in addition to a local .godeps directory. Go ahead and add .godeps to your .gitignore file or equivalent. We can use the same Godeps file as before:

Godeps
1
github.com/go-martini/martini v1.0

We run gpm install to install the dependencies to our local .godeps directory.

I prefer this method to using pkgsets. I’ve had better luck building projects with complicated structures, and it’s a lot easier for me to run source gvp than it is to remember the name of my pkgset. Both methods work pretty well and give me warm fuzzies about managing my dependencies. I’m certain that as Go continues to mature more solutions will come available. I’ve also been researching using Docker with gpm only, which requires very little tweaking to what I’ve already discussed here.

Think Like a Chef

| Comments

The Gist

Celebrity chef Tom Colicchio’s Think Like a Chef is a recipe-book-slash-culinary-theory hybrid. The reader is taken through a series of basic cooking techniques applied to a plethora of ingredients, followed by a focus on several interesting ingredients used in a variety of recipes, climaxing with a series of three ingredients used to create many interesting meals.

My Opinion

This book is designed to not just be about recipes, but instead about taking techniques and ingredients and demonstrating unique and interesting ways to think about them. I’ve always liked to think that cooking and software engineering have a lot in common. Chefs and programmers have to juggle whole systems of information in their heads at once to solve the problem at hand. Chefs and programmers follow instructions until it no longer benefits them, at which point they make substitutions to create novel solutions. Chefs and programmers tend to dress a bit strange compared to the average businessman. Not to mention we both have to wear hair nets.

I like to cook. I like to think about how I can create different dishes, how I can utilize aging pantry or freezer items, and how I can perfect spice combinations to build restaurant-quality meals. And I like to eat. So I thought I would like this book. Personally, I found the author to be distractingly arrogant and the book design to be irritatingly repetitive. Although not designed as a simple recipe book, I would have been better served using it as such. However, were I to use this as a recipe book, I would have a lot of issues finding many of the exotic ingredients called for in the middle of Indiana, to say nothing of how expensive all his favorite ingredients are.

The pieces of this book that I really enjoyed were the prose on using one’s intuition to substitute ingredients, spices, and cooking styles as appropriate; these seemed like the more intellectual parts of the book that I could actually relate to. As a software engineer, I’m often working on a problem that seems like it can be solved by following the standard recipe, but actually requires a level of clever substitution to build it into a new idea entirely. As a home cook, I make a substitution in nearly every recipe I find; sometimes it works, sometimes it doesn’t. I would have enjoyed this book much more thoroughly had the focus actually been on intuitive cooking instead of fanciful recipes.

Who Would Like This

Are you about to make a lavish dinner to impress house guests? You could probably find some pretty good (and descriptive) recipes in this book. However, I heard of this thing called the internet that has many more recipes and descriptions of cooking styles as well. If you like beautiful pictures of food, you may also eat this book up (though hopefully not literally). If you happen to be a fan of this celebrity clown, maybe you’d enjoy his cavalier writing style more than I did.

An Example Use Case for Docker

| Comments

I spent a lot of time last week asking questions about Docker. What is Docker? How could Docker help me day-to-day? How easy is Docker to use? How does Docker like its eggs cooked? Isn’t Docker a brand of sneakers?

What is Docker?

Docker is a utility for maintaining system environments. Docker capitalizes on Linux containers, a method of operating system virtualization which isolates multiple process groups on a single host.

Through a series of commands, Docker pulls up a base system image and applies changes to create a custom image. Docker provides the means to access any of these step-level containers for further manipulation. Docker uses a unique layered system such that sibling layers can utilize the same base images, in contrast to a virtual machine which would require multiple copies of things like operating systems, shared libraries, and shared binaries. The image below is an excellent visualization of the difference between a virtual machine and a Docker container.

VM vs LXC

Why use Docker?

Docker’s primary job is to take in a series of commands and spit out a clean environment with those settings. This is especially useful in deployment. I can take the Docker ubuntu image, download and install all my dependencies, copy over my application code, and run my application given some environment variables.

All of that should sound somewhat familiar if you’ve ever used heroku. Heroku performs very similar tasks to get your application up and running: take a Linux image, download base tools for ruby/python/nodejs/whatever, install application-specific dependencies (through bundler, flask, npm, etc.), and run your application given some environment. Docker gives you the power of heroku at the development level. …Sort of.

If I can create a Docker container for my application to run in, I can set up a build server to use a container to run my tests. I can use that same container to build a clean staging environment. I can use the staging container to build an identically clean production environment. With that knowledge in hand, I know the exact state of the production environment every time I deploy and I can reproduce it locally.

Theoretically, I can even use Docker for setting up a development environment, although after a few days of attempting this I still think you’re better off running natively.

Of course, Docker keeps a big list of examples from big-name company use-cases if you’re interested.

Example Usage

Brass. Tacks. Let us get down to them, compadre.

You probably want to install Docker first. If you’re not using Linux, have fun installing boot2docker, the rest of us are going to get started without you.

I started by trying to bootstrap my environment for Ollert with Docker. Ollert uses ruby-2.2, QtWebkit (in test), and MongoDB. It uses bundler to install any required ruby gems. Not too complicated, but I’ve noticed it’s never easy to get a new developer’s environment quite right.

We start out with the official ruby:2.2.0 image from the Docker Hub:

1
$ docker run ruby:2.2.0 echo "B-b-b-b-brass t-t-t-t-tacks!"

OMG that step will take forever if you’ve never downloaded the base debian image. It downloads and sets up quite a few layers. If you’re interested in what it’s doing behind the scenes and you can read Dockerfiles, this file is what’s being executed. Anyway, when it’s done you should see a friendly reminder about what we’ve gotten down to. We use docker run to run (download first if necessary) an image; in this case, the ruby:2.2.0 image. Everything after the image name is the command to run. Now that we’ve downloaded some base images, you can check out your available images using docker images.

Now I need to install my system-level dependencies:

1
$ docker run ruby:2.2.0 apt-get update

Note how this time the base image was already found in your local repository, resulting in a command that ran pretty quickly (based on your internet speeds (sorry Comcast customers!)). But what have we really done so far? We’ve created two separate containers: one with our initial echo command (useless) and one with all our updates. To see these containers, use docker ps -a. This will give you output similar to the following:

1
2
3
4
$ docker ps -a
CONTAINER ID        IMAGE               COMMAND                  CREATED             STATUS                      PORTS             NAMES
ad5ddd55f2c2        ruby:2.2.0        "apt-get update"         2 seconds ago       Exited (0) 2 seconds ago                        mad_curie
10cbaac4488c        ruby:2.2.0        "echo 'B-b-b-b-brass"    15 minutes ago      Exited (0) 15 minutes ago                       mad_perlman

These are now the containers available. We can create a new image from the first container using docker commit ad5 rubyapp, which will allow us to use it to create further containers. However, if we were to do this for every command we wanted to execute, we might be here for a while. We could go into bash on the base image and do all of our steps:

1
2
3
4
5
6
7
$ docker run -it ruby:2.2.0 /bin/bash
root@1be43510341e:/# apt-get update
...
root@1be43510341e:/# apt-get auto-remove
...
root@1be43510341e:/# apt-get install -y --force-yes libqtwebkit-dev mongodb
...

We could then use this image to run our app - however, this is also tedious and a bad solution. We want something that we can see on a granular level and reproduce every time for a base image. Fortunately, Docker provides us an easy way to do this using a DSL. Introducing the Dockerfile:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
# base image
FROM ruby:2.2.0

#install system-level dependencies
RUN apt-get update && apt-get autoremove -y && apt-get install -y --force-yes libqtwebkit-dev mongodb

# install gems from /tmp such that bundling is CACHED
WORKDIR /tmp
ADD Gemfile Gemfile
ADD Gemfile.lock Gemfile.lock
ADD .env .env
RUN bundle install

# load application source
ADD . /usr/src/app
WORKDIR /usr/src/app

# port where application is served
EXPOSE 5000

The syntax is a little different, but all we’re doing is telling Docker our base image, issuing commands, and copying files. The ADD command allows us to copy files from our host system. In this case, I copy over . to /usr/src/app in the container. I also copy over my Gemfile separately to cache the bundle so it does not install every time. I then expose the port I want my application to use. Run this file as such:

1
$ docker build -t rubyapp .

This creates an image called rubyapp and a container for every line of the Dockerfile that is run. Although your first build may take a moment, subsequent builds will be cached and should be significantly faster. Now, if we want to run my application:

1
$ docker run -d --name rubyappinstance rubyapp foreman start -d /usr/src/app

I use foreman to start my application from the given directory. I tell Docker that the application will be daemonized using the -d flag. If I check my running containers with docker ps, I’ll see my application running. If I want to stop it, I just run docker stop rubyappinstance.

I’m going to stop there for now. In order to get Ollert working properly, I also need to link a Mongo database and change some environment variables in my application, but those are relatively easy tasks.

Is it worth it?

The only conclusion is a definite maybe. Docker is definitely pretty cool. It may be able to help you deploy custom applications easier; for Ollert, it feels like overkill. There is a lot of overhead in downloading core versions of different operating systems, and I already find myself itching to clean up all the leftover Docker images/containers on my machine I used once and never again. After getting the Docker development out of the way (building and testing a Dockerfile), you may save yourself some time in the future if you have to change hosting services or CI environments. Try it out! It’s a pretty neat concept and definitely worth your attention in 2015.

2014 Retrospective

| Comments

Happy New Year, dear reader.

2014 was a fairly good year for me. I wrote a lot of code, read a lot of books, and published a lot of blog posts.

Things I got paid to do

2014 was a reaffirmation of career goals alongside the opportunities to work with several very different clients. In addition to working on cool stuff, I was promoted to Software Engineer 2. During the tail end of the year, I had the chance to work from home part of the week, which was a great experience. A sampling of my professional works this year:

  • I wrapped up my time writing Qt/C++ for a touch-screen automated-guidance system embedded in heavy machinery. I spent the first two months of the year playing Tech Lead for a small team of misfits leaving the project during an organizational restructure. I had the opportunity to work directly with a project architect to redesign features and guide my fellow developers through the implementation.
  • I worked on Health2Wealth, a ruby application created by some work friends to track Fitbit data and allow an administrator to award monetary rewards for reaching goals.
  • I pitched, designed, and implemented a product called Ollert, a Trello analysis tool written in ruby. Built during a 48-hour SEP Startup Weekend, I led a team of 6 to bring the project to fruition. SEP has fully embraced the product and I continue to work on improving it when not on project work. We currently get about 300 sessions a month, and I frequently get emails from users thanking me, reporting bugs, or submitting feature requests.
  • I became part of a team of developers working on a modern asset-tracking tool to replace an existing Microsoft Access tool. This was my introduction to C# programming with ASP.NET MVC. The project was short-lived for my team, but sparked my interests in RESTful APIs and Javascript.
  • I worked on a rewrite of a huge WebForms e-commerce site. We created a hybrid site running some pages in the WebForms engine and all of our new pages in fresh MVC. We worked with a high-fidelity prototype created by an outside company, which proved at times invaluable and at other times a thorn in our sides. We worked directly with members of the client on-site, who were able to provide us with great insights into the legacy system and helped us work out unclear requirements.
  • In December, I had some fun working on a solo proof-of-concept with a new client. This client is planning to do a rewrite of a large application in 2015 and wanted to do some investigation into rules engines, particularly those using Java. I worked through a few libraries and landed on JBoss Drools, an open-source rules engine with an integrated frontend called KIE Workbench. It’s been a while since I wrote any Java code, and I’d certainly never deployed any Java web applications. After two weeks of work, I was able to launch a web application using Spring MVC and integrate a KIE Workbench instance running on the same Azure virtual machine. The application is able to pick up rule changes from the Workbench automatically, without recompilation or redeployment.

Of my own volition

I had a lot of fun coding ideas this year, most of them lost to the raptures of time. A few items of note:

  • BYO Game of Life (http://byo-game-of-life.herokuapp.com)
    • My most recent release - a frontend web client allowing the user to give a URL pointing to a Game of Life backend. I created a Go implementation as part of my presentation for the inaugural Indy Golang meetup.
  • 3rd Day Organics
    • We finally finished version 1.0 of 3DO, a custom e-commerce website for a co-worker’s spouse’s cooperative food program. We built the whole application from scratch; in hindsight, we should have set something up using spree or another commerce framework.
  • Fortune Cookie API (http://fortunecookieapi.com)
    • A fairly silly API written using NodeJS to get data associated with fortune cookies: fortunes, lessons in Chinese, and lottery numbers. I implemented a simple application to access the data at http://demo.fortunecookieapi.com.
  • mongoose-simple-random (https://www.npmjs.com/package/mongoose-simple-random)
    • My foray into NodeJS packages - this is a plugin which adds findRandom and findOneRandom methods to any mongoose schema. Used extensively in the Fortune Cookie API.
  • Ollert (https://ollertapp.com)
    • I know I listed it under a paid project, but I was putting over 8 hours a week into Ollert between the time it was originally built and the time I was paid to work on it. It’s my big pet project, and I’m proud of the things that it and I have accomplished. Trello has noted our accomplishments, validating all the hard work and long hours I’d sunk into it.

Book It - Where’s my free pizza?

I think that I read 11 books this year. See my books section for a quick check of which books you should be reading. Some of my favorites:

This blog o' mine

In reality, I put quite a bit of work into this blog.

I published 24 blog posts in 2013; I published 37 in 2014 (including this one).

I had 1700 sessions and 2420 pageviews in 2013. I did almost that well in a single month at the end of 2014. I had 10,194 sessions and 12,997 pageviews in 2014 (as of 10:46AM EST 12/31/2014).

In February, I set up a custom domain at “larry-price.com”. Since then, I moved from Github Pages to OpenShift and added SSL.

I integrated Google AdSense this summer. Full disclosure: I’ve “earned” $3.78 in the past 6 months. From my perspective, these ads are simply an experiment to see how I could potentially monetize this blog. Another experiment was posting Amazon Associate links with my write-up posts; I’ve gotten 4 clicks and 0 buys.

In an attempt to keep people on the site “longer”, I added a Related Posts section above the Comments - it makes the blog take forever to generate and has had negligible results. It turns out users are more likely to click the related tags than the related posts. I intend to remove the Related Posts section soon (it may not be there when you read this).

Hello, 2015

As I enter the new year, I want to be able to look back next year at a set of naïve goals and wonder why I ever thought they were practical. So here they are:

  • Contribute meaningful code to several open source projects not created by me
  • Read at least 1 book per month
  • Attend more meetups
  • >3000 sessions/month on this blog (currently ~1400)
  • Become pro at using chopsticks
    • I’m already pretty good, but by the end of 2015 I will be eating pizza with chopsticks
  • More effectively conceal emotions when working with peers and clients
    • Use this energy and passion to put the situation in my control instead
  • Work remotely more and better
  • Obtain more technical leadership roles on projects
    • Not entirely in my control, I know, but I’m hoping that 2015 is a year where I’m able to demonstrate both my technical skills and proclivity to command

Goodbye, 2014

Thanks for reading. Have you thought about doing one of these yourself? You really should! There’s lots of help out there if you need it - you can even reach out to me, if you like.

As always, may your compile times be short and your error messages meaningful.

Deploying an Octopress Site to Openshift

| Comments

I’ll detail how to take a new or existing GH Pages blog in Octopress and deploy it to OpenShift.

Why?

I recently moved my blog from Github Pages to OpenShift Online. I did this because I wanted to utilize SSL with my custom domain - a feature not currently available on Github Pages (it’s 12/14/2014 - let me know when this becomes possible!). OpenShift supports SSL with a free account, so I decided to make the switch. I wanted to use the existing architecture set up by Octopress that built my GH Pages blog. Hopefully you see a little lock icon in your address bar if you’re reading this from the site.

Step 0: Create an OpenShift account

Here’s the link: https://www.openshift.com/app/account/new. They’re serious about the first and second options being free. The difference is signing up for the second option requires credit card information to give you the possibility of scaling your application. The second option is what allows you to add certs for doing SSL.

Step 1: Create a Ruby 1.9.3 application

Octopress still uses ruby 1.9.3; for simplicity’s sake, so will we. Assuming you have ruby-1.9.3 installed and the rhc gem installed (gem install rhc), create a new ruby-1.9 application on OpenShift:

1
$ rhc app create octopress ruby-1.9

replacing “octopress” with your desired application name. I’ve found this action takes some time, usually over a minute, while OpenShift allocates resources. When your application has been created, you should see an ssh:// URL for your git repository on OpenShift. Make sure you can get access to this later.

If you’re using a custom domain, now is as good a time as any to set it up. OpenShift has its own docs to cover this topic.

Step 2: Merge in “Octoshift”

I’ve made some updates to the octopress repo that I haven’t requested be pulled into the master branch yet. I’ll update this section in the future if necessary.

If you’re new to Octopress, clone from the master branch:

1
$ git clone git@github.com:imathis/octopress.git

Either way, cd into your octopress directory and merge in my fork:

1
2
$ git remote add octoshift git@github.com:larryprice/octopress.git
$ git pull octoshift master

Step 3: Set up deployment

There is a rake task for setting up an OpenShift deployment called setup_openshift. This rake task clobbers the _deploy directory and reinitializes it using the URL for your OpenShift repository, which we committed to memory in Step 1.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
$ rake setup_openshift["ssh://548e1873e0b8cddccf000094@octopress-username.rhcloud.com/~/git/octopress.git/"]
rm -rf _deploy
mkdir -p _deploy/public
cd _deploy
Initialized empty Git repository in /home/lrp/Projects/2014/octopress/_deploy/.git/
remote: Counting objects: 21, done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 21 (delta 2), reused 21 (delta 2)
Unpacking objects: 100% (21/21), done.
From ssh://octopress-username.rhcloud.com/~/git/octopress
 * branch            master     -> FETCH_HEAD
 * [new branch]      master     -> origin/master
cd -
cp config.ru _deploy/
cd _deploy
[master 5204046] Octopress init
 3 files changed, 27 insertions(+), 295 deletions(-)
 create mode 100644 Gemfile
 rewrite config.ru (99%)
 create mode 100644 public/index.html
cd -

---
## Now you can deploy to ssh://548e1873e0b8cddccf000094@octopress-username.rhcloud.com/~/git/octopress.git/ with `rake deploy` or `rake openshift` ##

Step 3.5 Set up forcing SSL

Do you already have SSL for your domain? Bully for you! I’ve included an option to setup_openshift to force traffic to use https in a production environment. Just send a second parameter to rake and it’ll do the rest.

1
$ rake setup_openshift["ssh://548e1873e0b8cddccf000094@octopress-username.rhcloud.com/~/git/octopress.git/",true]

Note that you will still need to change your URL in _config.yml if you are using a custom domain.

Step 4: Deploy

Now we can just run the openshift or deploy rake tasks to deploy our app to OpenShift. This task verifies we have the latest version of our OpenShift master branch, installs missing gems, copies over your public folder, and pushes the application to OpenShift.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
$ rake deploy
## Deploying branch to OpenShift
## Pulling any updates from OpenShift
cd _deploy
From ssh://octopress-username.rhcloud.com/~/git/octopress
 * branch            master     -> FETCH_HEAD
Already up-to-date.
Fetching gem metadata from https://rubygems.org/...........
Resolving dependencies...
Using rack (1.5.2)
Using rack-protection (1.5.3)
Using tilt (1.4.1)
Using sinatra (1.4.5)
Using bundler (1.3.5)
Your bundle is complete!
Use `bundle show [gemname]` to see where a bundled gem is installed.
cd -
cd _deploy

## Committing: Site updated at 2014-12-14 23:13:54 UTC
[master b17225e] Site updated at 2014-12-14 23:13:54 UTC
 1 file changed, 17 insertions(+)
 create mode 100644 Gemfile.lock

## Pushing generated _deploy website
Counting objects: 9, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (7/7), done.
Writing objects: 100% (9/9), 1.29 KiB | 0 bytes/s, done.
Total 9 (delta 1), reused 0 (delta 0)
remote: Stopping Ruby cartridge
remote: Waiting for stop to finish
remote: Building git ref 'master', commit b17225e
remote: Building Ruby cartridge
remote: bundle install --deployment --path ./app-root/repo/vendor/bundle
remote: Fetching gem metadata from https://rubygems.org/..........
remote: Installing rack (1.5.2)
remote: Installing rack-protection (1.5.3)
remote: Installing tilt (1.4.1)
remote: Installing sinatra (1.4.5)
remote: Using bundler (1.3.5)
remote: Your bundle is complete!
remote: It was installed into ./vendor/bundle
remote: Preparing build for deployment
remote: Deployment id is b4ef1149
remote: Activating deployment
remote: Compilation of assets is disabled or assets not detected.
remote: Starting Ruby cartridge
remote: -------------------------
remote: Git Post-Receive Result: success
remote: Activation status: success
remote: Deployment completed with status: success
To ssh://548e1873e0b8cddccf000094@octopress-username.rhcloud.com/~/git/octopress.git/
   a5071bb..b17225e  master -> master

## OpenShift deploy complete
cd -

Fin

That’s that. Treat it as any Octopress install. I’ll see if the owners of the octopress repository would be interested in pulling in my customizations and update this blog post as necessary.

As for my switch from GH Pages to OpenShift: I’ve found OpenShift to be just as fast as the GH Pages static server, especially when combined with Cloudflare’s CDN and optimization systems. No regrets so far. Drop me a line in the comments if you have a beef with OpenShift or know of any equivalent alternatives.