Thursday 28 May 2015

So you want to be DevOps?

The DevOps juggernaut is well on it's way, careering through an enterprise near you. Ripping Configuration Engineers up of the ground like a tsunami and demanding they automate themselves out of existence. Developers are no longer headphones-on-code-slingers Agile Developers Engineers but polyglot superstars who walk every last line of BDD/TDD code from the IDE vagrant/docker container to a scripted live cloud deployment and when they're not hacking on open source projects at night they are checking out their monitoring dashboard splunk feed they have built to check every last twitch from the live system.

Except it's not like that. At least that's from my experience and from from the speaking to people who work in enterprises.

DevOps has created a ripple and quite rightly so - we (developers) were sitting in silos cranking out code and blissfully unaware of the downstream carnage that can be caused.

But what is this big bang DevOps movement? We've all read the Phoenix Project and follow #devops but what are we trying to achieve?

We want to be DevOps

Damn you Spotify and Netflix - you guys are fantastic but you've put DevOps on the radar of the people who wear suits and that is causing devastation downstream in the engine room.

Yes we like what we've read and heard about DevOps but being told to transform into a DevOps organisation often seems like an impossible task.
You Can't always get what you want
But if you try sometimes you might find
You get what you need - Richards/Jagger
We want DevOps but what do we need?

Well for starters, lets start with this one:

We need to deliver features faster

It's not uncommon for enterprises to release software quarterly - we want to release daily, if not hourly or even better - be in control of our releases and choose to deploy to live anytime we want.


What's the problem?

There are many problems, mostly down to the scale of the changes required at enterprise level.

Delivering faster takes discipline and cultural buy in and Enterprises have a lot of stakeholders. Getting everyone to buy into the dream is hard and it only tales one or two senior nay sayers to cause lots of pain.

The Enterprise may be based in the backend of nowhere - getting the right people that are enlightened is not easy. Many Enterprises chug along with sub par development teams churning out buggy software that doesn't do exactly what the business or customer want but this rubbish works well enough and is profitable enough to warrant not rocking the boat.

How can we fix the problem?
Success is the sum of small efforts - repeated day in and day out - Robert Collier
Faster Delivery is the end goal and is a side effect of the sum of a number of small factors. We can create a ruler to measure where we are now and where we need to be in order to achieve our goal of delivering features faster. Here are some ideas to deliver software faster:

Level 1 - Basic building blocks


  • Basic Agile Methodology followed
  • Coding standards and peer review in place
  • Source code is in version control
  • Continuous Integration (CI) in place
  • Unit tests run on the CI server
  • Static code metrics in place (Code coverage, cyclomatic complexity, etc)
  • Acceptance test behaviour written in common language between business and development teams (BDD)
  • Test Driven Development (TDD) practiced
  • Code pairing is common practice 
  • Code reviews in place
  • Architecture is modular
  • Time allowances for teams to work on dealing with technical debt
  • Stable teams
  • Test environments that mimic live


Level 2 - Intermediate progress


  • Development Teams provide out of hours support for the features they own
  • Software built with components
  • Scripted one click test environment build
  • Living documentation built from acceptance tests
  • Database is in version control
  • Integration tests run on CI server
  • Monitoring awareness by all teams
  • Configuration in version control
  • Configuration as code in place for deployments
  • Developers and testers integrated
  • Culture of learning and self improvement, proper staff training available



Level 3 - Advanced progress

  • Acceptance tests run on CI Server
  • Automatic Performance tests
  • Automatic database deployment
  • Live systems monitoring accessible to all
  • Infrastructure as code
  • Software can be deployed to live with no downtime
  • Zero touch deployment 
  • Development team integrated with operations team



Of course different enterprises will have different ways of operating so some of the above may not apply but everyone in the team has a very important role to play in delivering software faster.

Tuesday 17 June 2014

A Dummy REST API with Go Lang

Sometimes I just want to try out a few ideas in a HTML/JavaScript client without having to scaffold a server to provide a bunch of RESTful APIs so I've built myself a dummy REST API to do just that.

Here's how I did it!

My requirements were:
  • Serve up the contents of a file based on a route for example \Customers would serve a file in the \Customers directory
  • Allow Cross-origin resource sharing and/or JSONP 
  • Allow me to change the JSON being served without having to restart anything

I decided to write the server in Go Lang since it's fun to scaffold up http web services with and the resultant utility will be cross platform so I can tinker away on my Mac with few issues. I've used Martini framework to get up and running quickly.

So the file directory will contain the files that I want to serve up as per:




The code to scan the file directory for the files is straightforward:

The code to serve up the file based on the matching route:

So with the GET file populated with some appropriate JSON:



So now when we run the servicestub application:



And if we choose a specific item:




Now to write some JavaScript!

Code here:

Monday 1 July 2013

Setting up a Go Development Environment on a Samsung Chromebook

Since I've become an owner of a Samsung Chromebook, setting up a comfortable offline dev environment has been an ongoing goal.

I've been tinkering with various configurations for a while and have come up with this as as a starter for getting a Go Development environment up and running.

Part 1 - Installing the Go compiler

From start to finish this step should take about 40 minutes.

1. Boot the Samsung Chromebook into Developer Mode

The quick way to do this is to hold down escape + refresh + the power button.

Then press Control + D at the next page and over the course of the next 10-15 minutes the chromebook will wipe itself and install a developer mode enabled version. This brings a few things to the table like a command line shell.


2. Install Ubuntu with the help of Crouton 

I've tried dual booting Ubuntu on my Samsung Chromebook with ChrUbuntu which is fairly awesome but I had serious issues with the track pad, sound and general performance. I'm sure these issues will be ironed out over time and that being said - the wireless connectivity was much better on the Ubuntu boot than in Chrome Os - I didn't have to do the turn it off and on again song and dance with my router to get the Chromebook to connect which I frequently have to do with the Chrome Os boot.

Then I discovered Crouton and quickly I came to the conclusion that it is amazing. Ubuntu runs really smooth, the hardware works as per my Chromebook and whats more - I can switch between my linux distro and Chrome os at the touch of a button. Well four buttons (ctrl + alt + shift + Forward\Backward), but you get my point.

Installing Crouton is a piece of cake:

(i) Download the latest release from here: http://goo.gl/fd3zc
(ii) Open a shell (ctrl + alt + t) then type "shell" at the command prompt
(iii) type: sudo sh -e ~/Downloads/crouton -t unity

I've chosen unity over xfce which is used on the Crouton github page, no real reason here other than I preferred the look and feel.

Once it's finished installing it can be fired up with:

sudo enter-chroot startunity


3. Go

Go Lang is also very easy to install with apt-get and works beautifully:

sudo apt-get install golang


now check that Go is installed by typing: go version



Great, the Go compiler is in place.

Part 2 - Installing LiteIDE

This step takes about 20 minutes to complete.

Now compiling from the command line is one thing but I'm from the camp of developers that likes thumping out code in an IDE.

There are a couple of IDEs for Go that could be used - for example Eclipsegoclipse.

However I've chosen to go with LiteIDE for the time being as I like its simplicity.

There are a number of tools that need installing first that are missing from the crouton unity install:

1. make

sudo apt-get install make

2. g++

sudo apt-get install build-essential g++

3. GDB

sudo apt-get install gdb

4. QT4

sudo apt-get install libqt4-*

5. Git

sudo apt-get install git

6. LiteIDE

The only difference I had here from the instructions on the LiteIDE project page was with the QTDIR path which differs from the default path used by apt-get.

One further tip here - don't use sudo for the next few lines or you'll end up making work for yourself.

The build takes about 10 - 15 minutes. Once finished, we have a working LiteIDE in \litelide\bin

Opening the liteide application and confirming that traditional "hello world" code compiles and runs:


Excellent!

Friday 14 June 2013

Mocking OS X with Mono and Moq

A quick way to test units of code with dependencies on other objects is to mock them up. This is not a new concept and the internet is awash with decent mocking frameworks and documentation on how to use them but I thought I'd jot a quick example anyway on how to go about mocking with mono on OS X.

I've built a simple class that has a dependency manually injected into its constructor.

Here is the interface for the dependency:


Implement the interface:



Create a class that depends on the IRepository:
Now when we invoke the above class:

All works as intended, the output is "Test" which is how we implemented the IRepository.

But when testing we want to isolate the MyWorkFactory class without having to rely on the working implementation of the IRepository interface.

So there are a few ways of going about this. We could create a new implementation of IRepository with known outcomes for testing. Or we can mock the dependency up.

I'm using Moq in this example as it's lightweight, works well on mono and is easy to use.

So let's mock/moq up the repository, inject it into the MyWorkFactory class and see what happens:

So as can be seen, the MyWorkFactory object now uses our Mocked repository dependency and outputs "My Mock" so we can test the class in isolation and configure the dependencies to test all avenues.

Monday 1 April 2013

Samsung Chromebook

I unboxed a Samsung Chromebook this weekend and was looking forward to giving it a whirl but was  immediately deflated when it would not connect to my BT Home Hub. All manner of configuration changes later and I was still non the wiser.

The good news is that there is an active support forum and quick replies from the chrome ninjas. I was advised to enable the "Experimental Static IP Configuration" via the chrome://flags  configuration page. One reboot later and what do you know? I was connected.

So now 48 hours later I'm very pleased as a lot of boxes have been ticked:


  • It's fast. Really fast. Not just in terms of the boot up but the browsing is a lot more responsive than I am accustomed to.
  • The hardware is really nice. it's compact and light. The keyboard has a great feel to it and is great for touch typing.
  • The heat dissipation is excellent - this is certainly no leg warmer like the MacBook Pro
  • There are plenty of excellent apps on the chrome store to suit my requirements - and many offline compatible ones too.
  • The battery life is very good indeed - I've been using it for two hours and it's reporting over six hours left.
  • I can remote desktop onto my Desktop  and Mac.


Will it replace the MacBook Pro?

Maybe but I have a few issues to tackle first.

I'm missing a good text editor and the ability to develop .Net code. OK Google provide a Chrome Remote Desktop application that I can use to remote onto my Windows 8 desktop PC - it's very good but not quite as responsive as the Microsoft RDP client on the Mac. It may be the case that I dual boot the machine with Ubuntu and Mono to tick this box.

I've also recently taken an interest in programming in Python. I've discovered a decent alternative for the Chromebook is to "spin up" an Amazon EC2 micro instance and SSH onto it via the built in Chrosh (CTRL + ALT + T) terminal.

I'll probably blog about programming on the Chromebook once I've researched the options properly.

Oh yes, I'm going to have to find a different game to play too...

Sunday 10 February 2013

OS X - Super Simple Database Application with C# in 5 steps

There have been a couple of recent developments in the Mono world that have made creating applications on OS X even easier.

One useful tool that has recently appeared on the scene is Nuget. Visual Studio developers have had access to nuget for many years and now and finally it is available to Mono users. It's made very easy to use via Matt Ward's MonoDevelop nuget add in.

This blog post will walk through using Nuget on MonoDevelop on OS X to create a simple database application.

Install the Nuget Addin into MonoDevelop, there are instructions here, I added a new repository via the Add-in repository manager that points to

http://mrward.github.com/monodevelop-nuget-addin-repository/3.0.5/main.mrep

1) Create a new Console Application


File -> New -> Solution -> Select C# and then Console Project

2) Create the model


The model is a simple POCO that will map directly to the database:


Nothing complicated here.


3) Use Nuget to add ServiceStack.Net OrmLite

There are many Object Relational Mappers out there for C#, for example Mono added Microsoft's Entity Framework last year shortly after it became open source. I'll blog about using Entity Framework in the future but for this post I'm going to use an ORM called ServiceStack.Net OrmLite to access a SQLite database.


Goto Project -> Manage Nuget Packages

Add the following package:

  • ServiceStack.OrmLite.Sqlite.Mono

Use the search box to locate the package and hit the Add button to add the package to your project:




4) Add a couple of additional references.


In order to use Linq to query the database a few additional references are required, add these by right clicking on the references folder and selecting Edit References...

  • System.Core
  • System.Data
  • System.Data.Linq


5) Use ORMLite to create the database, populate it with data and query it


When run:

The code for this blog post is available for download here.

Saturday 9 February 2013

What's your crash plan?

Just over a week ago my trusty hand built Windows machine horrified me by presenting me with this message:


"Back up your files immediately to prevent information loss"

Oh dear - I hadn't backed up my files on my PC for a few months and even then it was just to an external hard drive and the Mac was even worse - my apple time machine had been gathering dust in a box somewhere since I moved house. This has bugged me for a while so now it's time for action!

I started exploring options for backing up the data from all the machines used by the household. After totalling the size of the data I currently wanted to back up (around 140 GB) it became apparent that free cloud storage offered by the usual software suspects just wouldn't do. For starters these services are not really aimed at backing up the contents of your household computers, their focus is on sharing documents.

I then remembered a Hanselman blog post from last year where he documented his back up strategy. He uses a service called Crash Plan so I thought I'd give it a try. 

The software supplied by Crash Plan installed perfectly and was super simple to setup and use on all the household machines running either OS X or Windows and has just worked.

Crash plan also provide a free application for the iPhone and Android phones used in the house - this application provides me with the status of all backups and also allows me to access individual files that have been backed up when I want.

Now I don't have particularly fast upload speed on my line but one week later and I am fully backed up. 

The hard drive still hasn't crashed but I now know if it does all is not lost.