Category Archives: Cloud Computing

Architecture Qualifications

I was recently asked about Software Architecture qualifications by a reader to the blog, and I’m keen to share my advice a bit more widely.

They are an experienced developer, looking to take a step away from being an individual contributor, to become a design and system implementation influencer. They wanted to know my thoughts about TOGAF, and how useful the qualification had been in my own progression.

My advice assumes that you’ve already decided that a certification or formal course of study is the right way for you to go on your next learning step. If you are a proponent of the 70/20/10 model, then this is very much covering what you should do with your 10% time. So, without further ado, let’s get into the nitty-gritty of the possible options.

TOGAF was an interesting course of study, but I really feel it’s got a very narrow range of applications. I believe it’s only relevant for a very small number of extremely large organisations. It’s main focus was around the creation and maintenance of a large architecture practice in an enterprise, which is well away from the day to day of designing and developing systems.

If you are looking at a progression path from individual contributor to technical leader, then I’d strongly advise your favourite flavour of cloud certifications. I use AWS at the moment, and there’s a Solutions Architect track that’s really good. There are similar Microsoft paths for Azure, or Google ones for their cloud.

ITIL is possibly a useful direction, but that does tend more to hardware and processes, so might be less useful if you are aiming to design and create new systems, as opposed to running existing systems stably and efficiently.

If you are thinking modern companies, strong agile approaches and staying close to the day-to-day implementation of your designs, then the Cloud route is my number 1 suggestion, and there’s a lot of great supporting courses out there to aid your studies!

Advertisements

AWS Cloud Practitioner

In a brief break from focusing on Leadership books, I’ve been brushing up my technical skills and reviewing some training material from AWS.

Amazon’s cloud offerings are many and varied, and they can be daunting for anyone unfamiliar with the basic concepts. The AWS Cloud Practitioner certification provides a grounding in these core concepts, and is suitable for anyone who has to interact with AWS in a professional capacity.

The AWS provided training consists of around 7 hours of content, covering the basic principles of the cloud, outlining some core AWS services and then covers security, design and pricing. It’s broken down into short videos with knowledge checks following each section. It’s easy to consume and easy to understand.

The certification is a single multiple choice exam, consisting of 60 questions with 90 minutes to complete. I also took the practice exam, which was 25 questions long, but I’d advise doing this a few days before your scheduled exam as the the results are not always ready immediately.

Once you’ve passed the exam, you get access to a digital badge¬†that you can share to display your credentials.

As is often the case with these kinds of certifications, the value is in the initial training, which I would recommend for anyone who wants to learn the difference between EC2 and S3, and why either matters. The exam and certification are an additional extra, nice for hte validation but not fundamentally required.

This is a stepping stone for other more involved certifications, but I’d say it’s not required. An experienced developer could skip this and move straight to the associate level exams without missing much, whereas a product owner, scrum master or other non-technical person may really find benefit at the practitioner level.

Map Reduce

The idea of Map Reduce is very simple. It is a method to split up a complicated problem into smaller work packets that can be solved on a distributed system.

There are two main parts to Map Reduce, the Mapping function and the Reducing function.

The Mapping function is used to split up your dataset into manageable chunks. It is performed by the master node in the system. The Reducing function will be run on each of your worker nodes. It takes a mapped set of data, processes it, and returns the results to the master node. The master will then collate all of the reduced values and return the final results.

There are some further complexities to Map Reduce implementations, namely how the data is actually sent to a worker, how the results are returned and how the scheduling is managed. This is basically what a system like Hadoop will manage for you, so you can concentrate on the details of your Mapping and Reducing.

The canonical example is to produce a count of words in a document. The input to your mapping function is a string containing the text of the document. This will be split into words by the mapper, and a list of key value pairs identifying each word will be sent to the worker nodes. The reduce function will simply count the values provided, and return the results.


"A simple string with a repeated word"

Map function


function map(string mapInput)
{
  foreach(var word in input.ToLower().Split(" "))
  {
    Console.WriteLine(word);
  }
}

function reduce(string[] reduceInput)
{
   Console.WriteLine(reduceInput.Length);
}

This is rough psuedo-code showing what you’d need to implement, it will not work exactly as written, you will need to customise for your actual implementation.

We’d expect from this to see the following results:

a 2
repeated 1
simple 1
string 1
with 1
word 1

From this simple example, you should be able to see how we can expand to cope with much more complicated and interesting problems.

Azure – Roles

My simple Azure application, Magic Deck Statistics, makes use of two of the different types of Azure roles, web roles and worker roles.

Very simply, web roles are used to run web applications, via IIS. Worker roles do not have access to IIS (by default), and are used to run processes that don’t have user interaction.

In my application the worker role runs a daily job to download data from various sources and process it into a SQL database. The web roles deal with the presentation of this data, handling requests from visitors and querying the processed data.

The advantage of Web and Worker roles is that you don’t have to worry about the underpinnings of the setup. The server details are abstracted away and managed for you. You can very quickly scale up by provisioning additional web and worker roles through the management console.

If you need to deal with more complex software scenarios you can make use of the more general VM options, which allow you full configuration control over the VM. The cost here is that you must then maintain the entire machine, dealing with any patching and updating yourself.

Azure – An Exploration

Microsoft’s cloud offering is known as Windows Azure, and it is rapidly maturing. A couple of years ago it was clunky and hard to use, now it integrates quickly and easily with Visual Studio, making creating and deploying scalable cloud applications quick and easy.

There’s a free trial for new users, and those of us lucky enough to have MSDN subscriptions¬† get a certain amount of compute and data space for free. You can run a number of websites at a free pricing tier, but for anything complex or customised, you will have to pay.

I’ve created a simple application, Magic Deck Statistics Viewer, to get a fuller understanding of Azure, and how it all fits together. My application collects information about decks used in the popular card game Magic: the Gathering, and allows users to generate charts based on the collected data.

It uses a single worker role to gather data once a day, and two web roles to manage the actual site. It’s developed in ASP.NET MVC4, using Visual Studio 2012. I make use of the Extra Small compute instances. These are very cheap to run (you can manage 8-9 on an MSDN subscription), and allow practice with the features of Azure without running up a large bill.

I’ll be covering the details of my Azure Exploration in further posts, and hopefully it will prove to be useful and enlightening to those just starting to investigate the cloud.