Monday, 27 June 2016

Yet Another Brexit Blog Post

Brexit is happening. There's some straws to clutch at, but they are unlikely to stop this. I don't intend to turn this blog into a political commentary or a place to bemoan what might have been. There are lots of people who do those things much better than I can. As ever, my blogging contains significant elements of collecting my own thoughts. I'll try and edit the worst of the ramblings down...

I have been alternately sad, angry and despairing for a few days; and binged on news, Twitter and political discussion at work. Now it is time to try and start collecting my thoughts on how this affects me and those around me. I'm not going to go into my personal life too much - suffice to say that I'm worried for a future that seems certain to lack many of the advantages I benefitted from; songs from the 1980s have renewed relevance; and I have a shortlist of places to emigrate to if it gets really bad.

This decision affects all of life in this country and so I'll set out how I think this decision, and the wider social implications, affect computing as an industry and profession. Some of this may well be true for other sectors, but I lack the expertise to comment. The Leave campaign is urging us Remainers to be positive and get stuck into working out how to make this work. There's a bitchy side of me that says that this is because they lack the vision to do so themselves. However, like inspiration, a positive outcome is more likely to visit those that are busy.

The short term 

There was quite a lot of good things about being in the EU. There may be some good things being out. There's very little to recommend about the process of leaving. This will be a hard time all round, arising from several factors:
The economy has had a nasty shock and we have at least three years of uncertainty ahead of us.
(3 = some time to put government back together since everyone's just resigned and activate article 50
+ 2 years negotiation
+ some time to have any idea how the deal reached plays out in reality.)
Any benefits (up to £350million / week if the leave campaign is to be believed, or more likely £175 million / week net) from paying less into the EU won't trickle through yet. The benefits we currently get from the EU, e.g. scientific grants, will start to dry up as new investment and partnerships are refused or seen as too risky. With a stretched economy (the pound continues to take a hammering even if the FTSE is stabilising) there will be even deeper cuts to infrastructure and services.
Vote Leave Poster: We Send The £50 million Every Day
Sectors such as banking will find companies looking to EU states as a larger market to be present in. Sectors closely tied to public spending, such as health and education, will suffer a lack of investment and uncertainty in their future plans.
Start-ups will find funding hard to come by as hard times hit investors' pockets and uncertainty makes selling the financial side of a vision harder. There are already reports of funding being lost.
How to approach this challenge?
There will still be business in this country. As a company, if you already have funding then diversifying your markets and workplaces and seeking ways of being distinctive and adventurous without excess investment seems like the way to go. New ventures will find funding harder to come by. The "you need less than you think" ethos of REWORK will be one to follow.
As individuals, if ever there was a time to up your game and make what you're doing work then this is it. Over time many may need to consider what mix of skills, time commitment, money and location they can be flexible on. But, as a group, we shouldn't devalue what we have: the UK has a strong tech economy and we are still competing in a global market.

Until negotiations are complete recruitment and retention of staff from beyond our borders will be harder, as uncertainty surrounding the economy, visas to work and residency will be off-putting even if the barriers are not yet different. Ahead of any negotiations, Boris has written in the Telegraph that rights of workers from the EU will be protected and that there will be points-based immigration. However ahead of the Article 50 process being well underway certainty on this seems impossible. Certainly friends from the EU that I have spoken to are at least working on a plan-B.

From the news, it seems that the more xenophobic aspect of the leave campaign has already given an impression of legitimacy to the nation's racists. I hope this is quashed immediately, but as the false prospectus of greater investment in services and deprived areas becomes felt this will be an ongoing task. This will compound the problems and affect short term recruitment, including of students, as well.
Controversial UKIP Poster
The great talent from around the world and across society that we have needs to feel valued. The occasionally macho, sexist, white male environment of tech needs to be consciously welcoming to achieve that. Compared to those that voted leave in desperation at being excluded from the government's value we're mostly pretty well off. Brexit has been described as divisive and I'm quite happy to continue to disagree with many elements of the leave campaign.

The medium term

With a "short term" of three years I've just outed myself out as an ex-academic. Despite working at a start-up, given the squeeze I think universities will feel in the next years, I think I've made the less stressful choice.
As the shape of the negotiated exit becomes clearer planning can become more confident. It seems likely that some links to the EU will be maintained. Other countries in the physical continent are outside the EU, but none has left before. At present the EU doesn't seem in a mood to be charitable. This will extend the differences illustrated below.
Four Maps Illustrating Fault Lines In The EU
Over two to five years the mismatch between the promised removal of EU bureaucracy, costs and immigration and reality will become clear. The country - in particular the disadvantaged rural and ex-industrial communities - will have suffered several years of austerity, uncertainty, likely Scottish independence and a possible resumption of tension in Ireland as many of the assumptions of the Good Friday agreement vanish. The potential for unrest is there.

There are some implications from the short term which bear extrapolation.
Being lean and agile have been a thing for a while. Reacting to the unexpected and being quick to diversify and adapt are clearly necessary now. Although advocating lightweight start ups, this may need some business skills to achieve, including:

  • Bases in multiple countries to work in local economies, to avoid currency and punitive tax exposure. Our country seems likely to become more insular, but software is impossible to constrain within borders. We have to buck the trend.
  • Going to meet customers and the tech community rather than expecting them to come to a UK that has shrunk. Our experience of being in something bigger will have to be brought to bear here, retaining that outlook. 
  • Use of cloud services to enable growth and shrinkage is already standard. Their use to avoid infrastructure outlay will become more desirable with smaller companies. Enabling location of data and computation in non-UK jurisdictions to make meeting local data protection requirements easier will be a new advantage to consider. This adaptability isn't magic that occurs only in the cloud-provider's API. Learning past lessons of scalability matter more than ever.
  • Planning for internationalisation, in an environment where multi-lingual staff become harder to recruit.
The economics of buying hardware from abroad will have changed. Assumptions about the lifecycle of devices may also change if our economy slows down and pound remains low valued.

The keep it small approach is a different sort of business. It is one run because the members are interested in running it as an ongoing profitable and enjoyable affair. The "unicorn" companies and big IPO will be a harder dream to achieve from the UK. So the sector will morph in its outlook. However, as certainty returns (even if it is a less rosy certainty) planning and confidence can start to return.

The talent issue will really start to take hold as border arrangements change. Recruiting skilled workers is likely to be easier than minimum wage farm workers. Even so, recruiting from the EU will involve more complexity than it did last week. Anyone who has recruited from beyond the EU will have some idea of the ever changing requirements and processes this could well imply. Overseas corporate presence, or even virtual Estonian residency, will be an advantage here in avoiding dealing with the UK border agency. Employment by an interesting company will remain an attraction. Given my predictions of racism and civil unrest residency in the UK may be less attractive. The remote working approach of distributed offices are already effective. In smaller companies the very small offices and home working, for instance as described in The Year Without Pants, will fit well into the slim and quickly self-sufficient approach.

The long term

We're human. We'll find a way to make this work. How, I'm not going to claim to know. So I'll try finishing with some top of my head predictions and suggestions:

  • If there isn't some degree of talent-drain abroad I'll be very surprised. 
  • Attracting British talent to computing will be necessary. We need to show inclusivity to as wide as possible a community. We need to enable aspiration and creative thinking. Having started my working life during a recession that feels like a significant challenge.
  • It is also in the long term that the hit on local science and technology education at university level will be felt as a reduction in students willing to take on degree level debt and a shrinking of the university sector occurs. Online courses taken as continuing professional development will become a greater part of the professional's CV. 
  • Local communities exchanging ideas and attracting talent to cities will continue to be important (Brighton Java is my pet example, but these things exist in many places). Forward looking companies will continue to sponsor such meetings to foster a local community of talent.
  • I mentioned above that we'll need the experience of having been European in maintaining an outward looking business environment and inclusive work environment. That tradition must be consciously passed on and grown.
  • The cost of power may change many times over this period: due to exchange rate variation, Scottish independence, changing power balance with nations we import energy from and global warming prevention and realisation. Technology which is power-aware, adaptable and enables power-use reduction should already be a growing sector and should certainly grow faster.

Friday, 15 April 2016

Internet of Things and The Cloud

The Internet of Things (IoT) got some bad press as a concept a couple of weeks back, as a result of the plug being pulled on Revolv's Nest hub, e.g. Wired, BBC, The Guardian. And indeed, when I buy a "thing" I expect it to just work - and for some time. Thermostats, light switches, smoke alarms and kitchen-appliances are all things that I expect to leave working when I move house in some years; lightbulbs less so, but I don't expect all of them to break at once because the light fitting or the wires have become obsolete. And that is the issue here: the controllers still work just fine - but the cloud service that they require to work is being switched off.

The added cost of being an early adopter probably comes with an expectation that the product may date faster than if I wait, or require more frequent fixes and upgrades - as is the case if you compare my FitBit to my old analogue watch. In any case, this story raises a wider question of what happens to IoT tech as the market evolves. As someone that's worked in "pervasive computing" and IoT for some time this is naturally of interest to me.

There's a number of potential solutions, and their accompanying issues, that have come past in news articles and Twitter, including:

  • Use products with standards, assuming that the idea is mature enough to have a standard. A feature of innovation is that at least part of the idea is often ahead of the standards curve.
  • Provide a refund and ease people onto a new system, assuming they are willing to invest in that on the back of their previous investment being a dud.
  • Sell or open source the cloud software so others can keep it going to service the device. However, in most cases this software may contain some significant IP that the company probably wants to sell or reuse in a pivot of their idea. If open source was the right solution it would probably have been part of the picture well in advance of pulling the product.

So, what to do?

My suggestion is that the IoT is a good example of a system where "cloud" services supporting the devices would benefit from a different approach: 
It isn't the cloud, it's a cloud.
By which I mean: don't sell me a device that connects to some centralised cloud service - unless you're really small scale and alpha-testing. Sell me something that brings up its own cloud service, maybe with a provider I choose, and that I get billed for. It can let my phone know where to look so they can talk - it will just be a URL and the physical thing can bootstrap disseminating this.

Maybe this is a systems architect's solution, and certainly feels obvious if you cast this as a distributed systems problem, but I can see several advantages:
  • If the vendor switches their systems off mine can keep going. If there's some advantage to them aggregating data from many customers that can happen by forwarding on from my cloud to theirs - and fail cleanly if the need changes. The crucial step being to eliminate centralised nodes in the control loop.
  • The home systems can point at a repo with appropriate security to manage centralised updates. There's no particular need to give me a login to the cloud service, so the secret sauce isn't much more exposed than it would be when centralised - especially given that one end of it is hardware in my possession. 
  • By putting my data on a system which I control I can be given greater and more plausible control over my privacy. My detailed data is visible to fewer people and a hacker has to gain access to many systems to gather large amounts of data.
  • By distributing the system over many little services scalability is easy and system failures affect fewer users.

Wednesday, 20 January 2016

make-ing docker

I mentioned some time ago that I've been exploring Docker. I then changed my main job and the blog went rather quiet. I have a contract that gives me some time and IP to myself, and this is finally turning into a new project. Between personal projects and work I've spent a whole lot more time with Docker. So, a blog post, about building Docker images, that might be useful to someone...

A bit of background:
First, I'm a fan of make. I know, it shows my age. But, it's really quite good at handling building tasks with a minimum of tricky requirements on the build system. It can also be turned to organising installations and running jobs, so keeping related issues in one version-controllable place. It is also in a format that's quite user friendly, even when ssh-ed into a server and fixing stuff with vi. So, while I've also used Ant, Ansible and Maven to do some of these things, it remains a reliable standby.

One of the things I use Docker for is setting up groups of images which are related to each other: django; django with a different config; django with that config, but setup for running tests rather than running the server - and so on. Which leads to dependencies.

Most of the time when I use Docker, its in a local environment, and I just build and run images on one machine - without using Docker Hub. I like having the tool chain version controlled, and this approach fits.

Make is good at handling dependencies, but the relationship chains in Docker image definitions don't expose themselves in file names (unless you're very organised with naming schemes). The quick first approach is to make them with explicit pointers in the make file as well as in the Dockerfile. But eventually that level of duplication will irk. So, in a spare couple of hours to polish my build scripts, I refactored the duplication out of the Makefile.

The key parts of the makefile are below - stripped of my builds to show the principle:

DIRS := $(shell find . -mindepth 1 -maxdepth 1 -type d)
DOCKERFILES := $(addsuffix /Dockerfile,$(DIRS))
IMAGES := $(subst /,,$(subst ./,,$(dir $(DOCKERFILES))))
FLAG_FILES := $(addprefix ., $(addsuffix .docker, $(IMAGES)))
PWD := $(shell pwd)

# Docker images can depend on each other.
# A changed base image ought to trigger a rebuild of its children.
define image_dep_search
@echo "checking dependencies of $1"
@for d in $(IMAGES); do \
 from=`grep FROM $$d/Dockerfile | cut -d ' ' -f 2`; \
 if [ $1 = $$from ]; then \
  echo "dependent image $$d"; \
  touch $$d; \
  make .$$d.docker; \

all: images 

# Consider all docker image directories for building
images: $(FLAG_FILES)
 @echo "Done making images."

# Build images where the directory or contents have changed since flag last set
.%.docker: % %/* 
 $(eval IMAGE = $(subst .,,$(basename $@)))
 $(eval BASE = $(word 2,$(shell grep FROM $(addsuffix /Dockerfile,$(IMAGE)))))
 $(eval HAS_DEP = $(filter $(BASE),$(IMAGES)))
 @echo "building $(IMAGE)"
 @cd $(IMAGE) && docker build -t $(IMAGE) .
 @touch $@
 $(call image_dep_search,$(IMAGE))

# Utility make targets for creating containers from images 
.PHONY: run_java_bash
run_java_bash: .java_base.docker java_bash_container

.PHONY: java_bash_container
 docker run --rm -v=$(PWD)/..:/project -it --name java_bash java_base bash

 @rm -f $(FLAG_FILES)

In order, this contains:
  1. Some definitions, which find directories that contain Dockerfiles. Files called ".<imgname>.docker" will be created to mark the latest build.
  2. A definition to use later, that finds the FROM line in the Dockerfile; extracts the argument; sees whether it is one of our images; fiddles the need for a build and calls make on that image.
  3. The standard make stuff, to run a build on each image directory which has changes. Once built any dependent images are found and built using the routine defined above.
  4. A phony target to run the container, to illustrate the point. It double checks the image, in case we're forgetful about running "make all" first. This provides the project root (the parent of the docker directory) as a mounted volume - which may or may not be a good thing, depending on your use case.
  5. A clean target, that gets rid of the build flag files. 

There's a couple of assumptions here:
  1. A directory of docker images within the project. I usually call it "docker". This is the set of docker images that will be considered for dependencies. The rest are just assumed to exist. The makefile is in this docker directory.
  2. The FROM and the image name in the Dockerfile are separated by a space.
  3. Images are flat directories. I think that structure is probably better put elsewhere, and built in an archive format than copying lots of files one by one in the Docker build process - so this hasn't been an issue for me.
  4. If one image depends on having another already built then another makefile rule is needed to force correct order. I'll update this when I've got an update that automates this as well. Simple "list of target" rules may well be needed to build specific sub-sets anyway.
And that's it: build the images same as I build the code, with a minimum of drag on my effort. Easy to call from Jenkins, easy to call from the command line.

Addendum: With a bit of tidying up, and some example Dockerfiles, this is now on GitHub at