Wed, Nov 9, 2016
Internet of Things (IoT) devices have come a long way over the last couple of decades. In 1996 I remember being in complete awe at turning on a light from my Pentium II-based desktop computer using X10 and being equally frustrated when I switched on the microwave and the same light turned off. In equal parts it marveled and disappointed and to top it off it wasn’t particularly cheap. Fast-forward a couple of decades and IoT is the new buzz-word, previously know as Ubiquitous Computing, Pervasive Computing, and a whole host of other buzz words but in short what it really means is our ‘things’, our devices, our environments, are all getting smarter. In part this is because technology is, as promised by Mark Weiser and others becoming more pervasive because of the low costs associated with microprocessors especially from ARM. ARM has designed the technology shipped in a staggering 86 billion chips over the last 25 years which is an amazing achievement and highlights the fact that yesterday’s ‘dumb’ device has given birth to today’s technology-emblazoned offspring. But what does that mean today? Well, there is a growing concern in the industry that what this really means when you peal back the marketing onion is that we are in for an interconnected mess of devices with little-to-no security and a potential nightmare for device management, control, and software updates.
Canonical have observed this problem for a long time. Being part of the early effort to bring a solid Linux distribution to ARM devices, helping found Linaro to work on essential Linux/ARM projects, and most recently on Ubuntu Core for IoT devices (and beyond), this and other efforts to shore-up defenses and bring about a step-wise improvement for Linux devices over the years really has improved the whole IoT offering today. Many companies have contribute so far and this certainly is not a one-man show, IoT and Linux on IoT is such a massive endeavor that the whole industry has to come together to agree on ways of working, software delivery mechanisms, device updatability, security mechanisms, and device management. The newly formed LITE group in Linaro, comprised of Canonical, ARM, Huawei, NXP, RDA, Redhat, Spreadtrum, STMicroelectronics, Texas Instruments, and ZTE, has the lofty goal of trying to foster security and inter-operability in this fragmented world and personally, I welcome initiatives such as this as long as together they drive the industry forward.
Last Thursday Canonical did just that with the release of Ubuntu Core 16. For those that are not familiar with this project, Ubuntu Core is an entirely new rendition of Ubuntu, stripped back and redesigned from the ground up to have security and IoT at the forefront right from the very start. It uses the revolutionary new package format: snaps, to deliver self-contained software into a constrained environment and builds upon great Linux projects such as systemd, seccomp, cgroups, Linux kernel enhancements, squashfs, and others to form a secure and extensible platform for the IoT. Ubuntu Core 16 is the latest in a line of releases that are already proven in the real world. Dell and others have been shipping Gateway devices with previous versions of this OS for sometime and many others are using it for use cases such as digital signage, robotics and more. The technology is mature and the developer story: creating and running software on your desktop that is deployable on your Ubuntu Core IoT device is compelling. In short, the team has done tremendously well to move the world of IoT forward and together, with innovative device makers, can truly deliver on the promises that the IoT has made for some time.
Tue, Sep 13, 2016
If, like me you have upgraded to the Mac OS X Sierra beta you may find that Git is broken from the command line. The error I was seeing was:
xcrun: error: invalid active developer path (/Library/Developer/CommandLineTools), missing xcrun at: /Library/Developer/CommandLineTools/usr/bin/xcrun
It turns out that the developer tools are borked on upgrade so need reinstalling. To do that you need to run:
After a short download and installation everything should be back to normal.
Fri, Jul 1, 2016
A bit belated but here is a short report of two events I ran over the last month or so: the Keynsham 10 Miler and the Cheddar Gorge Half Marathon.
Keynsham 10 Miler (10 miles, 16.1 km)
This event was held on Sunday 22nd May in, you guessed it, Keynsham. The weather was fine although it had been raining the previous days which mean the largely undulating and off-road run was, muddy. Billed as a “multi-terrain” event the run took us down step and slippy hills (I clearly wore the wrong footware), over decrepit wooden styles, through long grass, up grueling inclines, and over foot-destroying loose stones; and then you do it again (its a two lap course). It was great. I must admit that due to a shoulder injury I picked up running trails in Salt Lake City my fitness has completely plummeted so this challenging off-road 10 miler took more out of me than I expected. Towards the 8 mile mark I was experiencing hamstring craps and at the end the pace slowed to a crawl. Despite all this I posted a respectable time and I will definitely be back again next year.
Cheddar Gorge Half Marathon (13.1 miles, 21.1km)
This was the second time I’d ran in Cheddar this year, the first was for the 10km event, and again the event was run to perfection. Tom and the team do an admirable job of ensuring the runners are registered quickly, drop their bags of easily, and are on the start line vaguely knowing where they are going. But the main actor in this event is the stunning scenery. Up on the top of Cheddar Gorge you get a great view of the surroundings and a glimpse of the pain to come. I started my Garmin on the walk to the start line this time and my watch read 528ft climbed, which is more than most complete marathons, and this was before the run itself!
Again the run was off-road, challenging, and “undulating”, but this time it was also cold, wet, and extremely muddy. Coupled with the lack of training this was probably my most challenging run this year. I took it easy on the first lap but with a 500ft climb in the middle, everything after that, whether it be flat, decline, or incline, gave my legs a bashing. A disappointing time (I was aiming for sub 2hrs and missed it) but a great day so overall a success. I’ll be back again in a few weeks to complete the Cheddar Gorge challenge and look forward to running many more Relish Running events next year.
Fri, Jul 1, 2016
This weeks snippet is all to do with improving the snap developer experience.
Since the release of snapd 2.0.8 we have added one of the most useful tools for snap developers:
What this does is effectively mounts any folder containing an unpackaged snap at
/snap/snapname as a writeable folder allowing quick iteration during the packaging process. No longer do you have to create a read-only squashfs snap and install it to try out your latest changes and this speeds up the workflow tremendously. The process I use from the package directory is:
% snapcraft prime
% snap try prime
... test/hack ....
% snap remove
When you are happy with your testing you can create the real snap file with:
Thu, Jun 16, 2016
Yesterday we announced the new home for everything snaps and Snapcraft, snapcraft.io, and at the same time made available the cross-distribution work that really does means snaps can run on virtually any Linux distribution. Today we have enabled support for Debian, Arch, Gentoo, Fedora, and of course Ubuntu and all its flavours but enabling more including OpenSUSE, CentOS, Elementary, Mint, OpenWrt and others is in the works.
The announcements was met with a mostly positive response which, given that Linux packaging has been an issue for so many people and for so long, is hardly surprising. This particular problem has resulted in the community creating a few different initiatives such as AppImage and OrbitalApps, tackling the problem in different ways and all of which have their own merits and limitations. In my opinion (of course slightly biased but based on technical fact), snaps are the best solution for a complete cross-platform universal package format whether that be for the Linux desktop, IoT and mobile devices, or the server. Snaps are surprisingly easy to use, encompass industry-leading security with AppArmor, Seccomp filters, and more, and all with the very familiar and popular Ubuntu development workflow. A special mention has to go to the tooling as well; Snapcraft is already pretty awesome but week after week new features are being added to improve this further and snapd, the tool that runs snaps in a confined environment, is seeing so much innovative, open development that these two are great examples of projects to get your teeth into if you are a developer, tester, or just want to dabble in an open source project.
There has been some doubt that snaps really could run on multiple distributions and, as part of the team who tested this extensively, I can definitely confirm this is the case. I played around with several native installations and VMs over the course of last week and the results were super positive. As the saying goes, a picture is worth a thousand words so I took a couple of screenshots. Enjoy!
Wed, Jun 1, 2016
There is a lot of buzz around snaps, the new packaging format created by Canonical to enable secure, transactional, and robust application updates, and rightly so. This new method of distributing applications is revolutionizing not only software on IoT devices, but on the desktop, server, and beyond. The software that actually runs the snap applications is called
snapd. Hosted on GitHub,
snapd uses the Go language and is actively developed by a core set of developers, but, like most projects at Canonical, we actively encourage as much community participation as possible. One of the core developers, Zygmunt, posted a great outline on how to make your first contribution to snapd and, taking up the challenge, I did exactly that.
Zygmunt’s instructions are pretty clear but I thought I would look at this from a new developer’s perspective, using a clean Ubuntu 16.04 install and with a new GitHub account. What follows is a guide to setting everything up ready for your first contribution. As a side note I have various machines I work on regularly, mainly my Lenovo laptop running Ubuntu, but also my Macbook Pro, and a HP Windows all-in-one. This allows me to look at the various platforms available to developers all of which are valid options for software development, especially in the IoT world.
Installing Ubuntu 16.04
Installing Ubuntu is very straight-forward and, depending on the way you intend to run it, the instructions vary ever-so-slightly. There is a comprehensive guide to installing Ubuntu as the only OS available on the Ubuntu Site which will get you up and running in no time. Remember to download Ubuntu 16.04. On the Mac and Windows platform you will need to install Ubuntu as a Virtual Machine (VM) but again, this is straight-forward. I chose to use VirtualBox as it is free, feature-rich, and runs Ubuntu great; download it from the website. The Ubuntu Community is great when it comes to sharing information and there are already many tutorials on how to get an Ubuntu VM working. Although a few years old now, the answer on AskUbuntu is still valid today and is worth reading after you have installed VirtualBox. I highly recommend you also install the Guest Additions iso to enable functionality such as drag and drop between host and guest, and window resizing.
There are a few extra packages you will need out-of-the-box for snapd development so once logged in to your shiny new Ubuntu install you will need to check for, and install, updates:
sudo apt-get update
sudo apt-get upgrade
In addition to the updates you will need to install
golang, and it is useful to install
snapcraft-examples for snapping applications later:
sudo apt-get install git bzr golang snapcraft snapcraft-examples
So now you have Ubuntu on your development machine, what next?
As stated previously, all
snapd development occurs in GitHub. To be able to contribute to the project you will need a GitHub account so go ahead and create one. Fortunately GitHub’s documentation is easy to follow so you will need to sign up for a new account - a free account is just fine, and follow that up by setting up Git. We are going to be using ssh for cloning so generating SSH keys is also needed but if this is a little too scary you can use the HTTPS method instead, just be aware that instructions further down and in Zygmunt’s post will need to be modified slightly but I’ll point that out. For reference there is a good [Git cheatsheet](Git Cheatsheet available from GitHub that is worth looking at.
Now we have a working Ubuntu install, Git and GitHub all set up, it is time to get developing.
Fortunately Zygmunt has this bit covered. He has a great tutorial on how to fork the
snapd project in GitHub (just go to the project and click the Fork button in the top-right corner) and set up the GO environment variables ready for development. I highly recommend using the same directory structure as Zygmunt (mine is
snapd) as this allows you to use his very useful
devtools branch for useful helper functions and make sure you run the
get-deps.sh script in the
snapd directory to ensure you have everything set up correctly. Make sure to also get the
devtools branch with a git clone and have this ready somewhere in your development directory outside of the snapd directory.
git clone email@example.com:zyga/devtools
Development workflow is particularly nice. You hack on the code, run the tests, refresh the
snapd code with your changes, and test them manually. The
refresh-bits script from
devtools starts up a separate instance of
snapd with your changes and allows you to install snaps and test your code without affecting the host system. On my system this looks something like this (expanded a little for clarity):
** hack, hack, hack **
** iterate until the tests are happy **
./refresh-bits snap snapd setup run-snapd restore
** open second terminal and test changes **
Well that bit is easy, talk to the developers on irc, sign up to the mailing list, go find a bug to fix, but most importantly, get involved!
Wed, May 18, 2016
Snapcraft is described as a “delightful packaging tool”, enabling a developer to package their app from a single tree by aggregating the pieces from multiple places if necessary. It supports multiple build technologies and produces a .snap package with all its dependencies for Ubuntu Core and now the Ubuntu Classic Desktop (using snapd). It is the future of packaging for Linux-based systems. I encourage the reader to read the documentation on GitHub to get a flavour of what Snapcraft is and to learn more about the key concepts, setting up your system to produce snaps, and a nice first snap walkthrough example. For this post I am going to introduce a couple of concepts that served me well when snapping the Electron-based application, Simplenote.
Simplenote is a cross-platform note-taking application that uses Simperium to sync notes across Android, iOS, Mac, and of course Ubuntu (and other Linux systems). It has support for instant search, tags, note sharing, backups, and best of all it is free. This makes Simplenote a great choice if, like me, you use several systems on a daily basis.
Snapping Simplenote is relatively straight-forward but as I walk through the process below there are a few concepts that could help others snap applications. The rest of this post assumes you are using an Ubuntu 16.04-based system (VM, PC) and have installed snapcraft as per the instructions on the GitHub page.
First we need to get the latest version of Simplenote from GitHub, at time of writing this was 1.0.1.
tar xvzf Simplenote-linux-x18.104.22.168.tar.gz
Then we need to create our initial
snapcraft.yaml file in the Simplenote directory to tell Snapcraft how to package the application.
snapcraft init creates a barebones file ready for editing. For Simplenote you need the following, don’t worry about the contents just yet, I will point out the important bits soon.
summary: The simplest way to keep notes.
description: The simplest way to keep notes. Light, clean, and free.
plugs: [unity7, opengl, network]
The first 4 lines of the file are there to describe details about the package itself including the name, version number, and a plain text summary and description. After that we get on to the sections that describe how the application is executed and what features it needs from the underlying system.
command is the command to run on execution and
plugs tells the system that this application wants to use the
network interfaces - all mandatory for Simplenote. You can read more about interfaces in Zygmunt’s series of interface articles.
As a side note, the command entry is a little different for this application as Simplenote (and other Electron-based apps) need a few environment variables set up before the actual binary can be called. This is accomplished by creating a wrapper script that does this set up. The contents of the wrapper file can be seen later in this post.
Back to the
snapcraft.yaml file. The
parts section describes what Snapcraft needs to do when creating the the
.snap package. In this case we rely on the
copy plugin that enables, you guessed it, copying of files from the host system during packaging to the snap package. The actual files to copy are listed in the
files section. This copying is done to ensure that Simplenote has all the libraries and binary blobs necessary to execute once mounted within it’s snap-based confined area, running on a Snappy system. The format is:
One entry that sticks out a little is the resources line:
The use of the globbing wildcard ‘
*’ ensures that the whole resources directory is copied across.
One section we skipped over was
stage-packages. Snapcraft is a very clever tool and we benefit from it’s knowledge of Debian-based packages by stating in this section that we want to install
gnome-themes-standard from the Ubuntu archive into the snap; again these are required packages for Simplenote.
Another side note. To understand what libraries are required by a binary like Simplenote we can use the ldd command:
This produces an output that can be studied to understand where the binary is looking for its dependencies. Simplenote comes with a couple of libraries embedded in the package, namely
libffmpeg.so and ldd shows these on my system as:
libnode.so => /home/jamie/snapping/simplenote/Simplenote-linux-x64/./libnode.so (0x00007fa6caa85000)
libffmpeg.so => /home/jamie/snapping/simplenote/Simplenote-linux-x64/./libffmpeg.so (0x00007fa6c5801000)
Notice that the entry after ‘
=>’ points to local .so files in the
Simplenote-linux-x64 directory, this means that we need to copy these over into the snap, hence the
copy entry in the
snapcraft.yaml file. All other libraries are present on the host system and will be used automatically by Snapcraft.
The contents of the wrapper file discussed above are:
exec "$SNAP/Simplenote" "$@"
There is nothing too exciting about this file, we set up the fonts path to be inside the snap directory itself (remember snaps are confined) as well as the share folder. We also set the
LD_LIBRARY_PATH to ensure that the snap is looking for it’s libraries in the right location, namely
$SNAP/usr/lib/x86_64-linux-gnu/. Again, looking at the
snapcraft.yaml file you can see that we copy
libffmpeg.so here to ensure Simplenote does not complain about missing libs. The last line executes the Simplenote binary.
The wrapper file needs to be executable so we do this using the chmod command:
chmod +x wrapper
Building and Installing
All that is left to do is build and install the snap.
sudo snap install simplenote_1.0.1_amd64.snap
This process will pull down any dependencies and create the squashfs-based
.snap file. After this we should have a
simplenote_1.0.1_amd64.snap file in the local directory ready to be installed with the
snap install command. One caveat at the moment is that Simplenote will expect to use
dbus and with AppArmor confinement this is not possible with the application we just built. It is possible to get around this but I will leave that as an exercise for a later post.
If you try to run the application using:
You will see the following:
What we can do, which introduces a new concept nicely, is use
—devmode. devmode allows the snap to break out of it’s confinement during development to quickly get up and running. From there you can look at what policy violations would potentially occur and adjust your application accordingly. When you are happy that your application is working in a confined environment you can simply install without devmode to test.
To install Simplenote with devmode you can use the
sudo snap install —devmode simplenote_1.0.1.snap
simplenote now produces the following:
It is not perfect but that wasn’t the aim of this post. Instead we looked at packaging an application, copying files inside the snap using
copy, using a wrapper file to set up a snap environment, and now to conclude lets introduce another little snippet of information to help you debug your snapping process:
Debugging snaps using
Snapping an application is usually pretty simple but when you get stuck and just need to look inside the snap itself to see what is going on there is a simple trick to allow you to do this. Adding
busybox as an application to your snap gives you to have a shell environment right there in the snap. This allows you to poke and prod at directories, seeing if files you thought you copied over are present, and generally debugging (you can also add
gdb and other tools if necessary in the same way).
busybox we would modify the
snapcraft.yaml file above as follows:
summary: The simplest way to keep notes.
description: The simplest way to keep notes. Light, clean, and free.
plugs: [unity7, opengl, network]
plugs: [unity7, opengl, network]
Notice the extra busybox-related lines. To access
busybox once the snap is installed you can use:
Important directories are:
$SNAP - the snaps home directory i.e. /snap/simplenote/id
$SNAP_DATA - the snaps data directory in var i.e. /var/snap/simplenote/id
$SNAP_USER_DATA - the snaps user data directory i.e. /home/jamie/snap/simplenote/id
All of the above directories are on a Classic 16.04 Ubuntu system.
All code can be found at:
The next steps for this application are to sort out the fonts and menu spacing, add a
.desktop file for dash discovery, get around the AppArmor violation, and upload to the store, but all of this is for another post.
Sun, Apr 24, 2016
Another week, another race, this time the Pensford 10k event, but lets take a little step back first.
At the beginning of my ‘racing’ (very loose term for official events) calendar selection I had the aim of adding at least one, preferably two events per month to ensure that the pressure to line up against others kept me honest and provided the motivation to get my backside off the couch and training. When the event is close by and offers an “undulating, and seldom flat course” with a nice 50m and 100m steep climb in the middle, it is worth a try, and try I did.
Race day I lined up with nearly 200 other people, but amongst the normal pre-race banter I was hearing whispers from the more experienced runners: ‘it only gets going around the 5k mark, watch out’, and ‘the hill in the middle is a killer’. With this in mind, I moved myself from the 40min+ starting group to the 50min+ group all whilst I was wondering what I had gotten myself in to.
I set off at a slow but steady pace, trying to conserve energy for ‘the hills’ although at this point I was overtaking people in the faster group which had me a little worried. Despite a few hills the first 5k was fine, up, down, up, down, and up again but rather doable. Then the 5k marker appeared and right on cue, the hill that was the focus of many a conversation at the start of the race.
Now, a 100m climb sounds rather small, and on paper it is, but when you are trying for a good race time, with tired legs, a dislocated shoulder (did I mention that?) and enjoying the best the South West has to offer you in terms of windy, cold weather, the start of this tiny little hill was not welcome. I’d set my Garmin watch to alert me when I was slower than an 8:30 pace thinking me and the beep were never going to meet but unfortunately the slow beep was my unwelcome companion the whole time I climbed this section.
Despite a slow time (51:27) I really did enjoy it. I am confident I could knock a large chunk of time off that next time, in good health, and with next year being the 30th anniversary of the race, I think I will be back to prove that.
Next on the agenda, the Keynsham 10 miler but I am sorely tempted to add the Bath Ultra Marathon in September to the list, after all, I’m still looking for races.
Mon, Apr 11, 2016
This week I ran the 2016 inaugural Cheddar Gorge Challenge event, the 10k race. Billed as a ‘lumpy course’ these series of runs offer “more climbing just getting to the start than you will in most other events”. With the affectionately named Hell Steps towards the end this is a tough run but more importantly, it is a fun event. Cheddar is beautiful, steeped in history and picturesque from the bottom of the gorge let alone from running up and down it so the prospect of completing 3 races (10k, half marathon, and marathon) in and around the area was too enticing to pass up.
The terrain, according to the website is “steep in places and very steep everywhere else” which sets the scene but to be honest I arrived at the event fully trained but expecting the worse. In reality there were hard sections, easy sections, a little doubt that I could actually complete the event around the 5k mark, and a ‘this is great’ moment around 7k. The distance is tiny compared to the training I have done but it was a combination of the race-day ‘too fast start pace’ and the lumpiness that hit my tired legs hard. There was never a time that I truly wanted to give up but there were times when I thought about a brisk walk rather than running. Overall I am happy to say that even up the Hell Steps I broke into a run and completed the course in a not impressive, but rewarding 58:28. I quietly wanted a sub-1hr finish but given the course I was a little sceptical about that but even though the wind was against us most of the race (how does it constantly blow directly at you regardless of your orientation?) I hit my goal.
So why would I put my body through this I hear you ask. Well, this year I am doing something out of my comfort zone, something a little crazy (for me), and something that I hope will make a difference to others. This year I am running, a lot, and for a good reason. This year I am running to raise money for MacMillan Cancer Support. To read about my story please click the link which is also where you can donate to this very worthy cause. I am putting myself through several challenges, including a marathon in the Himalayas of Nepal and a 45 mile Ultra Marathon because there are people out there who just can’t, so lets together make their lives a little easier during a really hard time.
So if you can, please donate.
Wed, Apr 6, 2016
A few weeks I joined Canonical and for the eagle-eyed you’d realise this is actually for the second time. Previously at Canonical I spent my days with the Mobile Team, realising the goal of a good Linux on ARM experience which eventually culminated in the foundation of Linaro. This time I am equally excited about another formative stage of technology, that of IoT and the possibilities of interoperable and extensible devices running a standard Linux operating system.
I personally will be working directly on Snappy Core, the technology used to provide a stable, secure, transactional, and featureful platform for the internet of things (IoT) and beyond. I believe in the future of IoT, big data, and a world full of Mark Weiser’s vision of Ubiquitous Computing and I am excited to be part of it.
Sun, Feb 21, 2016
Nepal is such a wonderful place. Steeped in history and culture with some of the most breath-taking sights to be seen, Nepal is the home to Buddhism, the Himalayas, and of course the mighty Mt Everest. Despite all this wealth, the country and the people themselves face a lot of challenges. Economically Nepal is considered a third-world country with many people in utter poverty. To compound this hardship, Nepal has also experienced terrible earthquakes that have left many dead and even more without basic needs such as accommodation, access to food and clean water, and education. Last years earthquakes were terrible and it continues to happen. This had me thinking, wouldn’t it be great if I could do something, no matter how small, to help out in some way?
Through my love of running I got to learn about the Impact series of Marathons and right there as the inaugural run was Nepal. Impact Marathons has lofty goals, in fact they are aiming to contribute to the 17 UN Global Goals through running. “Our runners will truly see the impact of their time, money and resources by visiting and meeting all of the projects they are supporting”. As part of the week long trip we will be helping repair a school that was badly affected by the earthquake as well as undertaking other community projects. The actually marathon will be at the end of the week and promises to combine stunning views, up to 2300m of altitude, and 26.2 miles of running for a great cause. I am really looking forward to it.
This year I am hoping to complete several new challenges and up my running to ultra distances. At the moment, Nepal is my final planned race of the year but between now and then there is a lot of training to do.
Tue, Jan 19, 2016
I have a confession to make. While I have publicly supported the parkrun initiative for some time and I wholeheartedly believe it is a great idea, I have never actually ran one myself. There have been many excuses from family conflicts at the weekends to blaming the weather but this week I thought I would actually ignore all of this and partake in the spectacle. I chose the closest event, which for me was Southwick Country Park in Trowbridge and arrived before the customary 9am start time. Nearly 300 people turned up to run the wet and muddy course and despite not knowing anything about the logistics of Park Run (you pick up a finishing token at the end and have that scanned with your personal parkrun barcode) I mingled into the crowd ready to run.
The course itself was great. 3 laps around a semi-gravelled path with some sections completely covered by 6 inches of rain water. The first lap around I tried to avoid the water but this this only pushed you onto the slippery surrounding mud so for lap 2 and 3 it became obvious that the best option was to just get your feet wet.
I did not manage any personal bests but that was not the point; the event itself was great fun and the volunteers were excellent. This may have been my first parkrun but it is definitely not my last, I will be there again this Saturday, 9am, ready to do it all again.
Wed, Dec 30, 2015
I’ve been using the Fujitsu ScanSnap 1300i on Mac OS X for some time now in the pursuit of a paperless life but now that I am using Ubuntu more and more it was apparent that I was going to have to get this little scanner working on Linux. Searching the internet threw up a few interesting articles but nothing worked 100%. In the end the steps I used under Ubuntu 15.10 were:
Install sane and gscan2pdf:
$ sudo apt-get install sane gscan2pdf
Download the scanner firmware from http://www.openfusion.net/public/files/1300i_0D12.nal copy it to the relevant directory:
$ wget http://www.openfusion.net/public/files/1300i_0D12.nal && sudo mkdir /usr/share/sane/epjitsu && sudo cp 1300i_0D12.nal /usr/share/sane/epjitsu
Open up the scanner lid first then initialise it with:
$ sudo scanimage -L
Run the gscan2pdf application:
You can tweak some of the scanner options by clicking the scan button and playing around with the pop-up box tabs. For me the most important were to select duplex as I regularly scan double-sided documents and to select colour.
Mon, Nov 9, 2015
Although most of the documentation out there today shows you how to run Ubuntu Snappy Core on an Ubuntu desktop, it is also pretty simple to do this on Mac OSX. In short:
Download the Ubuntu Snappy Core image from:
You will need the amd64 version of Snappy.
Unarchive the file:
Then convert the image into something that VirtualBox can run:
qemu-img convert -f raw -O vmdk ubuntu-15.04-snappy-amd64-generic.img ubuntu-15.04-snappy-amd64-rpi2.vmdk
At this point you want to create a new VM with VirtualBox. Make sure you create a Linux Ubuntu image but when you get to the Hard drive section select “Use an existing virtual hard drive file”. Navigate to your .vmdk image and click create. Now, when you start the VM you should be greated (pretty quickly) with the login prompt from Snappy.
Fri, Nov 6, 2015
This is a short guide to installing Ubuntu Snappy Core on a Rasberry Pi 2 using a Mac. It is pretty straight-forward but there are a couple of areas where you can get caught out.
First, download the Ubuntu Snappy image from:
As of writing the latest release was:
Insert your SD card if you haven’t done so already and use diskutil to find it.
$ diskutil list
Make sure you are confident that you know exact which disk is your SD card before proceeding. The relevant part of my output was:
/dev/disk4 (external, physical):
#: TYPE NAME SIZE IDENTIFIER
0: FDisk_partition_scheme 31.9 GB disk4
1: Windows_FAT_32 Untitled 31.9 GB disk4s1
Umount the disk with:
diskutil unmountDisk /dev/disk4
Then proceed to write your Ubuntu image to the card with:
$ unxz -c ubuntu-15.04-snappy-armhf-rpi2.img.xz | sudo dd of=/dev/rdisk4 bs=32m && sync
Notice the use of ‘r’ in front of the /dev/disk4 file.
Thats all there is to it. Pop the SD card into your Rasberry Pi 2 and start using Ubuntu Snappy Core.
Wed, Aug 19, 2015
I confess, I’m a bit of a gadget hound. I own four different smart watches all with different OSs:
- Pebble with PebbleOS
- Samsung Gear with Tizen
- Motorola 360 with Android Wear
- and now Apple iWatch with WatchOS
When I first got the Pebble (Kickstarter model) I was instantly impressed. It was a device that lasted days, gave me notifications at a glance, and allowed me keep my phone in my pocket unless it was really needed. The trouble is that I got bored pretty quickly with it’s lack of functionality and the Samsung Gear looked enticing. With the ability to make calls, as well everything the Pebble did and more, it seemed like a no brainer. It originally came with a version of Android so buggy it made the device pretty much unusable. Much later this was ‘upgraded’ to a Tizen OS which to its credit was better, but was still limited. Enter Android Wear. In a blaze of publicity at Google I/O 2014 this wearable OS seemed perfect, so I went and purchased the Motorola 360, arguably the best looking device on the market. Unfortunately, this was also crippled. No sound (so no beeps, no notification noises), no ability to make and receive calls, and more importantly no real way to get back to notifications once they were dismissed as well as no real compelling stock applications, this watch just felt like a device to vibrate every time something happened and to ignore at all other times. Android Wear just wasn’t compelling enough so I always gravitated back to Garmin watches (Forerunner 620, Forerunner 920t). Now there is the Apple Watch.
I’m still in the honeymoon stages with it at the moment but I have been wearing it exclusively for the past three weeks. It’s proven to be a useful aid: the fitness app is a bit poor and I’ll revisit this point in another blog post but overall the experience of using it has been pleasant. I can read messages and email, make and receive calls, the calendar app is super useful, and I’ve found that I use it extensively for reminders. All in all it has been a success so far but it is not without its problems. Watch OS 2.0 promises to improve the device further and I am certainly looking forward to it but for now, the iWatch is the best smart watch in an immature market.
Mon, Aug 10, 2015
As a scholar of software engineering with a particular interest in the field of ubiquitous computing and artificial intelligence, the recent series by AMC, “Humans” really did peak my interest. Is it something based on sensationalism or alternatively something that could be considered grounded in reality? Well, I believe that it is a drama that reflects more on the latter than the former. I really like the concept so far and it raises questions that only academia have explored in detail before movie studios love; concepts such as artificial understanding, consciousness, love, and the projection of human traits upon non-human subjects (anthropomorphism).
Sure, the movie industry have toyed with a multitude of these concepts with many dollars flowing in at the box office but what Humans does is ground these in such mundaneness, run of the mill reality, that I really like the play with boundaries, the grey line that the whole programme is toying with. What defines humanity and what distinguishes it from an imposter? What is the point in humans learning when machines can do it much better? Maybe more importantly do the majority of people, the Joe Bloggs of the world, care about the gap in what is possible with strong AI and what is human; a posit that I believe will be in the forefront of minds for the next 50 years.
In short, I really do like Humans so far.
Fri, Mar 27, 2015
Creating a bootable image for installing a Linux OS is pretty straight-forward but when you are doing this on the Mac there is a specific way it needs to be done. I alway use USB drives for this purpose so what follows are the steps needed to create a bootable USB stick from a Linux .iso image.
I presume you have already downloaded your favourate Linux distribution in .iso format, below I’m using Debian Jessie.
First conver the .iso image into a .img image.
$ hdiutil convert -format UDRW -o debian-jessie-DI-rc1-amd64-netinst.img debian-jessie-DI-rc1-amd64-netinst.iso
You then need to find your USB drive.
$ diskutil list
Look for USB device. I’ll use /dev/disk7 for this example. First make sure it is unmounted.
$ diskutil unmountDisk /dev/disk7
Then copy the image to the USB stick. CAUTION This will overwrite anything that is already on the drive.
$ sudo dd if=debian-jessie-DI-rc1-amd64-netinst.img.dmg of=/dev/disk7
Safely eject the USB disk before using it for booting on your target device.
$ diskutil eject /dev/disk7
And there you have it, a bootable, Linux install USB drive.
Sun, Mar 15, 2015
Continuing on from my post about TrustZone it seems that there is a lot of interest in hardware-backed security for Android and what you can do with it. One of the most interesting things that a hardware-isolated area can do for devices, whether that be a dedicated co-processor or technology such as TrustZone, is to provide a trusted enviroment dedicated to protecting your most valuable assets and the operations that are performed on them. Installing something like a micro operating system in this divide can give you a lot of features that the main OS just cannot gain access to and is the thrust of standards bodies such as Global Platform . This micro OS, or to use the popular parlance: a Trusted Execution Environment (TEE), is becoming more important in a world of one-click / swipe / wave-a-device payments and device authorisation and over the coming years will see a surge in popularity not only from independant vendors but from the large OS vendors too. But lets take a step back.
The concept of a Trusted Execution Environment is to provide a secure area of the main processor, memory, and peripherals, that can be used to perform privileged operations. First defined by the Open Mobile Terminal Platform (OMTP) forum in their Advanced Trusted Environment:OMTP TR1 standard and later adopted by Global Platform in their standardisation effort, the TEE has become a bridge between pure software security mechanisms and hardware-only solutions. The TEE uses the concept of isolation that technologies such as TrustZone enable to execute in the processors Secure World mode.
The TEE can be a fully-functional operating system offering software developers the opportunity to create Trusted Applications: applications that reside in the Secure World and perform security-critical functions outside of the control of the main operating system running in the Normal World. An example of such a Trusted Application can be a Trusted User Interface (TUI) - a display that is presented to the user completely protected by the Secure World and inaccessible to the main operating system such as Android. The interface could display sensitive information such as passwords and be confident that attacks such as screen scraping or video buffer capture would not reveal anything.
It is clear that the popularity of TEEs is increasing. Based on one commercial TEE vendors press releases the adoption rate of the Trustonic TEE is reported to be over 100m devices every 6 months (source: http://www.trustonic.com - figures from February 2014 to July 2014) although wide-spread utilisation by third-party developers is yet to be exploited. Ekberg et al attribute this to a lack of access to the TEE stating that “Despite TEE’s large-scale deployment, there’s been no widely available means for application developers to benefit from its functionality as mobile device manufacturers have restricted TEE access to their internal use cases.”, but also admit that standardisation could potentially solve this issue. Recent announcements by companies such as Linaro point to a more open access model but we are yet to see commercial devices with OP-TEE technology.
In short, TEEs are here to stay and I expect that the likes of Apple and Android will open up access to this trusted area for more developers to enhance the security of their applications in the near future.
Thu, Feb 26, 2015
I have recently been reading the book entitled Talk Like TED Carmine Gallo which promises to bestow the virtues of great public speaking upon all who read it. Early on in the book there is a rather salient point that got me thinking, a point that starts with a simple question, “What are you passionate about”. Now there are quite a few things I am passionate about but in the context of Software Engineering, my chosen career path, it is something that underpins all the great projects that over time I have really enjoyed working on. What is it? Data.
I am passionate about data, specifically the conclusions you can draw from it. This is not to say the actual gathering of data, although that can be quite interesting in itself: constructing tools and processes as you squirrel away the nuts of information that together paint a picture that no one individual data point can allude to. I am more passionate about the ‘Wheres Wally’ dance: the finding of that little something you’ve been looking for in a sea of noise, the epiphany, the moment, the unveiling. The answer to the puzzle that is something you intrinsically know is just outside your grasp and that with the data, that collection of measurements and information, the answer will magically appear. The puzzle that is made up of a thousand pieces and by putting them all together it becomes clear. That is what I’m passionate about. I guess my career has always followed that route of problem solving.
Software Engineering is a great field to be in if you enjoy problem solving: you get to create a solution based upon parts constructed with only your imagination, a programming language, and your favourite text editor. In my experience, the first solution you produce is often not quite what you were looking for, and the itch remains. You continue to iterate, introduce bugs, fix bugs, thinking of new and novel ways to answer your initial questions and finally you have something that not only works, it satisfies that itch. When you employ this process to scratch a larger itch, a higher-level more abstract problem that requires the gathering and analysis of data I find there is satisfaction from the initial problem solving during development plus the benefit of discovering that pattern or snippet of information that maybe you only thought was there before but now is proven with the data. Maybe this explains why I have an affinity with Pervasive Computing and, in its latest incarnation as a buzz word - Internet of Things (IoT). The topic of Data Inference, that is what I really enjoy.
I’ve gathered much data over the years: email achives and usage data, energy monitoring and the subsequent discovery of inefficient appliances, health data with Fitbit and Garmin or lifestyle monitoring with Slogger, it can all be combined to do wonderful things. But there is a tendancy to gather data just of the sake of it and I have certainly been guilty of that but I am starting to take a step back and trust the data more - to make informed decisions based upon it - so lets see how that goes this year. Big data is definately here, but the more important point everyone should be asking is “What do we do with all that data and how can it benefit humanity?”.
Sun, Feb 22, 2015
Recently I was asked to provide a quick, high-level introduction to TrustZone and how it could potentially improve the security on Android platforms. Any response to this is tricky: TrustZone is just a mechanism built in to a platform that if unused can do very little for device security but when utilised to its fullest, can create a totally seperate environment dedicated to protecting your most important secrets. But first a bit of background.
According to Bloomberg ARM’s chip designs are found in 99% of the world’s smartphones and tablets; 2013 alone saw ARM’s partners ship over 10 billion chips (source: ARM Strategic Report 2013). Popular devices such as the Apple iPhone and iPad, Amazon’s Kindle, and Samsung’s flagship Galaxy series all use a Central Processing Unit (CPU) based on an ARM design. In 2004 ARM released its design for a hardware-enforced parallel execution environment for the PB1176 and ARMv7 architectures that was adopted into all later application processor designs.
TrustZone itself is an implementation of device-level security utilizing extensions to the CPU and Advanced Microcontroller Bus Architecture (AMBA), or memory bus. By connecting all these components together in a homogeneous architecture it is possible to contruct two distinct ‘worlds’, a “Secure World” and a “Non-Secure World” (or “Normal World”) . The two modes are orthogonal to each other with the Secure World enjoying full access to all memory regions and priviledged CPU areas whereas the Normal World can be restricted. This arrangement is configured during the boot process. The interface between the two worlds is governed by a special Secure Monitor Mode, accessible via an interrupt instigated with the Secure Monitor Call (SMC) instruction. Identification of which world the processor is currently executing it is possible by the use of a extra ‘flag’ known as the NS, or Non-Secure bit. All components that wish to use the functionality provided by TrustZone must beaware of this flag.
With TrustZone it is possible to isolate an area of the CPU, memory, and peripherals for use by a trusted software component called a Trusted Execution (TEE) or other such privileged software. For example, Android’s implementation of the core crytographical keystore functionality, KeyChain, can use hardware components such as TrustZone, Sim Card, or Trusted Platform Module (TPM), to enhance overall security. By using TrustZone a device can provice secure software functionalty, backed up by the hardware it is running on.
It is clear that with more widespread use TrustZone could benefit an increasingly mobile society who expect to do the most secure of operations with their devices.
Wed, Feb 18, 2015
Its been a while, in fact it has been around a year since I updated this site (to be fair I did write a few posts on another blog during that period … excuses, excuses) which I attribute to a increasingly busy schedule but more to a lack of enthusiasm. So, in an attempt to get back into this blogging lark I thought it would be a good opportunity to redesign the site with Hugo, a static, but more importantly Markdown-based web engine, and put up a few articles on something dear to my heart, Software Engineering. So expect more development related posts interspersed with running, triathlon, travel, and other randomness as I attept to do this on a semi-regular basis.
Oh, and if you are looking for any of my past entries from 2007 onwards, they will be back up shortly as I figure out how to convert WordPress content to Hugo and still keep some form resemblence to the original post.