Getting a recent version of Mono running on BeagleBone Black

by danny 10. February 2015 22:11

So I’m currently working on a project prototyping replacing an expensive industrial PLC board with a much cheaper and at the same time more capable system based on the BeagleBone Black.  Once upon a time, I did a lot of work on linux, but I’ve been a windows (and especially .Net) guy for quite a while.  So the first thing I wanted to try was to get Mono working so that I can write as much of the logic for this system as possible using a more familiar and productive environment.  Here are a few notes to make sure I remember what was involved.

Step 1: Update the BeagleBone to a very recent Debian image.

As of this writing, the latest official image for the BeagleBone is from May 2014 which is, like, ancient in Internet/Linux time.  The good news, though, is that according to threads on the Debian section of the BeagleBone Forumsthe current test release from 2/1/15 on the BeagleBoneBlack Debian pageis very nearly the final build for the new stable image.  It’s rc3 or something along those lines.  So I replaced my the OS on my BeagleBone with a copy of the console version of the new image.

NOTE: I picked console because I don’t intend to use the board as a PC with direct video, keyboard and mouse but rather as a controller for an embedded system which I will configure through ssh, so I thought I’d start with a thinner/lighter image and add the packages I need.  You could chose the lxde version which has more packages preinstalled, and I’m sure that would work just as well.

  • Download image on to my windows box from:
  • Uncompress the image using 7-zip (which conveniently understands this unix-centric compression)
  • Write the image to a micro sd card using Win32DiskImagerHappily my laptop has an SD-card writer built-in so I just had to pick up a micro sd card with an adapter that allows it to work in my full-size SD drive—an 8gb card was only like $10 at my local drugstore when I went out to grab one the other night.
  • Power-down the BeagleBone (nicely with ‘shutdown –h 0’ or by pressing the power button once and waiting—apparently this is a significant issue with the BeagleBone which seems to have real problems if you just pull the power, not only because of normal linux filesystem may not be in a happy state, but also because pulling the power doesn’t turn off all the components in a clean order or something)
  • Pop in the micro sd and hold down the button on top of the board right over the micro sd card slot while powering the board back on.
  • Wait until the 4 status leds stop flashing and instead are all lit steady.  Then power off the board, pop out the micro sd card and boot it back up.
  • Login and verify the new version is installed with ‘uname –a’:

Linux beaglebone 3.8.13-bone70 #1 SMP Fri Jan 23 02:15:42 UTC 2015 armv7l GNU/Linux

Step 2: Add the official Mono packages from Xamarin

This is actually the step that drove me crazy for quite some time.  I kept finding websites where someone mentioned how they got mono working on their BeagleBone by either using the software floating point system (which is reportedly significantly slower and may require special patches, etc.) or going to some other variant of Linux or building mono themselves (which takes a lot of storage space and a LONG time to build on the board directly or might be done faster through a complex cross-compiling setup), but all of those options either weren’t working for me or just seemed more complex than should really be required.  So finally I decided to just go to and see what I could find out.

On the official website they have installation instructions for linux with a special section for Debian, Ubunto and derivatives.  You can follow their instructions.  They worked perfectly for me, but the high-level overview is that you first run an apt-key command to add a signing key for the Mono Project.  Then you add the mono-project repository to the list of apt-get sources, and then apt-get install mono-complete just gets all the stuff.

Result: WORKS!

So I haven’t yet had a chance to test in detail, but so far I was able to compile a quick hello, world and run it with no problems.

Look out world, soon my c# code will be able to do more than move bits around my PC—it will start controlling things in the physical world.  Bwahahaha!  Bwahahaha!



Super-productive .Net Tools: NCrunch and Team Foundation SERVICE

by danny 7. May 2013 15:27

This post is just a quick note to point folks to two things that I have found make me much more productive building .Net apps.  There are lots of things that could be named in this camp including ReSharper and PowerShell and what have you.   (Scott Hanselman, by the way, maintains a whole monster list.)  There are two, though, that have had a surprisingly profound impact on me and my team.


If you are serious about building unit tests for your code (whether you religiously write them test-first or not), a phenomena which you will encounter pretty quickly on any decent sized project is that you have a LOT of tests which you want to run regularly. 

In addition, I often find that I have a mix of true unit tests and others which are really integration tests.  The biggest difference between these two from a practical standpoint of interacting with them is that my unit tests are very safe (no side effects) and automatic (no pre-run setup needed) while my integration tests often need me to fire up multiple programs and do a little setup before running them. 

Both kinds of tests are super useful, and when I’m working in a particular area it’s usually no big deal to run relevant tests.  ReSharper makes that easy—just click on the icon in the left margin of the test code.  Running tests in other parts of the code base, however, takes more effort mostly because it takes more time.  You have to build everything, then you either have to think carefully about what tests need run or run everything and then wait for all the tests to complete, etc.  The result of this extra friction is that it often doesn’t get done.  Of course my CI build (which I’ll talk more about below) will catch errors after I check in, but every moment bugs live in my code increases the cost/pain associated with fixing them.  Not only does the cost go up if I check in and then have to come back to the bug later (after someone else may have had to deal with it as well), but it also goes up even when I just context switch away from the code I just wrote which caused the bug.  Ideally I would see test failures caused by code changes immediately upon making those changes.

That’s what NCrunch does.  It’s amazing.  Basically it constantly monitors changes you makeNCrunch-coverage in visual studio and intelligently builds and runs tests as needed based on the code you change—even before you save.  It then quietly in the background updates an icon in the lower right of the screen indicating if there are problems and maintains a window with a list of projects that don’t build or tests that aren’t passing.  It also shows in the left margin of each line a little dot that indicates if the line is covered by unit tests or not and if so whether or not those tests are passing.  So you immediately get feedback which helps you see and right away fix any tests that are failing.  Awesome!

Visual Studio Team Service

The other tool which has had a major impact on my team lately is TFS in the cloud.  Especially for small teams like mine where the service is free, this service has been huge.  With a few clicks I had setup a source control server which is accessible from anywhere on the internet, includes support for bugs/tasks/planning integrated into VS2012, automatically sends email notifications to team members on checkins and bug changes, and even does continuous integration builds and runs tests whenever a checkin is made.

In addition the new support for code reviews in VS2012 works beautifully with this system.  My team spent nearly a year using mercurial with bitbucket which wasn’t bad, but the switch to TFS in the cloud was transformative.  Before that switch code reviews had so much friction that they just didn’t get done regularly, and now every checkin gets reviewed.  Previously we had unit tests but no CI server configured to guarantee they were run on each checkin.  In addition we just hadn’t been able to find a task/bug tracking system that worked for us, and now we have a first class system integrated with VS, and we’re using it.

Both of these tools are awesome.  I suggest you go give them a try right now.



Bootable USBs Good!

by danny 29. April 2013 14:52
Technorati Tags:

I’ve been meaning to start posting random helpful facts to my blog for a while both as a repository for myself and in the off chance that someone else will find them and get some value.  So here’s the first one.

For various reasons lately I’ve found myself needing to boot from a usb in order to do a clean install, to run a RAM checker (I was having some unhappy hardware issues which turned out to be an SSD drive firmware issue) or to do a firmware update on something.  Being able to quickly and easily create a bootable USB for anything where I have an ISO is super handy—much faster than burning a CD, I always have my USB swiss-army knife in my pocket, etc.

In the past doing this could be a pain because of the need to track down one or more utilities and go through a bunch of manual steps.  It seems, though, that the Linux community has nailed this (probably a while ago and I just didn’t know).  The answer is the Universal USB Installer.  You get it from:

And you can either use it to download and create a bootable USB for various linux distributions or to create a bootable USB from any ISO you have (like the one used to update the firmware on my crucial M4 SSD).



the next adventure

by danny 2. March 2013 11:39

Just a short note to say that I've moved on to my next adventure--working from home full-time.  The year I spent at INRIX was great, but because of family issues I've had to make another job change so that I can be at home full-time.  Happily, my brother's start-up business has progressed to the point where we're ready to dive in and try to make it go with me on-board full-time, and that means I can work from home.

So I now seem to have come full-circle from running a very small company years ago in Idaho, to the behemoth that is Microsoft, then down to a mid-sized company at INRIX and now back to a tiny start-up.

I make no promises about frequency of posts to this blog, but it seemed important to at least note the major transition, and maybe I will have time to make this a forum for posting lessons learned in my new role...

- Danny

Tags: ,


Goodbye microsoft, hello world.

by danny 31. January 2012 22:49

Today I sent a resignation email to my manager at microsoft, thereby setting in motion the process to leave the company where I have worked for very nearly 15 years.  It has been a great experience, and I have learned a ton, but I have to say that I can't wait for my next adventure.

The last couple years have been pretty rough for me at work--occasional days where I felt productive and was having fun mixed in with a whole lot of days where I didn't much want to be at work, instead of the other way around.  I really thought I would stay at Microsoft until I retired, but late last year when God finally got it through my head that He wanted something else for me, I began to get the picture that it took a really rough year on the data team for Him to finally pry me away from there (working on the EF was great, but by the time I left, it was really time for me to go), and then it took another really rough year in an entirely different team for me to finally see that microsoft was no longer the place for me.  Praise God for His patience with me!

What am I doing next you ask?

Well, first I'm spending about a month working with my brother and nephew to help them get a new software company off the ground.  He has come up with a great idea for a niche product related to small business accounting that not only is positioned well for current success but also presents lots of opportunities to grow and diversify over time.  We have a short window, but we're going to dive in and try to get the core capabilities together and set the architectural direction.

Then, on the last day of February I will start a new job as a dev lead at Inrix.

Happily, both Inrix and my brother's company will be using some of my favorite development tools and languages: visual studio, .net & c#.  So I have no idea what my time will be like, but I hope to share here some of my journey and the technical things I discover along the way.

- Danny

Tags: , ,

Month List