Moderate Activity


  Analyzed 8 days ago based on code collected 9 days ago.
Posted 3 months ago
Meeting Minutes
IRC Log of the meeting.

Meeting minutes.

20150106 Meeting Agenda

Release Metrics and Incoming Bugs
Release metrics and incoming bug data can be reviewed at the following ... [More] link:

Status: Vivid Development Kernel
Both the master and master-next branches of our Vivid kernel have been
rebased to the v3.18.1 upstream stable kernel. We have also uploaded
our first 3.18 based kernel to the archive (3.18.0-8.9). Please test and let us
know your results. We are also starting to track the v3.19 kernel on
our unstable branch.
Important upcoming dates:
Fri Jan 9 – 14.04.2 Kernel Freeze (~3 days away)
Thurs Jan 22 – Vivid Alpha 2 (~2 weeks away)
Thurs Feb 5 – 14.04.2 Point Release (~4 weeks away)
Thurs Feb 26 – Beta 1 Freeze (~7 weeks away)

Status: CVE’s
The current CVE status can be reviewed at the following link:

Status: Stable, Security, and Bugfix Kernel Updates – Utopic/Trusty/Precise/Lucid
Status for the main kernels, until today:

Lucid – Verification & Testing

Precise – Verification & Testing

Trusty – Verification & Testing

Utopic – Verification & Testing

Current opened tracking bugs details:

For SRUs, SRU report is a good source of information:


cycle: 12-Dec through 10-Jan
12-Dec Last day for kernel commits for this cycle
14-Dec – 20-Dec Kernel prep week.
21-Dec – 10-Jan Bug verification; Regression testing; Release

Open Discussion or Questions? Raise your hand to be recognized
No open discussion. [Less]
Posted 3 months ago
Whenever a user downloads Ubuntu from our website, they are asked if they would like to make a donation, and if so how they want their money used. When the “Community” option is chosen, that money is made available to members of our community to use ... [More] in ways that they feel will benefit Ubuntu.

You can read the report here.

We pretty consistently spend less than we get in each quarter, which means we have money sitting around that could be used by the community. If you want to travel to an event, would like us to sponsor an event, need hardware for development or testing, or anything else that you feel will make Ubuntu the project and the community better, please go and fill out the request form.

Published on behalf of Michael Hall from the Community Team [Less]
Posted 3 months ago
Ubuntu Make 0.4 has just been released and brings Go support and a new game category!

To hack using Go under Ubuntu, just open a terminal and type:

umake go

and here we "go"! This will enable developers to always install ... [More] the latest Google golang version and setting up some needed environment variables for you.

We also starts thinking about game developers. Putting the code where our mouth is, we are pleased to inaugurate a new "games" section, bringing stencyl, an amazing quick and easy way to make games for multiple platforms available!

umake games stencyl

and you will be able to be creative in creating the new top seller game!

Ubuntu Make 0.4 is already available in Vivid, as well, through its ppa, to 14.04 LTS and 14.10 ubuntu releases.

If you have any idea (like a favorite IDE for go!) or some other game platforms, our issue tracker is opened for any suggestion!

On other news, the new name migration is now over with the github repository being moved under the ubuntu namespace and is now available here, waiting eagerly for your contribution! [Less]
Posted 3 months ago
Whenever a user downloads Ubuntu from our website, they are asked if they would like to make a donation, and if so how they want their money used. When the “Community” option is chosen, that money is made available to members of our community to use ... [More] in ways that they feel will benefit Ubuntu.

I’m a little late getting this report published, but it’s finally done. You can read the report here:

We pretty consistently spend less than we get in each quarter, which means we have money sitting around that could be used by the community. If you want to travel to an event, would like us to sponsor an event, need hardware for development or testing, or anything else that you feel will make Ubuntu the project and the community better, please go and fill out the request form.

Posted 3 months ago
Security research has an interesting culture, and its not entirely unlike devops. They are typically kept as a separate entity of the organization and left to do their work supporting the other areas of business. Pen testing reports are published and ... [More] work items are assigned - but what if we could take that a step further and make infosec a real part of the process?

I've been thinking about this subject quite a bit recently, and it has evolved from the original concept of leveraging devops toolchains in infosec research. Pen Testers typically have a set routine of vulnerabilities to check, and this tedious process has been automated by a few toolkits like Metasploit. The original idea was to take toolkits like metasploit, and develop a suite of routines using the Juju Relationship system to test common vulnerabilities, such as:

Framework CVE's
Known Rails Injection tactics
Session Hijacking leveraging peering hosts
SQL Injections of User Input Fields leveraging kits like splinter or selenium

to name a few examples.

This has great implications on time reduction, reusablility of existing code/practices, and become part of the deployment pipeline for testing for known security vulnerabilities. Freeing up pen testers to resume research on new and interesting ways to default security practices.

With technologies like Docker reducing the overall surface attack area and the rise of containerized computing - this leads me to the next step in evolution of this thought process.

The Pitch

What if we made full container pentesting part of the modern deployment pipeline?

The proposed Workflow:

Developer commits Code to Repository
CI pulls code and runs unit tests
CI builds a containerized deployment artifact
Integration tests are run against the container(s) in a testing environment
Automated Pentests are then run against the container artifacts

Relationships establish the constraints of the test

If HTTP interface/relation is joined - kicks off a suite of browser automation attacks

If all test suites pass, container is then delivered to target environment

The use of Containers ensures the testing surface reflects an accurate representation of the production environment. After all - most container hosts are extremely light weight installations to support a maximum density deployment of applications. The fact that container context's in the Docker etho's are immutable also allows us to tightly control the scope, versions, and attack surface under load. The artifacts from a failed Pen Test would also allow post-mortem metrics to be collected against the test. If this were all being built with a juju bundle - a developer/infosec specialist could then pull the bundle, redeploy on their laptop and get a full 360 degree view into the failure and make the required adjustment.

Imagine the scenario that you are a Rails shop, running a long-standing Rails3 application. A developer adds a gem that introduces a large scale SQL injection bug that has the possiblity of a user dropping all tables in your database. These are the kinds of situations your organization wants to catch in testing vs in the wild. Automated pen-testing that crawls a website, inspects every input and attempts to force submit sql injections, and can then validate database entries were indeed altered that were not intended - would be extremely invaluable.

While this won't prevent your users from choosing poorly formed passwords, often less than 8 characters and/or single word variants in plain english - this can prevent the next wave of catastrophy from striking your app.

Leveraging technologies like Docker to warehouse and build/ship your app, then placing juju orchestration on top of this stack, you can automate away several layers of pain with charms. "Servers attacking servers" so to speak. Making this process automated ensures repeatability and sustainability of the work that your infosec team does on a routine basis, and makes the process more approachable, digestible, and understandable to everyone involved from the infosec team, to the developers, to ops.

But this is only the beginning, there's certainly more to this that I haven't touched. I'm very interested to hear your thoughts about this, and what you feel are the strengths and weaknesses of this pattern. [Less]
Posted 3 months ago
Happy New Year everyone!

I was on holiday when this happened but I figured it was worth a repost. As announced by James Page OpenStack 2014.2.1 is now available for Ubuntu 14.04 (and 14.10).

Instructions are at the link, you might also want to check the upstream release notes
Posted 3 months ago
My monthly report covers a large part of what I have been doing in the free software world. I write it for my donators (thanks to them!) but also for the wider Debian community because it can give ideas to newcomers and it’s one of the best ways to ... [More] find volunteers to work with me on projects that matter to me.

Debian LTS
This month I have been paid to work 20 hours on Debian LTS. I did the following tasks:

CVE triage: I pushed 47 commits to the security tracker this month. Due to this, I submitted two wishlist bugs against the security tracker: #772927 and #772961.
I released DLA-106-1 which had been prepared by Osamu Aoki.
I released DLA-111-1 fixing one CVE on cpio.
I released DLA-113-1 and DLA-114-1 on bsd-mailx/heirloom-mailx fixing one CVE for the former and two CVE for the latter.
I released DLA-120-1 on xorg-server. This update alone took more than 6h to backport all the patches, fixing a massive set of 12 CVE.

Not in the paid hours, but still related to Debian LTS, I kindly asked Linux Weekly News to cover Debian LTS in their security page and this is now live. You will see DLA on the usual security page and there’s also a dedicated page tracking this:

I modified the LTS wiki page to have a dedicated Funding sub-page. This avoids having a direct link to Freexian’s offer on the main LTS page (which surprised a few persons) and allows to give some more background information and makes it possible for other persons/companies to also get listed in the same way (since there’s no exclusive relationship between Debian and Freexian here!).

And I also answered some questions of Nguyen Cong (a new LTS contributor, employed by Toshiba with explicit permission to contribute to LTS during work hours! \o/), on IRC, on (again) and on the mailing list! It’s great to see the LTS project expanding beyond current members of the Debian project.

Distro Tracker
I want to give again some more priority to Distro Tracker at least to complete the transition from the old PTS to this new service… last month has been a bit better than November but not by much.

I reviewed a patch in #771604 (about displaying long descriptions), I merged another patch in #757443 (fixing bad markup which rendered the page unusable with Konqueror), I fixed #760382 where package gone through NEW would never lose their version in NEW.

Kali related contributions
I’m not covering my Kali work here but only some things which got contributed upstream (or to Debian).

First I ensured that we could build the Kali ISO with live-build 4.x in jessie. This resulted in multiple patches merged to the Debian live project (1 2 3 4). I also submitted a patch for a regression in the handling of conditionals in package lists, it got dropped and has been fixed differently instead. I also filed #772651 to report a problem in how live-build decided of the variant of the live-config package to install.

Kali has forked the sysvinit package to be able to disable the services by default and I was investigating how to port this feature in the new systemd world. It turns out systemd has such a feature natively: it’s called Preset files. Unfortunately it’s not usable in Debian because Debian does not call systemctl preset during package installation. I filed bug #772555 to get this fixed (in Stretch, it’s too late for Jessie :-().

I’m using salt to automate some administration task in Kali, at home and at work. I discovered recently that the project tries to collect “Salt Formulas”: those are ready to use instructions for as many services as possibles.

I started using this for some simple services and quickly felt the need to extend “salt-formula”, the set of states used to configure salt with salt. I submitted 5 pull requests (#73 and #74 to configure salt in standalone mode, #75 to enable the upstream package repositories, #76 to automatically download and enable the desired salt formulas, #77 for some bugfixes) and they have all been merged in less than 24 hours (that’s the kind of thing that motivates you to contribute again in the future!).

I also submitted a bug fix for samba-formula and a bug report in salt itself (#19180).

BTW I have some salt states to setup schroot and sbuild. I will try to package those as proper salt formulas in the future…

Misc stuff
Mailing list governance. In Debian, we often complain about meta-discussion on mailing lists (i.e. discussions about how we discuss together) and at the same time we need to have that kind of discussions from time to time. So I suggested to host those discussions in a new mailing list and to get this new list setup, our rules require to have other people interested in having this list. The idea had some support when we discussed it on debian-private, so I relaunched it on debian-project while filing the official request in the BTS: #772645. Unfortunately, I only got one second. So if you’re interested in pursuing this idea, speak up now…

Sponsorship. I sponsored another Galette plugin this month: galette-plugin-fullcard. Thanks to François-Régis Vuillemin for his work.

Publican. Following one of my bug report against Publican and with the help of the upstream author, we identified the problem and I submitted a patch.

See you next month for a new summary of my activities.

No comment | Liked this article? Click here. | My blog is Flattr-enabled. [Less]
Posted 3 months ago
This is the first edition of this book to be published by Pragmatic Bookshelf, which I believe is an excellent fit as a company for the book’s content. The second edition was published back in 2000 by a publisher who specializes these days in a ... [More] different sort of content. Plus, I love The Pragmatic Programmers series by Pragmatic Bookshelf and this history contained here belongs in this series. Good move for both the authors and the publisher.

Fire in the Valley, Third Edition is subtitled The Birth and Death of the Personal Computer, and for good reason. The book is a history of the types of computers that people bring into their homes, starting at the very beginning when this was just a dream for a few stalwart hobbiests willing to build their own computers. It continues through the usual suspects like MITS and Apple all the way to the present day when computing power has been grafted in to so many different devices that the meaningfulness of having “my own computer” isn’t quite the same as it once was.

The book covers not only historic events and figures, but also issues and philosophies that had an impact of the birth, growth, life, and death of many companies along the way. It also includes a ton of first-hand accounts from key players that make the story rich, interesting, and fun to read.

While this is being sold quite rightly as a history book, perhaps it should receive more fanfare as a chronology of a revolution, of a sweeping cultural shift. I lived through much of the era described in the book (I bought my first computer in 1981) and can easily remember a time when there were only three or four people in my school who had a computer at home, when there was no computer lab, or when the first computer labs were created and filled with Commodore PET computers that had no software other than an operating system, so there was nothing for students to do with or on them. Society is indeed different, and this book describes integral and foundational reasons why and how that change occurred. If this sounds interesting to you, this book is easily the best one I have encountered on the topic. That was true of the previous edition, and is even more true today with the third edition. Pick it up!

Disclosure: I was given my copy of this book by the publisher as a review copy. See also: Are All Book Reviews Positive? [Less]
Posted 3 months ago
Inspired by the post by Riccardo Padovani about the awesome year that Ubuntu Italy had, I welcome you to a similar one for Ubuntu California for events I participated in.

The year kicked off with our annual support of the Southern California ... [More] Linux Expo with SCaLE12x. The long weekend began with an Ubucon on Friday, and then a team booth on Saturday and Sunday in the expo hall. There were a lot of great presentations at Ubucon and a streamlined look to the Ubuntu booth with a great fleet of volunteers. I wrote about the Ubuntu-specific bits of SCaLE12x here. Unfortunately I have a scheduling conflict, but you can look for the team again at SCaLE this February with an Ubucon and Ubuntu booth in the main expo hall.

Ubuntu booth at SCale12x
In April, Ubuntu 14.04 LTS was released with much fanfare in San Francisco as we hosted a release party at a local company called AdRoll, which uses Ubuntu in their day to day operations. Attendees were treated with demos of a variety of flavors of Ubuntu, a couple Nexus 7s with Ubuntu on them, book giveaways, a short presentation about the features of 14.04 and a pile of pizza and cookies, courtesy of Ubuntu Community Donations Funding.

Ubuntu release party in San Francisco
More details and photos from that party here.

In May, carrying the Ubuntu California mantle, I did a pair of presentations about 14.04 for a couple of local groups, (basic slides here). The first was a bit of a drive down to Felton, California where I was greeted at the firehouse by the always welcoming FeltonLUG members. In addition to my presentation, I was able to bring along several laptops running Ubuntu, Xubuntu and Lubuntu and a Nexus 7 tablet running Ubuntu for attendees to check out.

Ubuntu at FeltonLUG
Back up in San Francisco, I presented at Bay Area Linux Users Group and once again had the opportunity to show off my now well-traveled bag of 14.04 laptops and tablet.

Ubuntu at BALUG
As the year continued, my travel schedule picked up and I mostly worked on hosting regular Ubuntu Hours in San Francisco.

Some featuring a Unicorn…

And an Ubuntu Hour in December finally featuring a Vervet!

December 31st marked my last day as a member of the Ubuntu California leadership trio. I took on this role back in 2010 and in that time have seen a lot of maturity come out of our team and events, from commitment of team members to host regular events to the refinement of our booths each year at the Southern California Linux Expo. I’m excited to see 2015 kick off with the election of an entirely new leadership trio, announced on January 1st, comprised of: Nathan Haines (nhaines), Melissa Draper (elky) and Brendan Perrine (ianorlin). Congratulations! I know you’ll all do a wonderful job. In spite of clearing out to make room for the new leadership team, I’ll still be active in the LoCo, with regular Ubuntu Hours in San Francisco and an Ubuntu Global Jam event coming up on February 8th, details here. [Less]
Posted 3 months ago
2014 held a lot of interesting things for me, both personally and professionally. I had a long and serious internal monologue with myself deciding if I really wanted to write this post. I think that overall it's important to recap what you've ... [More] experienced over the past year so you have a clear cut vision of where you've come from; and thus - where you might be going.

The Retrospective
Almost a year at Canonical
January 20'th 2015 will mark 1 full year at Canonical as a full-time employee. I have to say that I've come a long way from the strong headed solo developer that was hired on the Juju Solutions Team. I had a great grasp of what I thought the project needed back then - or so I thought. Everything under the sun needed a charm; and as a rubyist that meant I had all the candy in the universe to keep me busy... right? I was sorely mistaken in the fact that just because a charm exists, doesn't neccessarily mean it's awesome - nor does that mean that a single person should be writing hundred's of charms for public consumption.

The Great Charm Audit of 2014 performed primarily by Matt Bruzek taught me quite a few lessons in charming, and what key aspects will help us to succeed in achieving our mission critical goal.

Charms Need Automated Tests
Charms Need full documentation to be useful to anyone
Charms are honest and legitimate software projects - not toys.
Quality Software takes time
None of this is hard persay, just time consuming.

In addition to the tennants of Charms and leveraging Juju - I've learned quite a bit about being an upstanding member of the community and what it means to foster a community. The Juju community at large is still relatively small in terms of population - so it's even more important to ensure we are treating them like the first class citizens they are.

The Juju Charmer Team has grown from a small 3 man operation at the beginning of 2014 to 7 members swooping in and out of the Review Queue like super hero's right out of a comic book. It's very important to remember to be appreciative of these folks. We have gained our first community charmer among our ranks - who is donating his time ensuring that incoming charms are receiving a prompt and curteous first round review. It's been an absolute joy to foster his growth and development from a community member being promoted right into ~charmer status.

Code Reviews
I remember back in early 2012 I was very concerned with Documentation, as I felt like my code wasn't self documenting enough. Since that time I've undergone a few more years of professional development - written several thousand more lines of code, and have come to the conclusion that the thousands of lines of code that I have written scare me. Every line of code that I've written up to the early portion of 2014 was only used internally at a marketing agency, or on client websites that were tested by the hands of 2 over-worked individuals before it landed in production.

Flash forward to 2015 and before anything gates from a namespace branch into any form of recommended or production listing - it's got a solid two or three people's hands on it, commenting about: readability, comments, docstrings, why did you loop twice when you could have mapped with a dictionary? to name a few things. Its been an amazing period of growth for my craft - as I continue to consider new and developing technologies to leverage in my professional life. I am thankful for every last person that has put up with the late nights, crazy code quality, and rushed reviews so we can make a deadline for our clients. You ladies and gentleman deserve a medal and time off for dedicated efforts to making software a better place for everyone to dwell.

An aside on hobbies and music
As my posts have also alluded to, my hobbies have grown to include more creative passions such as Screen Casting and Live Mixing under the monicker of DJ Genesis. While these hobbies may be just that - a hobby - they are an amazing avenue for creativity. I find that the more I pursue these outlets - that my passion to code rises. It's no secret that I spend entirely too much time in front of my computer - but now with activites that are less logic oriented, and more creative (meaning I'm putting together a puzzle, without having to make the pieces myself) I find that this stimulates my mind to new heights.

The beautiful part about this story - is my day job has lent a hand in enablement of this passion. Running my own shoutcast server on a super cheap Digital Ocean droplet empowers me to broadcast to several like minded individuals. It's even getting to where there is collaboration afoot on this front. The sheer size of global collaboration has taken a front-seat in my heart, which is a byproduct of working out in the open on Open Source projects. The first take at a joint-mix has begun with a new-friend from the UK that discovered me mixing live and wanted to remix some of the bits I was working with.

Nu:Bounce - Episode #9 by Dj Genesis on Mixcloud

Going International
I took my first steps out of the country in 2014 to Europe for a planning sprint in Brussels Belgium. I have three one word sentences.


I cant begin to explain how mind blowingly awesome this was. Being surrounded by buildings older than my country was an incredibly humbling experience. There was such a rich culture of small businesses surrounding the area that I was staying in. I think my favorite part of the entire trip was the small Ramen shop I discovered down the street. I enjoyed it so much that I stopped in twice for Ramen while I had the opportunity. There was tons of planning to be had at the sprint, as it was a work-by-day-play-by-evening scenario. I'm jonesing to go back - and it looks like I've received a wish-granted card; I'll be speaking at FOSS CON and running workshops a few towns over at the end of January, early February.

Whats in store for the future
New Workloads
I'm moving teams from Big Data over to the New Workloads team - starting with Kubernetes. We're going to be building some awesome stuff with containers as a first class citizen. The coverage I've given to Flannel and container networking will come in handy. I'm looking forward to touching on the newer bleeding edge tech in the Dev/Ops sphere.

Juju is Orchestration Glue
Thus far I've been posting mostly scenarios where Juju is a phaux PAAS service orchestrator - and that's such a misnomer as to the true purpose of Juju. While it has the capacity to work on a scale of providing Apps as a Service - A key stregth is it's ability to jack in as an orchestrator of services. As a quick rundown more for my own edification of what the primary goals are to keep in mind:

A community of expertise
Encapsulates best practices for





How it should and will talk to other services


Reference Deployments

Every charm is a grown up software project, with its own suite of community, documentation, and practices.

I don't really know where 2015 is going to take me. But if it's anything like 2014: I've got my work cut out for me. I look forward to even more growth, experiences, and honing my craft of geekery.

Cheers and bring on the challenges! [Less]