Monday, 7 April 2008

Can Software Engineers save lives?

Keith has a post about getting on the DVCS bandwagon. What really interested me in the post is the part about his discussions with a colleague about the choice of tools.

"It's kind of a shame that people like Ed and myself can semi-legitimately get involved in a conversation about which DVCS and why. Summed over all the people like Ed and myself that's a lot of mental energy being poured down a black hole, across the industry as a whole. Especially since it seems as if there is almost nothing to choose between these tools."

It reminded me of this quote from Philip Greenspun.

“Another issue is a perennial side-show in the world-wide computer programming circus: the spectacle of nerds arguing over programming tools. The data model can’t represent the information that the users need, the application doesn’t do what what the users need it to do, and instead of writing code, the “engineers” are arguing about Java versus Lisp versus Perl versus Tcl. If you want to know why computer programmers get paid less than medical doctors, consider the situation of two trauma surgeons arriving at an accident scene. The patient is bleeding profusely. If surgeons were like programmers, they’d leave the patient to bleed out in order to have a really satisfying argument over the merits of two different kinds of tourniquet."

I can't find the original source of the quote, so I grabbed it from this post. My guess is that the original quote dates from the late 20th century.

It's good to see how slowly technical culture evolves.

Saturday, 5 April 2008

List Comprehensions Lazily

Ed has a nice post about list comprehensions in Erlang, and mentions that Python has them as well.

If you are interested in using them in Python, but actually only need the resulting items one at a time or on demand, then try generator expressions.

They have the same syntax, except they are enclosed in parentheses rather than square brackets.

>>> some_list = [i*i for i in range(12)]
>>> some_list
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81, 100, 121]

>>> some_generator = (i*i for i in range(12))
>>> for i in some_generator:
... print i,
4 9 16 25 36 49 64 81 100 121

The list comprehension creates and returns a list (as you would expect) whereas the generator expression creates a generator object that you can call next() . This will return the value defined in the expression that "generates" the values i.e. in this dumb example range(20).

I've heard from the horse's mouth that if you get to larger amounts of data (no idea what would be large enough to cause concern here) that generator expressions will be more efficient.

Friday, 4 April 2008

Agile and Orthodoxy

I work in a team led by someone who can safely be termed a veteran Agilist, with colleagues who have all swallowed the red pill.

I recently wrote a blog post about Kanban, which has left some of my colleagues somewhere between troubled and sceptical.

Before I go any further, please note I am not an expert in any topic, let alone Agile or Kanban. What follows reflects my limited understanding of the issues surrounding the two subjects. I should also add that Keith, as far as I can tell is not amongst the sceptics. He was also at the talk and was not troubled by the subject matter in fact I think it's fair to say he was enthused.

I don’t know what all of the reasons for disquiet are, but certainly amongst them is the feeling that as the Kanban approach does not at least presume (if not enforce) what are accepted as the pillars of the faith, then it is some sort of heresy.

Without two week iterations and test first development as pre requisites, how could Kanban be compatible with Agile?

Well I have re-read and I don’t see anything in the Kanban approach as illustrated by David Anderson at QCon that would be in conflict with any of the principles as enshrined in the manifesto.

At this point Keith will tell me that the manifesto is just a collection of meaningless platitudes. On some levels, that’s true enough. Who doesn’t want to be more of everything nice and sweet and smashing? But why expect more? It is after all a manifesto.

You can try to live up to some or all of the manifesto if you like. But claiming to adhere to the principles means nothing if you take no action is support of the aims. Actions taken in conflict with the principles must surely discredit your claimed adherence.

On the other hand if you claim to believe in some set of principles, take actions that support them, and do not take actions in conflict with them, then surely you have some right to say you are an adherent.

I should say at this point, that it’s probably of no interest to teams who are happy in their jobs and producing good work that delights their customers, what any manifesto . But let's pretend for a moment that it is important to them.

Despite it seeming too obvious to mention, I think its worth remembering that the things we might like to aspire to in Agile are not laws of nature. They are principles. The simple fact that we have XP, FDD, Crystal, DSDM, Scrum and others should be evidence enough for that.

Before you cry foul and say there is something in common with all of the approaches that form the core pillars of the faith, and amongst the common features are, for example, short iterations and test first development, let me respectfully direct you back to the manifesto.

So why don't I get back to my vague rambling point? Given that we are mostly normal human beings doing mostly abnormal stuff, then we still have the problem of working out what practical things can we do in our daily work to help us achieve our goals while not compromising our principles (too much)?

Clearly the best approach would be for us to be liberated from wage slavery but that’s another issue.

However, to be more realistic, putting the principles into actions is where the specific day to day activities of Agile come in. This is where a team might decide to do XP for example, with a Scrum wrapper perhaps?

Nothing in the manifesto requires such explicit practices as n-week iterations or test first development. XP or FDD have, on many occasions been found to greatly improve the chances of reaching the goals of a project while being true to the principles of Agile. Test first development is a pre-requisite in XP, but that is not to say that it is a pre-requisite for meeting the higher aims of Agile.

That’s all great but what if someone finds a practical approach that also helps in meeting customer needs and values, but at the same time does not care whether you do formal design or test last. You or I might think that sounds strange but surely regularly delivered, working software in the customer’s hands, meeting their needs and values trumps all.

For me the basic principle that I believe has most value is favouring individuals over processes. If we follow this principle then what I think should flow naturally from this are practices that people are happier with.

To insist on two week iterations and test first above all else then that would be favouring a process over individuals.

If being Agile was as simple as adopting a recipe of processes, it would not be so hard to adopt. But then it would be as meaningful as public only displays of piety as a means to get into heaven.

So getting back to a really good talk I listened to at QCon. The case study was an example where a demoralised, staggeringly under-performing team was turned around very quickly. There was no test first, no iterations … no kidding eh?

The team focussed on what was blocking progress and the elimination of wasteful practices in what they were trying to achieve. They tackled those problems in a way that empowered individuals in their daily work. They took an approach that imposed very little in the way of process. They freed people from a lot of the wasteful drudgery in their jobs but did not dictate how they should carry out their allotted tasks.

All of the above could have been achieved in many ways. What I think is exciting about Kanban is that it focusses very tightly on controlling a variable that is very easy to understand and can be controlled quite easily.

That variable is:

The amount of work in progress

I reckon it’s always best to try and effect changes on things you can control rather than things you can’t.

Dramatically changing how a team works is a great deal harder to do and I would argue ultimately a futile undertaking.

Wednesday, 2 April 2008

Should you do DSLs alone when no-one is watching?

This post is sponsored by InfoQ and in particular this article.

The article gives a summary of a blog post from Dave Thomas, who may well be a very pragmatic guy.

The main thing of interest to me in the post is something related to why I think that the current hype about DSLs is becoming a bit onanistic and the BDD frameworks are at best a distraction.

If I understand correctly the crux of the problem is as follows. If we are touting these DSLs at domain experts in the belief that the domain experts who will be comfortable with these DSLs are business users, then we have a problem.

In the context of Applescript Dave makes the following observation:

Given ...

"... for years, I've been trying to get into AppleScript. I keep trying, and I keep failing. Because the language is deceptive. They try to make it English-like. But it isn't English. It's a programming language. And it has rules and a syntax that are very unEnglish like. There's a major cognitive dissonance—I have to take ideas expressed in a natural language (the problem), then map them into an artificial language ..."

When ...

"When you're writing logic like this, with exception handling, command sequencing, and (in more advanced examples) conditionals and loops, then what you're doing is programming. The domain is the world of code."

Then ...

"... who are the domain experts? That's a trickier question to answer. In an ideal world, it would be the business users. But, the reality is that if the business users had the time, patience, an inclination to write things at this level, they wouldn't need programmers. Don't kid yourselves—writing these specs is programming, and the domain experts are programmers.

I guess this is the real problem I have with all of this. Surely a specification (or specification language), however captured has to have the characteristics of a ubiquitous language. A programming language does not need to have this characteristic.

I'm thankful to Keith for pointing out the Cobol Fallacy to me as I think this has some bearing on this sorry tale of trying to create a natural language like DSL.

begin"to say that i believe"'bollix')

Monday, 17 March 2008

QCon best talk award goes to Kanban

Well actually my award for best QCon London talk goes to "A Kanban System for Software Engineering".

Kanban is probably best known as a lean, JIT production scheduling method. The approach is basically “pull” based as opposed to” push. A "push" method requires accurate forecasting of demand in order to plan levels of effort.

Kanban can be more responsive to changing needs (changes in customer demand) by having later stages in a production pipeline signal that items of work are needed. Item of work are pulled into and along the production pipeline. In Kanban tokens (for example cards) can be used to signal that a later stage in a pipeline is ready for more work.

In the context of software development projects, Kanban promises "iteration-less" development, no more estimation, increased productivity and morale.

Like a lot of agile things, to get started all you need is a white board and post-its. However the speaker David Anderson, was anxious to dispel the myth that Kanban was in fact anything to do with whiteboards and post-its.These are enabling "technologies" and they also help create the foundation for transparency as well. However the real issue is:

“How much work is currently in progress?”

I’ll come back to this point, but essentially “work in progress” is what Kanban is all about.

So let’s have a look at the claims that the speaker made.

Development without iterations? Otherwise known as "Is he mad?"

In a nutshell, everything is just one big iteration. Releases may be made every two or whatever weeks, but work comes from one single work queue. There is a concept of lead time, which is how long something should take to get from start of pipeline to production, but this is not the same as an iteration, and it does not have to synchronize with a release schedule.

Items are taken from a work queue and enter a process, a pipeline or something that whatever you want to call your sequence of stages. The stages that make up the sequence are those “things that need to be done” with the item of work. Those "things to be done" can be analysis, design, development and so on right up to release.

The decisions about what items enter work queue and their priority are taken by the customers or product owners.

The approach I’ve just mentioned (analysis, design, development) sounds a bit waterfall. In fact the Kanban approach can happily accommodate the waterfall approach to development. There is no need to mandate any particular life cycle or development methodology for a project.

As mentioned before a key ingredient in Kanban is the notion of “work in progress”. Work in progress is a central variable that the Kanban system is focussed on controlling. The idea is to set specific limits for work in progress in any stage of the pipeline.

In the talk, David gave an example of a team with three developers, in which only three items could be in the development portion of the pipeline at any one time.

For each stage, the number of items of work in progress can be different and adjusted according to observed throughput. The system is explicitly setup to allow such adjustments by the members of the team.

No more estimation?
Otherwise known as "Is he mad?"

I admit this one, while very attractive (since I can't estimate ... never have been able to, and don't think I ever will), is a bit difficult for me to comprehend.

Let’s say you can only have three items of work and an agreed lead time. Without estimates, how can I tell if I grab the next item off the queue that it will fit into the lead time? David did mention that items can be bounced back if considered too big, so I suppose there is a bit of sneaky just in time estimation involved somewhere.

At basis I think, in the example provided, it was not so much that there is no more estimation that is the issue. The point here is that that system focuses on the elimination of waste. Therefore where estimation was considered wasteful it was eliminated. So I may be misrepresenting Kanban to say that there is no more estimation. This may be particular to the example given. I'll update you when I am enlightened enough to understand more.

Increased morale and productivity? Otherwise known as "I hope he is not mad"

With regard to morale and productivity I think the latter stems mainly from the elimination of waste (download the slides - in the project study cited in the talk it emerged that 33-40% of developer capacity was spent on estimation. The morale boost of witnessing rapid improvement of throughput is something I can easily understand.

In addition I think the approach encourages transparency. This is a bit of an obvious claim given the way in which project details are held. Having a big whiteboard with all the tokens representing items of work, visible to all in the room promotes simple visibility of the state of the project. Physical proximity helps of course. Although it is obvious it should never be underestimated or taken for granted in what is basically a social activity.

This post barely scratches the surface of the talk, so I recommend that you try to get hold of the talk online (he has given in elsewhere so there may be video somewhere) or catch David speaking at some other conference, or even just browse the slides.

Monday, 11 February 2008

What does the B in BDD mean

Hmmm ... the B in BDD. I know what it's supposed to stand for. So what does it smell like? A bit like a bad Balti. You know the one. It smells good when you're all beered up and you need a good dose of ghee to line your internals, but in the morning you'll smell like stale spice and hate yourself.

I've heard of TDD being described as "poor mans formal methods". Well BDD as it’s currently being hawked is "poor mans formal methods sold by a snake oil salesman".

As a general rule of thumb, I reckon you can judge the substance of an idea by the use cases put forward as exemplars of the idea.

Is this:

stack."should be empty"

demonstrably better on any level in any universe than:

assert stack.isEmpty()

Actually this is the best explanation I've seen of why all this BDD (frameworks at least) stuff does not seem amazingly compelling to me.

As a branding exercise BDD may well be a good suggestion. It might be a good idea to drop the "Test" from TDD, since the word "Test" encumbers us with the idea that it is something you do after everything else is done. It’s the implementations that seem a bit stilted to me.

You could use any xUnit framework for all and any of its short comings, coupled with well thought out tests to specify the behaviour of the software under test.

It is true xUnit tests do not communicate nicely with the business end of a software project, but are these BDD frameworks aimed at providing a fancy way to expose the intent of tests in human readable form? Perhaps that’s an unfair question since clearly the intent is really about moving from
from business requirements in the direction to working code.

So would using a BDD framework be a good approach to driving development. For the sake of argument lets imagine we were able to get “Behaviour” style user stories from “the business” in the format:

Given some context …
When some condition and/or action happens …
Then this other thing happen or be true …

It looks to me that if we could get something like the above, then our problems would be over. We'd have some concrete unambiguous requirements! To get to this point we would already have had to define a constrained format for the requirements. Then we would still need to implement a translation layer to map those requirements to a verification language, with or without a framework. That language would then be able to assert that the system meets the requirements.

If the BDD frameworks are just another mechanism for writing the verification language, fair enough but they do not allow us to move up to a higher level abstraction. They do not provide us with the means to write this:

Given some context …
When some condition and/or action happens …
Then this other thing happen or be true …

and then just automatically get the computer to just do the right thing. Therefore I can’t see the compelling argument for the adoption of any the current rash (rSpec, jBehave, Instinct, easyb ... and so on and on) of BDD frameworks.

I have no deep love for Fit, having used it a bit too much recently, but I can say this. It enables us to our say to our clients something not a million miles away from this:

Given a trade that looks like this …
When you amend it as follows …
Then the resulting trade will be like this …

Job done.

We can do this in some quite flexible ways and one thing I do know for sure is that our clients have never shown any interest in the behaviour of the Stack object which I know must be lurking somewhere in the code.

Saturday, 2 February 2008

Our chimps are smarter than theirs

Often heard in the consulting world is a discussion of the stunted abilities of this or that company's developers, especially when compared to the speaker's abilities. If you've been on the receiving end of this type of rant before, and found it an unedifying experience then you may find this is a fun read.