Improving Wetware

Because technology is never the issue

Three choices: Mitigation, Adaptation and Suffering

Posted by Pete McBreen 14 Mar 2010 at 14:52

Interesting ideas in a PDF presentation on climate change. It is amazing what difference we have created in moving from 285ppm of CO2 in the atmosphere to the present 390ppm. One disturbing idea is that the temperature effects have so far been buffered by warming the ocean and melting lots of ice in glaciers. Since a lot of places rely on glaciers to feed rivers during the summer, before the glaciers melt entirely we need to be building a whole lot more storage capacity in our reservoirs or many places are going to experience extremely severe water shortages in the summer months.

“We basically have three choices: mitigation, adaptation and suffering. We’re going to do some of each. The question is what the mix is going to be. The more mitigation we do, the less adaptation will be required and the less suffering there will be.” – John Holdren Source - page 68

Other sources for teh science background to this http://realclimate.org and http://climateprogress.org/.

Just read Debunking the Science that Makes Life Dismal

Posted by Pete McBreen 12 Mar 2010 at 18:58

Economics for the rest of us is a very interesting comparison of classical vs neo-classical economics with the central tenet that economics as taught an promoted is

Arguably, the damage from the teaching of economist’s theory of wages is far greater than the damage from the teaching of creationism. Yet the theory of wages is part of economics education in any and all schools, and it continues without any notice or apposition. The reason is, of course, not hard to understand. While everyone is hurt when we teach religion and pretend it’s science, not everyone is hurt when we teach economics. What workers lose, executives and capitalists gain; and it is the latter who study economics, hire economists, and endow schools.

Lots of lessons in the book for the current economic meltdown, not least that the failure of governments to ensure equality and equitable distribution of wealth has and will make society a lot worse off even if the “economy” looks to be healthy.

The most interesting claim is that unemployment results when spending on consumption and investment goods declines. Investment is needed to absorb the surplus that is created, and without this investment in real goods, productivity gains result in lost jobs. In addition, once consumer confidence drops, people stop buying and then the downward spiral starts. But contrary to current popularized ideas, it is not the consumers who can spend our way out of the recession. Consumers are rationally saving money in case things go worse. It is the investors who have to show the confidence by investing in new productive capacity, that will generate the jobs that enable consumers to feel confident again.

The executives and capitalists who have so far managed to retain too large a share of the overall pie are now hoarding cash, not investing in productive capacity and as a result are deepening the depression. After all, is capital not supposed to be the patient partner in the enterprise. Why should anyone expect a family with a large mortgage to spend money when billionaires and large enterprises have cash stored away in banks, looking for lucrative investment opportunities but only bothering to invest when they have a near certainty of return.

Seeking Simpler Explanations

Posted by Pete McBreen 02 Mar 2010 at 20:25

Yes, there is a fancy name for simpler explanations - Occam’s Razor - or Ockham if you prefer the old spelling, but I prefer to use plain english.

A common problem with beginners to software development is that when an error occurs, they manage to convince themselves that they have found an error in the compiler or computer. Yes, sometimes this is actually happens, compilers do have errors, but a much simpler, and more likely explanation is that they have made a normal beginner mistake in their code. Of the two, it makes more sense to investigate the simpler cause first and only then, if no problems can be found is it worth while investigating alternate explanations. Most of the time the simple explanation is the one to go with. If a program suddenly starts failing, and there have been some recent edits to the code, then the simplest explanation is that the error was surfaced by the changes, so that is the best place to start looking.

Climate science as a good example of simpler explanations

One explanation of the problem of climate change and rising CO2 levels is that there has been a conspiracy of scientists to raise the specter of anthropogenic global warming so that they get fame and fortune.

A simpler explanation is that the conspiracy is n the other side. That some large corporations with vested interests are leading a campaign to convince the general public that there is nothing strange going on with the climate.

One way of testing which of these is a more likely explanation is to look at past behavior. Can we find any evidence of scientists acting in concert to deceive anyone? No, sorry, nothing there. Sure there have been cases where an individual scientist or group of scientists have been enthusiastic about an idea that turned out to be incorrect, but these cases have never lasted for long and even early on there was the normal skepticism of scientists asking questions.

Looking to the other side, can we find any evidence of corporations acting in concert to deceive people? Yes, several times, often with significant deleterious effects on people and the environment. Car and oil companies managed to keep lead in petrol for a long time after the effects of low level of lead exposure were known to harm humans. Lead was only removed when catalytic converters were required to reduce smog and the lead was poisoning the catalytic converters.

Another example, early on the car companies bought up and dismantled many of the electric trolley companies thus forcing people to buy cars in order to get around in cities. Very few cities have effective light rail transit these days, even though in the 1930’s most major cities had these electric trolley lines. San Francisco is one of the few cities that still has the remnants of the old system still running.

Another example is the tobacco industry, managing to spread enough doubt about the effects of smoking so that for over forty years there was insufficient effort put into preventing people from becoming addicted to the nicotine in cigarettes. End result of this was a massive lawsuit and damages awarded against the industry, but even now, the public attitude is such that the tobacco companies can still sell very addictive substances and keep on addicting new generations of customers (aka addicts).

With these examples, the simplest explanation of the public debate over global warming is that there is a conspiracy among the major corporations who have a vested interest in the Coal and Oil sectors of industry to spread doubt and uncertainty. Very year the doubt proceeds, the corporations generate billions in profit. Following the money is always a simpler explanation.

The Onion has written a software manifesto...

Posted by Pete McBreen 28 Feb 2010 at 18:07

I think that the Rugged Software Manifesto has to be a parody.

I am rugged… and more importantly, my code is rugged.

Ok some of the statements are reasonable,

I recognize that software has become a foundation of our modern world.

but overall the whole thing is so over the top that it has to be a parody.

I am rugged, not because it is easy, but because it is necessary… and I am up for the challenge.

How Can We Detect Slow Changes?

Posted by Pete McBreen 07 Feb 2010 at 17:26

Sometimes it seems that while we were not looking, things changed.

Not too many years ago -

  • Hardware was the largest part of any software project budget. Now, unless you are working at a massive scale, the cost of the computing hardware is a rounding error on the bottom line.
  • Scripting languages were too slow for use on real projects, but the web has well and truly demonstrated that this is false.
  • Javascript was only used for annoying irritating effects on web pages, but now AJAX and Web 2.0 have brought drag and drop functionality to the browser application (admittedly not everyone is using these capabilities but they exist).

Not too sure how this is happening, but it seems that when we first learn about something, those ideas stick and it is hard to change what we know to match the current reality. When I started commercial software development, it was common to build systems on a PDP-11 with under 512KB of RAM. These days a laptop comes with at least 2GB of RAM, an increase of main memory of a factor of 4,000, but sometimes I still catch myself trying to save a few bytes when designing some aspect of a system.

The open question for now is how to detect this type of slow change (even if the pace of technological change is not all that slow compared to other changes.) This is an important question because many societies and groups have been hit by surprises that in hindsight are obvious, and the consequences were catastrophic;

  • When cutting down trees in an area, when does the population realize that there is a serious problem with deforestation?
  • When does a drought become a climate shift that means the area is no longer amenable to the current mode of agriculture?
  • When does the exploitation of fish in a fishery result in the collapse of the stocks in that fishery?

On the technology side, when do the desktop application developers get hit overtaken by the web applications running in a browser? Functionality wise, we can deliver nearly equivalent functionality over the web provided we have the bandwidth, so maybe it is time to recreate departmental applications as web applications?

Chip and Pin Credit Card Vulnerabilities

Posted by Pete McBreen 06 Feb 2010 at 10:14

This is old news to europeans, but Canada has just started to move to this technology, and it looks like the same systems that are deployed in Europe. With that in mind, here are a few links to known problems in the European model

Chip and Spin is a site that looks at the overall context of the Chip and PIN model, but most interesting of all is that of all places to be doing this type of research, the University of Cambridge is investigating Banking security.

The main issue is that with a credit card containing a chip and the customer providing the PIN, it is going to be a lot harder for the account holder to prove that the transaction is fraudulent. But as the study shows, cloning a card containing a chip is not that hard, and obtaining the pin is not much harder (even before we get into the social engineering possibilities).

Money quote from the Banking security study:

We demonstrate how fraudsters could collect card details and PINs, despite the victims taking all due care to protect their information. This means that customers should not automatically be considered liable for fraud, simply because the PIN was used. Even though a customer’s PIN might have been compromised, this is not conclusive evidence that he or she has been negligent.

Update from the same source - How Not to Design Authentication talks about the problems of using credit cards for online transactions (card not present transactions).

Yet another update from the same team: Chip and PIN is broken

The flaw is that when you put a card into a terminal, a negotiation takes place about how the cardholder should be authenticated: using a PIN, using a signature or not at all. This particular subprotocol is not authenticated, so you can trick the card into thinking it’s doing a chip-and-signature transaction while the terminal thinks it’s chip-and-PIN. The upshot is that you can buy stuff using a stolen card and a PIN of 0000 (or anything you want). We did so, on camera, using various journalists’ cards. The transactions went through fine and the receipts say “Verified by PIN”.

Now using Tynt Insight

Posted by Pete McBreen 20 Jan 2010 at 20:02

Since I was on the team that developed it, thought it was about time to install Tynt Insight on this blog, so I can now see what gets copied and the links will be a bit different when you copy from the site.

Based on this trend we will probably reach 400ppm in April or May 2015.

Read more: http://www.improvingwetware.com/#ixzz0dDTBA0Gp

Under Creative Commons License: Attribution Share Alike

If Tynt Insight is working correctly, clicking on that link will take you to the CO2 blog post and highlight what was copies on that posting.

This link http://www.improvingwetware.com/2010/01/09/why-this-site-has-the-co2-badge#ixzz0dDTvtIq1 goes to the articles permanent page ans should always work even after there are more blog posts on the home page that have moved the CO2 article off the home page.

Good process vs. Bad process

Posted by Pete McBreen 17 Jan 2010 at 11:08

Interesting set on slideshare about the Netflix company culture. Process slide is number 61 - not quite figured out how to link directly to that slide - and the following slides…

Lesson: You don’t need detailed policies for everything

A tale of woe related to optimization

Posted by Pete McBreen 17 Jan 2010 at 08:52

In Optimised to fail the authors start with a great quote…

The late Roger Needham once remarked that ‘optimisation is the process of taking something that works and replacing it with something that almost works, but is cheaper’. [emphasis added]

Although the technical details of the protocol are not public, the authors seem to have managed to replicate what happens, but the key part of their paper are the vulnerabilities that they reveal. These vulnerabilities coupled with the transfer of liability for fraudulent transactions from the banks to the customers means that this protocol and the associated hardware and banking cards should be withdrawn from use.

Browser standards and slow progress

Posted by Pete McBreen 14 Jan 2010 at 11:13

Justin Etheredge has an interesting rant about browsers and the compatibility with standards. The paragraph below should have rounded corners from CSS, but as he says…

And how about this? If you’re looking at this in Safari, Opera, Firefox, or Chrome, then you are seeing nice rounded corners which are created using only CSS. Guess what you’ll see in IE8… nothing. A square box.

Looks like jQuery might be the way to go rather than trying to deal with these browser issues.

An interesting python project

Posted by Pete McBreen 13 Jan 2010 at 22:12

After all the fun and games in the press over the climate models, some developers decided to rewrite the climate models in python. So far their graphs seem to track pretty well to the fortran original code, but these are early days in this implementation of the code.

Looks like I’m going to have to update my python implementation as it is too old to run their code… I’m back at 2.5.1 and the code needs 2.5.4

Just because Zed is so awesome

Posted by Pete McBreen 13 Jan 2010 at 18:48

One of Zed’s earlier rants about why Programmers Need To Learn Statistics.

Finally, you should check out the R Project for the programming language used in this article. It is a great language for this, with some of the best plotting abilities in the world. Learning to use R will help you also learn statistics better.

Why this site has the CO2 badge

Posted by Pete McBreen 09 Jan 2010 at 18:39

Since the trends on global CO2 levels are not good, I decided that it would be useful to watch how they are changing, The historical trend has been that on average we are increasing CO2 levels by approx. 1.9ppm/year. Based on this trend we will probably reach 400ppm in April or May 2015.

But we will see fluctuations up and down over the course of the year

This is a feature of the way the climate relates to the overall earth systems, the CO2 level drops as vegetation grows in the northern hemisphere summer, and then rises during the northern hemisphere winter, peaking in the spring, and then starting to fall off again in June. On an annual basis this fluctuation is around 6 ppm, but year on year we are averaging nearly 2ppm higher - but this varies with the economy and the weather in any year, hot years tend to be associated with a higher rise.

Below is sample data extracted from CO2Now.org which is also the source of the badge.

YearJanFebMarAprMayJunJulAugSepOctNovDecAverage
1959315.62316.38316.71317.72318.29318.16316.55314.80313.84313.26314.80315.59315.98
1960316.43316.97317.58319.02320.02319.59318.18315.91314.16313.83315.00316.19316.91
2008385.42385.72385.96387.18388.50387.88386.38384.15383.07382.98384.11385.54385.57
2009386.92387.41388.77389.46390.18389.43387.74385.91384.77384.38385.99387.27387.35

Overall this is a large scale experiment

How much CO2 can humans add to the atmosphere without adversely affecting the climate systems that we depend on?

A defense of the GPL

Posted by Pete McBreen 08 Jan 2010 at 22:21

A historical look at what makes the GPL useful. Best quote

All you’re doing by whining about how the GPL makes it impossible to make money off of someone else’s work is to convince me that you’re…

Yes, it is a rant, but understandable in view of the rants and opinions raging about the GPL due to Oracle’s impending purchase of MySQL. For other views Groklaw explains The GPL Barter Cycle, Stallman on selling exceptions to the GPL - a follow up to the letter to the EU Commission, GPL Works No Matter Who Owns the Copyrights, Groklaw’s - Reasons I Believe the Community Should Support the Oracle-Sun Deal. In the end Groklaw comes out against the plan to make money from Open Source code by getting the EU Commission force it to go proprietary.

My personal take on the MySQL deal is that the time to have the concerns was when it was first sold to Sun, not afterwards by trying to revise the deal that Sun made when it first acquired MySQL.

For more background on Software, GPL and Patents there is always Groklaw’s GPL Resources and the amazingly detailed An Explanation of Computation Theory for Lawyers, and for the historically minded, the ongoing SCO GPL case.

Still Questioning Extreme Programming

Posted by Pete McBreen 03 Jan 2010 at 14:37

After reading Mark Wilden’s tale of Why he doesn’t like Pair Programming I have spent some time reconsidering my Questions about Extreme Programming.

In the book I let Pair Programming off lightly, not fully addressing the dark side of pair programming that Mark addressed. Yes, chapter 9 is called Working at this intensity is hard, but I did not point out that after a full day of pair programming most developers are not in a fit state to do more work. XP requires a 40 hour work week so that the developers can recover from the pair programming.

Other problems I have noticed with Pair Programming

  • Exploration of ideas is not encouraged, pairing makes a developer focus on writing the code, so unless there is time in the day for solo exploration the team gets a very superficial level of understanding of the code.
  • Developers can come to rely too much on the unit tests, assuming that if the tests pass then the code is OK. (This follows on from the lack of exploration.)
  • Corner cases and edge cases are not investigated in detail, especially if they are hard to write tests for.
  • Code that requires detail thinking about the design is hard to do when pairing unless one partner completely dominates the session. With the usual tradeoff between partners, it is hard to build technically complex designs unless they have been already been worked out in a solo session.
  • Personal styles matter when pairing, and not all pairings are as productive as others.
  • Pairs with different typing skills and proficiencies often result in the better typist doing all of the coding with the other partner being purely passive.

Having said all of that, pairing is still a good way to show someone else around a codebase, and pairing is definitely useful when debugging code - the second pair of eyes definitely helps.

Questions about Extreme Programming that are still open

Overall I still see XP as a fairly niche methodology, as there are few projects that it is really suited for. The constraints of XP are fairly rigid, and although many projects have tried to replace the onsite customer with systems analysts, the decision cycle is definitely longer when the analyst is in the loop.

The key problem that XP projects face is that there is no real compelling case for using XP. Yes, some developers like it, but more for the Unit Testing rather than any other part, and the testing aspects can be replicated in any other approach to software development. I still think that the most useful parts of XP can be applied in other contexts.

Overall, although my questioning of XP was not well received at the time, I think it has stood the test of time well in that eight years on,XP is approaching the status of a footnote in software development history. Yes, it helped to motivate the Agile Alliance, but these days I see more projects trying Scrum than XP, and while some of the XP practices are here to stay, it is hard to point to any software that was developed using XP and state that it will stand the test of time.

Yes, many of the tools that supported XP practices will stand the test of time, but most (all?) were not developed as part of an XP project, instead they were solo projects undertaken to assist an XP team in one way or another.

In summary, although Questioning Extreme Programming is now outdated in that it refers to an earlier incarnation of XP, I still stand behind the claim that while it was an interesting approach to software development, it is not applicable to many projects, and the current versions of XP have similar problems. The ultra-light weight approach to software development that tries to put developers first does not work all that well in a corporate or entrepreneurial setting.

Make Lists. Not Too Much. Mostly Do.

Posted by Pete McBreen 12 Dec 2009 at 09:59

Michael Pollan’s Eaters Manifesto - Eat food. Not too much. Mostly plants. has spawned another that looks at time management - Make lists. Not too much. Mostly do..

time managment is another totally overdone subject, wouldn’t it be great to have a similar credo to simplify all this hackneyed advice on to do lists, productivity, time management systems, and the like? Then, sent from the productivity heavens, it came to me:

Make lists. Not too much. Mostly do.

Nice parallels to software development here, planning is useful, but the key thing is actually crating the software.

An amusing speculation about the Waterfall

Posted by Pete McBreen 10 Dec 2009 at 20:36

Tarmo Toikkanen has an interesting speculation about Why people still believe in the Waterfall model, putting the blame on the Royce paper that was trying to say that waterfall was not the way to do software development.

OK, so why do people still advocate the waterfall? If you look at the scientific articles on software engineering that discuss the waterfall, they all cite Royce’s article. In other words, they’re saying something like “The waterfall is a proven method (Royce, 1970).” So they base their claims on an article that actually says the opposite: that the model does not work.

Tarmo was not the first to run across this idea, but the interpretation of the problem is different.

Don’t draw figures or diagrams of wrong models, because people will remember them. And in the worst case that could cause hundreds of thousands of failed projects and billions of euros and dollars wasted for nothing.

Other people has written about The Waterfall Accident and Waterfall Model; not what was intended.

Time to move beyond Agile?

Posted by Pete McBreen 29 Nov 2009 at 16:35

Just found a very interesting post from Luke Halliwell on The Agile Disease that looks at the fit between Scrum and the game industry, but many of his points are relevant to other development domains.

[Agile] was designed by consultants, aimed squarely at the world of in-house enterprise IT: a world where mediocre developers work for large corporations that don’t understand software development, but can afford to buy in expensive consultants to “save” their runaway projects.

Having daily stand-up meetings is ludicrous; it exists simply to protect against the dysfunction of team members that never talk to one another. … In anything resembling a normal, common-sense team, people will surely raise blockages with their manager as soon as they occur and not need to wait for a daily meeting!

After that rant Luke went on to describe what worked in his field, and even had a post after attending a Scrum Master course.

Amusing quote from an economist

Posted by Pete McBreen 25 Nov 2009 at 19:11

Talking about why things are a lot more complicated than we might otherwise think.

Books are thick cos things are hard – Richard Denniss

A comment towards the end of his talk: ENVS1001 - Resources, Environment and Society - 2009 audio podcast, Week 05 Panel B: Can Economics Save the World? Richard Denniss (Fenner School of Environment and Society, Australian National University on iTunesU)

Software development as a scientific activity

Posted by Pete McBreen 22 Nov 2009 at 22:21

Given that I do not agree with the characterization of software development as software engineering, it was somewhat of a surprise to find that there are a lot of parallels between software development and science.

Debugging and testing are probably the most scientific activities, in that developers have to make guesses about what is happening and then devise experiments to prove those guesses wrong. Developers also have to make predictions about how the systems they are developing will behave and then defend those predictions against speculations made by uniformed observers and occasionally defend against misinformation conveyed by financially interested parties.

One conclusion I could make from this is that the politics of software development are very similar to the politics of science. Practitioners try to pretend that there is no politics involved as it is all perfectly rational and understandable, but because people are involved it is all about politics. As soon as we start making predictions, then how we interpret those possible futures has a big effect on the actions we might take. This then becomes the realm of politics and it is that part that many software developers forget about (and many scientists as well) - Politics Matters.

We probably could have saved ourselves , but we were too damned lazy to try very hard … and too damned cheap. – Kurt Vonnegut

To celebrate the politics of science I’ve included a graphic reminder in the sidebar that small changes over a long period can result in a very interesting future.