A political piece from last week’s election - We Need a Programmer for President.
It has an interesting take on the need for more emphasis on teaching programming in schools at an early age, rather than the more normal Computer Literacy which focuses on how to use the standard suite of applications.
Just noticed that CNC machines are getting to be cheap as well. A sample guide to CNC machines looks at how they can be used in conjunction with moulding techniques to fabricate moulds for plastic parts as well as produce metal parts.
These CNC machines are not quite as cheap as the 3D printers, but they are in the ballpark - plus if you create the moulds correctly, can be used to scale up small scale manufacturing of plastic parts much better than you could with a 3D printer.
Looks like we are starting to live in what could be called Interesting Times.
Although Moore’s Law still seems to be holding out a bit longer, individually the cores in CPUs are not that much faster than they used to be. We have been stuck near 3GHz for nearly 10 years now, and a common occurrence on servers and laptops now is to see a process taking 100% of the available core but overall the machine is running at 25% or 16% loading (depending on whether it is a 4 or 8 core machine). In order to get processes to run faster we are going to have to learn how to program with multicore CPUs in mind.
Peak Oil seems to have occurred in the 2004-2007 timeframe, so the days of cheap fuel are behind us. In Canada fuel is still cheap, but $1/L is not something we have seen for a while. How society handles the transition to $2/L is going to be interesting. The effect of higher prices will have a double impact with the expected wild fluctuations in price that many analysts in the Peak Oil field are predicting. It is amusing however to watch local dealers having to do massive truck sales at the end of each year to get rid of their excess inventory of gas guzzling vehicles.
As we track towards 400ppm CO2 the thought that maybe Global Warming would be nice in a country with cold winters is turning out to be mistaken. A better term would have been Anthropogenic Climate Change and the changes that are resulting in more extreme weather with a tendency to more arid conditions on the western edge of the prairies is beginning to make things interesting.
The convergence of computers, open source and manufacturing will be having ramifications soon. The Maker Faire phenomenon of 3D printers and low cost CNC machines has been very instructive and soon may become disruptive when the costs of these technologies falls further. Already a 3D printer can be obtained for $1,000 with a resolution that rivals that of commercial machines that cost 30X more. A good bet would be that this is likely to have a bigger impact than did the arrival of low cost microcomputers that lead to the PC era and subsequently our current Internet era.
Looking to chemistry this time, here are six proposed Rules of Reproducibility.
- Were studies blinded?
- Were all results shown?
- Were experiments repeated?
- Were positive and negative controls shown?
- Were reagents validated?
- Were the statistical tests appropriate?
Many science papers are unfortunately weak when it comes to these rules, and in many fields #2 is a real problem - only the positive results are shown, the rest are hidden away and never seen.
Found another interesting parallel between software development and running. The field of running and exercise is full of lots of claims about special ideas that will drastically improve performance of athletes. The Science of Sport site has a blog post on How to spot bad science and fads- Determining whether an idea is worthwhile
At a recent track meet I was having a conversation with a friend in college, who made the astute observation that if the coaches inserted random scientific terms to explain things, even if they were totally wrong, the runners seemed to buy into it more enthusiastically. That’s a very common reaction, we all do it. We associate science and complexity with being smart or correct. As I’ve said before…people trying to fool you go from simple to complex…good coaches translate complex things into simple understandable ideas.
In another post the same site talks about the value of research, theory and practice
… I often rely on what one of my Professor’s, Jason Winchester, called the three stool leg test. You have research, theory, and practice. If you have all three, it’s almost certainly a good idea to implement it. If you have 2 of 3, it’s fairly likely that it works and it depends on the strength of the 2. If you’ve only got 1 of 3 going for it, it probably doesn’t work. The beauty of using the 3 stool leg test is it blends science and practice, and compliments it with theory which in itself is a blend of science and practice.
Jim Bird has taken a look at how much is technical debt costing you. Nice to see that he ignores the dollar estimates per line of code that some authors use and just uses a simple $$$ through to $ notation.
$$$ Making a fundamental mistake in architecture or the platform technology – you don’t find out until too late, until you have real customers using the system, that a key piece of technology like the database or messaging fabric doesn’t scale or isn’t reliable, or …
$ Missing or poor error handling and exception handling. It will constantly bite you …
Recently Jim Bird had to point out that Source Code is an Asset, Not a Liability. Unfortunately it means that there are people in the software development community that are not aware of the literature - specifically Howard Baetjer Jr.’s Software as Capital.
Some interesting lessons for Software Development can be obtained form outside our field. I was reminded of this while reading a running blog that looked at what lessons could be gained from outside of the field of running coaching…
Rules of Everything
- When something is new or gains popularity, it is overemphasized until it eventually falls into it’s rightful place. How long that process takes varies greatly.
- Research is only as good as the measurement being used is.
- We overemphasize the importance of what we can measure and what we already know, ignoring that which we can not measure and know little about.
- We think in absolutes and either/ors instead of the spectrum that is really present.
Point 1. helps explain a lot of the original hype/hope surrounding the agile approaches to software development.
Lessons from outside the running world
We go through a cycle of forgetting and remembering what’s been done before us. You see this in the reintroduction or rememphasis in certain training methods in the coaching world. That’s why it is incredibly important to know your history. And if you can, know your history from a primary source where you attempt to look at it through their eyes during that time period. For example, going back and reading Lydiard’s original work gives a greater appreciation of what he was trying to do, then reading someones summary now, 50 years later. We lose a little bit of the original message.
Sometimes there is useful information available from looking back at what worked in the past. Although many on the software field seem to try to forget the past, the pioneers in the field learned a lot, some of which is still applicable to our present circumstances.
Don’t normally link to Dave winer, but his The bosses do everything better is priceless…
When he looked at the code he must have been shocked to find something complex and intricate. Why isn’t the source code as simple as the software? Hah. When you figure that out let me know.
All too often in software development I hear the comment that there must be a “simpler/easier way.”
Unfortunately, although sometimes simple solutions are workable, in most cases the simplest solution is not workable. Or rather the simple solution would be workable in some circumstances, but not for the current project becasue of some fairly obvious deficiencies in the simple solution.