software

The Scientific Method

Back in the 1990s I was fortunate enough to work for a very smart, energetic man. In a way, working for him — or at least in the position he gave me — helped change the trajectory of my career into what I wanted it to be.

Skipping 99% of that story ... one thing he did exceptionally well was troubleshoot problems, and troubleshoot them very fast. I didn’t know it at the time, but he was using something called The Scientific Method. After observing him for a while, I saw him repeat these steps so precisely that I thought he must have them on a tattoo on the inside of his eyelids:

  1. Observe some feature, in our case, a bug
  2. Hypothesize a model consistent with the observations
  3. Predict future events the hypotheses should yield
  4. Verify the predictions by making further observations
  5. Validate by repeating

A software self healing pattern (part 1)

Over the last 18 months I've been working with a 24x7 manufacturing group, and no matter what I say, they always have the same two requests/demands:

  1. The software system must not fail, and
  2. If it does fail for some reason, it needs to be able to recover properly from the failure.

Simply put, (a) the machines must keep moving, and (b) nobody wants the phone call in the middle of the night when the machines stop moving.

If you build the wrong application, no cool new technology will save it

Paraphrasing someone tonight: “I worked on cool projects X, Y, Z with cool new technologies A, B, and C. They all failed. Nobody used them. The only app customers still use was written in lowly old PHP. And the customers love it.”

I took that as, if you build the wrong application, no cool new tech will save it.

Measuring Scrum team productivity/speed with Function Point Analysis

I bought my first copy of Agile Software Development with Scrum, by Schwarber and Beedle back around 2002, I think. I was just thumbing through it last night when I saw that they use Function Points as a metric to demonstrate the velocity that agile software teams achieve, and more specifically use it to show that some teams develop software much faster using Scrum.

I didn’t know about Function Point Analysis back in 2002 — I didn’t become a Certified Function Point Specialist until about two years later — so I probably just skimmed over that line then, but when I saw it last night I thought it was cool that they used function points as a metric for software team development speed.

The 90/90 Rule of software development

The 90/90 Rule: “The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time.”

~ Tom Cargill

The place to fight design wars is in new markets

“When Yahoo bought Viaweb, they asked me what I wanted to do. I had never liked the business side very much, and said that I just wanted to hack. When I got to Yahoo, I found that what hacking meant to them was implementing software, not designing it. Programmers were seen as technicians who translated the visions (if that is the word) of product managers into code.

This seems to be the default plan in big companies. They do it because it decreases the standard deviation of the outcome. Only a small percentage of hackers can actually design software, and it’s hard for the people running a company to pick these out. So instead of entrusting the future of the software to one brilliant hacker, most companies set things up so that it is designed by committee, and the hackers merely implement the design.

If you want to make money at some point, remember this, because this is one of the reasons startups win. Big companies want to decrease the standard deviation of design outcomes because they want to avoid disasters. But when you damp oscillations, you lose the high points as well as the low. This is not a problem for big companies, because they don’t win by making great products. Big companies win by sucking less than other big companies.

So if you can figure out a way to get in a design war with a company big enough that its software is designed by product managers, they’ll never be able to keep up with you ... The place to fight design wars is in new markets, where no one has yet managed to establish any fortifications. That's where you can win big by taking the bold approach to design, and having the same people both design and implement the product. ”

~ I hope to write more about this at some point, but for now this is a long quote from a Paul Graham blog post titled, Hackers and Painters

Software development process standard operating procedures

Some long time ago I was working on a large software development project, and I wasn’t happy with either the quality or the velocity of our programming effort. So one night I sat down and tried to work out an activity diagram to show what our software development process needed to be, to improve both speed and quality. It turns out that a lot of this is just common sense, but for some reason or another team members would try to circumvent the process, which always led to more pain for everyone involved.

Kent Beck’s Four Rules of Software Design (also known as “Simple Design”)

For the first time in many years I just came across Kent Beck’s Four Rules of Software Design:

  1. Passes the tests
  2. Reveals intention (should be easy to understand)
  3. No duplication (DRY)
  4. Fewest elements (remove anything that doesn’t serve the three previous rules)

There are wording variations on those rules, but I got those specific words from this Martin Fowler post. As he notes, “The rules are in priority order, so ‘passes the tests’ takes priority over ‘reveals intention.’”

For more information on Kent Beck’s Four Rules of Software Design, see that link, or this link to the original rules on c2.com.