WCRE Trip Report (November 11, 2005)

It is fun hanging out with people who toss around "3 million lines of code" like it is nothing special. For the crowd at WCRE, such large programs are the norm. These guys work with big programs, spewed out by bright-eyed coders from the distant past, that have now grown too useful to be thrown away. Dealing with these heaping piles of often-mediocre code has given the attendees a gritty, realistic outlook. Likewise, whenever these guys actually succeed with these programs, e.g. effectively reclustering the modules of a program, or translating the entire code base to a newer language, they deserve to boast a little about the accomplishment.

WCRE has the warm, pleasant feel of small conferences. There are few enough people present that you get to talk with all of them, and there are few enough papers accepted that you can attend all of the presentations. The conference itself was run admirably. Everything was close and convenient, there was free wireless Internet everywhere, the food was great, and the coffee and tea never stopped flowing.

This year the conference was co-held with WICSA, an architecture conference. The integration was quite smooth--attendees of either conference could attend the vast majority of the other one as well. The WICSA crowd also made be jealous with their nice wiki.

All conferences should have such a wiki nowadays, so WICSA is ahead of the game. A wiki is a great place for all kinds of things attendees might want to share with each other. Just a few examples are:

  • continuations of discussions started after the talks or over drinks
  • links to web sites that came up in conversation
  • practical local information, like where is good to eat

As an example of the last item, for the first few days I kept hearing non-locals asking where to get prepaid phone cards. Apparently the in-hotel convenienc store had run out. It would have been great if folks could post locations they found on a wiki so that other attendees would know where to look.

If WCRE does take WICSA's example of setting up a wiki next year, I suggest they do not follow their lead in password management. It took me a couple of days to be able to log into the WICSA wiki. I did not previously have an account, and I had to find someone off line who had the authority to create an account for me. Just make it a wiki, already, and leave it open to editing. Either that, or have a single, bulk password that is posted prominantly around the conference. The security of restricting access to attendees only was more hassle than it is worth. Besides, it is an open conference, is it not? "Outsiders" participating on the wiki should only enrich the conversation.

On the technical material, I enjoyed the paper by Kuhn, et al., on performing cluster analysis using the text of identifiers and comments. Human language is content, too, but most reverse engineering tools ignore them because they do not have formal meanings to computers. Cluster analysis does not actually need to know the meaning of the terms, though. The analyzer can simply look for patterns in word usage.

Technical debt is a great term that James Highsmith tossed out during his Stevens Award Lecture. The concept is that before you can add new features or otherwise improve a program, you have to spend time coping with short cuts that the programmers have taken in the past. These are things I flag with "XXX" in my own code, and indeed, there tend to be embarassing numbers of them floating around if I end up rushing to make a demo.

The concept of technical debt is fascinating to think about and leaves a lot of open questions. How do you distinguish technical debt from simply programming something in a simpler way? Once you figure out the distinction, is there any way to measure how much of it you have? And whether or not you can measure it, how can it be measured?

Mary Shaw received the other Stevens Award this year. She gave a fun talk about her long career of paying close attention to uncomfortable disconnects between theory and practice. She includes many examples from her thoughts over the years, including hard-line pure functional programming and the (non-) use of proofs and formal specification to achieve correct software.

Finally, a couple of professional reengineers showed us what they do. Sneed talked about using metrics on an existing system to estimate costs of reengineering. He gets much more accurate estimates than with estimates for a new system, because the existing code gives a reliable way to understand the complexity of what is there and what will need changing. Akers talked about using automatic rewriting tools to reengineer. Full automation is a dead end (if the rewrite is fully automatic, then have you really rewritten it?). Instead, his approach is to have a team of engineers spend a few months developing rewrite rules that are customized to the particular program and the particular reengineering task.

Overall, it was a pleasant and invigorating conference.