October 2004 Archives

Writing an ISO Image Under Windows

| No Comments

Another task that turns out to be not too hard. I couldn't find any native support for writing ISO images under Windows XP, but there is a version of cdrecord for Windows. It requires Cygwin to be installed, but I always have that on any Windows desktop I use.

The -scanbus shows what devices are available:

$ cdrecord.exe -scanbus
Cdrecord-Clone 2.01 (i686-pc-cygwin) Copyright (C) 1995-2004 J�rg Schilling
Using libscg version 'schily-0.8'.
        0,0,0     0) 'ST340014' 'A               ' '3.10' Disk
        0,1,0     1) *
        0,2,0     2) *
        0,3,0     3) *
        0,4,0     4) *
        0,5,0     5) *
        0,6,0     6) *
        0,7,0     7) HOST ADAPTOR
        1,0,0   100) 'SAMSUNG ' 'CD-R/RW SW-248F ' 'R602' Removable CD-ROM
        1,1,0   101) *
        1,2,0   102) *
        1,3,0   103) *
        1,4,0   104) *
        1,5,0   105) *
        1,6,0   106) *
        1,7,0   107) HOST ADAPTOR

As 'CD-R/RW SW-248F' is pretty obviously the CD drive, I know the unit to use is 1,0,0. Writing the image to disk is as simple as:

$ cdrecord.exe -dev=1,0,0 cd-image.iso
./cdrecord: No write mode specified.
./cdrecord: Asuming -tao mode.
./cdrecord: Future versions of cdrecord may have different drive dependent defaults.
./cdrecord: Continuing in 5 seconds...
Cdrecord-Clone 2.01 (i686-pc-cygwin) Copyright (C) 1995-2004 J�rg Schilling
scsidev: '1,0,0'
scsibus: 1 target: 0 lun: 0
Using libscg version 'schily-0.8'.
Device type    : Removable CD-ROM
Version        : 0
Response Format: 1
Vendor_info    : 'SAMSUNG '
[... etc ... ]

An RPM Database Fix


Had a remarkably trouble-free time upgrading from RedHat 6.2 (don't ask) to Fedora Core 2. Well, if you must know, it's a development server that hasn't been broken. After the upgrade, Oracle and ColdFusion still worked. There was one thing that could be a niggle.

# rpm -qa
error: cannot open Packages database in /var/lib/rpm
no packages

In fact, any attempt to use rpm resulted in the same message. Strange thing was that looking at the underlying database everything looked fine.

# /usr/lib/rpm/rpmdb_verify /var/lib/rpm/Packages
# echo $?

After some flailing around, I decided to try seeing what was really happening. This is a habit I picked up from SnoopDOS. On Linux, the equivalent command is strace

# strace rpm -qa 2>&1| less
execve("/bin/rpm", ["rpm", "-qa"], [/* 24 vars */]) = 0
[... lots of splurging gobbledygook here ... ]
[... not having a lot to go on, I searched for instances of "rpm" ... ]
[... many lines later ...]
open("/etc/rpm/macros.db1.rpmorig", O_RDONLY|O_LARGEFILE) = 3

Ok, I don't know all that much about rpm but it should surely not be reading files with ".rpmorig" tagged onto the name. Feeling brave, I decided to try a fairly radical fix.

# cd /etc/rpm
# mkdir STUFF
# mv macros* STUFF/
# rpm -qa

... and it worked. It looks like the one line in macros.db1.rpmorig was the culprit:

%_dbapi         1

It seems that this was causing rpm to try accessing the database using an obsolete API. Oh well, now it's noted for future reference.

Trying Out Subversion

| 1 Comment

... that's the version control system, not anything political.

My first experience of file versioning was in college, where the main VMS system had it as a feature of the operating system. Older versions of a file would be preserved and could be retrieved fairly easily. Unfortunately, disk space was at a premium, so we would run the purge command fairly regularly.

Next came RCS, the granddaddy of open source version control systems. There were two commands: ci and co (check in and check out, respectively). It was file based and it was simple.

For a long time we used SCCS at work, because it was bundled with the commercial Unices of the day, Solaris in particular. sccs edit and sccs delget were the commands to use. Otherwise, it was essentially similar to RCS.

CVS was the next step up. Sure, you had to compile it yourself, but it was worth it. Two features were killers:

  • access to remote repositories
  • tags and branches

It helped that there was plentiful documentation, including the CVS book, and support in the popular IDEs, include Eclipise.

CVS has those killer features, but it is showing its age. The support for tags and branches is sometimes a kludge and sometimes worse. It's too easy to make a mistake when you try to merge code between branches. When you delete or rename a file, it's as if you are creating a new file from scratch. This is an important difference: Subversion versions not only file contents and file existence, but also directories, copies, and renames.

Subversion has been controversial. The somewhat notorious 2003 e-mail on its failings by Tom Lord gives one pause. Greg Hudson's reply helps to put it in context: if you are happy with Subversion's idea of snapshots of trees of files, it's a reasonable replacement for CVS.

More importantly, there's a Subversion book, the software reached (and passed version 1.0) and there's support in Eclipse.

I'm going to try it out for one of my projects here and see how it goes.

I Didn't Attend No Fluff Just Stuff 2004

| No Comments

I generally agree with what this review of the NFJS conference based on my experience in 2003. The speakers are good and the material is solid.

In 2004, I didn't see enough new material to make it seem worthwhile to go again. For example, I attended talks on Naked Objects and Mock Objects last year.

There are some other considerations.

  • My backlog of books I have bought but not yet read is getting too big.
  • This time I would have to travel further, from the city and not a nearby suburb. Invidious choice between staying at a hotel or dealing with rush-hour traffic.
  • Need to save money for the honeymoon in November :-)

We'll see, maybe I'll make it next year. Of course, NFJS is much cheaper than going to one of the mega-conferences on the coasts and probably better value because of the smaller number of attendees.

Software Disasters: the Human Factor


Canada's National Post has an article that won't surprise anyone who has worked on large projects. "people " challenges are often more difficult than technical ones.

It's just surprising that such basic mistakes are still being made.

  • inadequate support from executives
  • poor understanding of how the project fits in the business
  • large requirements document up front that the developers are expected to interpret without further communication
  • inadequate training
  • no attempt to sell (never mind tailor) the system to the people who will use it

Those are basic, basic mistakes. I don't think you even need experience to understand why they are mistakes. The conclusion of the article is on target: avoiding people problems is a core job of management.

It becomes a major role of (management) to kind of herd the cats in and make them all line up in a reasonable way, said Barry Wilderman, an analyst at the Meta Group. That's why this stuff is so hard.

Uhm, yeah. Understanding people and organizations can be hard and getting them to work together smoothly is even harder.

However, I don't believe developers can afford to isolate themselves in a technical cocoon so one claim doesn't cut it with me. Developers are least qualified to validate a business requirement. They're either nerds and don't get it, or they're people in another culture altogether. Listen, no outsider gets it. I'm not saying that every developer needs to be schmoozing with users but every developer needs be aware. The relationship between the developers and the people who will use the software matters. If there are problems in that relationship, that's a huge risk that developers need to address.

The Photographer Comes Through

| No Comments

For anyone who doesn't want to click through all the links from photos.evidencetech.com to get there, a direct link. I've only had a chance to take a quick look but I like what I see.