Mar 032013

This isn’t the first time I’ve commented about user interfaces.  This weekend has been particularly wet, and while I did have plans to get out of the house for a few hours, I’ve spent a week-end indoors.

This meant time to go tinker with a few projects, amongst those being my desktop configuration.  In particular, key bindings, and panel placement.

My desktop is largely based on what I was running when I was at university.  At this time, I was mostly limping around with aging Pentium II class laptops with display resolutions of 800×600 pixels.  No space for big spacious panels here.  I was running KDE at the time.

I frequently had a need to move windows around without the use of an external mouse.  The little writing boards that the university provides in its lecture theatres are barely big enough to accommodate a writing pad let along a laptop!  The laptops I had generally had the small track-point type cursor, which while usable, wasn’t great.

That, and small screen real-estate dictated a desktop which was space efficient and mostly keyboard driven.

Now, I could have gone the route of tiling window managers.  I recall trying out Ratpoison at one point, decided it wasn’t for me, and let it go.  I do like a little eye candy!  KDE 3.5 at the time provided a good balance, and wasn’t too bad on these machines.  Crucially, I was able to set up a couple of small panels around the sides of the screen, and set up key bindings to perform most operations.  Some of the common operations I needed were:

Key binding Action
Logo + Shift + M Move a window
Logo + Shift + R Resize a window
Logo + Shift + S Shade (roll up) a window
Logo + Shift + X Maximise a window
Logo + Shift + C Close a window
Logo + Shift + I Iconify (minimise) a window
Logo + Menu (now broken) Bring up launcher menu

You’ll note here I use the term Logo to refer to the additional modifier on some keyboards. Apple users will know this as the Command key. Microsoft users will of course call this the Windows key. On my Yeeloong, the key looks like a house in a white circle. FVWM calls it Mod4. I call it the Logo key because it usually has some sort of OS-vendor-specific logo associated with it.

I use this, rather than Control and Alternate, since most applications bind their operations to those keys, and ignore the Logo key. By requiring the Logo key to be used in command sequences, it leaves Control and Alternate for applications. Control and Alternate of course, still get used in combination with Logo to extend the command set. The key bindings in bold get used the most.

One key stroke did change; originally I had Logo + Menu as the key sequence to bring up the launcher menu. I found KDE 4 broke this, I could no longer assign the Menu key (the one that brings up context menus in Windows) to this action. I haven’t really settled on a suitable substitute, and of course, my current Apple MacBook does not have this key, so I use Logo+Launch instead (what shows the dashboard in MacOS X), a little more of a stretch, but it works.

The layout I settled on was to have a menu and task bar up the top of the screen, then status notification and pager down the right-hand side.  Each panel occupied no more than 16 pixels.  I made extensive use of virtual desktops, configuring 12 virtual desktops and assigning these to each of the function keys. So Logo + F5 meant, go to desktop 5. Logo + Shift + F5 meant, move window to desktop 5.

Thus I could juggle the laptop in my hands, launching applications, moving them between desktops, figuring out where I needed to be next and getting things ready on my way between lectures and tutorials.

Over time, KDE became unsuitable as a full blown desktop due to its memory footprint. I found myself looking around and for a while, I was running FVWM. After a bit of research I figured out how to set up the bindings in much the same way. I managed to get FVWM’s BarButtons to emulate the side panel, but have the panel just lurk in the background and be recalled when necessary. And of course, the root menu and icons shown on the desktop mostly removed the need for a task bar.

This worked well, until I got the MacBook. The MacBook’s keyboard does have function keys, but requires you to hold the Fn key to access them. So the “Move to desktop 3” operation which I do so commonly, became Logo+Shift+Fn+F3. A real finger-twister. I’ve just put up with it until now.

What’s replaced it? Well I’ve done some re-working. Icons on the desktop for iconified windows is good and well but you’ve got to be able to get at them. That, and getting at the menu with the mouse in one hand proved to be a hassle, so for those tasks, the task-bar is back with its launcher button.

The BarButtons has been given a title bar, and the EWMH working area has been set so that applications do not cover the task bar, or the BarButtons title bar, allowing those to be accessed with the mouse alone. The BarButtons can be raised or lowered with a new key sequence, Logo+Z.

The application switching was always a chore. This is probably the one exception to the Logo for Window manager command rule; I used the very prevalent Alternate+Tab to switch applications. FVWM does this out-of-the-box unsurprisingly. I found with a lot of applications though, going Alt-Tab,Tab,Tab,Tab just a little annoying.

Logo+A brings up the window list and the window list stays there until a choice is made. The windows are numbered with a digit, and one can use arrow keys to select. Much better.

As for the virtual desktops. FVWM has desktops and pages. What I was using as “desktops” before, were just pages of a single desktop. I decided that once again we’d do proper virtual desktops, but only 4 of them, with 4 pages each. That’s 16 “spaces” in total. But how to switch between them? With new key bindings:

Key binding Action
Logo + 1 Jump to Desktop 1
Logo + 2 Jump to Desktop 2
Logo + 3 Jump to Desktop 3
Logo + 4 Jump to Desktop 4
Logo + Q Jump to Page 1
Logo + W Jump to Page 2
Logo + E Jump to Page 3
Logo + R Jump to Page 4
Logo + T Jump to last page
Logo + Backtick Jump to last desktop
Logo + Tab Jump to last desktop and page
Logo + Shift + 1 Move window to Desktop 1
Logo + Shift + 2 Move window to Desktop 2
Logo + Shift + 3 Move window to Desktop 3
Logo + Shift + 4 Move window to Desktop 4
Logo + Shift + Q Move window to Page 1
Logo + Shift + W Move window to Page 2
Logo + Shift + E Move window to Page 3
Logo + Shift + R Move window to Page 4
Logo + Shift + T Move window to last page
Logo + Shift + Backtick Move window to last desktop
Logo + Shift + Tab Move window to last desktop and page

Then there’s the key binding Logo + Escape, which brings up a Jump To menu, where I can just press one of the numeric keys 1-4, which pops up a menu allowing me to select the page 1-4. So to jump to Desktop 4 Page 2, I just hit Logo+Escape, 4, 2. I’m still working on the move bit, while FVWM’s GotoDeskAndPage is documented and works, I’m having trouble with MoveToDeskAndPage, which seems to be undocumented.

We shall see on Monday how the new arrangement goes.

Accessing applications I think will be the next point to work on. I’m thinking about how to code a suitable launcher that can be summoned by FVWM. There are elements of common launchers such as the Windows Start menu, and its replacement, the Start Screen in Windows 8 that are good. The Program Manager was limited by its MDI interface, but the program groups meant there was a hierarchy that the Windows 8 start screen, and indeed, other contemporary launchers, lack.

The FVWM launcher isn’t bad — but it makes it hard to re-arrange items, there the old Program Manager really does shine. Want a new program group? Choose File, New, select Program Group, click OK, done. Want to add an icon to that group? Similar process. Adding and managing applications really does need to be that simple, so the end user can put things where they want them.

That, and keyboard accessibility is a must. FVWM has a little bug which is evident in this screenshot…

FVWM 2.6.5 as I have it now.

FVWM 2.6.3 as I have it now.

If you look at the menu, you’ll notice two things:

  1. There are multiple menu items with the same access key
  2. Where a menu item contains an ampersand, the space following is treated as the access key.

That’s a minor bug with FVWM’s AutomaticHotKeys. Fixable for sure. That particular menu is generated by a Perl script distributed with FVWM; I have a patch that tweaks it to put ampersands in appropriate places to give all the items access key assignments, but I think there are better ways.

In the meantime, those who are interested, my FVWM configuration is here. This will expand into the .fvwm directory.

Feb 032013

Warning, this is a long post written over some hours. It is a brain dump of my thoughts regarding user interfaces and IT in general.

Technology is a funny beast. Largely because people so often get confused about what “technology” really is. Wind back the clock a few hundred millenia, then the concept of stone tools was all the rage. Then came metallurgy, mechanisation, industrialisation, with successive years comes a new wave of “technology”.

Today though, apparently it’s only these electronic gadgets that need apply. Well, perhaps a bit unfair, but the way some behave, you could be forgiven for thinking this.

What is amusing though, is when some innovation gets dreamt up, becomes widespread (or perhaps not) but then gets forgotten about, and re-discovered. No more have I noticed this, but in the field of user interfaces.

My introduction to computing

Now I’ll admit I’m hardly an old hand in the computing world. Not by a long shot. My days of computing go back to no later than about the late 80’s. My father, working for Telecom Australia (as they were then known) brought home a laptop computer.

A “luggable” by today’s standards, it had no battery and required 240V AC, a smallish monochrome plasma display with CGA graphics. The machine sported a Intel 80286 with a 80287 maths co-processor, maybe 2MB RAM tops, maybe a 10MB HDD and a 3.5″ floppy drive. The machine was about 4 inches high when folded up.

It of course, ran the DOS operating system. Not sure what version, maybe MS-DOS 5. My computing life began with simple games that you launched by booting the machine up, sticking in a floppy disk (a device you now only ever see in pictorial form next to the word “Save”) into the drive, firing up X-Tree Gold, and hunting down the actual .exe file to launch stuff.

Later on I think we did end up setting up QuikMenu but for a while, that’s how it was. I seem to recall at one point my father bringing home something of a true laptop, something that had a monochrome LCD screen and an internal battery. A 386 of some sort, but too little RAM to run those shiny panes of glass from Redmond.


I didn’t get to see this magical “Windows” later until about 1992 or ’93 or so when my father brought home a brand-new desktop. A Intel 486DX running at 33MHz, 8MB RAM, something like a 150MB HDD, and a new luxury, a colour VGA monitor. It also had Windows 3.1.

So, as many may have gathered, I’ve barely known computers without a command line. Through my primary school years I moved from just knowing the basics of DOS, to knowing how to maintain the old CONFIG.SYS and AUTOEXEC.BAT boot scripts, dealing with WIN.INI, fiddling around with the PIF editor to get contankerous DOS applications working.

Eventually I graduated to QBasic and learning to write software. Initially with only the commands PRINT and PLAY, baby steps that just spewed rubbish on the screen and made lots of noise with the PC speaker, but it was a start. I eventually learned how to make it do useful things, and even dabbled with other variants like CA Realizer BASIC.

My IT understanding entirely revolved around DOS however and the IBM PC clone. I did from time to time get to look at Apple’s offerings, at school there was the odd Apple computer, I think one Macintosh, and a few Apple IIs. With the exception of old-world MacOS, I had not experienced a desktop computer OS lacking a command line.

About this sort of time frame, a second computer appeared. This new one was a AMD Am486DX4 100MHz with I think 16MB or 32MB RAM, can’t recall exactly (it was 64MB and a Am5x86 133MHz by the time the box officially retired). It was running a similar, but different OS, Windows NT Workstation 3.1.

At this point we had a decent little network set up with the two machines connected via RG58 BNC-terminated coax. My box moved to Windows for Workgroups 3.11, and we soon had network file sharing and basic messaging (Chat and WinPopup).

Windows 95

Mid 1996, and I graduated to a new computer. This one was a Pentium 133MHz, 16MB RAM, 1GB HDD, and it ran the latest consumer OS of the day, Windows 95. Well, throw out everything I knew about the Program Manager. It took me a good month or more to figure out how to make icons on the desktop to launch DOS applications without resorting to the Start ? Run ? Browse dance.

After much rummaging through the Help, looking at various tutorials, I stumbled across it quite by accident — the right-click on the desktop, and noticing a menu item called “New”, with a curious item called “Shortcut”.

I later found out some time later that yes, Windows 95 did in fact have a Program Manager, although the way Windows 95 renders minimised MDI windows meant it didn’t have the old feel of the earlier user interface. I also later found out how to actually get at that start menu, and re-arrange it to my liking.

My father’s box had seen a few changes too. His box moved from NT 3.1, to 3.5, to 3.51 and eventually 4.0, before the box got set aside and replaced by a dual Pentium PRO 200MHz with 64MB RAM.


It wasn’t until my father was going to university for a post-grad IT degree, that I got introduced to Linux.In particular, Red Hat Linux 4.0. Now if people think Ubuntu is hard to use, my goodness, you’re in for a shock.

This got tossed onto the old 486DX4/100MHz box, where I first came to experiment.

Want a GUI? Well, after configuring XFree86, type ‘startx’ at the prompt. I toiled with a few distributions, we had these compilation sets which came with Red Hat, Slackware and Debian (potato I think). First thing I noticed was the desktop, it sorta looked like Windows 95.

The window borders were different, but I instantly recognised the “Start” button. It was FVWM2 with the FVWMTaskBar module. My immediate reaction was “Hey, they copied that!”, but then it was pointed out to me, that this desktop environment was somewhat older than the early “Chicago” releases by at least a year.

The machines at the uni were slightly different again, these ones did have more Win95-ish borders on them. FVWM95.

What attracted me to this OS initially was the games. Not the modern first person shooters, but games like Xbilliard, Xpool, hextris, games that you just didn’t see on DOS. Even then, I discovered there were sometimes ports of the old favourites like DOOM.

The OS dance

The years that followed for me was an oscillation between Windows 3.1/PC DOS 7, Windows 95, Slackware Linux, Red Hat Linux, a little later on Mandrake Linux, Caldera OpenLinux, SuSE Linux, SCO OpenServer 5, and even OS/2.

Our choise was mainly versions of SuSE or Red Hat, as the computer retailer near us sold boxes of them. At the time our Internet connection was via a belovid 28.8kbps dial-up modem link with a charge of about $2/hr. So downloading distributions was simply out of the question.

During this time I became a lot more proficient with Linux, in particular when I used Slackware. I experimented with many different window managers including: twm, ctwm, fvwm, fvwm2, fvwm95, mwm, olvwm, pmwm (SCO’s WM), KDE 1.0 (as distributed with SuSE 5.3), Gnome + Enlightenment (as distributed with Red Hat 6.0), qvwm, WindowMaker, AfterStep.

I got familiar with the long-winded xconfigurator tool, and even getting good at having an educated guess at modelines when I couldn’t find the specs in the monitor documentation. In the early days it was also not just necessary to know what video card you had, but also what precise RAMDAC chip it had!

Over time I settled on KDE as the desktop of choice under Linux. KDE 1.0 had a lot of flexibility and ease of use that many of its contemporaries lacked. Gnome+Enlightenment looked alright at first, but then the inability to change how the desktop looked without making your own themes bothered me, the point and click of KDE’s control panel just suited me as it was what I was used to in Windows 3.1 and 95. Not having to fuss around with the .fvwm2rc (or equivalent) was a nice change too. Even adding menu items to the K menu was easy.

One thing I had grown used to on Linux was how applications install themselves in the menu in a logical manner. Games got stashed under Games, utilities under Utilities, internet stuff under Internet, office productivity tools under Office. Every Linux install I had, the menu was neatly organised. Even the out-of-the-box FVWM configuration had some logical structure to it.

As a result, whenever I did use Windows on my desktop, a good amount of time was spent re-arranging the Start menu to make the menu more logical. Many a time I’d open the Start menu on someone else’s computer, and it’d just spew its guts out right across the screen, because every application thinks itself deserving of a top-level place below “Programs”.

This was a hang-over of the days of Windows 3.1. The MDI-style interface that was Program Manager couldn’t manage anything other than program groups as the top-level, and program items below that. Add to this a misguided belief that their product was more important than anyone elses, application vendors got used to this and just repeated the status quo when Windows 95/NT4 turned up.

This was made worse if someone installed Internet Explorer 4.0. It invaded like a cancer. Okay now your screenful of Start menu didn’t spew out across the screen, it just crammed itself into a single column with tiny little arrows on the top and bottom to scroll past the program groups one by one.

Windows 95 Rev C even came with IE4, however there was one trick. If you left the Windows 95 CD in the drive on the first boot, it’d pop up a box telling you that the install was not done, you’d click that away and IE4 Setup would do its damage. Eject the CD, and you were left with pristine Windows 95. Then when IE5 came around, it could safely be installed without it infecting everything.

Windows 2000

I never got hold of Windows 98 on my desktop, but at some point towards the very end of last century, I got my hands on a copy of Windows 2000 Release Candidate 2. My desktop was still a Pentium 133MHz, although it had 64MB RAM now and few more GB of disk space.

I loaded it on, and it surprised me just how quick the machine seemed to run. It felt faster than Windows 95. That said, it wasn’t all smooth sailing. Where was Network Neighbourhood? That was a nice feature of Windows 95. Ohh no, we have this thing called “My Network Places” now. I figured out how to kludge my own with the quick-launch, but it wasn’t as nice since applications’ file dialogues knew nothing of it.

The other shift was that Internet Explorer still lurked below the surface, and unlike Windows 95, there was no getting rid of it. My time using Linux had made me a Netscape user and so for me it was unneccesary bloat. Windows 2000 did similar Start Menu tricks, including “hiding” applications that it thought I didn’t use very often.

If it’s one thing that irritates me, it’s a computer hiding something from me arbitrarily.

Despite this, it didn’t take as long for me to adapt to it as I did from Windows 3.1 to 95 though, as the UI was still much the same. A few tweaks here and there.

In late 2001, my dinky old Pentium box got replaced. In fact, we replaced both our desktops. Two new dual Pentium III 1GHz boxes, 512MB RAM. My father’s was the first, with a nVidia Riva TNT2 32MB video card and a CD-ROM drive. Mine came in December as a Christmas/18th birthday present, with a ATI Radeon 7000 64MB video card and a DVD drive.

I was to run the old version of Windows NT 4.0 we had. Fun ensued with Windows NT not knowing anything about >8GB HDDs, but Service Pack 6a sorted that out, and the machine ran. I got Linux on there as well (SuSE initially) and apart from the need to distribute ~20GB of data between many FAT16 partitions of about 2GB each (Linux at this time couldn’t write NTFS), it worked. I had drive letters A through to O all occupied.

We celebrated by watching a DVD for the first time (it was our first DVD player in the house).

NT 4 wasn’t too bad to use, it was more like Windows 95, and I quickly settled into it. That said, its tenure was short lived. The momoent anything went wrong with the installation, I found I was right back to square one as the Emergency Repair Disk did not recognise the 40GB HDD. I wrustled up that old copy of Windows 2000 RC2 and found it worked okay, but wouldn’t accept the Windows 2000 drivers for the video card. So I nicked my father’s copy of Windows 2000 and ran that for a little while.

Windows XP was newly released, and so I did enquire about a student-license upgrade, but being a high-school student, Microsoft’s resellers would have none of that. Eventually we bought a new “Linux box” with an OEM copy of Windows 2000 with SP2, and I used that. All legal again, and everything worked.

At this point, I was dual-booting Linux and Windows 2000. Just before the move to ADSL, I had about 800 hours to use up (our dial-up account was one that accumulated unused hours) and so I went on a big download-spree. Slackware 8.0 was one of the downloaded ISOs, and so out went SuSE (which I was running at the time) and in went Slackware.

Suddenly I felt right at home. Things had changed a little, and I even had KDE there, but I felt more in control of my computer than I had in a long while. In addition to using an OS that just lets you do your thing, I had also returned from my OS travels, having gained an understanding of how this stuff works.

I came to realise that point-and-click UIs are fine when they work, hell when they don’t. When they work, any dummy can use them. When they don’t, they cry for the non-dummies to come and sort them out. Sometimes we can, sometimes it’s just reload, wash-rinse-repeat.

No more was this brought home to me when we got hold of a copy of Red Hat 8.0. I tried it for a while, but was immediately confronted by the inability to play the MP3s that I had acquired (mostly via the sneakernet). Ogg/Vorbis was in its infancy and I noticed that at the time, there didn’t seem to be any song metadata such as what ID3 tags provided, or at least XMMS didn’t show it.

A bit of time back on Slackware had taught me how to download sources, read the INSTALL file and compile things myself. So I just did what I always did. Over time I ran afoul with the Red Hat Package Manager, and found myself going in cycles doing many RPM solving dependency hell.

On top of this, there was now the need to man-handle the configuration tools that expected things the way the distribution packagers intended them.

Urgh, back to Slackware I go. I downloaded Slackware 9.0 and stayed with that a while. Eventually I really did go my own way with Linux From Scratch, which was good, but a chore.

These days I use Gentoo, and while I do have my fights with Portage (ohh slot-conflict, how I love you!!!), it does usually let me do what I want.

A time for experimentation and learning

During this time I was mostly a pure KDE user. KDE 2.0, then 3.0. I was learning all sorts of tricks reading HOWTO guides on the Linux Documentation Project. I knew absolutely no one around me that used Linux, in fact on Linux matters, I was the local “expert”. Where my peers (in 2002) might have seen it once or twice, I had been using it since 1996.

I had acquired some more computers by this time, and I was experimenting with setting up dial-up routers with proxy servers (Squid, then IP Masquerade), turning my old 386 into a dumb terminal with XDMCP, getting interoperation between the SCO OpenServer box (our old 486DX4/100MHz).

The ability for the Windows boxes to play along steadily improved over this time, from ethernet frames that passed like ships in the night (circa 1996; NetBEUI and IPX/SPX on the Windows 3.1, TCP/IP on Linux) though to begrudging communications with TCP/IP with newer releases of Windows.

Andrew Tridgell’s SAMBA package made its debut in my experimentation, and suddenly Windows actually started to talk sensible things to the Linux boxes and vice versa.

Over time the ability for Linux machines and Windows boxes to interoperate has improved with each year improving on the next layer in the OSI model. I recall some time in 1998 getting hold of an office suite called ApplixWare, but in general when I wanted word processing I turned to Netscape Composer and Xpaint as my nearest equivalent.

It wasn’t until 2000 or so that I got hold of StarOffice, and finally had an office suite that could work on Windows, Linux and OS/2 that was comparable to what I was using at school (Microsoft Office 97).

In 2002 I acquired an old Pentium 120MHz laptop, and promptly loaded that with Slackware 8 and OpenOffice 1.0. KDE 3.0 chugged with 48MB RAM, but one thing the machine did well was suspend and resume. A little while later we discovered eBay and upgraded to a second-hand Pentium II 266MHz, a machine that served me well into the following year.

For high-school work, this machine was fine. OpenOffice served the task well, and I was quite proficient at using Linux and KDE. I even was a trend-setter… listening to MP3s on the 15GB HDD a good year before the invention of the iPod.

Up to this point, it is worth mentioning that the Microsoft world, the UI hadn’t changed all that much in the time between Windows 95 and 2000/ME. Network Neighbourhood was probably the thing I noticed the most. At this time I was usual amongst my peers in that I had more than one computer at home, and they all talked to each other. Hence why Windows 95/98 through to 2000/ME didn’t create such an uproar.

What people DID notice was how poorly Windows ME (and the first release of 98) performed “under the hood”. More so for the latter than the former.

Windows XP

Of course I did mention trying to get hold of Windows XP earlier. It wasn’t until later in 2002 when the school was tossing out a lab of old AMD K6 machines with brand new boxes (and their old Windows NT 4 server for a Novell one) that I got to see Windows XP up close.

The boxes were to run Windows 2000 actually, but IBM had just sent us the boxes preloaded with XP, and we were to set up Zenworks to image them with Windows 2000. This was during my school holidays and I was assisting in the transition. So I fired up a box and had a poke around.

Up came the initial set up wizard, with the music playing, animations, and a little question mark character which at first did its silly little dance telling you how it was there to help you. Okay, if I had never used Windows before, I’d be probably thankful this was there, but this just felt like a re-hash of that sodding paperclip. At least it did eventually shut up and sit in the corner where I could ignore it. That wasn’t the end of it though.

Set up the user account, logged in, and bam, another bubble telling me to take the tour. In fact, to this day that bubble is one of the most annoying things about Windows XP because 11 years on, on a new install it insists on bugging you for the next 3 log-ins as if you’ve never used the OS before!

The machines had reasonable 17″ CRT monitors, the first thing I noticed was just how much extra space was taken up with the nice rounded corners and the absence of the My Computer and My Network Places icons on the desktop. No, these were in the Start menu now. Where are all the applications? Hidden under All Programs of course.

Hidden and not even sorted in any logical order, so if you’ve got a lot of programs it will take you a while to find the one you want, and even longer to find it’s not actually installed.

I took a squiz at the control panel. Now the control panel hadn’t changed all that much from Windows 3.1 days. It was still basically a window full of icons in Windows 2000/ME, albeit since Windows 95, it now used Explorer to render itself rather than a dedicated application.

Do we follow the tradition so that old hands can successfully guide the novices? No, we’ll throw everyone in the dark with this Category View nonsense! Do the categories actually help the novices? Well for some tasks, maybe, but for anything advanced, most definitely not!

Look and feel? Well if you want to go back to the way Windows used to look, select Classic theme, and knock yourself out. Otherwise, you’ve got the choice of 3 different styles, all hard-coded. Of course somewhere you can get additional themes. Never did figure out where, but it’s Gnome 1.0-style visual inflexibility all over again unless you’re able to hack the theme files yourself.

No thank-you, I’ll stick with the less pixel-wasting Classic theme if you don’t mind!

Meanwhile in Open Source land

As we know, it was a long break between releases of Windows XP, and over the coming years we heard much hype about what was to become Vista. For years to come though, I’d be seeing Windows XP everywhere.

My university work horses (I had a few) all exclusively ran Linux however. If I needed Windows there was a plethora of boxes at uni to use, and most of the machines I had were incapable of running anything newer than Windows 2000.

I now was more proficient in front of a Linux machine than any version of Windows. During this time I was using KDE most of the time. Gnome 2.0 was released, I gave it a try, but, it didn’t really grab me. One day I recall accidentally breaking the KDE installation on my laptop. Needing a desktop, I just reached for whatever I had, and found XFCE3.

I ran XFCE for about a month or two. I don’t recall exactly what brought me back to KDE, perhaps the idea of a dock for launching applications didn’t grab me. AfterStep afterall did something similar.

In 2003, one eBay purchase landed us with a Cobalt Qube2 clone, a Gateway Microserver. Experimenting with it, I managed to brick the (ancient) OS, and turned the thing into a lightish door stop. I had become accustomed to commands like `uname` which could tell me the CPU amonst other things.

I was used to seeing i386, i486, i586 and i686, but this thing came back with ‘mips’. What’s this? I did some research and found that there was an entire port. I also found some notes on bootstrapping a SGI Indy. Well this thing isn’t an Indy, but maybe the instructions have something going for them. I toiled, but didn’t get far…

Figuring it might be an idea to actually try these instructions on an Indy, we hit up eBay again, and after a few bids, we were the proud owners of a used SGI Indy R4600 133MHz with 256MB RAM running IRIX 6.5. I toiled a bit with IRIX, the 4DWM seemed okay to use, but certain parts of the OS were broken. Sound never worked, there was a port of Doom, but it’d run for about 10 seconds then die.

We managed to get some of the disc set for IRIX, but never did manage to get the Foundation discs needed to install. My research however, led me onto the Debian/MIPS port. I followed the instructions, installed it, and hey presto, Linux on a SGI box, and almost everything worked. VINO (the video capture interface) was amongst those things that didn’t at the time, but never mind. Sound was one of the things that did, and my goodness, does it sound good for a machine of that vintage!

Needless to say, the IRIX install was history. I still have copies of IRIX 6.5.30 stashed in my archives, lacking the foundation discs. The IRIX install didn’t last very long, so I can’t really give much of a critique of the UI. I didn’t have removable media so didn’t get to try the automounting feature. The shut down procedure was a nice touch, just tap the OFF button, the computer does the rest. The interface otherwise looked a bit like MWM. The machine however was infinitely more useful to me running Linux than it ever was under IRIX.

As I toiled with Debian/MIPS on the Indy, I discovered there was a port of this for the Qube2. Some downloads later and suddenly the useless doorstop was a useful server again.

Debian was a new experience for me, I quite liked APT. The version I installed evidently was the unstable release, so it had modern software. Liking this, I tried it on one of the other machines, and was met with, Debian Stab^Hle. Urgh… at the time I didn’t know enough about the releases, and on my own desktop I was already using Linux From Scratch by this time.

I was considering my own distribution that would target the Indy, amongst other systems. Already formulating ideas, and at one point, I had a mismash of about 3 different distributions on my laptop.

Eventually I discovered Gentoo, along with its MIPS port. Okay, not as much freedom as LFS, but very close to it. In fact, it gives you the same freedom if you can be arsed to write your own portage tree. One by one the machines got moved over, and that’s what I’ve used.

The primary desktop environment was KDE for most of them. Build times for this varied, but for most of my machines it was an overnight build. Especially for the Indy. Once installed though, it worked quite well. It took a little longer to start on the older machines, but was still mostly workable.

Up to this point, I had my Linux desktop set up just the way I liked it. Over the years the placement of desktop widgets and panels has moved around as I borrowed ideas I had seen elsewhere. KDE was good in that it was flexible enough for me to fundamentally change many aspects of the interface.

My keybindings were set up to be able to perform most window operations without the need of a mouse (useful when juggling the laptop in one’s hands whilst figuring out where the next lecture was), notification icons and the virtual desktop pager were placed to the side for easy access. The launcher and task bar moved around.

Initially down the bottom, it eventually wound up on the top of the screen much like OS/2 Warp 4, as that’s also where the menu bar appears for applications — up the top of the window. Thus minimum mouse movement. Even today, the Windows 7 desktop at work has the task bar up the top.

One thing that frustrated me with Windows at the time was the complete inability to change many aspects of the UI. Yes you could move the task bar around and add panels to it, but if you wanted to use some other keystroke to close the window? Tough. ALT-F4 is it. Want to bring up the menu items? Hit the logo key, or failing that, CTRL-ESC. Want to maximise a window? Either hit ALT-Space, then use the arrows to hit Maximise, or start reaching for the rodent. Far cry from just pressing Logo-Shift-C or Logo-Shift-X.

Ohh, and virtual desktops? Well, people have implemented crude hacks to achieve something like it. In general, anything I’ve used has felt like a bolt-on addition rather than a seamless integration.

I recall commenting about this, and someone pointing out this funny thing called “standardisation”. Yet, I seem to recall the P in PC standing for personal. i.e. this is my personal computer, no one else uses it, thus it should work the way I choose it to. Not what some graphic designer in Redmond or Cupertino thinks!

The moment you talk about standardisation or pining for things like Group Policy objects, you’ve crossed the boundary that separates a personal computer from a workstation.

Windows Vista

Eventually, after much fanfare, Microsoft did cough up a new OS. And cough up would be about the right description for it. It was behind schedule, and as a result, saw many cut backs. The fanciful new WinFS? Gone. Palladium? Well probably a good thing that did go, although I think I hear its echoes in Secure Boot.

What was delivered, is widely considered today a disaster. And it came at just the wrong time. Just as the market for low-end “netbook” computers exploded, just the sort of machine that Windows Vista runs the worst on.

Back in the day Microsoft recommended 8MB RAM for Windows 95 (and I can assure you it will even run in 4MB), but no one in their right mind would tollerate the constant rattle from the paging disk. The same could be said for Windows NT’s requirement of 12MB RAM and a 486. Consumers soon learned that “Windows Vista Basic ready” meant a warning label to steer clear of, or insist on it coming with Windows XP.

A new security feature, UAC, ended up making more of a nuisance of itself, causing people to do the knee-jerk reaction of shooting the messenger.

The new Aero interface wastes even more screen pixels than the “Luna” interface of Windows XP. And GPU cycles to boot. The only good thing about it was that the GPU did all the hard work putting windows on the screen. It looked pretty, when the CPU wasn’t overloaded, but otherwise the CPU had trouble keeping up and the whole effect was lost. Exactly what productivity gains one has by having a window do three somersaults before landing in the task bar on minimise is lost on me.

Windows Vista was the last release that could do the old Windows 95 style start menu. The newer Vista one was even more painful than the one in XP. The All Programs sub-menu opened out much like previous editions did (complete with the annoying “compress myself into a single column”). In Vista, this menu was now entrapped inside this small scrolling window.

Most of Vista’s problems were below the surface. Admittedly Service Pack 1 fixed a lot of the problems, it was already too late. No one wanted to know.

Even with the service packs, it still didn’t perform up to par on the netbooks that were common for the period. The original netbook for what it’s worth, was never intended to run any version of Windows, the entire concept came out of the One Laptop Per Child project, which was always going to be Linux based.

Asus developed the EeePC was one of the early candidates for the OLPC project. When another design got selected, Asus simply beefed up the spec, loaded on Xandros and pushed it out the door. Later models came with Windows XP, and soon, other manufacturers pitched in. This was a form factor with specs that ran Windows XP well, unfortunately Vista’s Aero interface was too much for the integrated graphics typically installed, and the memory requirements had the disk drive rattling constantly, sapping the machine of valuable kilojoules when running from the battery.

As to my run-in with Vista? For my birthday one year I was given a new laptop. This machine came pre-loaded with it, and of course, the very first task I did was spend a good few hours making the recovery discs then uninstalling every piece of imaginable crap that manufacturers insist on prebloating their machines with.

For what I needed, I actually needed Linux to run. The applications I use and depend on for university work, whilst compatible with Windows, run as second class citizens due to their Unix heritage. Packages like The Gimp, gEDA, LaTeX, git, to name a few, never quite run as effortlessly on Windows. The Gimp had a noticable lag when using my Wacom tablet, something that put my handwriting way off.

Linux ran on it, but with no support for the video card, GUI related tasks were quite choppy. In the end, it proved to be little use to me. My father at the time was struggling along with his now aging laptop using applications and hardware that did not support Windows Vista. I found a way to exorcise Windows Vista from the machine, putting Windows XP in its place.

The bloat becomes infectious

What was not lost on me, was that each new iteration of full desktops like KDE brought in more dependencies. During my latter years at University, I bought myself a little netbook. I was doing work for Gentoo/MIPS with their port of Linux, and thus a small machine that would run what I needed for university, and could serve as a test machine during my long trips between The Gap and Laidley (where I was doing work experience for Eze Corp) would go down nicely. So I fired off an email and a telegraphic money transfer over to Lemote in China, and on the doorstep turned up a Yeeloong netbook.

I dual booted Debian and Gentoo on this machine, in fact I still do. Just prior to buying this machine, I was limping along with an old Pentium II 300MHz laptop. I did have a Pentium 4M laptop, but a combination of clumsiness and age slowly caused the machine’s demise. Eventually it failed completely, and so I just had to make do with the PII which had been an earlier workhorse.

One thing, KDE 3.0 was fine on this laptop. Even 3.5 was okay. But when you’ve only got 300MHz of CPU and 160MB RAM, the modern KDE releases were just a bit too much. Parts of KDE were okay, but for the main desktop, it chugged along. Looking around, I needed a workable desktop, so I installed FVWM. I found the lack of a system tray annoyed me, so in went stalonetray. Then came maintaining the menu. Well, modern FVWM comes with a Perl script that automates this, so in that went.

Finally, a few visual touches, a desktop background loader, some keybinding experiments and I pretty much had what KDE gave me, that started in a fraction of the time, and built much faster. When the Yeeloong turned up and I got Gentoo onto there, the FVWM configuration here was the first thing to be installed on the Yeeloong, and so I had a sensible desktop for the Yeeloong.

Eventually I did get KDE 4 working on the Yeeloong, sort of. It was glitchy on MIPS. KDE 3.5 used to work without issue but 4.0 never ran quite right. I found myself using FVWM with just the bits of KDE that worked.

As time went on, university finished, and the part-time industrial experience became full-time work. My work at the time revolved around devices that needed Windows and a parallel port to program them. We had another spare P4 laptop, so grabbed that, tweaked Windows XP on there to my liking, and got to work. The P4 “lived” at Laidley and was my workstation of sorts, the Yeeloong came with me to and from there. Eventually that work finished, and through the connections I came to another company (Jacques Electronics). In the new position, it was Linux development on ARM.

The Windows installation wasn’t so useful any more. So in went a copy of the gPartED LiveCD, told Windows to shove, followed by a Gentoo disc and a Stage 3 tarball. For a while my desktop was just the Linux command line, then I got X and FVWM going, finally as I worked, KDE.

I was able to configure KDE once again, and on i686 hardware, it ran as it should. It felt like home, so it stayed. Over time the position at Jacques finished up, I came to work at VRT where I am to this day. The P4 machine stayed at the workplace, with the netbook being my personal machine away from work.

It’s worth pointing out that at this point, although Windows 7 had been around for some time, I was yet to actually come to use it first hand.

My first Apple

My initial time at VRT was spent working on a Python-based application to ferry metering data from various energy meters to various proprietary systems. The end result was a package that slotted in along side MacroView called Metermaster, and forms one of the core components in VRT’s Wages Hub system. While MacroView can run on Windows, and does for some (even Cygwin), VRT mainly deploys it on Ubuntu Linux. So my first project was all Linux based.

During this time, my new work colleagues were assessing my skills, and were looking at what I could work on next. One of the discussions revolved around working on some of their 3D modelling work using Unity3D. Unity3D at the time, ran on just two desktop OSes. Windows, and MacOS X.

My aging P4 laptop had a nVidia GeForce 420Go video device with 32MB memory. In short, if I hit that thing with a modern 3D games engine, it’d most likely crap itself. So I was up for a newer laptop. That got me thinking, did I want to try and face Windows again, or did I want to try something new?

MacOS was something I had only fleeting contact with. MacOS X I had actually never used myself. I knew a bit about it, such as its basis was on the FreeBSD userland, the Mach microkernel. I saw a 2008 model MacBook with a 256MB video device inbuilt, going cheap, so I figured I’d take the plunge.

My first impressions of MacOS X 10.5 were okay. I had a few technical glitches at first, namely MacOS X would sometimes hang during boot, just showing nothing more than an Apple logo and a swirling icon. Some updates refused to download, they’d just hang and the time estimate would blow out. Worst of all, it wouldn’t resume, it’d just start at the beginning.

In the end I wandered down to the NextByte store in the city, bought a copy of MacOS X 10.6. I bought the disc on April 1st, 2011, and it’s the one and only disc the DVD drive in the MacBook won’t accept. The day I bought it I was waiting at the bus stop, figured I’d have a look and see what docs there are. I put the disc in, hear a few noises, it spits the disc out again. So I put it back in again, and out it comes. Figuring this was a defective disc, I put the disc back in and march back down to the shop, receipt in one hand, cantankerous laptop in the other. So much for Apple kit “just working”.

Then the laptop decided it liked the pussy cat disc so much it wouldn’t give it back! Cue about 10 minutes in the service bay getting the disc to eject. Finally the machine reneged and spat the disc out. That night it tried the same tricks, so I grabbed an external DVD drive and did the install that way. Apart from this, OSX 10.6 has given me no problems in that regard.

As for the interface? I noticed a few things features that I appreciated from KDE, such as the ability to modify some of the standard shortcuts, although not all of them. Virtual desktops get called Spaces in MacOS X, but essentially the same deal.

My first problem was identifying what the symbols on the key shortcuts meant. Command and Shift were simple enough, but the symbol used to denote “Option” was not intuitive, and I can see some people getting confused for the one for Control. That said, once I found where the Terminal lived, I was right at home.

File browsing? Much like what I’m used to elsewhere. Stick a disc in, and it appears on the desktop. But then to eject? The keyboard eject button didn’t seem to work. Then I remembered a sarcastic comment one of my uncles made about using a Macintosh, requiring you to “throw your expensive software in the bin to eject”. So click the CD icon, drag to the rubbish bin icon, voilà, out it comes.

Apple’s applications have always put the menu bar of the application right up the top of the screen. I found this somewhat awkward when working with multiple applications since you find yourself clicking on one (or command-tabbing over to) one window, access the menu there, then clicking (or command-tabbing) to the other, access the menu up the top of the screen.

Switching applications with Command-Tab works by swapping between completely separate applications. Okay if you’re working with two completely separate applications, not so great if you’re working on many instances of the same application. Exposé works, probably works quite well if the windows are visually distinct when zoomed out, but if they look similar, one is reminded of Forrest Gump: “Life’s like a box of chocolates, you never know what you’re gonna get!”

The situation is better if you hit Command-Tab, then press a down-arrow, that gives you an Exposé of just the windows belonging to that application. A far cry from hitting Alt-Tab in FVWM to bring up the Window List and just cycling through. Switching between MacVim instances was a pain.

As for the fancy animations. Exposé looks good, but when the CPUs busy (and I do give it a hiding), the animation just creeps along at a snail’s pace. I’ll tolerate it if it’s over and done with within a second, but when it takes 10 seconds to slowly zoom out, I’m left sitting there going “Just get ON with it!” I’d be fine if it just skipped the animation and just switched from normal view to Exposé in a single frame. Unfortunately there’s nowhere to turn this off that I’ve found.

The dock works to an extent. It suffers a bit if you have a lot of applications running all at once, there’s only so much screen real-estate. A nice feature though is in the way it auto-hides and zooms.

When the mouse cursor is away from the dock, it drops off the edge of the screen. As the user configures this and sets up which edge it clings to, this is a reasonable option. As the mouse is brought near the space where the dock resides, it slowly pops out to meet the cursor. Not straight away, but progressively as the proximity of the cursor gets closer.

When fully extended, the icons nearest the cursor enlarge, allowing the rest to remain visible, but not occupy too much screen real-estate. The user is free to move the cursor amongst them, the ones closest zooming in, the ones furtherest away zooming out. Moving the cursor away causes the dock to slip away again.

Linux on the MacBook

And it had to happen, but eventually Linux did wind up on there. Again, KDE initially, but I again, found that KDE was just getting too bloated for my liking. It took about 6 months of running KDE before I started looking at other options.

FVWM was of course where I turned to first, in fact, it was what I used before KDE was finished compiling. I came to the realisation that I was mostly just using windows full-screen. So I thought, what about a tiling window manager?

Looking at a couple, I settled on Awesome. At first I tried it for a bit, didn’t like it, reverted straight back to FVWM. But then I gave it a second try.

Awesome works okay, it’s perhaps not the most attractive to look at, but it’s functional. At the end of the day looks aren’t what matter, it’s functionality. Awesome was promising in that it uses Lua for its configuration. It had a lot of the modern window manager features for interacting with today’s X11 applications. I did some reading up on the handbook, did some tweaking of the configuration file and soon had a workable desktop.

The default keybindings were actually a lot like what I already used, so that was a plus. In fact, it worked pretty good. Where it let me down was in window placement. In particular, floating windows, and dividing the screen.

Awesome of course works by a number of canned window layouts. It can make a window full screen (hiding the Awesome bar) or near full-screen, show two windows above/below each other or along side. Windows are given numerical tags which cause those windows to appear whenever a particular tag is selected, much like virtual desktops, only multiple tags can be active on a screen.

What irritated me most was trying to find a layout scheme that worked for me. I couldn’t seem to re-arrange the windows in the layout, and so if Awesome decided to plonk a window in a spot, I was stuck with it there. Or I could try cycling through the layouts to see if one of the others was better. I spent much energy arguing with it.

Floating windows were another hassle. Okay, modal dialogues need to float, but there was no way to manually override the floating status of a window. The Gimp was one prime example. Okay, you can tell it to not float its windows, but it still took some jiggery to get each window to sit where you wanted it. And not all applications give you this luxury.

Right now I’m back with the old faithful, FVWM.


FVWM, as I have it set up on Gentoo

Windows 7

When one of my predecessors at VRT left to work for a financial firm down in Sydney, I wound up inheriting his old projects, and the laptop he used to develop them on. The machine dual-boots Ubuntu (with KDE) and Windows 7, and seeing as I already have the MacBook set up as I want it, I use that as my main workstation and leave the other booted into Windows 7 for those Windows-based tasks.

Windows 7 is much like Windows Vista in the UI. Behind the scenes, it runs a lot better. People aren’t kidding when they say Windows 7 is “Vista done right”. However, recall I mentioned about Windows Vista being the last to be able to do the classic Start menu? Maybe I’m dense, but I’m yet to spot the option in Windows 7. It isn’t where they put it Windows XP or Vista.

So I’m stuck with a Start menu that crams itself into a small bundle in one corner of the screen. Aero has been turned off in favour of a plain “classic” desktop. I have the task bar up the top of the screen.

One new feature of Windows 7 is that the buttons of running applications by default only show the icon of the application. Clicking that reveals tiny wee screenshots with wee tiny title text. More than once I’ve been using a colleague’s computer, he’ll have four spreadsheets open, I’ll click the icon to switch to one of them, and neither of us can figure out which one we want.

Thankfully you can tell it to not group the icons, showing a full title next to the icon on the task bar, but it’s an annoying default.

Being able to hit Logo-Left or Logo-Right to tile a window on the screen is nice, but I find more often than not I wind up hitting that when I mean to hit one of the other meta keys, and thus I have to reach for the rodent and maximise the window again. This is more to do with the key layout of the laptop than Windows 7, but it’s Windows 7’s behaviour and the inability to configure it that exacerbates the problem.

The new start menu I’d wager, is why Microsoft saw so many people pinning applications to the task bar. It’s not just for quick access, in some cases it’s the only bleeding hope they’d ever find their application again! Sure, you can type the name of the application, but circumstance doesn’t always favour that option. Great if you know exactly what the program is called, not so great if it’s someone else’s computer and you need to know if something is even there.

Thankfully most of the effects can be turned off, and so I’m left with a mostly Spartan desktop that just gets the job done. I liken using Windows to a business trip abroad, you’re not there for pleasure, and there’s nothing quite like home sweet home.

Windows 8

Now, I get to this latest instalment of desktop Operating Systems. I have yet to actually use it myself, but looking at a few screenshots, a few thoughts:

  • “Modern”: apart from being a silly name for a UI paradigm (what do you call it when it isn’t so “modern” anymore?), looks like it could really work well on the small screen. However, it relies quite heavily on gestures and keystrokes to navigate. All very well if you set these up to suit how you operate yourself, but not so great when forced upon you.
  • Different situations will call for different interface methods. Sometimes it is convenient to reach out and touch the screen, other times it’ll be easier to grab the rodent, other times it’ll be better to use the keyboard. Someone should be able to achieve most tasks (within reason) with any of the above, and seamlessly swap between these input methods as need arises.
  • “Charms” and “magic corners” makes the desktop sound like it belongs on the set of a Harry Potter film
  • Hidden menus that jump out only when you hit the relevant corner or edge of the screen by default without warning will likely startle and confuse
  • A single flat hierarchy of icons^Wtiles for all one’s applications? Are we back to MS-DOS Executive again?
  • “Press the logo key to access the Start screen”, and so what happens if the keyboard isn’t in convenient reach but the mouse is?
  • In a world where laptops are out-numbering desktops and monitors are getting wider faster than they’re getting taller, are extra-high ribbons really superior to drop-down menus for anyone other than touch users?

Apparently there’s a tutorial when you first start up Windows 8 for the first time. Comments have been made about how people have been completely lost working with the UI until they saw this tutorial. That should be a clue at least. Keystrokes are really just a shortened form of command line. Even the Windows 7 start menu, with its search bar, is looking more like a stylised command line (albeit one with minimal capability).

Are we really back to typing commands into pseudo command line interfaces?

The Command line: what is old is new again

The Command line: what is old is new again

I recall the Ad campaigns for Windows 7, on billboards, some attractive woman posing with the caption: “I’m a PC and Windows 7 was my idea”

Mmm mmm, so who’s idea was Windows 8 then? There’s no rounded rectangles to be seen, so clearly not Apple’s. I guess how well it goes remains to be seen.

It apparently has some good improvements behind the scenes, but anecdotal evidence at the workplace suggests that the ability to co-operate with a Samba 3.5-based Windows Domain is not among them. One colleague recently bought herself a new ultrabook running Windows 8.

I’m guessing sooner or later I’ll be asked to assist with setting up the Cisco VPN client and setting up links to file shares, but another colleague, despite getting the machine to successfully connect to the office Wifi, couldn’t manage to bring up a login prompt to connect to the file server, the machine instead just assuming the local username and password matched the credentials to be used on the remote server. I will have to wait and see.

Where to now?

Well I guess I’m going to stick with FVWM a bit longer, or maybe pull my finger out and go my own way. I think Linus has a point when he describes KDE as a bit “cartoony”. Animations make something easy to sell, but at the end of the day, it actually has to do the work. Some effects can add value to day-to-day tasks, but most of what I’ve seen over the years doesn’t seem to add much at all.

User interfaces are not one-size-fits-all. Never have been. Touch screen interfaces have to deal with problems like fat fingers, and so there’s a balancing act between how big to make controls and how much to fit on a screen. Keyboard interfaces require a decent area for a keypad, and in the case of standard computer keyboards, ideally, two hands free. Mice work for selecting individual objects, object groups and basic gestures, but make a poor choice for entering large amounts of data into a field.

For some, physical disability can make some interfaces a complete no-go. I’m not sure how I’d go trying to use a mouse or touch screen if I lost my eyesight for example. I have no idea how someone minus arms would go with a tablet — if you think fat fingers is a problem, think about toes! I’d imagine the screens on those devices often would be too small to read when using such a device with your feet, unless you happen to have very short legs.

Even for those who have full physical ability, there are times when one input method will be more appropriate at a given time than another. Forcing one upon a user is just not on.

Hiding information from a user has to be carefully considered. One of my pet peeves is when you can’t see some feature on screen because it is hidden from view. There is one thing if you yourself set up the computer to hide something, but quite another when it happens by default. Having a small screen area that activates and reveals a panel is fine, if the area is big enough and there is some warning that the panel is about to fly out.

As for organising applications? I’ve never liked the way everything just gets piled into the “Programs” directory of the Start Menu in Windows. It is just an utter mess. MacOS X isn’t much better.

The way things are in Linux might take someone a little discovery to find where an application has been put, but once discovered, it’s then just a memory exercise to get at it, or shortcuts can be created. Much better than hunting through a screen-full of unsorted applications.

Maybe Microsoft can improve on this with their Windows Store, if they can tempt a few application makers from the lucrative iOS and Android markets.

One thing is clear, the computer is a tool, and as such, must be able to be adapted for how the user needs to use that tool at any particular time for it to maintain maximum utility.

Sep 302012

Just thought I’d post this up here for “backup” purposes… lately I’ve been doing a lot of AVR programming, the first step of course was to procure a programmer for the devices.

The following is a schematic for the programmer I have built myself. It can be built out of scrap bits, none of the components are critical in value.

It gets its name as it is essentially identical to the “DASA” serial cables, only all the signals are inverted.  The inverting buffers serve to provide voltage level conversion along with crude tri-state functionality when the AVR device is not being programmed.

Inverted-"DASA" serial programming cable for AVR

Inverted-“DASA” serial programming cable for AVR

The design is released under the TAPR Open Hardware License.

Apr 212012

Some may recall my old set up I used to record the AWNOI net. A bit fiddly, but it worked, and worked well. However the machine I used was short-lived. Basically, I wanted to stream in both directions between a sound device connected to a HF radio transceiver, and a USB wireless headset, with a feed being recorded to disk.

The problem with newer sound devices is the rather limited sync range possible with the modern audio CODECs. Many will not do sample rates that aren’t a multiple of 44.1kHz or 48kHz, and I have a headset that won’t record any other sample rate other than 16kHz. ALSA’s plug: doesn’t play nice with JACK, and I shelved the whole project for later.

Well, tonight I did some tinkering with gstreamer to see if it could do the routing I needed. Certainly all the building blocks were there, I just had to get the pipeline right. A bit of jiggling of parameters, and I managed to get audio going in both directions, and to a RIFF wave file to boot. I’ve put it in a shell script for readability/maintainability:

# GStreamer bi-directional full-duplex audio routing script
# Stuart Longland VK4MSL





exec    gst-launch-0.10 \
    alsasrc device=${HEADSET_DEV} \
            name=headset-capt \
            slave-method=resample \
            buffer-time=${HEADSET_CAPT_BUFTIME} \
            latency-time=${HEADSET_CAPT_LATTIME} \
        ! ${HEADSET_CAPT_FMT} \
        ! audioresample \
        ! audioconvert \
        ! ${STREAM_FMT} \
        ! queue \
        ! tee name=headset \
        ! audioresample \
        ! audioconvert \
        ! ${SNDCARD_PLAY_FMT} \
        ! alsasink device=${SNDCARD_DEV} \
            name=sndcard-play \
            buffer-time=${SNDCARD_PLAY_BUFTIME} \
            latency-time=${SNDCARD_PLAY_LATTIME} \
    alsasrc device=${SNDCARD_DEV} \
            name=sndcard-capt \
            slave-method=resample \
            buffer-time=${SNDCARD_CAPT_BUFTIME} \
            latency-time=${SNDCARD_CAPT_LATTIME} \
        ! ${SNDCARD_CAPT_FMT} \
        ! audioresample \
        ! audioconvert \
        ! ${STREAM_FMT} \
        ! queue \
        ! tee name=soundcard \
        ! audioresample \
        ! audioconvert \
        ! ${HEADSET_PLAY_FMT} \
        ! alsasink device=${HEADSET_DEV} \
            name=headset-play \
            buffer-time=${HEADSET_PLAY_BUFTIME} \
            latency-time=${HEADSET_PLAY_LATTIME} \
    interleave name=recorder-in \
        ! audioconvert \
        ! audioresample \
        ! ${OUTPUT_FMT} \
        ! wavenc \
        ! filesink location="${OUTPUT}" \
    headset. \
        ! queue \
        ! recorder-in.sink1 \
    soundcard. \
        ! queue \
        ! recorder-in.sink0

Now the fun begins ironing out the kinks in my data cable for the FT-897. At present, it works for receive, and seems to work for transmit. I use VOX on the radio itself and keep the headset’s microphone on mute when I don’t want to transmit.

At present, I was getting a bit of distorted audio coming back through the headset when I transmitted, almost certainly RF-pickup in the cable and line-in circuitry of the computer’s sound card. I’ll have to see if I can filter it out, but the real test will be seeing if such distortion is present on the outgoing signal — or rather, if it’s significantly audible to be a problem.

Apr 012012

Well, it’s been a while since I touched this tablet.  I basically chucked it in a corner in disgust after it shat itself rather unceremoniously on the trip before we even got to the NSW/Victorian border.  By “shat” itself, I mean corrupting files on the internal microSD card, intermittent device resets, display flickers, all the hallmarks of a dry joint.

The seller on eBay that sold me the device have been completely unresponsive as to the problems, so looks like I kissed about $250 goodbye.  Ahh well, such is life.  They are still being sold on eBay, but buyer beware, they are cheap, and it’s pot luck whether yours is cheerful, or nasty like mine.  If you want something reliable, look elsewhere.

Having made this mistake, well, I’m looking to make lemonade from the lemon.  First step, was to figure out what on earth I had.  So out with the screwdriver.

You’ll notice on the top and bottom of the unit, there are four small plugs concealing screws.  These hold the LCD screen assembly in place.  Undo these, then you need to carefully work your way around and release the clips that hold the LCD screen assembly.  Do not try to detach the LCD touch panel from the LCD!  I initially couldn’t get it to budge, so I tried doing exactly this in the hunt for possible hidden screws (there were none).  This was the end result:

Shattered Flytouch III touch panel

Why one should not try to detach the touch panel.

Never mind I say… the unit was just about destined for the bin as it was.  External USB HID devices work for what I’m after, but it’ll mean any touch-related fun is out unless I can pick up a replacement 4-wire panel.  Element14 and RS have them at >$80, to which I say, bugger it, I’ll do without.

Having pulled the unit apart, the main PCB is held to the back shell by a few screws, one thing is immediately apparent.  The whole device is based on what looks to be a fairly generic System-on-Module based around the Vimicro VC0882BCXA System-on-Chip, and the Vimicro VC7822EL companion chip.

Flytouch III PCB

Top left is a Wifi module based on the Realtek RTL8111, and to the right, the GPS module (which hooks to one of the serial ports from what I recall).  Down the bottom of the image are the USB ports.  Near the HDMI socket is a Silicon Image SiI9022ACNU HDMI transmitter.

The system on module looks interesting, and I’m curious to find out more about it, as for hobby projects, the pins are not too small to deal with using a soldering iron.  The OS and boot loader exist on the microSD card.  I tried putting a 16GB card in, but evidently I wasn’t getting the partition table right as it wouldn’t boot.  I haven’t tried hooking up a serial port as yet, so it’s hard to know what is wrong.  Some research indicates that ttyS0 lurks on this board just near the aforementioned microSD socket:

The system on module within the Flytouch III

The system on module within the Flytouch III

I haven’t spotted the bad joints that were giving me grief. In fact, having gotten it out of the case, I find the top USB port (flakey from day one) seems to be behaving, and I’ve had no issues with it running with the case apart.  Otherwise I’d be running a soldering iron over a few joints just to make sure everything was right.

Next step?  Well for now, I’ll put it back together (minus touchscreen) and put it aside.  I’ll have a look at tacking a connector onto those serial pins, with a level shifter so I can interact with the serial console.

Having gotten bootloader access, I should be able to debug the SD card cloning issue, then I can have a close gander at what the current u-boot and kernel are doing to tickle the hardware.  End game?  Well, Android isn’t much use without a touch screen, so I’ll be probably hacking together a Gentoo-based environment with some amateur radio related software.  We shall see.

Mar 242012

Prior to my road trip to LCA 2012 Ballarat, I bought a new toy, namely a Yaesu VX8-DR handheld.

At that point it turned up only just before I was due to leave, so I wasn’t able to get the accessories I wanted. I cobbled together my own 12V charger lead by snipping the original power supply and soldering on a cigarette lighter socket, but otherwise I used the handheld in its out-of-the-box configuration.

Having gotten back, I have purchased the FGPS-2 GPS module, CT-136 GPS adaptor and the BU-1 Bluetooth module.

Transceiver performance

The set works quite well. The antenna is pretty deaf and useless on 6m, maybe I can get a better after-market tri-bander whip, but on 2m and 70cm it works reasonably well. I’ve heard APRS traffic over distances of 100km, and even been heard on APRS by a digipeater some 90km away.

Audio quality is good, both transmit and receive. Plug in a pair of stereo headphones, and the wideband FM receiver sounds excellent; in stereo to boot.

Probably my biggest nit, is you can’t simultaneously charge and externally power the set. To charge, you must either detach the battery and drop it into a separate charger cradle (an optional extra) or turn off the set.

GPS Performance

When I purchased the VX-8DR, it was a real toss up between it and the VX-8GR. The reason I went the VX-8DR was because it had 6m, and Bluetooth. Having gotten the GPS, I’ve run into the problem a lot have reported; the GPS module is deaf as a post.

The VX8-GR doesn’t improve on this either. However, the good news, is that because my module is external, I can (1) mount it in a better spot, or (2) replace it with a better compatible module.  For VX8-GR owners, this is the end of the road, they can do nothing but moan to Yaesu.  I at least have options.

The module is mounted vertically inside the FGPS-2 casing. Usually with GPS modules such as these, they embed a small patch antenna, whose radiation pattern is perpendicular to the plane of the antenna surface. Being vertical, this means when you hold the radio vertically (as you normally would), GPS reception is poor because the radiation pattern is directly in front of the radio.

The radio seems to perform a lot better, if the radio is held with the screen facing upwards towards the sky. It’ll even work inside my house if I do this. It seems this is a screw-up on par with the iPhone 4.  Another alternative is to replace the module, the FGPS-2 apparently uses 9600 baud serial with NMEA format strings.  However, it seems the parser in the VX-8 is rather crude.  I have a module that does NMEA at 4800 baud, so I’ll either need to coax it up to 9600, or use a microcontroller to buffer and convert rates, and perhaps do some tweaking of the sentence format to make up for the VX-8’s shortcomings.

My hunch; if I make an alternative bracket to the CT-136 adaptor, I can nail this, and another problem, the inability to plug in the GPS and a headset. I have the CT-M11 cable, and thus I plan to make a bracket to connect the FGPS-2 to the end of this cable; allowing me to also plug in a wired headset.


I bought the Bluetooth as an insurance policy to give me another means of interfacing a headset. Then began the fun of getting it to work with my headsets. I have a couple; a Bullant earmuff-headset, a lightweight mono Digitech headset, and a “MyTalker” headset.

The first was one set I bought some years ago, back when the Bluez was far less stable than it is today, and also long before I was into Amateur Radio or possessed a Bluetooth-capable phone. I tried pairing using a USB Bluetooth dongle, but had little luck, so they got put on one side. Also despite advertising being able to stream music, it only supports HFP and HSP profiles, so you get to listen to your tunes in 8kHz 8-bit mono. They are sold at some hardware stores, such as Mitre 10 The Gap (where I bought my set).

The handheld did pair with this set, but I couldn’t get PTT to work, and the headset itself also had a few faults; namely it was always noisy, and the broadcast receiver stopped working, so I’ve taken them apart for now to see if I can fix these issues. I can key the radio up using the radio’s PTT, but then both internal and headset microphones go live.

The second set is sold by Jaycar, catalog number AA2080. This would be my preferred set to use with the radio as it can pair with two devices simultaneously. It supports the same profiles as the earmuffs, but it’s at least more lightweight.

The BU-1 takes one look at this set, and turns its nose up at it, with the VX-8 giving up and displaying “PAIRING ERROR”.

I also bought the MyTalker set from Jaycar, catalog number XC4894. This set is much like the earmuffs. It embeds its own microphone, but the unit itself provides a 3.5mm socket for you to plug in your own headphones, or use the supplied earphones (which are awful and uncomfortable, don’t use them). At the other end of the unit, is a lead terminated with a 3.5mm plug to plug into a music player. I’ve modded this set to be able to use an external microphone, switchable between a transceiver and the Bluetooth set, allowing a headset connected to a radio to also connect to a phone. I’m still working on this bit.

The VX-8 treats this set with much the same contempt as the mono headset before.

Today, I poppsed in and bought a more expensive set; this time I looked for A2DP functionality, Jaycar have one, catalog number AA2082. Like the AA2080 it can talk to two devices, unlike the AA2080 it supports AVRCP and A2DP. Also, not advertised, is it can function as an analogue headset; supplied in the box is a dual 3.5mm to mini-USB cable that can plug into the headset and allow you to use it with a non-Bluetooth capable device.

I plugged it into the bicycle’s battery to charge on the way home. When I got home, I read the instructions (which are in awful Chinglish). Basically, the English translation of the pairing instructions go like this:

  1. Hold in the MFB button (the centre one on the right ear-cup) in for several seconds. You will hear the voice prompts “Hello”, followed by “Enter Pin Code 0000 on phone”.
  2. When you hear the latter prompt, tell your device to start looking for the headset
  3. When it finds a device called “AA2082”, select it, and enter 0000 as the pin code

So, the steps I followed:

  1. Turn on the VX-8
  2. Hold in the MENU key to bring up the Set menu, then select BLUETOOTH PCODE
  3. Enter 0000 on the keypad.
  4. Hold in the MFB button on the headset until you hear the “Enter Pin Code” prompt
  5. Hit V/M on the VX-8
  6. After a few brief moments, you should see “PAIRING COMPLETE”, press PTT to confirm.

Having got this working, I notice a few things:

  • Stereo (A2DP) sounds a little weird, perfectly clear, but the compression is apparent. I’ll experiment with the laptop later to see if it’s the headset or the radio.
  • Mono works well, pressing MFB toggles PTT on the VX-8. VOX doesn’t seem to work, but no great loss as I find VOX to be a disaster when outdoors.
  • In mono mode, a buzzing is apparent on the received audio. This isn’t audible on transmitted audio, nor did I notice this on received audio when I tried using the headset with my mobile phone.
  • Range seems to be quite restricted, possibly due to where the module is installed it doesn’t get the reception it perhaps needs. A2DP suffers more from this than HFP, with drop-outs being frequent. Again, I’ll need to do some experimentation with the laptop, and perhaps some experimentation with the radio without the battery installed to see if that helps performance.

I’m tossing up whether I get one of these motorcycle Bluetooth headsets.  I ride on the bicycle quite a lot, and at the moment I use headsets embedded in the helmet that are home-built from old computer headsets.  The longevity of the microphone seems to be the biggest problem  I also am on the look-out for an earmuff headset for things like the Imbill car rally, ideally one that can do A2DP.  The Bullant ones I know can’t do this.  I see some earmuffs in the $400+ price bracket that offer Bluetooth, but no idea if that includes A2DP, and frankly, I shudder at that price.

The motorcycle ones are designed to fit a wide range of helmets, and they look as if they’ll fit a set of cheap regular earmuffs quite well.  They typically sell for about $200, support A2DP, multiple devices, and intercom.  Add in $30 for a set of earmuffs, and it makes this a much more attractive option.

More experimentation will be needed I think, but this is looking promising.  I’ll probably post up more details as I come across them.

It’d be nice if Yaesu had been a bit more up-front on what the BU-1 supports: the AA2080 supports both HFP and HSP, yet the BU-1 won’t touch it, the Bullant set supports the same profiles yet the BU-1 works fine with it.  The reasoning for this is not clear, but it does seem that it’ll reliably talk to A2DP capable headsets, so maybe that is a starting point for others.

Likewise with the CT-136, I’ll see if I can fabricate a bracket using the CT-M11 cable, and see where that gets me.

Jul 172011

The problem

For some time now, we’ve been putting up with interference from a few stations, who for now will remain nameless.  Foul language, deliberate interference, the list goes on…

Allegedly some of these people have been doing it for longer than I’ve been alive.

It is as if, these people, believe we are not entitled to use a small patch of radio spectrum to engage in a little friendly chat.

Some have even gone as far as vowing to do “everything they can” to “ruin” amateur radio.

This means war.

Well, we could complain to the ACMA… apparently some have done this already… many times.  If they haven’t acted after 20 years worth of complaints, I don’t think they’ll ever act.  Not without a very substantial amount of evidence.

There is nothing however, that stops us, getting on the band and having a chat, except one thing.  Someone parking on the frequency we choose and interfering with our communications.  Yes, we could QSY, but experience has shown the culprits just chase us up and down the band.

They cannot be on all frequencies however.  One big group, on one frequency, is vulnerable to attack.  Numerous smaller groups, scattered across the band however, is far more resilient.  They cannot be on all frequencies at the same time.  More to the point, more ears open and listening, means more data points … bonus points if those “ears” are directional.

My proposal

What we need to do is stir up some activity on the 80m band.  The 80m amateur band is a wonderful local chit-chat band.  It has almost guaranteed propagation for distances over 1000km on any given evening.  It is open to all license classes.  (Well, if you ignore the DX window.)  I’m proposing a contest with a difference.

Most contests, you make contact with a station, exchange numbers, then it’s ta ta… (or “73”) and you go your separate ways.  Not terribly exciting listening.

I’m proposing a social ragchew contest.  I want to encourage as many people, on as many groups, as possible.  The more people, the better.  Talk about anything you like.  QRP and QRO stations welcome.  Mobile and portable stations, also welcome.  Newcomers, especially welcome.  Make it a large group, or a small group, doesn’t matter.  It doesn’t have to be a formal net, just so long as there’s at least three people.

How will it be scored?

This is something I’m still thinking about… but I’m thinking something along these lines… I would love your input.

For every hour, or part thereof, each member of a group chatting on the same frequency, will get one point for each member of that group.

So if 3 of you talk for 2¼ hours, that’s 3×3=9 points.


  • Triple points for every station who has held their license:
    • Less than 12 months
    • Greater than 50 years
  • Double points for every:
    • Station that is “mobile” (i.e. moving between localities) or  “portable” (i.e. set up temporarily at some location for less than one week)
    • QRP station (running 5 watts or less)
    • DX contact (overseas)

I’m thinking these should be added together, so if in your group of VK’s you happen to score someone joining your group from Europe (for example) that only just got their license a month ago and is running QRP whilst mobile, add 24 points to each group member for every hour or part thereof that they participate on your group.

What about interference?

More than likely, this will stir up the trolls that seek to ruin our experience.  Part of the aim of this, is that a lot of people will be listening.  The following is something anyone can do, even the shortwave listeners.

  • Log the following:
    • the time in UTC
    • your location (latitude/longitude or Maidenhair Locater)
    • the signal strength
    • the nature of interference
  • If you can, record the interference
  • If you have a directional antenna, point that in the direction where the signal is strongest.  Use that to measure the signal strength, and log the bearing, along with the antenna type.

With enough evidence, we can flush out these serial pests once and for all.

When will it be held?

This is open to discussion… I’m thinking Friday or Saturday night.  I’m thinking it should start some time in the evening when the band opens up, maybe after 7:00PM.

The contest should remain open until the last group participating in the contest goes clear… if a group manages to successfully run to dawn the next day, good on them, maybe there should be bonus points for their efforts. 🙂

Let me know what your thoughts are… this is, as I say, a request for comment.  Feel free to get in touch with me directly or leave a comment here.

May 202011

During the International Rally of Queensland, it was interesting to observe how people made use of the radios provided for the event. In fact, watching peoples’ behaviour to me, made it clear that none of them had any training in how to use one of these devices. And they all struggled, mostly as a result of each others’ bad habits.

This isn’t an isolated case… my mother who works at the Brisbane International airport, often complains about the radio etiquette of her fellow colleagues. A lot of people have a radio thrust into their hands, and haven’t a clue how to use them. In trying to figure it out, they often fall trap to the same bad habits.

I myself have found a lot of this by mistake, and by observing others. A lot of this is also applicable to using regular telephones … I found the tip of standing still when talking helpful when I needed to make a call to emergency services on my mobile phone — the particular spot where I was at the time, the phone would drop out if I moved more than 6 inches in any direction. Learning not to talk too close, or too loudly into a microphone, also helps.

The following is a little chart I came up with. No, the stick figures are not XKCD grade, they’re not meant to be. Click on the image below for a copy as a PDF, or get the SVG source here.  File is provided in the public domain, but attribution would be appreciated.  If you use radios in your workplace, and observe this kind of behaviour in your colleagues, you might like to print this out and stick it on a wall somewhere.


Apr 292011

Well, the antenna I tuned up in my last post, I can say, while it doesn’t work that great on 80m, it did get a contact into Victoria this evening on the AWNOI net.  Terry VK2TEZ near Coffs Harbour gave me a 4-3 signal report, so still lots of room for improvement… part of that was due to static crashes from storms in NSW, but I think with a better tuned antenna, we should be able to get towards having a workable antenna.  At the moment the autotransformer I use has ~95 turns, with output taps at 0, 25, 50 and 75 turns.  I think one somewhere between 0 and 25, and/or some extra turns might help… so I might wind a new one and see where that gets us.

The headlight still continues to give me grief.  An interesting discovery though this evening.  Since the battery is no good, I’ve permanently mounted it to the bicycle frame.  This was achieved by removing the plastic bracket which is used to mount the headlight on the handlebars or on the helmet mount (using a rubber O-ring), and replacing this with a bracket bent out of a short piece of aluminium.  It fastens to the bicycle frame at the front right above the front wheel, using a bolt hole normally used for mounting rim brakes (my bike has disc brakes).

The upshot is that the headlight’s casing has a pretty good electrical connection to the bicycle frame.  Turns out this is a big no no with these lights.  Kiss goodbye HF if you do… you’ll get crap everywhere from 400kHz right up into the VHF.  I’ll have to do some further investigation, but I found that if I insulated the case from the frame, it helped on the 400kHz and HF emissions.  I think something parasitic is causing the 2m grief as this continues (that, or it’s less critical on the case being earthed).

For a while I thought it might’ve been something lurking around 415kHz… the standard IF frequency of most superhetrodyne receivers, but alas, can’t see anything there.  Otherwise it’d explain why it appears to be everywhere.  I definitely suspect it’s not supposed to be oscillating there though, so I think parasitic oscillations are the cause here.  I’m slowly researching my own power supply for the LED in this headlamp, so its days are numbered.

The insulation was achieved by breaking a cheap plastic picnic knife, drilling a couple of mounting holes, and mounting the headlight on that.  That quelled the HF interference quite a bit, and I was able to listen to the HF bands on my way into Brisbane.  At least it was nice to listen to something other than that sodding wedding in the UK.  (C’mon fellas, yes, great and all but can’t we just confine it to one station?)

I was concerned about the longevity of this arrangement however.  And as it turned out, I was right to be concerned.  It broke as I approached the Normanby Fiveways.  I went over a bump, heard a crack, and noticed the headlight dangling by the power lead.  I pulled over, threw it in the basket and grabbed the backup headlight.  At least there was one on the helmet, a 1W LED, so I still complied with local laws for night riding.  I didn’t have a mounting for the backup light, I just pointed it forward sitting in the bottom of the front basket, with it on flash as a warning to drivers.

Once at the destination, I reverted the headlight back to being directly mounted on the bicycle frame.  Interference was intermittent, but when it was acting up, it did wipe out 80m with S6 noise.  Not good when most stations are barely making S6 as it is.  I wound up turning off the main headlamp as for the most part I could see where I was going, and I knew the route.  As I got out of town this was less of an issue due to the lack of traffic, and of course I was on bicycle paths or the footpath for 90% of it.  That at least allowed me to hear what was going on with the net.

The other flaw I had was that the helmet’s speaker connections were acting up… wound up unplugging the earpiece side of the headset adaptor and using the internal speaker.  Thankfully I could still use the helmet’s microphone and the rest of the wiring harness… just not the speakers in the helmet.  I noticed this as I pulled out of my street, in fact I was aware there was a problem, but now I know where the problem is now.  I’ll get onto it tomorrow.  And I’ll look at a better way to mount this headlamp in an insulated fashion as an interim solution to a power supply replacement.

Mar 052011

This is a question raised on an earlier post of mine.

It’s an interesting comparison between radios and mobile phones.  And some are of the belief that all you do with a radio, is talk on it, or that mobile phones can completely replace radios.  Rather than respond there, I’ve decided there’s enough content there for a completely separate post.  I have highlighted my main arguments here for those who just want to quickly skim through.

Indeed, mobile phones do exist, and they are very handy things.  They do generally come with some sort of hands-free capability.  This is true of my Nokia 3310 … the connectors are available from JayCar, and the headset schematic is trivial.  This is not true of all mobile phones unfortunately.  Much the same is true of my radios, the FT-897D takes a standard RJ45 connector for the microphone, the FT-290R II takes a more obscure 8-pin “Foster” connector, but even they can be sourced if you look around.

RFI is a worse problem for mobile phones however, GSM seems to have a happy knack of being able to inject itself into almost anything unless you’re careful with your circuit design.

It’s worth considering what the primary point of the exercise is however, and how radio and mobile phones differ.

Mobile phones are great if you want to call someone specific. They are highly optimised for one-to-one conversations.  In fact, it’s highly expensive to do anything else.  Conference calls are a rare thing and you pay through the nose for the privilege.  Mobile phone charges are high enough already — I would not like to be paying for the cost of a one hour conference call twice daily on my way from/to work.

To contrast the fees, it costs me $20/month for a mobile phone service through Telstra (excluding calls).  I rarely see a phone bill above $30, but I’d probably see that climb to triple digits if I used it in the manner I use my radio.  The radio license costs me $65/year, regardless of whether I leave my station packed-up and inoperable, or whether I’m using it all 31557600 seconds of the year.

When I was riding frequently however, I regularly participated in discussions on my commutes.  It does make the ride more enjoyable when you can have a friendly chat on the way in.  The beauty of radio though is that you don’t all have to be in close enough proximity to hear each other baseband.

Radios are well suited to group discussions, since radio is an inherently shared medium. At most a repeater site which can relay the traffic between stations is all that is necessary.  I’ve also had quite successful simplex contacts on the 2m band over 50km, and overseas on the 40m band.  Mobile phones only achieve coverage over a few kilometres line-of-sight, coverage is extended by cellular towers which perform a similar function to repeaters.

If you’re in a discussion on the radio, good operating practice states that you leave a gap between transmissions so that other stations may break in if needed.  The breaking station may be someone wanting to get in touch with one of the other operators on frequency, may be an interested party, or could even be a person in distress.

It’s relatively simple for someone to jump in on a conversation.  Mobile phones however, prohibit this unless, once again, you pay severely for the privilege.  How often have you been in a situation where you’ve been trying to chase a caller off the phone so that the line is free for that important call you’ve been waiting for?  Not such a problem with radio.

Mobile phones give you a certain degree of privacy in communications.  Encryption standards vary between mobile phone standards, but all of them (except AMPS, which is now extinct) provide some means of privacy.  Radios generally don’t unless you pay through the nose for a set and a suitable license.  Encryption is also forbidden on amateur bands.

Both allow a certain amount of experimentation.  If you have a mobile phone that provides an antenna socket, it is theoretically possible to construct your own antennas.  You are not however able to alter the transmission mode or frequency of operation, nor are you able to construct your own mobile phone (homebrewing) without significant expense, as the device you construct must be tested and approved by local authorities before you may connect it to a network.  (In Australia, the body responsible is the ACMA, and the approval you need comes in the form of a “regulatory compliance mark”, formerly “A-tick”.)

You can however readily experiment with software running on top of modern smartphones, if you phone is that new.  (Mine isn’t)  Or, if you have a >= 3G capable phone (again, mine isn’t), you can hook a small computer up and use standard VoIP software.

Radios on the other hand, if your license permits it (mine does), can be completely constructed from scratch.  You choose the frequency and mode, there are boundaries where you cannot go, but there’s still a hell of a lot of freedom that mobile phones do not provide.  All amateur transceivers have socketed antennas, allowing experimentation with other antenna types.  Multi-band sets permit experimentation with different frequency bands, all of which differ in their properties.  Transmission modes include pretty much all analogue modes, and in most license classes, many forms of digital communication.

Mobile phones typically are fairly easy to use (there are people however that never seem to get it however), while radios almost always require a certain level of training.  Amateur radio requires you to sit two or three separate exams (usually two written exams for theory and regulations, and a practical test).

Some might ask why I use such an old mobile phone?  Well, you’ll notice the FT-290R II isn’t a spring chicken either.  I use stuff because they do the job.  The old Nokia 3310 has been solid and reliable.  There’s minimal “fluff” to cause problems.  Someone dials my number, it rings.  I dial a number, it calls that person.  Text messages, easy.  My needs don’t require anything more sophisticated.  Don’t unnecessarily complicate, I say.  When I’m out and about, this means I’m contactable two ways … primarily by radio, but if the phone rings, I can pull over and plug the phone in instead to take the call.

In my situation on a bicycle, it is also paramount that I do not have my hands tied up manipulating radio/phone controls. My solution was to wire up a small keypad which provides push-to-talk and four directional buttons.  On the mobile phone, the PTT becomes my answer button, and I can dial a person by momentarily pressing the button, waiting for the prompt, and announcing the “voice tag” of the person in the phone book.  The phone then rings that person automatically.

On the radio, I mainly use memory channels, so I’m moving up and down the memory channels.  Usually I just switch to a given frequency, and stay there.  When I want to talk, I press the button down — or, more recently I added a switch which is equivalent to “holding the button”.  So I just flick the switch to go to transmit, and flick it back again.  In the meantime, I’m able to use my hands for operating the bicycle.

Contrast this with trying to juggle a netbook computer running a VoIP package such as Skype.  It’d be a nightmare, those user interfaces are not designed for mobile operation. They’re simply not appropriate.  SIP-based VoIP is better in some ways as you can code your own application, but even then, you’re at the mercy of the mobile phone carrier’s network.  VoIP is very sensitive to NAT and dynamic IP addresses, and I think operating mobile in this manner would be a bit much to expect.  Skype also cannot handle a group as large as radio can.  (SIP can handle over 200 participants in a conference, limited by server bandwidth.  On the radio, I’ve regularly participated in nets with more than 10 people on air at a time.  Skype is limited to 5 IIRC, or maybe you pay for more.)

Amateur radio is largely infrastructure independent. On the bicycle I can get around obstacles that would be impassable in a car.  With high capacity batteries, and a reasonable power set on a high mountain top, I can achieve significant simplex range, thus allowing me to relay traffic over great distances, without any requirement for intermediate infrastructure.

“Ohh, I’ll just use the phone for that” you say.  Yeah, right.  Try that in the Lockyer Valley just now.  Many of the mobile phone towers went for a swim, as did the exchanges.  Areas around Grantham are without any forms of mobile or land-line based telephony.  And of course, no Internet.  The same situation was the case for people caught up in the Black Saturday bushfires down in Victoria.  I’d imagine communications are under very heavy strain in Christchurch at the moment.

Mark Pesce made a very valid point in his LCA2011 keynote, communications can also be disrupted for political reasons, such as what has happened in Egypt and Lybia.  What do you do then?  Radio’s not perfect, but it sure beats being left without a means to let people know you’re okay.  With mobile phones, you are dependent on others to bring online infrastructure, before you can make a call from your phone to the other.  (Unless you experiment with something like the Serval Batphone, which has its limitations.)

So one does not completely replace the other.  They are complementary. The theory requirement keeps a lot of people away from amateur radio, however I’m happy to report I’ve never received a telemarketing call on the radio. 🙂  More to the point, there is more to amateur radio than just talking to people, just like there’s more to the police force than just arresting people.

As for me, radio has fascinated me for a long time.  I first became interested in radio from a very young age, but I particularly got into it after studying how it worked at university.  This is what lead me on to amateur radio.  So for me, it’s as much technical as it is social.  I enjoy meeting up and talking with people, but I also enjoy the experimental aspect of it.

At the moment, a large amount of my energy is going into bicycle mobile operation, particularly with regards to HF communications.  This does necessitate big antennas.  Antenna installations are always a trade-off between physical size, efficiency and band-width, and it can be a real challenge to get things working, but it’s rewarding when it pays off.

Some would argue: “Why bother? Just use a mobile phone.”  That’s like asking a car enthusiast, “why muck around under the bonnet when you can take your car to the garage down the road?”  Or to the avid gardener, “Why bother growing your own veges, there’s a greengrocer in the shopping centre?”.  Yes, they do exist.

I also would like to point out that the commercial world has gained lots from home experimenters.  You use a NAT router for your home Internet connection?  What’s the OS it runs?  Many run Linux.  Did we get Linux from a big commercial organisation originally?  No, it came from an avid homebrewer of operating system kernels, and was never intended to be “big and professional like gnu”.  Did we get Single Sideband from the commercial world?  No, it was an Amateur Radio inspired invention.  Likewise with a lot of high frequency design techniques that are in mobile phones today.  Heck, in the future we’ll probably be adding Codec2 to that list.

The world needs amateurs of all persuasions.  For this reason, declaring something “obsolete” just because you can do the subset of things you do with another more contemporary technology, is a short-sighted way of viewing things.  The amateur world benefits from the professional world, and vice versa.  It’s often the case that someone who works in a particular industry for a living, goes home then hacks on various projects related to that industry for fun in his/her spare time.

So, “why not just use a mobile phone”?  Because I find radio fun, I enjoy it, and I hope that some day, what I learn can be shared and applied in a professional setting to improve technology as a whole.  After all, isn’t having fun what the world is all about?