Aug 232015
 

Something got me thinking tonight.  We were out visiting a friend of ours and it was decided we’d go out for dinner.  Nothing unusual there, and there were a few places we could have gone for a decent meal.

As it happens, we went to a bowls club for dinner.  I won’t mention which one.

Now, I’d admit that I do have a bit of a rebel streak in me.  Let’s face it, if nobody challenged the status quo, we’d still be in the trees, instead someone decided they liked the caves better and so developed modern man.

In my case, I’m not one to make a scene, but the more uptight the venue, the more uncomfortable I am being there.  If a place feels it necessary to employ a bouncer, or feels it necessary to place a big plaque out front listing rules in addition to what ought to be common sense, that starts to get the alarm bells ringing in my head.

Some rules are necessary, most of these are covered by the laws that maintain order on our streets.  In a club or restaurant, okay, you want to put some limits: someone turning up near-starkers is definitely not on.  Nobody would appreciate someone covered in grease or other muck leaving a trail throughout the place everywhere they go, nor should others be subjected to some T-shirt with text or imagery that is in any way “offencive” to the average person.

(I’ll ignore the quagmire of what people might consider offencive.  I’m sure someone would take exception to me wearing largely blank clothing.  I, for one, abhor branding or slogans on my clothing.)

Now, something that obstructs your ability to identify the said person, such as a full-face balaclava, burka (not sure how that’s spelled) or a full-face helmet: there’s quite reasonable grounds.

As for me, I never used to wear anything on my head until later in high school when I noted how much less distracted I was from overhead lighting.  I’m now so used to it, I consider myself partially undressed if I’m not wearing something.  Something just doesn’t feel right.  I don’t do it to obscure identity, if anything, it’d make me easier to identify.  (Coolie hats aren’t common in Brisbane, nor are spitfire or gatsby caps.)

It’s worth pointing out that the receptionist at this club not only had us sign in with full name and address, but also checked ID on entry.  So misbehaviour would be a pointless exercise: they already had our details, and CCTV would have shown us walking through the door.

The bit that got me with this club, was in amongst the lengthy list of things they didn’t permit, they listed “mens headwear”.  It seemed a sexist policy to me.  Apparently women’s headwear was fine, and indeed, I did see some teens wearing baseball caps as I left, no one seemed to challenge them.

In “western society”, many moons ago, it was considered “rude” for a man to wear a hat indoors.  I do not know what the rationale behind that was.  Women were exempt then from the rule, as their headwear was generally more elaborate and required greater preparation and care to put on and take off.

I have no idea whether a man would be exempt if his headgear was as difficult to remove in that time.  I certainly consider it a nuisance having to carry something that could otherwise just sit on my head and generally stay out of my way.

Today, people of both sexes, if they have anything on their head at all, it’s mostly of a unisex nature, and generally not complicated to put on or remove.  So the reasoning behind the exemption would appear to be largely moot now.

Then there’s the gender equality movement to consider.  Women for years, fought to have the same rights as men.  Today, there’s some inequality, but the general consensus seems to be that things have improved in that regard.

This said, if doing something is not acceptable for men, I don’t see how being female makes it better or worse.

Perhaps then, in the interests of equal rights, we should reconsider some of our old customs and their exemptions in the context of modern life.

Feb 282015
 

Well, it’s been about 7½ years since I bought my first bike and started riding, and really, about 5 years since I started riding seriously as a means of transportation.

In late 2011 my father and I went halves in a pair of GPS/CB radio units, it was a 2 for 1 deal and so we bought these two units at about $400 each, normally they’d be about $700 individually. So there I started logging the distance I covered. I just used the in-built odometer on the GPS, resetting it when the bike went in for service.

When I got the mountain bike, I realised I needed to track the distance covered by each bike to ensure they all went in at their 1000km service on-time. So being a programmer by trade, I coded up a crude CGI/Perl script that used a SQLite back-end to log the odometer readings. It was a simple HTML form where I could enter the distance at regular intervals.  Crucially, it worked with the “feature phone” I used at the time.

The SQL views (no such thing as stored procedures in standard SQLite3) took care of actually calculating the differentials and so I used that to track my progress. So far so good. I’ve now had this in place since mid-2012 and I’ve brought in some of my data from early 2012, thus I’m now starting to see some trends.

Distances by year

Year Distance (km)
2012 5594.9
2013 4837.78
2014 4593.42

Am I getting lazy? Well, hard to say there. I go out less on the weekends and have also optimised my routes to reduce distances somewhat.  Some of this is weather-dependent, in the heat one does not feel like going outdoors.

Distance by month-of-year

Month Distance (km)
01 282.59
02 406.20
03 409.10
04 377.42
05 511.29
06 493.36
07 330.01
08 532.05
09 494.21
10 470.14
11 370.27
12 394.13

I’m not sure why there’s a lull in activity around July, but the most active months seem to be May and August.  The lull in January can be somewhat attributed to the end of the Christmas break.  I guess if anything, I should aim to be more active in July when the weather is the coolest.

Guess I’ll be keeping an eye on what happens over time with these stats and see if I can get them up a bit.

The following graph will continuously update as I pump data in. We’ll see what happens.

Distance by month-of-year

Jul 192014
 

My only mode of transport these days is a bicycle.  I might get lifts from other people on occasion, but normally I ride everywhere.

It’s a great way to get around, good form of exercise, cheap and whilst I won’t be breaking any speed records, it’s not overly time consuming.  I spend more time waiting for buses and trains than I do getting places on the bike.  The downside is what to wear whilst cycling.  For cycling use, car drivers have a hard enough time seeing a cyclist as it is, so I feel safer if I’m at the very least, light-coloured, ideally day/night high visibility compliant with AS/NZS 4602:1999.  I’ve been cycling as my main mode of transport now for nearly 5 years, and over this time I’ve tried a number of things for clothing.

Regular clothing

“Normal” clothing, was naturally what I started out with.  What I find is that it quickly wears out, particularly trousers, when subjected to this sort of treatment.  The cycling movement puts a lot of stress in the crutch and thus, I find they give out within a year or two.

Cycling is also very physical, so one will sweat a lot.  So at the very least you’ll want a shirt to wear cycling, and another to change into when you get to your destination.  The high-visibility polo shirts work well for this, they’re cheap and lightweight, keep the sun off well without being too hot.

Work clothing

By this I mean industrial work clothing.  After finding that my trousers were wearing out at an alarming rate, I decided I’d go for more industrial type clothing.

I hate wearing belts, so I looked around and bought some overalls.  My preference is for ones that have a front zip.  A bloody pain in the arse to find in this country!  The likes of King Gee, Bisley, Worksense and many others tend to make those sorts for markets like in NZ, but over here they tend to sell only stud-fastening ones which I find are more time consuming to fasten.  A zip: you’re done in about 2 seconds, studs you’ll be clipping them together for about 10.  But I digress…

The ones I found were medium-weight ones, 290gsm or something like that.  In the winter, they’re okay, but once the fabric gets soaked with sweat one’s body temperature then becomes rather uneven.  In summer they’re often too hot to consider.

Lighter-weight ones might fare better in the sweat stakes, not sure about durability.  Given the high cost ($70~$120 a pair) I’ll just have to keep looking.

Ones made out of the same material as the high-visibility polo shirts could work well, no idea where to find them though if they exist.

Seeking the all-weather cycling suit

Some at this point would be screaming at me “why not lycra”?  Well, I’ve never been a fan of lycra and have no intention of becoming a MAMIL.

One evening coming home a few weeks ago, we had some very windy weather. It’s mid-winter right now, and this wind was going right through me. My clothes were wet with sweat, and with the wind, made the cold weather that much worse.

This got me thinking: what have I got or can I get, that will block the wind, without making me sweat ridiculous amounts?  It’s presently winter, and so now’s a good time to go try an experiment, and see how they fare as the weather patterns shift towards the more humid summer weather.  If I’m still wearing this clothing in July 2015, I’ll be onto something.

Breathalon spray coveralls

I had some Breathalon coveralls lying around, previously I had worn these in wet weather, and found they are not bad.

I bought this pair for about $15 off eBay, but they’re rare as hens teeth. One company sells them for about the AU$150 mark. So not the cheapest, amongst my gripes is that they’re not the most comfortable fit and they have a one-way zip which is an annoyance when nature calls. Apart from that though, they’re a bright yellow, and they’re breathable.

The other gripe I have is no pockets: this particular pair I tried cutting access slits in to gain access to the pockets in my trousers. This proved to be unwise, they now leak in wet weather, so I’ll have to look at sealing those slits somehow.

I tried them one week: I found I sweat less than I did wearing other clothing. With just a lycra stinger suit underneath, I got to work mostly dry and comfortable. This was in dry weather. Summer humidity might be another matter, but in bright sunny winter weather, they were fine. However, they’re very hard to get hold of, and are still quite expensive.

That said, they’re probably 60% of the way there.

Disposable clothing

With the above experiment being largely successful, I considered what else would make the grade. The Breathalon coveralls were okay, but they lacked some features. Could I find some material and make my own?

Will Rietveld provided the inspiration for a cheap alternative: Tyvek coveralls. These are about AU$10 a pair, are generally white in colour (okay, not strictly daytime high-vis, but at least not black like motorcycle rainsuits), very lightweight and were apparently not much different to the old Gore Tex for breathability.

Before doing this, I did some research.  I had seen these before but had dismissed the idea thinking, they’re disposable, surely they won’t last!  Looking around, I found Barefoot Jake’s article which gave them the thumbs up, and Ken K’s forum post giving them the thumbs down.  In the forum post, the comment was the failure was in the seams.  The other two articles mention taping the seams to prevent this problem.

For the cost I thought it worth giving a go. There are a few different fabrics used in this sort of clothing. Tyvek being just one.  They’re usually described in therms of protection classes.

Class 6 coveralls tend to be very flimsy, made from single layered polypropylene and are by far the cheapest at ~AU$5 a pair.  You can just about see through them, wind and water will pass right through.  Maybe you can get some in a bright colour, in which case they’re about as good as a high-vis vest.  For keeping wind and water out: useless.

Class 5 coveralls are made from slightly heavier material such as SMS fabric and are more expensive (~AU$8 a pair).  They’re more opaque (although you can still see clothing through these), will repel water and light spray and block a small amount of wind.  If you’re like me, and a bit self-conscious, you could wear these over the top of more conventional cycle clothing.

I found that water will pool on the fabric, and they are a bit more breathable.  However, the slight transparency is a little disconcerting.  They’re worth a look.

Class 4 coveralls are used for things like asbestos removal.  Materials vary, but in amongst these are the Tyvek ones recommended by Wll’s article.  They can be had for about AU$10 a pair.

I decided to start with these, buying 3 pairs of these.  I noted the fact that the seams were taped a bright orange.  The fact they were taped seemed to suggest that someone had noticed this particular failure mode and had taken particular attention to the problem.  These ones I think are the Hazguard MP4 type material, similar to Tyvek, but with a plastic-like coating.

As I’m after a single-piece suit, I dispensed with the scissors.  When I got home, I tried grabbing a pair, turning a tap on and running the water over them to see what the waterproofing was like.  The water pooled, running my hand under the pool did not reveal any leaks.  So from that perspective, they should do exactly what I’m after.

Things were getting draughty outside so I put the pair on, and after wearing them for a few hours basically just pottering around the house, I hadn’t broken out into a ball of sweat, so breathability was there, a PVC suit would have had me sweating like a pig by then.  I wore them on my way into work to try them out.

First experiments with Class 4 coveralls

First thing that became apparent: as I cycled, the back part ballooned out.  Not necessarily a bad thing, as it made me very obvious to drivers by enlarging my apparent size.  Pedalling appeared to act like a pump, pushing air into the suit, and the air appeared to be trapped.  Like in Will’s experiment, I found that I was starting to sweat after about 20 minutes, and when I got to work, I was noticably more sweaty.  However, it was just humidity, I didn’t feel like I was overheating, nor did I feel cold when the wind blew.

So not quite there, but close.  I can buy Tyvek material on a roll cheap enough, so maybe with some work, we can improve on this.

Class 5 coveralls experiment

Since the humidity really did build up quickly, I thought maybe there was something a little more breathable.  I bought a pair of coveralls that were an SMS-type fabric.  The seems are not taped, and so I suspect these will probably have a blow out at some point.  I did the same waterproofness test and found the water pooled there also, however they’re considered splash resistant, so I suspect the water would seep through eventually.

It was at this point I noticed they were slightly more transparent.  So the following Monday I cycled in them, with one of my lycra stinger suits underneath.  I got to work, not quite as sweaty as the previous week, but still with a noticeable amount of moisture.

One hypothesis: with the Breathalon suit, I also had my stinger suit underneath.  Maybe that was helping by soaking up the sweat rather than letting it bead up on my skin, and allowing it to be more efficiently evaporated?

Class 4 + stinger suit

I tried the stinger suit underneath the class 4 coveralls, and found that the amount of sweat hadn’t changed.  In fact, doing this made things worse, the moist air didn’t dissipate fast enough and once I cooled down, the cold sweat kept me a little too cool.  Without the stinger suit, I’d eventually dry out inside the coveralls after about 15 minutes, but with the stinger suit, I was still damp after 30.

Alternative options

So I hit the web again.  Was the answer to buy another pair of spray coveralls like the Breathalon pair?  There aren’t too many options around here in Australia.  Elliots did make some out of their Zetel material, but they’ve stopped making those (pity, they had pockets!).  Castle Clothing over in the UK make something that looks ideal.  Alas, I tried emailing them to see if they had an Australian distributor — I’m yet to hear back.

Neither of these options are meant for cycling.  Looking around I saw the BikeSuit.  Clearly Olaf Wit had a similar idea, and actually got his to production.  A few comments:

  • The bikesuit comes in one colour: black.  There are some reflective stripes, so I guess that’s kinda class N (night-time: i.e. reflective) high visibility, but I’d like class D (daytime: i.e. bright colour) too.  In fact, if I had to choose between them, I’ll take class D over class N.
  • The idea of using ventilation to prevent sweat build-up looks like just what the doctor ordered.  That said, wearing this over regular clothes — I sweat in regular clothes without any waterproof gear over the top, surely this will not improve the situation?
  • The suit packs up into a bag about the volume of two soccer balls.
  • Watching the video, it appeared clumbersome to put on.  There are zips everywhere.  The fellow takes it out of its bag at time 0:20.  At 0:50, he’s still adjusting things.  10 seconds later, he’s ready to start cycling.
  • They cost over US$340.  Sure breathable and durable fabric can be expensive, but Ouch!

The class 4 coveralls: I timed myself, and it took me about 50 seconds and I was zipped up.  I had work boots on at the time which I did not remove.  About the only thing BikeSuit has over the dispsable coveralls, is ventilation, durability and built-in shoe covers.  It loses on price, availability and visibility.

Poor man’s “bike suit”?

That got me thinking, could I turn these coveralls into a poor man’s bike suit?  I observed how the back of my coveralls ballooned out, what if I made some ventilation holes?

I tried making 10 small holes just below the line of elastic at the back.  I covered the area over with plastic tape first to give the material some re-enforcing, then punched the holes.  The next day I got to work, not quite sweat free, but certainly much dryer than before.  About on par with my experiment in the Breathalon suit.

I’m thinking if I cut a slit horizontally about 30cm long, then glue (sewing is not good with Tyvek) a triangular patch of mesh fabric maybe 40cm wide and 60cm tall to the inside, that would allow the coveralls to vent.  Fold the material over at the bottom so the bottom of the slit is covered by a layer of material, or use some sheet Tyvek to make a flap, and I think I might be onto a low-cost alternative.  Tier Gear sell sheet Tyvek, so a metre or two of that would suffice for adding the extra flaps needed.

As for day/night high visibility: they exist.  More expensive obviously, but they do exist.

The only real question is one of durability.  Thankfully these things pack up so small and are lightweight enough, I can have a spare pair on the bike for wardrobe malfunction emergencies.  They should be good for WICEN events too: often I’m out on a checkpoint in the wind and rain.  Time will be the ultimate test, we shall see.

Apr 122014
 

Well, it seems user interfaces are going around in circles again. Just recently, I’ve been meeting up with the Windows 8.1 style UI. One of my colleagues at work uses this as his main OS, we also have a VM image of Windows Server 2012 R2 (a 180-day evaluation).  One thing I will say, it’s a marginal improvement on Windows 8.  But is it all that new?

win2012r2

Now, the start screen is one of the most divisive aspects of the Windows 8 UI.  Some love it, some hate it.  Me?  Well the first incarnation of it was an utter pigsty, with no categorisation or organisation.  That has improved somewhat now that you can organise tiles into groups.

But hang on, where have I seen that before?

winnt31-1

That looks… familiar… Let me scroll over to the right a bit…

winnt31-2Ohh, how rude of me!  Allow me to introduce you to an old acquaintance…

winnt31-3This, is Windows NT 3.1.  What one might call the first real ancestor of Windows 8.1.  Everything that came before was built on DOS, Windows 8.1 basically calls itself Windows NT 6.3 beneath the UI.  This is the great-great-great-great-great-great-great-great-grandparent of today’s Windows desktop OS.  Yeah, it’s been that many releases: 8.0, 7, Vista, XP, 2000, NT 4, NT 3.51 and NT 3.5 all sit in between.

Now the question comes up, how is it different?  Sure there are similarities, and I will admit I exaggerated these to make a point.  The Program Manager normally looks more like this:

winnt31-4The first thing you’ll notice that the Program manager (the closest you’ll get to the “start screen”) doesn’t normally occupy the full screen, although you can if you ask it to.  In fact, it will share the screen with other things you might have open:

winnt31-5This is good for both novice and power user alike.  The power user is likely wanting to launch some other application and will be thinking about the job at hand.  The novice might have instructions open that they’re trying to follow.  In neither case, do you want to interrupt them with an in-your-face full screen launcher; a desktop computer is not a smartphone.

The shortcoming of the original Program Manager interface in terms of usability was that you were limited to two levels of hierarchy, with the top-level containing only program groups, and the lower level containing only program icons.  The other shortcoming was in switching applications, you either had to know the ALT+TAB shortcut or minimise/restore full-screen applications so you could see the other applications or their icons.  There was no status area either.

Windows 95 improved things in that regard, the start menu could show arbitrary levels and the taskbar provided both a status area and a screen region dedicated to window selection.  As the installer put it, changing applications was “as easy as changing channels on TV”.  (Or words to that effect.  I’ve ran the Windows 95 installer enough times to memorise two OEM keys off-by-heart but not well enough to remember every message word-for-word.)  Windows NT 4.0 inherited this same interface.

This remained largely unchanged until Windows XP, which did re-arrange some aspects of the Start menu.  Windows 2000/ME made some retrograde changes with respect to network browsing (that is, “My Network Places” vs “Network Neighborhood”[sic]) but the general desktop layout was the same.  Windows Vista was the last version to offer the old “classic” menu with it disappearing altogether in Windows 7.  The Vista/7 start menu, rather than opening out across the desktop as you navigated, confined itself to a small corner of the screen.  Windows 8 is the other extreme.

Windows 8, the start screen takes over.  There’s no restore button on this “window”, it’s all or nothing.  Now, before people point me out to be some kind of Microsoft-hater (okay, I do endulge in a bit of Microsoft-bashing, but they ask for it sometimes!) I’d like to point out there are some good points.

The Start screen isn’t a bad initial start point when you first log in, or if you’ve just closed the last application and wish to go someplace else.  It’s a good dashboard, with the ability to display status information.  It’s also good for touch use as the UI elements are well suited to manipulation by even the most chubbiest of digits.

What it is not, is a good launcher whilst you’re in the middle of things.  A more traditional Start menu with an item to open the Start Screen would be better here.  It also does a very poor job of organising applications: something Windows has never been good at.

So I guess you’ve all worked it out by now that I’m no fan of the Windows UIs in general, and this is especially true of Windows 8.1/2012 R2.  Not many UIs have a lower approval rating in my opinion.  The question now of course, is “how would I do things different”?  All very well to have an opinion about what I don’t like, and so far the IT industry has pushed back on Microsoft and said: “We don’t like this”.  The question is, how do they improve.

Contrary to what some might think, my answer is not to roll the UI back to what came in Windows 7.  The following are some mock-ups of some ideas I have decided to share.

Meet a desktop that for now we’ll call the “Skin The Cat” desktop.  The idea is a desktop that provides more than one way to perform typical actions to suit different usage scenarios.  A desktop that follows the mantra… “there’s more than one way to skin a cat”.  (And who hasn’t wanted to go skin one at some point!)  No code has been written, the screenshots here are entirely synthetic, using The Gimp to draw what doesn’t exist.

mockup-desktopSo here, we see the STC desktop, with two text editors going and a web browser (it’s a quiet day for me).  I’ve shown this on a 640×480 screen just for the sake of reducing the amount of pixel art, but this would represent a typical wide-screen form-factor display.

You’ll notice a few things:

  • Rather than having the “task bar” down the bottom, I’ve put it up the left side.
  • The launch bar (as I’ll call it) has two columns, one shows current applications, the other shows that application’s windows.
  • Over on the right, we have our status area, here depicting volume control, WiFi, battery and mail icons, and a clock down the bottom.
  • Bottom left is a virtual desktop pager.  (MacOS X users would call these spaces.)

Why would I break Microsoft tradition and use this sort of layout?  My usual preference is to have the launch bar up the top of the screen rather than down the side, as that’s where your application menus are.  However, monitors are, for better or worse, getting wider rather than taller.  So while there’s plenty of space width-wise, stacking bars horizontally leads to one being forced to peer at their work through the proverbial letter-box slot.  This leaves the full height for the application.

I group the windows by the application that created them.  All windows have a border and title bar, with buttons for maximizing, iconifying (minimising, for you Windows folk) and closing, as well as a window options menu button, which also displays the application’s icon.

So how does this fit the “skin the cat” mantra?  Well the proposal is to support multiple input methods, and to be able to switch between them on the fly.  Thus, icons should be big enough you can get your thumb onto them with reasonable accuracy, most common window operations should be accompanied by keyboard actions, and the controls for a window should be reasonably apparent without needing special guidance.  More importantly, the window management should stay out of the way until the user explicitly requests its attention by means of:

  • clicking on any title bar buttons or panel icons with the mouse
  • tapping on the title bar of a window or on panel icons with one’s finger
  • pressing an assigned key on the keyboard

Tapping on the title bar (which is big enough to get one’s finger on) would enable other gestures momentarily to allow easier manipulation with one’s fingers.  The title bar and borders would be enlarged, and the window manager would respond to flick (close), pinch (restore/iconify or maximise) and slide (move) gestures.

Tapping the assigned keyboard key or keystroke (probably the logo/”Windows”/command key, inference being you wish to do give the window manager a command) would bring up a pop-up menu with common window operations, as well as an option for the launch menu.

Now this is good and well, but how about the launcher?  That was one of my gripes with Windows for many years after all… Well let’s have a look at the launch menu.

mockup-launcherSo here we’ve got a user that uses a few applications a lot.  The launcher provides three views, and the icons are suggestive about what will happen.  What we see is the un-grouped view, which is analogous to the tiles in Windows 8, with the exception that these are just program launchers, no state is indicated here.

The other two views are the category view, and the full-screen view.  These are identical except that the latter occupies the full screen like the present start screen in Windows 8 does.  Optionally the category view (when not full screen) could grow horizontally as the tree is traversed.  Here, we see the full-screen version.  Over on the right is the same list of frequent applications shown earlier.  As you navigate, this would display the program items belonging to that category.

mockup-launcher-fullscrn

Here, we choose the Internet category.  There aren’t any program icons here, but the group does have a few sub-groups.  The “Internet” category name is displayed vertically to indicate where we are.mockup-launcher-fullscrn-inetLooking under Web Browsers, we see this group has no sub-groups, but it does have two program icons belonging to it.  The sub-group, “Web Browsers” is displayed vertically.  Tapping/clicking “Internet” would bring us back up to that level.mockup-launcher-fullscrn-inet-web

In doing this, we avoid a wall of icons, as is so common these days on Android and iOS.  The Back and Home links go up one level, and up to the top respectively.

The launcher would remember what state it was last in for next time you call it up.  So if you might organise specialised groups for given tasks, and have in them the applications you need for that task.  It’d remember you opening that group last time so you’d be able to recall applications as needed.

Window management is another key feature that needs to be addressed.  The traditional GUI desktop has been a cascading one, where dialogue boxes and windows are draw overlapping one another.  Windows 8 was a throw-back to Windows 2.1 in some ways in that the “Modern” (god I hate that name) interface is fundamentally a tiling one.

There are times when this mode of display is the most appropriate, and indeed, I use tiling a lot.  I did try Awesome, an automatic tiling window manager for a while, but found the forced style didn’t suit me.  A user should be able to define divisions on a given desktop and have windows tiled within them.  The model would be similar to how spreadsheet cells are resized and optionally merged.  A user might initiate tiled mode by defining the initial number of rows and columns (which can be added or subtracted from later).  They then can “snap” individual windows to groups of cells, resizing the divisions as required.  Resizing and moving a window would then move in units of one “cell”.

At the request of the user, individual windows can then be “floated” from the cellular layout allowing them to be cascaded.  Multiple windows may also occupy a cell group, with the title bar becoming “tabbed’ (much like in FluxBox) to allow selection of the windows within that cell group.

I haven’t got any code for the above, this is just some ideas I’ve been thinking about for a while, particularly this afternoon.  If I get motivated, we may see a “skin the cat desktop” project come into existence.  For now though, I’ll continue to do battle with what I use now.  If someone (commercial or open-source motivated) wants to try and tackle the above, you’re welcome to.  However, the fact the ideas are documented here means there is prior art (in fact, there is prior art in some of the above), and I’d appreciate a little attribution.

Mar 272014
 

Well, the search goes on. This saga started out initially as questions began about where MH370 had gone.

As if a large aircraft could just disappear into thin air.

There’s no question that the plane has crashed. Almost certainly it crashed out in the ocean, a particularly nasty stretch of ocean that’s home to some of the roughest seas.

Sure, they’ll have life rafts and other safety devices on board. Anyone caught out in the ocean for this long would be long dead by now. To the families of those on board, I do offer my condolences.

The question remains as to why this happened though. I was thinking about this, this morning. The plane was destined for Beijing. It is understood it crashed somewhere south-west of Perth.

Far be it from me to raise conspiracy theories, but it got me thinking: was someone on that plane trying to take an Al Queda-esque swipe at Perth?

Update: Discussing the thought here, we rather suspect the 2 hour long recording on the flight recorder will be largely silent. It’s too far west for Perth, although there might’ve been intentions to alter course later. Something caused the course for the autopilot to be set down south, and the plane flew more-or-less dead straight. There are questions being raised about one of the pilot’s marital situation, which might hint at a suicide attempt. I guess I’ll stop here, and let the authorities do their investigation.

Mar 082013
 

Just recently, I managed to kill yet another hand-held. Not deliberately, just a combination of conditions and not adapting my behaviour to suit.

I have a Yaesu VX8-DR, which I mainly use on the bicycle for APRS. It isn’t bad, the GPS could be faster, and the Bluetooth is more of a gimmick (in that it only works with some Bluetooth headsets and is intermittent at best), but my biggest nit with it, is that you can’t charge the thing while it’s turned on.

This leads me to the bad habit of just leaving a DC power lead semi-permanently plugged into the side, with the other end plugged into the 12V supply on the bicycle. You guessed it… one bad day of rain, some water got in via the DC jack and basically destroyed it.  I’m pretty sure warranty doesn’t cover that kind of abuse.

I’m not in a hurry to buy another one.  In fact, I probably won’t.  I’m too clumsy to look after an expensive one, so better just to keep the two Chinese cheapies going (Wouxun KG-UVD1P’s).  This lead me to thinking about what I specifically like in a hand-held, and what features I’d look for.

Looking around, it seems the vast majority of sets out there are evolutionary.  An extra handful of memory channels, higher power, bigger battery, ohh look Bluetooth, and this one has {insert some semi-proprietary-digital-mode here}.  Yawn!

Most of them have tiny screens which can’t show a decent amount of information at a glance.  Digital voice is a long way being usable, with about 3 or 4 proprietary or semi-proprietary competing standards.  What about D-Star you say?  Well, what about it.  Nice mode, pity about the codec.  How about P25?  Same deal.

If a digital mode is going to succeed in Amateur radio, it’ll be necessary for a home base to be able to implement it with nothing more than a desktop or laptop computer loaded with appropriate Free Software and a sound card interface.  Not a silly proprietary “DV-Dongle” or some closed-source blob that speaks gibberish no other software can understand.

As for portable use; it should be possible for a hand-microphone that implements the mode on a DSP be plugged into an existing hand-held (like the Wouxun or Yaesu sets I mentioned earlier) to make it interoperable — open standards will help keep costs down here.

Until such a mode comes along (and they’re working on it — already making excellent progress on HF, keep it up guys!) there’s no point in pouring money into a digital mode that will be a white elephant in a few years.

By far the most popular mode on VHF and UHF is plain old FM.  The mode Armstrong made.  It’s everywhere, from your cheap $100 Chinese firecracker set to the most expensive SDR, they all offer it.  Repeaters abound, and it’s available to pretty much all amateur license classes.  And it works good enough for most.

The big problem with FM, is interfacing with repeaters.  In particular, the big use case with hand-helds and repeaters, is being able to recall the settings for a repeater where ever you happen to be.  Now you can carry around a booklet with the settings written in, and punch them into your radio each time.

This works better for some than others.  On the KG-UVD1P with its horrid UI, it is a tiresome affair.  The Yaesu VX-8DR and Kenwood TH-F7E aren’t bad, once you get used to them.  It’s still fiddly and time consuming, definitely not an option while mobile.  This is where memory channels come in.

Now I realise that sets which stored more channels than you could count on one digit-challenged hand were considered a revolution about 10 years ago.  Back then the idea that you could basically control a digital counter, which would supply an address to an EEPROM that would spit out the settings to drive a PLL synthesizer and other control circuitry was truly remarkable.

Today, the EEPROM and counter have been replaced by a MCU that reads the keypad matrix and outputs to a LCD panel, but we’re still basically incrementing a counter that’s acting as an address offset into non-volatile memory.  The only change has been the number of channels.  The Kenwood set I had gave you 400.  The VX-8, gives you 1000 — which can be optionally grouped into 24 banks (by far the best system I’ve seen to date).  The Wouxun gives you a poultry 128.

The hardest thing about this is finding a given repeater in a list.  128 is more than enough if you don’t travel, or if you pre-programme the set with the appropriate channels in some logical ordering before you leave.  In there hints another factor; “logical ordering”, since there’s no way to sort the memory channels by anything other than channel number.

In this day and age, 1000 channels, linearly indexed, is a joke.  I can buy a 2GB MicroSD card from the supermarket for $10.   How much repeater data could you store on one of those?  FAT file system drivers are readily implementable in modern MCUs and a simple CSV file is not that big a deal for a MCU to parse.

It wouldn’t be difficult to build up a few indexes of byte locations to store in NVRAM, and have the CSV store frequency, call sign, a Maidenhead locator and other settings of all the repeaters in the country, then allow the user to choose one searching by frequency, by call-sign, or if the user gives their current grid square (or it derives it from a GPS), by proximity.  That would be a revolution.  The same card could also store a list of Echolink and IRLP nodes, and make a note of such nodes via RF so it can automatically suggest the nearest IRLP node, take you there, then dial whatever node for you after you announce yourself on the frequency.

I’ve seen more elaborate software written for 8-bit micros like the Apple II, the Commodore 64 and the Sinclair ZX Spectrum back in the day, so clearly not beyond today’s equally powerful AVR, PIC, MSP430 and ARM chips.  A STM32F103RE packs 64KB RAM, 512KB flash and a SDIO interface in a nice small TQFP64 package and costs less than $8.  Even for a Wouxun, that’ll maybe add no more than 20% still keeping it rather competitive with the opposition.

As for user interface?  We don’t need Android on there, although that could be nice.  A decent size resistive touch-screen with a reflective dot-matrix LCD would more than suffice.  This technology, thanks to mobile phones, is cheap enough to implement in this application.  The MCUs needed to drive them have also come down in cost greatly.

Even without the touch-screen — a LCD bigger than a matchbox would allow for text that is easily readable, menus that aren’t constrained in their presentation, and a generally nicer user experience.

SDR hand-helds will likely be the next big revolution, if they are affordable, but I feel that’ll be a way off, and for rag chew on a local repeater, I doubt SDR will be that much superior.  It certainly will push the price up though.

I suppose a start will be to try and come up with a suitable front-end device that can be bolted onto existing transceiver hardware, maybe something that drives the computer control port of a mobile rig such as the FT-857 or IC-706.

From there, it just takes one brave manufacturer to package such a device up with a suitable transceiver in a hand-held form factor to put something to market.  If they did so in a way that could keep this UI module open-source, even better.  Bonus points if there’s a bit of an interface that can take a DSP for digital modes.

Want D-Star, P25, FreeDV, Wongi?  You got it, just slot in the right module, load on the firmware into the UI module, and away you go.  Want to do something special?  Break out the text editor and compiler and start hacking.  The RF side of things can still be as it was before, so shouldn’t pose any more of a problem for regulators than a transceiver with a digital modes jack and computer control interface.

I’m not sure if anyone has worked on such a front-end.  Another option would be a cradle that takes a modern smart-phone or tablet, interfaces via USB to the set, and uses the smart-phone as the UI, also extending the phone’s battery at the same time by supplying the 5V it needs to charge.  Bonus points if it can feed the audio signal to/from the phone for digital modes and/or interfacing with BlueTooth.  A pocket APRS I-Gate and Echolink node, perhaps?  Whatever takes your fancy.

I guess the real answer here will be to come up with something and see if there’s any interest — the “throw it against the wall and see what sticks” approach.

Mar 032013
 

This isn’t the first time I’ve commented about user interfaces.  This weekend has been particularly wet, and while I did have plans to get out of the house for a few hours, I’ve spent a week-end indoors.

This meant time to go tinker with a few projects, amongst those being my desktop configuration.  In particular, key bindings, and panel placement.

My desktop is largely based on what I was running when I was at university.  At this time, I was mostly limping around with aging Pentium II class laptops with display resolutions of 800×600 pixels.  No space for big spacious panels here.  I was running KDE at the time.

I frequently had a need to move windows around without the use of an external mouse.  The little writing boards that the university provides in its lecture theatres are barely big enough to accommodate a writing pad let along a laptop!  The laptops I had generally had the small track-point type cursor, which while usable, wasn’t great.

That, and small screen real-estate dictated a desktop which was space efficient and mostly keyboard driven.

Now, I could have gone the route of tiling window managers.  I recall trying out Ratpoison at one point, decided it wasn’t for me, and let it go.  I do like a little eye candy!  KDE 3.5 at the time provided a good balance, and wasn’t too bad on these machines.  Crucially, I was able to set up a couple of small panels around the sides of the screen, and set up key bindings to perform most operations.  Some of the common operations I needed were:

Key binding Action
Logo + Shift + M Move a window
Logo + Shift + R Resize a window
Logo + Shift + S Shade (roll up) a window
Logo + Shift + X Maximise a window
Logo + Shift + C Close a window
Logo + Shift + I Iconify (minimise) a window
Logo + Menu (now broken) Bring up launcher menu

You’ll note here I use the term Logo to refer to the additional modifier on some keyboards. Apple users will know this as the Command key. Microsoft users will of course call this the Windows key. On my Yeeloong, the key looks like a house in a white circle. FVWM calls it Mod4. I call it the Logo key because it usually has some sort of OS-vendor-specific logo associated with it.

I use this, rather than Control and Alternate, since most applications bind their operations to those keys, and ignore the Logo key. By requiring the Logo key to be used in command sequences, it leaves Control and Alternate for applications. Control and Alternate of course, still get used in combination with Logo to extend the command set. The key bindings in bold get used the most.

One key stroke did change; originally I had Logo + Menu as the key sequence to bring up the launcher menu. I found KDE 4 broke this, I could no longer assign the Menu key (the one that brings up context menus in Windows) to this action. I haven’t really settled on a suitable substitute, and of course, my current Apple MacBook does not have this key, so I use Logo+Launch instead (what shows the dashboard in MacOS X), a little more of a stretch, but it works.

The layout I settled on was to have a menu and task bar up the top of the screen, then status notification and pager down the right-hand side.  Each panel occupied no more than 16 pixels.  I made extensive use of virtual desktops, configuring 12 virtual desktops and assigning these to each of the function keys. So Logo + F5 meant, go to desktop 5. Logo + Shift + F5 meant, move window to desktop 5.

Thus I could juggle the laptop in my hands, launching applications, moving them between desktops, figuring out where I needed to be next and getting things ready on my way between lectures and tutorials.

Over time, KDE became unsuitable as a full blown desktop due to its memory footprint. I found myself looking around and for a while, I was running FVWM. After a bit of research I figured out how to set up the bindings in much the same way. I managed to get FVWM’s BarButtons to emulate the side panel, but have the panel just lurk in the background and be recalled when necessary. And of course, the root menu and icons shown on the desktop mostly removed the need for a task bar.

This worked well, until I got the MacBook. The MacBook’s keyboard does have function keys, but requires you to hold the Fn key to access them. So the “Move to desktop 3” operation which I do so commonly, became Logo+Shift+Fn+F3. A real finger-twister. I’ve just put up with it until now.

What’s replaced it? Well I’ve done some re-working. Icons on the desktop for iconified windows is good and well but you’ve got to be able to get at them. That, and getting at the menu with the mouse in one hand proved to be a hassle, so for those tasks, the task-bar is back with its launcher button.

The BarButtons has been given a title bar, and the EWMH working area has been set so that applications do not cover the task bar, or the BarButtons title bar, allowing those to be accessed with the mouse alone. The BarButtons can be raised or lowered with a new key sequence, Logo+Z.

The application switching was always a chore. This is probably the one exception to the Logo for Window manager command rule; I used the very prevalent Alternate+Tab to switch applications. FVWM does this out-of-the-box unsurprisingly. I found with a lot of applications though, going Alt-Tab,Tab,Tab,Tab just a little annoying.

Logo+A brings up the window list and the window list stays there until a choice is made. The windows are numbered with a digit, and one can use arrow keys to select. Much better.

As for the virtual desktops. FVWM has desktops and pages. What I was using as “desktops” before, were just pages of a single desktop. I decided that once again we’d do proper virtual desktops, but only 4 of them, with 4 pages each. That’s 16 “spaces” in total. But how to switch between them? With new key bindings:

Key binding Action
Logo + 1 Jump to Desktop 1
Logo + 2 Jump to Desktop 2
Logo + 3 Jump to Desktop 3
Logo + 4 Jump to Desktop 4
Logo + Q Jump to Page 1
Logo + W Jump to Page 2
Logo + E Jump to Page 3
Logo + R Jump to Page 4
Logo + T Jump to last page
Logo + Backtick Jump to last desktop
Logo + Tab Jump to last desktop and page
Logo + Shift + 1 Move window to Desktop 1
Logo + Shift + 2 Move window to Desktop 2
Logo + Shift + 3 Move window to Desktop 3
Logo + Shift + 4 Move window to Desktop 4
Logo + Shift + Q Move window to Page 1
Logo + Shift + W Move window to Page 2
Logo + Shift + E Move window to Page 3
Logo + Shift + R Move window to Page 4
Logo + Shift + T Move window to last page
Logo + Shift + Backtick Move window to last desktop
Logo + Shift + Tab Move window to last desktop and page

Then there’s the key binding Logo + Escape, which brings up a Jump To menu, where I can just press one of the numeric keys 1-4, which pops up a menu allowing me to select the page 1-4. So to jump to Desktop 4 Page 2, I just hit Logo+Escape, 4, 2. I’m still working on the move bit, while FVWM’s GotoDeskAndPage is documented and works, I’m having trouble with MoveToDeskAndPage, which seems to be undocumented.

We shall see on Monday how the new arrangement goes.

Accessing applications I think will be the next point to work on. I’m thinking about how to code a suitable launcher that can be summoned by FVWM. There are elements of common launchers such as the Windows Start menu, and its replacement, the Start Screen in Windows 8 that are good. The Program Manager was limited by its MDI interface, but the program groups meant there was a hierarchy that the Windows 8 start screen, and indeed, other contemporary launchers, lack.

The FVWM launcher isn’t bad — but it makes it hard to re-arrange items, there the old Program Manager really does shine. Want a new program group? Choose File, New, select Program Group, click OK, done. Want to add an icon to that group? Similar process. Adding and managing applications really does need to be that simple, so the end user can put things where they want them.

That, and keyboard accessibility is a must. FVWM has a little bug which is evident in this screenshot…

FVWM 2.6.5 as I have it now.

FVWM 2.6.3 as I have it now.

If you look at the menu, you’ll notice two things:

  1. There are multiple menu items with the same access key
  2. Where a menu item contains an ampersand, the space following is treated as the access key.

That’s a minor bug with FVWM’s AutomaticHotKeys. Fixable for sure. That particular menu is generated by a Perl script distributed with FVWM; I have a patch that tweaks it to put ampersands in appropriate places to give all the items access key assignments, but I think there are better ways.

In the meantime, those who are interested, my FVWM configuration is here. This will expand into the .fvwm directory.

Feb 032013
 

Warning, this is a long post written over some hours. It is a brain dump of my thoughts regarding user interfaces and IT in general.

Technology is a funny beast. Largely because people so often get confused about what “technology” really is. Wind back the clock a few hundred millenia, then the concept of stone tools was all the rage. Then came metallurgy, mechanisation, industrialisation, with successive years comes a new wave of “technology”.

Today though, apparently it’s only these electronic gadgets that need apply. Well, perhaps a bit unfair, but the way some behave, you could be forgiven for thinking this.

What is amusing though, is when some innovation gets dreamt up, becomes widespread (or perhaps not) but then gets forgotten about, and re-discovered. No more have I noticed this, but in the field of user interfaces.

My introduction to computing

Now I’ll admit I’m hardly an old hand in the computing world. Not by a long shot. My days of computing go back to no later than about the late 80’s. My father, working for Telecom Australia (as they were then known) brought home a laptop computer.

A “luggable” by today’s standards, it had no battery and required 240V AC, a smallish monochrome plasma display with CGA graphics. The machine sported a Intel 80286 with a 80287 maths co-processor, maybe 2MB RAM tops, maybe a 10MB HDD and a 3.5″ floppy drive. The machine was about 4 inches high when folded up.

It of course, ran the DOS operating system. Not sure what version, maybe MS-DOS 5. My computing life began with simple games that you launched by booting the machine up, sticking in a floppy disk (a device you now only ever see in pictorial form next to the word “Save”) into the drive, firing up X-Tree Gold, and hunting down the actual .exe file to launch stuff.

Later on I think we did end up setting up QuikMenu but for a while, that’s how it was. I seem to recall at one point my father bringing home something of a true laptop, something that had a monochrome LCD screen and an internal battery. A 386 of some sort, but too little RAM to run those shiny panes of glass from Redmond.

Windows

I didn’t get to see this magical “Windows” later until about 1992 or ’93 or so when my father brought home a brand-new desktop. A Intel 486DX running at 33MHz, 8MB RAM, something like a 150MB HDD, and a new luxury, a colour VGA monitor. It also had Windows 3.1.

So, as many may have gathered, I’ve barely known computers without a command line. Through my primary school years I moved from just knowing the basics of DOS, to knowing how to maintain the old CONFIG.SYS and AUTOEXEC.BAT boot scripts, dealing with WIN.INI, fiddling around with the PIF editor to get contankerous DOS applications working.

Eventually I graduated to QBasic and learning to write software. Initially with only the commands PRINT and PLAY, baby steps that just spewed rubbish on the screen and made lots of noise with the PC speaker, but it was a start. I eventually learned how to make it do useful things, and even dabbled with other variants like CA Realizer BASIC.

My IT understanding entirely revolved around DOS however and the IBM PC clone. I did from time to time get to look at Apple’s offerings, at school there was the odd Apple computer, I think one Macintosh, and a few Apple IIs. With the exception of old-world MacOS, I had not experienced a desktop computer OS lacking a command line.

About this sort of time frame, a second computer appeared. This new one was a AMD Am486DX4 100MHz with I think 16MB or 32MB RAM, can’t recall exactly (it was 64MB and a Am5x86 133MHz by the time the box officially retired). It was running a similar, but different OS, Windows NT Workstation 3.1.

At this point we had a decent little network set up with the two machines connected via RG58 BNC-terminated coax. My box moved to Windows for Workgroups 3.11, and we soon had network file sharing and basic messaging (Chat and WinPopup).

Windows 95

Mid 1996, and I graduated to a new computer. This one was a Pentium 133MHz, 16MB RAM, 1GB HDD, and it ran the latest consumer OS of the day, Windows 95. Well, throw out everything I knew about the Program Manager. It took me a good month or more to figure out how to make icons on the desktop to launch DOS applications without resorting to the Start ? Run ? Browse dance.

After much rummaging through the Help, looking at various tutorials, I stumbled across it quite by accident — the right-click on the desktop, and noticing a menu item called “New”, with a curious item called “Shortcut”.

I later found out some time later that yes, Windows 95 did in fact have a Program Manager, although the way Windows 95 renders minimised MDI windows meant it didn’t have the old feel of the earlier user interface. I also later found out how to actually get at that start menu, and re-arrange it to my liking.

My father’s box had seen a few changes too. His box moved from NT 3.1, to 3.5, to 3.51 and eventually 4.0, before the box got set aside and replaced by a dual Pentium PRO 200MHz with 64MB RAM.

Linux

It wasn’t until my father was going to university for a post-grad IT degree, that I got introduced to Linux.In particular, Red Hat Linux 4.0. Now if people think Ubuntu is hard to use, my goodness, you’re in for a shock.

This got tossed onto the old 486DX4/100MHz box, where I first came to experiment.

Want a GUI? Well, after configuring XFree86, type ‘startx’ at the prompt. I toiled with a few distributions, we had these compilation sets which came with Red Hat, Slackware and Debian (potato I think). First thing I noticed was the desktop, it sorta looked like Windows 95.

The window borders were different, but I instantly recognised the “Start” button. It was FVWM2 with the FVWMTaskBar module. My immediate reaction was “Hey, they copied that!”, but then it was pointed out to me, that this desktop environment was somewhat older than the early “Chicago” releases by at least a year.

The machines at the uni were slightly different again, these ones did have more Win95-ish borders on them. FVWM95.

What attracted me to this OS initially was the games. Not the modern first person shooters, but games like Xbilliard, Xpool, hextris, games that you just didn’t see on DOS. Even then, I discovered there were sometimes ports of the old favourites like DOOM.

The OS dance

The years that followed for me was an oscillation between Windows 3.1/PC DOS 7, Windows 95, Slackware Linux, Red Hat Linux, a little later on Mandrake Linux, Caldera OpenLinux, SuSE Linux, SCO OpenServer 5, and even OS/2.

Our choise was mainly versions of SuSE or Red Hat, as the computer retailer near us sold boxes of them. At the time our Internet connection was via a belovid 28.8kbps dial-up modem link with a charge of about $2/hr. So downloading distributions was simply out of the question.

During this time I became a lot more proficient with Linux, in particular when I used Slackware. I experimented with many different window managers including: twm, ctwm, fvwm, fvwm2, fvwm95, mwm, olvwm, pmwm (SCO’s WM), KDE 1.0 (as distributed with SuSE 5.3), Gnome + Enlightenment (as distributed with Red Hat 6.0), qvwm, WindowMaker, AfterStep.

I got familiar with the long-winded xconfigurator tool, and even getting good at having an educated guess at modelines when I couldn’t find the specs in the monitor documentation. In the early days it was also not just necessary to know what video card you had, but also what precise RAMDAC chip it had!

Over time I settled on KDE as the desktop of choice under Linux. KDE 1.0 had a lot of flexibility and ease of use that many of its contemporaries lacked. Gnome+Enlightenment looked alright at first, but then the inability to change how the desktop looked without making your own themes bothered me, the point and click of KDE’s control panel just suited me as it was what I was used to in Windows 3.1 and 95. Not having to fuss around with the .fvwm2rc (or equivalent) was a nice change too. Even adding menu items to the K menu was easy.

One thing I had grown used to on Linux was how applications install themselves in the menu in a logical manner. Games got stashed under Games, utilities under Utilities, internet stuff under Internet, office productivity tools under Office. Every Linux install I had, the menu was neatly organised. Even the out-of-the-box FVWM configuration had some logical structure to it.

As a result, whenever I did use Windows on my desktop, a good amount of time was spent re-arranging the Start menu to make the menu more logical. Many a time I’d open the Start menu on someone else’s computer, and it’d just spew its guts out right across the screen, because every application thinks itself deserving of a top-level place below “Programs”.

This was a hang-over of the days of Windows 3.1. The MDI-style interface that was Program Manager couldn’t manage anything other than program groups as the top-level, and program items below that. Add to this a misguided belief that their product was more important than anyone elses, application vendors got used to this and just repeated the status quo when Windows 95/NT4 turned up.

This was made worse if someone installed Internet Explorer 4.0. It invaded like a cancer. Okay now your screenful of Start menu didn’t spew out across the screen, it just crammed itself into a single column with tiny little arrows on the top and bottom to scroll past the program groups one by one.

Windows 95 Rev C even came with IE4, however there was one trick. If you left the Windows 95 CD in the drive on the first boot, it’d pop up a box telling you that the install was not done, you’d click that away and IE4 Setup would do its damage. Eject the CD, and you were left with pristine Windows 95. Then when IE5 came around, it could safely be installed without it infecting everything.

Windows 2000

I never got hold of Windows 98 on my desktop, but at some point towards the very end of last century, I got my hands on a copy of Windows 2000 Release Candidate 2. My desktop was still a Pentium 133MHz, although it had 64MB RAM now and few more GB of disk space.

I loaded it on, and it surprised me just how quick the machine seemed to run. It felt faster than Windows 95. That said, it wasn’t all smooth sailing. Where was Network Neighbourhood? That was a nice feature of Windows 95. Ohh no, we have this thing called “My Network Places” now. I figured out how to kludge my own with the quick-launch, but it wasn’t as nice since applications’ file dialogues knew nothing of it.

The other shift was that Internet Explorer still lurked below the surface, and unlike Windows 95, there was no getting rid of it. My time using Linux had made me a Netscape user and so for me it was unneccesary bloat. Windows 2000 did similar Start Menu tricks, including “hiding” applications that it thought I didn’t use very often.

If it’s one thing that irritates me, it’s a computer hiding something from me arbitrarily.

Despite this, it didn’t take as long for me to adapt to it as I did from Windows 3.1 to 95 though, as the UI was still much the same. A few tweaks here and there.

In late 2001, my dinky old Pentium box got replaced. In fact, we replaced both our desktops. Two new dual Pentium III 1GHz boxes, 512MB RAM. My father’s was the first, with a nVidia Riva TNT2 32MB video card and a CD-ROM drive. Mine came in December as a Christmas/18th birthday present, with a ATI Radeon 7000 64MB video card and a DVD drive.

I was to run the old version of Windows NT 4.0 we had. Fun ensued with Windows NT not knowing anything about >8GB HDDs, but Service Pack 6a sorted that out, and the machine ran. I got Linux on there as well (SuSE initially) and apart from the need to distribute ~20GB of data between many FAT16 partitions of about 2GB each (Linux at this time couldn’t write NTFS), it worked. I had drive letters A through to O all occupied.

We celebrated by watching a DVD for the first time (it was our first DVD player in the house).

NT 4 wasn’t too bad to use, it was more like Windows 95, and I quickly settled into it. That said, its tenure was short lived. The momoent anything went wrong with the installation, I found I was right back to square one as the Emergency Repair Disk did not recognise the 40GB HDD. I wrustled up that old copy of Windows 2000 RC2 and found it worked okay, but wouldn’t accept the Windows 2000 drivers for the video card. So I nicked my father’s copy of Windows 2000 and ran that for a little while.

Windows XP was newly released, and so I did enquire about a student-license upgrade, but being a high-school student, Microsoft’s resellers would have none of that. Eventually we bought a new “Linux box” with an OEM copy of Windows 2000 with SP2, and I used that. All legal again, and everything worked.

At this point, I was dual-booting Linux and Windows 2000. Just before the move to ADSL, I had about 800 hours to use up (our dial-up account was one that accumulated unused hours) and so I went on a big download-spree. Slackware 8.0 was one of the downloaded ISOs, and so out went SuSE (which I was running at the time) and in went Slackware.

Suddenly I felt right at home. Things had changed a little, and I even had KDE there, but I felt more in control of my computer than I had in a long while. In addition to using an OS that just lets you do your thing, I had also returned from my OS travels, having gained an understanding of how this stuff works.

I came to realise that point-and-click UIs are fine when they work, hell when they don’t. When they work, any dummy can use them. When they don’t, they cry for the non-dummies to come and sort them out. Sometimes we can, sometimes it’s just reload, wash-rinse-repeat.

No more was this brought home to me when we got hold of a copy of Red Hat 8.0. I tried it for a while, but was immediately confronted by the inability to play the MP3s that I had acquired (mostly via the sneakernet). Ogg/Vorbis was in its infancy and I noticed that at the time, there didn’t seem to be any song metadata such as what ID3 tags provided, or at least XMMS didn’t show it.

A bit of time back on Slackware had taught me how to download sources, read the INSTALL file and compile things myself. So I just did what I always did. Over time I ran afoul with the Red Hat Package Manager, and found myself going in cycles doing many RPM solving dependency hell.

On top of this, there was now the need to man-handle the configuration tools that expected things the way the distribution packagers intended them.

Urgh, back to Slackware I go. I downloaded Slackware 9.0 and stayed with that a while. Eventually I really did go my own way with Linux From Scratch, which was good, but a chore.

These days I use Gentoo, and while I do have my fights with Portage (ohh slot-conflict, how I love you!!!), it does usually let me do what I want.

A time for experimentation and learning

During this time I was mostly a pure KDE user. KDE 2.0, then 3.0. I was learning all sorts of tricks reading HOWTO guides on the Linux Documentation Project. I knew absolutely no one around me that used Linux, in fact on Linux matters, I was the local “expert”. Where my peers (in 2002) might have seen it once or twice, I had been using it since 1996.

I had acquired some more computers by this time, and I was experimenting with setting up dial-up routers with proxy servers (Squid, then IP Masquerade), turning my old 386 into a dumb terminal with XDMCP, getting interoperation between the SCO OpenServer box (our old 486DX4/100MHz).

The ability for the Windows boxes to play along steadily improved over this time, from ethernet frames that passed like ships in the night (circa 1996; NetBEUI and IPX/SPX on the Windows 3.1, TCP/IP on Linux) though to begrudging communications with TCP/IP with newer releases of Windows.

Andrew Tridgell’s SAMBA package made its debut in my experimentation, and suddenly Windows actually started to talk sensible things to the Linux boxes and vice versa.

Over time the ability for Linux machines and Windows boxes to interoperate has improved with each year improving on the next layer in the OSI model. I recall some time in 1998 getting hold of an office suite called ApplixWare, but in general when I wanted word processing I turned to Netscape Composer and Xpaint as my nearest equivalent.

It wasn’t until 2000 or so that I got hold of StarOffice, and finally had an office suite that could work on Windows, Linux and OS/2 that was comparable to what I was using at school (Microsoft Office 97).

In 2002 I acquired an old Pentium 120MHz laptop, and promptly loaded that with Slackware 8 and OpenOffice 1.0. KDE 3.0 chugged with 48MB RAM, but one thing the machine did well was suspend and resume. A little while later we discovered eBay and upgraded to a second-hand Pentium II 266MHz, a machine that served me well into the following year.

For high-school work, this machine was fine. OpenOffice served the task well, and I was quite proficient at using Linux and KDE. I even was a trend-setter… listening to MP3s on the 15GB HDD a good year before the invention of the iPod.

Up to this point, it is worth mentioning that the Microsoft world, the UI hadn’t changed all that much in the time between Windows 95 and 2000/ME. Network Neighbourhood was probably the thing I noticed the most. At this time I was usual amongst my peers in that I had more than one computer at home, and they all talked to each other. Hence why Windows 95/98 through to 2000/ME didn’t create such an uproar.

What people DID notice was how poorly Windows ME (and the first release of 98) performed “under the hood”. More so for the latter than the former.

Windows XP

Of course I did mention trying to get hold of Windows XP earlier. It wasn’t until later in 2002 when the school was tossing out a lab of old AMD K6 machines with brand new boxes (and their old Windows NT 4 server for a Novell one) that I got to see Windows XP up close.

The boxes were to run Windows 2000 actually, but IBM had just sent us the boxes preloaded with XP, and we were to set up Zenworks to image them with Windows 2000. This was during my school holidays and I was assisting in the transition. So I fired up a box and had a poke around.

Up came the initial set up wizard, with the music playing, animations, and a little question mark character which at first did its silly little dance telling you how it was there to help you. Okay, if I had never used Windows before, I’d be probably thankful this was there, but this just felt like a re-hash of that sodding paperclip. At least it did eventually shut up and sit in the corner where I could ignore it. That wasn’t the end of it though.

Set up the user account, logged in, and bam, another bubble telling me to take the tour. In fact, to this day that bubble is one of the most annoying things about Windows XP because 11 years on, on a new install it insists on bugging you for the next 3 log-ins as if you’ve never used the OS before!

The machines had reasonable 17″ CRT monitors, the first thing I noticed was just how much extra space was taken up with the nice rounded corners and the absence of the My Computer and My Network Places icons on the desktop. No, these were in the Start menu now. Where are all the applications? Hidden under All Programs of course.

Hidden and not even sorted in any logical order, so if you’ve got a lot of programs it will take you a while to find the one you want, and even longer to find it’s not actually installed.

I took a squiz at the control panel. Now the control panel hadn’t changed all that much from Windows 3.1 days. It was still basically a window full of icons in Windows 2000/ME, albeit since Windows 95, it now used Explorer to render itself rather than a dedicated application.

Do we follow the tradition so that old hands can successfully guide the novices? No, we’ll throw everyone in the dark with this Category View nonsense! Do the categories actually help the novices? Well for some tasks, maybe, but for anything advanced, most definitely not!

Look and feel? Well if you want to go back to the way Windows used to look, select Classic theme, and knock yourself out. Otherwise, you’ve got the choice of 3 different styles, all hard-coded. Of course somewhere you can get additional themes. Never did figure out where, but it’s Gnome 1.0-style visual inflexibility all over again unless you’re able to hack the theme files yourself.

No thank-you, I’ll stick with the less pixel-wasting Classic theme if you don’t mind!

Meanwhile in Open Source land

As we know, it was a long break between releases of Windows XP, and over the coming years we heard much hype about what was to become Vista. For years to come though, I’d be seeing Windows XP everywhere.

My university work horses (I had a few) all exclusively ran Linux however. If I needed Windows there was a plethora of boxes at uni to use, and most of the machines I had were incapable of running anything newer than Windows 2000.

I now was more proficient in front of a Linux machine than any version of Windows. During this time I was using KDE most of the time. Gnome 2.0 was released, I gave it a try, but, it didn’t really grab me. One day I recall accidentally breaking the KDE installation on my laptop. Needing a desktop, I just reached for whatever I had, and found XFCE3.

I ran XFCE for about a month or two. I don’t recall exactly what brought me back to KDE, perhaps the idea of a dock for launching applications didn’t grab me. AfterStep afterall did something similar.

In 2003, one eBay purchase landed us with a Cobalt Qube2 clone, a Gateway Microserver. Experimenting with it, I managed to brick the (ancient) OS, and turned the thing into a lightish door stop. I had become accustomed to commands like `uname` which could tell me the CPU amonst other things.

I was used to seeing i386, i486, i586 and i686, but this thing came back with ‘mips’. What’s this? I did some research and found that there was an entire port. I also found some notes on bootstrapping a SGI Indy. Well this thing isn’t an Indy, but maybe the instructions have something going for them. I toiled, but didn’t get far…

Figuring it might be an idea to actually try these instructions on an Indy, we hit up eBay again, and after a few bids, we were the proud owners of a used SGI Indy R4600 133MHz with 256MB RAM running IRIX 6.5. I toiled a bit with IRIX, the 4DWM seemed okay to use, but certain parts of the OS were broken. Sound never worked, there was a port of Doom, but it’d run for about 10 seconds then die.

We managed to get some of the disc set for IRIX, but never did manage to get the Foundation discs needed to install. My research however, led me onto the Debian/MIPS port. I followed the instructions, installed it, and hey presto, Linux on a SGI box, and almost everything worked. VINO (the video capture interface) was amongst those things that didn’t at the time, but never mind. Sound was one of the things that did, and my goodness, does it sound good for a machine of that vintage!

Needless to say, the IRIX install was history. I still have copies of IRIX 6.5.30 stashed in my archives, lacking the foundation discs. The IRIX install didn’t last very long, so I can’t really give much of a critique of the UI. I didn’t have removable media so didn’t get to try the automounting feature. The shut down procedure was a nice touch, just tap the OFF button, the computer does the rest. The interface otherwise looked a bit like MWM. The machine however was infinitely more useful to me running Linux than it ever was under IRIX.

As I toiled with Debian/MIPS on the Indy, I discovered there was a port of this for the Qube2. Some downloads later and suddenly the useless doorstop was a useful server again.

Debian was a new experience for me, I quite liked APT. The version I installed evidently was the unstable release, so it had modern software. Liking this, I tried it on one of the other machines, and was met with, Debian Stab^Hle. Urgh… at the time I didn’t know enough about the releases, and on my own desktop I was already using Linux From Scratch by this time.

I was considering my own distribution that would target the Indy, amongst other systems. Already formulating ideas, and at one point, I had a mismash of about 3 different distributions on my laptop.

Eventually I discovered Gentoo, along with its MIPS port. Okay, not as much freedom as LFS, but very close to it. In fact, it gives you the same freedom if you can be arsed to write your own portage tree. One by one the machines got moved over, and that’s what I’ve used.

The primary desktop environment was KDE for most of them. Build times for this varied, but for most of my machines it was an overnight build. Especially for the Indy. Once installed though, it worked quite well. It took a little longer to start on the older machines, but was still mostly workable.

Up to this point, I had my Linux desktop set up just the way I liked it. Over the years the placement of desktop widgets and panels has moved around as I borrowed ideas I had seen elsewhere. KDE was good in that it was flexible enough for me to fundamentally change many aspects of the interface.

My keybindings were set up to be able to perform most window operations without the need of a mouse (useful when juggling the laptop in one’s hands whilst figuring out where the next lecture was), notification icons and the virtual desktop pager were placed to the side for easy access. The launcher and task bar moved around.

Initially down the bottom, it eventually wound up on the top of the screen much like OS/2 Warp 4, as that’s also where the menu bar appears for applications — up the top of the window. Thus minimum mouse movement. Even today, the Windows 7 desktop at work has the task bar up the top.

One thing that frustrated me with Windows at the time was the complete inability to change many aspects of the UI. Yes you could move the task bar around and add panels to it, but if you wanted to use some other keystroke to close the window? Tough. ALT-F4 is it. Want to bring up the menu items? Hit the logo key, or failing that, CTRL-ESC. Want to maximise a window? Either hit ALT-Space, then use the arrows to hit Maximise, or start reaching for the rodent. Far cry from just pressing Logo-Shift-C or Logo-Shift-X.

Ohh, and virtual desktops? Well, people have implemented crude hacks to achieve something like it. In general, anything I’ve used has felt like a bolt-on addition rather than a seamless integration.

I recall commenting about this, and someone pointing out this funny thing called “standardisation”. Yet, I seem to recall the P in PC standing for personal. i.e. this is my personal computer, no one else uses it, thus it should work the way I choose it to. Not what some graphic designer in Redmond or Cupertino thinks!

The moment you talk about standardisation or pining for things like Group Policy objects, you’ve crossed the boundary that separates a personal computer from a workstation.

Windows Vista

Eventually, after much fanfare, Microsoft did cough up a new OS. And cough up would be about the right description for it. It was behind schedule, and as a result, saw many cut backs. The fanciful new WinFS? Gone. Palladium? Well probably a good thing that did go, although I think I hear its echoes in Secure Boot.

What was delivered, is widely considered today a disaster. And it came at just the wrong time. Just as the market for low-end “netbook” computers exploded, just the sort of machine that Windows Vista runs the worst on.

Back in the day Microsoft recommended 8MB RAM for Windows 95 (and I can assure you it will even run in 4MB), but no one in their right mind would tollerate the constant rattle from the paging disk. The same could be said for Windows NT’s requirement of 12MB RAM and a 486. Consumers soon learned that “Windows Vista Basic ready” meant a warning label to steer clear of, or insist on it coming with Windows XP.

A new security feature, UAC, ended up making more of a nuisance of itself, causing people to do the knee-jerk reaction of shooting the messenger.

The new Aero interface wastes even more screen pixels than the “Luna” interface of Windows XP. And GPU cycles to boot. The only good thing about it was that the GPU did all the hard work putting windows on the screen. It looked pretty, when the CPU wasn’t overloaded, but otherwise the CPU had trouble keeping up and the whole effect was lost. Exactly what productivity gains one has by having a window do three somersaults before landing in the task bar on minimise is lost on me.

Windows Vista was the last release that could do the old Windows 95 style start menu. The newer Vista one was even more painful than the one in XP. The All Programs sub-menu opened out much like previous editions did (complete with the annoying “compress myself into a single column”). In Vista, this menu was now entrapped inside this small scrolling window.

Most of Vista’s problems were below the surface. Admittedly Service Pack 1 fixed a lot of the problems, it was already too late. No one wanted to know.

Even with the service packs, it still didn’t perform up to par on the netbooks that were common for the period. The original netbook for what it’s worth, was never intended to run any version of Windows, the entire concept came out of the One Laptop Per Child project, which was always going to be Linux based.

Asus developed the EeePC was one of the early candidates for the OLPC project. When another design got selected, Asus simply beefed up the spec, loaded on Xandros and pushed it out the door. Later models came with Windows XP, and soon, other manufacturers pitched in. This was a form factor with specs that ran Windows XP well, unfortunately Vista’s Aero interface was too much for the integrated graphics typically installed, and the memory requirements had the disk drive rattling constantly, sapping the machine of valuable kilojoules when running from the battery.

As to my run-in with Vista? For my birthday one year I was given a new laptop. This machine came pre-loaded with it, and of course, the very first task I did was spend a good few hours making the recovery discs then uninstalling every piece of imaginable crap that manufacturers insist on prebloating their machines with.

For what I needed, I actually needed Linux to run. The applications I use and depend on for university work, whilst compatible with Windows, run as second class citizens due to their Unix heritage. Packages like The Gimp, gEDA, LaTeX, git, to name a few, never quite run as effortlessly on Windows. The Gimp had a noticable lag when using my Wacom tablet, something that put my handwriting way off.

Linux ran on it, but with no support for the video card, GUI related tasks were quite choppy. In the end, it proved to be little use to me. My father at the time was struggling along with his now aging laptop using applications and hardware that did not support Windows Vista. I found a way to exorcise Windows Vista from the machine, putting Windows XP in its place.

The bloat becomes infectious

What was not lost on me, was that each new iteration of full desktops like KDE brought in more dependencies. During my latter years at University, I bought myself a little netbook. I was doing work for Gentoo/MIPS with their port of Linux, and thus a small machine that would run what I needed for university, and could serve as a test machine during my long trips between The Gap and Laidley (where I was doing work experience for Eze Corp) would go down nicely. So I fired off an email and a telegraphic money transfer over to Lemote in China, and on the doorstep turned up a Yeeloong netbook.

I dual booted Debian and Gentoo on this machine, in fact I still do. Just prior to buying this machine, I was limping along with an old Pentium II 300MHz laptop. I did have a Pentium 4M laptop, but a combination of clumsiness and age slowly caused the machine’s demise. Eventually it failed completely, and so I just had to make do with the PII which had been an earlier workhorse.

One thing, KDE 3.0 was fine on this laptop. Even 3.5 was okay. But when you’ve only got 300MHz of CPU and 160MB RAM, the modern KDE releases were just a bit too much. Parts of KDE were okay, but for the main desktop, it chugged along. Looking around, I needed a workable desktop, so I installed FVWM. I found the lack of a system tray annoyed me, so in went stalonetray. Then came maintaining the menu. Well, modern FVWM comes with a Perl script that automates this, so in that went.

Finally, a few visual touches, a desktop background loader, some keybinding experiments and I pretty much had what KDE gave me, that started in a fraction of the time, and built much faster. When the Yeeloong turned up and I got Gentoo onto there, the FVWM configuration here was the first thing to be installed on the Yeeloong, and so I had a sensible desktop for the Yeeloong.

Eventually I did get KDE 4 working on the Yeeloong, sort of. It was glitchy on MIPS. KDE 3.5 used to work without issue but 4.0 never ran quite right. I found myself using FVWM with just the bits of KDE that worked.

As time went on, university finished, and the part-time industrial experience became full-time work. My work at the time revolved around devices that needed Windows and a parallel port to program them. We had another spare P4 laptop, so grabbed that, tweaked Windows XP on there to my liking, and got to work. The P4 “lived” at Laidley and was my workstation of sorts, the Yeeloong came with me to and from there. Eventually that work finished, and through the connections I came to another company (Jacques Electronics). In the new position, it was Linux development on ARM.

The Windows installation wasn’t so useful any more. So in went a copy of the gPartED LiveCD, told Windows to shove, followed by a Gentoo disc and a Stage 3 tarball. For a while my desktop was just the Linux command line, then I got X and FVWM going, finally as I worked, KDE.

I was able to configure KDE once again, and on i686 hardware, it ran as it should. It felt like home, so it stayed. Over time the position at Jacques finished up, I came to work at VRT where I am to this day. The P4 machine stayed at the workplace, with the netbook being my personal machine away from work.

It’s worth pointing out that at this point, although Windows 7 had been around for some time, I was yet to actually come to use it first hand.

My first Apple

My initial time at VRT was spent working on a Python-based application to ferry metering data from various energy meters to various proprietary systems. The end result was a package that slotted in along side MacroView called Metermaster, and forms one of the core components in VRT’s Wages Hub system. While MacroView can run on Windows, and does for some (even Cygwin), VRT mainly deploys it on Ubuntu Linux. So my first project was all Linux based.

During this time, my new work colleagues were assessing my skills, and were looking at what I could work on next. One of the discussions revolved around working on some of their 3D modelling work using Unity3D. Unity3D at the time, ran on just two desktop OSes. Windows, and MacOS X.

My aging P4 laptop had a nVidia GeForce 420Go video device with 32MB memory. In short, if I hit that thing with a modern 3D games engine, it’d most likely crap itself. So I was up for a newer laptop. That got me thinking, did I want to try and face Windows again, or did I want to try something new?

MacOS was something I had only fleeting contact with. MacOS X I had actually never used myself. I knew a bit about it, such as its basis was on the FreeBSD userland, the Mach microkernel. I saw a 2008 model MacBook with a 256MB video device inbuilt, going cheap, so I figured I’d take the plunge.

My first impressions of MacOS X 10.5 were okay. I had a few technical glitches at first, namely MacOS X would sometimes hang during boot, just showing nothing more than an Apple logo and a swirling icon. Some updates refused to download, they’d just hang and the time estimate would blow out. Worst of all, it wouldn’t resume, it’d just start at the beginning.

In the end I wandered down to the NextByte store in the city, bought a copy of MacOS X 10.6. I bought the disc on April 1st, 2011, and it’s the one and only disc the DVD drive in the MacBook won’t accept. The day I bought it I was waiting at the bus stop, figured I’d have a look and see what docs there are. I put the disc in, hear a few noises, it spits the disc out again. So I put it back in again, and out it comes. Figuring this was a defective disc, I put the disc back in and march back down to the shop, receipt in one hand, cantankerous laptop in the other. So much for Apple kit “just working”.

Then the laptop decided it liked the pussy cat disc so much it wouldn’t give it back! Cue about 10 minutes in the service bay getting the disc to eject. Finally the machine reneged and spat the disc out. That night it tried the same tricks, so I grabbed an external DVD drive and did the install that way. Apart from this, OSX 10.6 has given me no problems in that regard.

As for the interface? I noticed a few things features that I appreciated from KDE, such as the ability to modify some of the standard shortcuts, although not all of them. Virtual desktops get called Spaces in MacOS X, but essentially the same deal.

My first problem was identifying what the symbols on the key shortcuts meant. Command and Shift were simple enough, but the symbol used to denote “Option” was not intuitive, and I can see some people getting confused for the one for Control. That said, once I found where the Terminal lived, I was right at home.

File browsing? Much like what I’m used to elsewhere. Stick a disc in, and it appears on the desktop. But then to eject? The keyboard eject button didn’t seem to work. Then I remembered a sarcastic comment one of my uncles made about using a Macintosh, requiring you to “throw your expensive software in the bin to eject”. So click the CD icon, drag to the rubbish bin icon, voilà, out it comes.

Apple’s applications have always put the menu bar of the application right up the top of the screen. I found this somewhat awkward when working with multiple applications since you find yourself clicking on one (or command-tabbing over to) one window, access the menu there, then clicking (or command-tabbing) to the other, access the menu up the top of the screen.

Switching applications with Command-Tab works by swapping between completely separate applications. Okay if you’re working with two completely separate applications, not so great if you’re working on many instances of the same application. Exposé works, probably works quite well if the windows are visually distinct when zoomed out, but if they look similar, one is reminded of Forrest Gump: “Life’s like a box of chocolates, you never know what you’re gonna get!”

The situation is better if you hit Command-Tab, then press a down-arrow, that gives you an Exposé of just the windows belonging to that application. A far cry from hitting Alt-Tab in FVWM to bring up the Window List and just cycling through. Switching between MacVim instances was a pain.

As for the fancy animations. Exposé looks good, but when the CPUs busy (and I do give it a hiding), the animation just creeps along at a snail’s pace. I’ll tolerate it if it’s over and done with within a second, but when it takes 10 seconds to slowly zoom out, I’m left sitting there going “Just get ON with it!” I’d be fine if it just skipped the animation and just switched from normal view to Exposé in a single frame. Unfortunately there’s nowhere to turn this off that I’ve found.

The dock works to an extent. It suffers a bit if you have a lot of applications running all at once, there’s only so much screen real-estate. A nice feature though is in the way it auto-hides and zooms.

When the mouse cursor is away from the dock, it drops off the edge of the screen. As the user configures this and sets up which edge it clings to, this is a reasonable option. As the mouse is brought near the space where the dock resides, it slowly pops out to meet the cursor. Not straight away, but progressively as the proximity of the cursor gets closer.

When fully extended, the icons nearest the cursor enlarge, allowing the rest to remain visible, but not occupy too much screen real-estate. The user is free to move the cursor amongst them, the ones closest zooming in, the ones furtherest away zooming out. Moving the cursor away causes the dock to slip away again.

Linux on the MacBook

And it had to happen, but eventually Linux did wind up on there. Again, KDE initially, but I again, found that KDE was just getting too bloated for my liking. It took about 6 months of running KDE before I started looking at other options.

FVWM was of course where I turned to first, in fact, it was what I used before KDE was finished compiling. I came to the realisation that I was mostly just using windows full-screen. So I thought, what about a tiling window manager?

Looking at a couple, I settled on Awesome. At first I tried it for a bit, didn’t like it, reverted straight back to FVWM. But then I gave it a second try.

Awesome works okay, it’s perhaps not the most attractive to look at, but it’s functional. At the end of the day looks aren’t what matter, it’s functionality. Awesome was promising in that it uses Lua for its configuration. It had a lot of the modern window manager features for interacting with today’s X11 applications. I did some reading up on the handbook, did some tweaking of the configuration file and soon had a workable desktop.

The default keybindings were actually a lot like what I already used, so that was a plus. In fact, it worked pretty good. Where it let me down was in window placement. In particular, floating windows, and dividing the screen.

Awesome of course works by a number of canned window layouts. It can make a window full screen (hiding the Awesome bar) or near full-screen, show two windows above/below each other or along side. Windows are given numerical tags which cause those windows to appear whenever a particular tag is selected, much like virtual desktops, only multiple tags can be active on a screen.

What irritated me most was trying to find a layout scheme that worked for me. I couldn’t seem to re-arrange the windows in the layout, and so if Awesome decided to plonk a window in a spot, I was stuck with it there. Or I could try cycling through the layouts to see if one of the others was better. I spent much energy arguing with it.

Floating windows were another hassle. Okay, modal dialogues need to float, but there was no way to manually override the floating status of a window. The Gimp was one prime example. Okay, you can tell it to not float its windows, but it still took some jiggery to get each window to sit where you wanted it. And not all applications give you this luxury.

Right now I’m back with the old faithful, FVWM.

FVWM

FVWM, as I have it set up on Gentoo

Windows 7

When one of my predecessors at VRT left to work for a financial firm down in Sydney, I wound up inheriting his old projects, and the laptop he used to develop them on. The machine dual-boots Ubuntu (with KDE) and Windows 7, and seeing as I already have the MacBook set up as I want it, I use that as my main workstation and leave the other booted into Windows 7 for those Windows-based tasks.

Windows 7 is much like Windows Vista in the UI. Behind the scenes, it runs a lot better. People aren’t kidding when they say Windows 7 is “Vista done right”. However, recall I mentioned about Windows Vista being the last to be able to do the classic Start menu? Maybe I’m dense, but I’m yet to spot the option in Windows 7. It isn’t where they put it Windows XP or Vista.

So I’m stuck with a Start menu that crams itself into a small bundle in one corner of the screen. Aero has been turned off in favour of a plain “classic” desktop. I have the task bar up the top of the screen.

One new feature of Windows 7 is that the buttons of running applications by default only show the icon of the application. Clicking that reveals tiny wee screenshots with wee tiny title text. More than once I’ve been using a colleague’s computer, he’ll have four spreadsheets open, I’ll click the icon to switch to one of them, and neither of us can figure out which one we want.

Thankfully you can tell it to not group the icons, showing a full title next to the icon on the task bar, but it’s an annoying default.

Being able to hit Logo-Left or Logo-Right to tile a window on the screen is nice, but I find more often than not I wind up hitting that when I mean to hit one of the other meta keys, and thus I have to reach for the rodent and maximise the window again. This is more to do with the key layout of the laptop than Windows 7, but it’s Windows 7’s behaviour and the inability to configure it that exacerbates the problem.

The new start menu I’d wager, is why Microsoft saw so many people pinning applications to the task bar. It’s not just for quick access, in some cases it’s the only bleeding hope they’d ever find their application again! Sure, you can type the name of the application, but circumstance doesn’t always favour that option. Great if you know exactly what the program is called, not so great if it’s someone else’s computer and you need to know if something is even there.

Thankfully most of the effects can be turned off, and so I’m left with a mostly Spartan desktop that just gets the job done. I liken using Windows to a business trip abroad, you’re not there for pleasure, and there’s nothing quite like home sweet home.

Windows 8

Now, I get to this latest instalment of desktop Operating Systems. I have yet to actually use it myself, but looking at a few screenshots, a few thoughts:

  • “Modern”: apart from being a silly name for a UI paradigm (what do you call it when it isn’t so “modern” anymore?), looks like it could really work well on the small screen. However, it relies quite heavily on gestures and keystrokes to navigate. All very well if you set these up to suit how you operate yourself, but not so great when forced upon you.
  • Different situations will call for different interface methods. Sometimes it is convenient to reach out and touch the screen, other times it’ll be easier to grab the rodent, other times it’ll be better to use the keyboard. Someone should be able to achieve most tasks (within reason) with any of the above, and seamlessly swap between these input methods as need arises.
  • “Charms” and “magic corners” makes the desktop sound like it belongs on the set of a Harry Potter film
  • Hidden menus that jump out only when you hit the relevant corner or edge of the screen by default without warning will likely startle and confuse
  • A single flat hierarchy of icons^Wtiles for all one’s applications? Are we back to MS-DOS Executive again?
  • “Press the logo key to access the Start screen”, and so what happens if the keyboard isn’t in convenient reach but the mouse is?
  • In a world where laptops are out-numbering desktops and monitors are getting wider faster than they’re getting taller, are extra-high ribbons really superior to drop-down menus for anyone other than touch users?

Apparently there’s a tutorial when you first start up Windows 8 for the first time. Comments have been made about how people have been completely lost working with the UI until they saw this tutorial. That should be a clue at least. Keystrokes are really just a shortened form of command line. Even the Windows 7 start menu, with its search bar, is looking more like a stylised command line (albeit one with minimal capability).

Are we really back to typing commands into pseudo command line interfaces?

The Command line: what is old is new again

The Command line: what is old is new again

I recall the Ad campaigns for Windows 7, on billboards, some attractive woman posing with the caption: “I’m a PC and Windows 7 was my idea”

Mmm mmm, so who’s idea was Windows 8 then? There’s no rounded rectangles to be seen, so clearly not Apple’s. I guess how well it goes remains to be seen.

It apparently has some good improvements behind the scenes, but anecdotal evidence at the workplace suggests that the ability to co-operate with a Samba 3.5-based Windows Domain is not among them. One colleague recently bought herself a new ultrabook running Windows 8.

I’m guessing sooner or later I’ll be asked to assist with setting up the Cisco VPN client and setting up links to file shares, but another colleague, despite getting the machine to successfully connect to the office Wifi, couldn’t manage to bring up a login prompt to connect to the file server, the machine instead just assuming the local username and password matched the credentials to be used on the remote server. I will have to wait and see.

Where to now?

Well I guess I’m going to stick with FVWM a bit longer, or maybe pull my finger out and go my own way. I think Linus has a point when he describes KDE as a bit “cartoony”. Animations make something easy to sell, but at the end of the day, it actually has to do the work. Some effects can add value to day-to-day tasks, but most of what I’ve seen over the years doesn’t seem to add much at all.

User interfaces are not one-size-fits-all. Never have been. Touch screen interfaces have to deal with problems like fat fingers, and so there’s a balancing act between how big to make controls and how much to fit on a screen. Keyboard interfaces require a decent area for a keypad, and in the case of standard computer keyboards, ideally, two hands free. Mice work for selecting individual objects, object groups and basic gestures, but make a poor choice for entering large amounts of data into a field.

For some, physical disability can make some interfaces a complete no-go. I’m not sure how I’d go trying to use a mouse or touch screen if I lost my eyesight for example. I have no idea how someone minus arms would go with a tablet — if you think fat fingers is a problem, think about toes! I’d imagine the screens on those devices often would be too small to read when using such a device with your feet, unless you happen to have very short legs.

Even for those who have full physical ability, there are times when one input method will be more appropriate at a given time than another. Forcing one upon a user is just not on.

Hiding information from a user has to be carefully considered. One of my pet peeves is when you can’t see some feature on screen because it is hidden from view. There is one thing if you yourself set up the computer to hide something, but quite another when it happens by default. Having a small screen area that activates and reveals a panel is fine, if the area is big enough and there is some warning that the panel is about to fly out.

As for organising applications? I’ve never liked the way everything just gets piled into the “Programs” directory of the Start Menu in Windows. It is just an utter mess. MacOS X isn’t much better.

The way things are in Linux might take someone a little discovery to find where an application has been put, but once discovered, it’s then just a memory exercise to get at it, or shortcuts can be created. Much better than hunting through a screen-full of unsorted applications.

Maybe Microsoft can improve on this with their Windows Store, if they can tempt a few application makers from the lucrative iOS and Android markets.

One thing is clear, the computer is a tool, and as such, must be able to be adapted for how the user needs to use that tool at any particular time for it to maintain maximum utility.

Sep 302012
 

Just thought I’d post this up here for “backup” purposes… lately I’ve been doing a lot of AVR programming, the first step of course was to procure a programmer for the devices.

The following is a schematic for the programmer I have built myself. It can be built out of scrap bits, none of the components are critical in value.

It gets its name as it is essentially identical to the “DASA” serial cables, only all the signals are inverted.  The inverting buffers serve to provide voltage level conversion along with crude tri-state functionality when the AVR device is not being programmed.

Inverted-"DASA" serial programming cable for AVR

Inverted-“DASA” serial programming cable for AVR

The design is released under the TAPR Open Hardware License.

Apr 212012
 

Some may recall my old set up I used to record the AWNOI net. A bit fiddly, but it worked, and worked well. However the machine I used was short-lived. Basically, I wanted to stream in both directions between a sound device connected to a HF radio transceiver, and a USB wireless headset, with a feed being recorded to disk.

The problem with newer sound devices is the rather limited sync range possible with the modern audio CODECs. Many will not do sample rates that aren’t a multiple of 44.1kHz or 48kHz, and I have a headset that won’t record any other sample rate other than 16kHz. ALSA’s plug: doesn’t play nice with JACK, and I shelved the whole project for later.

Well, tonight I did some tinkering with gstreamer to see if it could do the routing I needed. Certainly all the building blocks were there, I just had to get the pipeline right. A bit of jiggling of parameters, and I managed to get audio going in both directions, and to a RIFF wave file to boot. I’ve put it in a shell script for readability/maintainability:

#!/bin/sh
# GStreamer bi-directional full-duplex audio routing script
# Stuart Longland VK4MSL

CAPT_BUFTIME=50000
CAPT_LATTIME=25000
PLAY_BUFTIME=20000
PLAY_LATTIME=1000

STREAM_FMT=audio/x-raw-int,channels=1,width=16,depth=16,rate=16000
OUTPUT_FMT=audio/x-raw-int,channels=2,width=16,depth=16,rate=16000
OUTPUT="${1:-out.wav}"

HEADSET_DEV=hw:Headset
HEADSET_CAPT_FMT=audio/x-raw-int,channels=1,width=16,depth=16,rate=16000
HEADSET_CAPT_BUFTIME=${CAPT_BUFTIME}
HEADSET_CAPT_LATTIME=${CAPT_LATTIME}
HEADSET_PLAY_FMT=audio/x-raw-int,channels=2,width=16,depth=16,rate=48000
HEADSET_PLAY_BUFTIME=${PLAY_BUFTIME}
HEADSET_PLAY_LATTIME=${PLAY_LATTIME}

SNDCARD_DEV=hw:NVidia
SNDCARD_CAPT_FMT=audio/x-raw-int,channels=2,width=16,depth=16,rate=48000
SNDCARD_CAPT_BUFTIME=${CAPT_BUFTIME}
SNDCARD_CAPT_LATTIME=${CAPT_LATTIME}
SNDCARD_PLAY_FMT=audio/x-raw-int,channels=2,width=16,depth=16,rate=48000
SNDCARD_PLAY_BUFTIME=${PLAY_BUFTIME}
SNDCARD_PLAY_LATTIME=${PLAY_LATTIME}

exec    gst-launch-0.10 \
    alsasrc device=${HEADSET_DEV} \
            name=headset-capt \
            slave-method=resample \
            buffer-time=${HEADSET_CAPT_BUFTIME} \
            latency-time=${HEADSET_CAPT_LATTIME} \
        ! ${HEADSET_CAPT_FMT} \
        ! audioresample \
        ! audioconvert \
        ! ${STREAM_FMT} \
        ! queue \
        ! tee name=headset \
        ! audioresample \
        ! audioconvert \
        ! ${SNDCARD_PLAY_FMT} \
        ! alsasink device=${SNDCARD_DEV} \
            name=sndcard-play \
            buffer-time=${SNDCARD_PLAY_BUFTIME} \
            latency-time=${SNDCARD_PLAY_LATTIME} \
    alsasrc device=${SNDCARD_DEV} \
            name=sndcard-capt \
            slave-method=resample \
            buffer-time=${SNDCARD_CAPT_BUFTIME} \
            latency-time=${SNDCARD_CAPT_LATTIME} \
        ! ${SNDCARD_CAPT_FMT} \
        ! audioresample \
        ! audioconvert \
        ! ${STREAM_FMT} \
        ! queue \
        ! tee name=soundcard \
        ! audioresample \
        ! audioconvert \
        ! ${HEADSET_PLAY_FMT} \
        ! alsasink device=${HEADSET_DEV} \
            name=headset-play \
            buffer-time=${HEADSET_PLAY_BUFTIME} \
            latency-time=${HEADSET_PLAY_LATTIME} \
    interleave name=recorder-in \
        ! audioconvert \
        ! audioresample \
        ! ${OUTPUT_FMT} \
        ! wavenc \
        ! filesink location="${OUTPUT}" \
    headset. \
        ! queue \
        ! recorder-in.sink1 \
    soundcard. \
        ! queue \
        ! recorder-in.sink0

Now the fun begins ironing out the kinks in my data cable for the FT-897. At present, it works for receive, and seems to work for transmit. I use VOX on the radio itself and keep the headset’s microphone on mute when I don’t want to transmit.

At present, I was getting a bit of distorted audio coming back through the headset when I transmitted, almost certainly RF-pickup in the cable and line-in circuitry of the computer’s sound card. I’ll have to see if I can filter it out, but the real test will be seeing if such distortion is present on the outgoing signal — or rather, if it’s significantly audible to be a problem.