Open Standards

People who know me, will know I’m quite a keen supporter of open source projects. I’m not nearly as fanatical about it as others, such as Richard Stallman, but I try to support open source as much as I can.

However, I suppose I’m a much bigger supporter, of open standards, than open source. I don’t mind if a project implementing a standard is proprietary commercial software — if the underlying standards it is built on, are open, that makes it possible for an open source implementation to be created. This gives users a choice — they may choose for various reasons to go for a commercial solution, or they may choose open source, it’s entirely up to them.

Now I realise that many of you will be reading this on planet.gentoo.org, and thus I’m likely preaching to the converted. I’m mainly aiming this at organisations that are completely blind to the issues faced. I’m hoping some of those might see this post.

Some might ask, what’s wrong with closed standards? There are a number of issues regarding closed standards.

  • Vendor lock-in: it locks people in to buying from particular vendors, for better or worse.
  • Inflexibility: If you don’t know how it works, how can you modify it to make it do what you want?
  • Control: Who controls what you do with the application? Or the data produced?

If you’re using some closed system, and you run into technical difficulties, the only people who can help, are the makers of that product. You can’t easily switch to another product, and you’re completely at that vendor’s mercy. Some charge extortionate rates to fix even trivial problems, if they help at all. Now granted, there are some good players out there, and if you strike one, great… but if things change for the worse, you’re stuffed.

The ability to understand how a system works is particularly important. Not just with troubleshooting… but also with experiments. Users of a system may have ideas that you as a company have not even considered. Now if it’s open, they can either modify themselves, or hire someone to modify, the system to suit their needs.

Experimentation in one’s spare time is a great way to learn too — university can’t teach you everything. But if the system is closed, how can they experiment? The ability to learn about a system is greatly stifled, when you can’t play with the deep internals at the protocol level.

Control over what you can do with the data produced by a system is a hassle. Remember that you, as the vendor, do not own the data produced by someone using your product. As far as the user is concerned, it’s their data. If I put an audio or video clip of my own work up on my site (which I have done on occasions), it’s not companies like Fraunhofer, or Microsoft, or Apple that own the content, it’s me. And I want the right to be able to share that clip under my terms.

The only reason why the Internet is popular today, is because of open standards. You would likely not be reading this, had it not been due to open protocols such as IEEE802.11b, OpenVPN, Ethernet, TCP/IP and HTTP, and open formats such as HTML. Look at what happened to Compuserve… The Microsoft Network… AOL… Ring a bell? They were all closed networks, that died out because the open wild of the Internet was more appealing to their users.

It isn’t just an issue in the information technology realm. Allow me to look at the problem in another context. Amateur radio, would not exist today as a hobby, if it were not for open communications standards.

If you look past the obvious social and competitive aspects of amateur radio, you see there’s another aspect, the experimentation side. As defined by the ACMA LCD (I’m sure it’s similar in other countries) …

6. Use of an amateur station

The licensee:

  1. must use an amateur station solely for the purpose of:
    1. self training in radiocommunications; or
    2. intercommunications; or
    3. technical investigations into radiocommunications; or
    4. transmitting news and information services related to the operation of amateur stations, as a means of facilitating intercommunication

The two points I’ve highlighted in bold above, are rather important. Put in layman’s terms… if you’re not in the hobby to talk to people, it’s mainly there for experimenting with the technology.

There’s another restriction here too … we’re not allowed to use cryptography, or any kind of secret code, it must be public domain. (e.g. I could, for instance, theoretically use UTF-8 on CW, encoding ones as a dash, zeros as a dot, and using RS-232-like encapsulation. Morse users would get confused however.)

Now suppose FM, for example, were a closed standard — that is, you had to pay some company royalty fees to use them. (Yes, I know that almost did happen way back in the 1930s, but anyway.) How well do you think that’d sit with radio amateurs, who typically like to build homebrew equipment? I don’t think it’d be liked much at all. In fact, if it were secret, it may very well be illegal in some countries. Thankfully this isn’t the case, and even emerging standards like D-Star, are fully open.

Now… back to the IT situation. We can see that a system where the protocols and standards used are fully open, can work. I have to ask why IT thinks it’s special, and insists on closed standards?

Looking at the educational environment … it’s here more than any other place, where we need open standards. How can students be expected to learn about something, if they can’t conduct their own experiments? Experimenting in one’s own time is a good way to gain a better understanding of the topic of study. It’s people graduating from these universities, that will be carrying the industry forward, and I really do think the present industry, should assist by being as open as possible.

Why is it, that universities like inflicting this poor choice of closed systems on its students? Yes, I’m looking at you, Queensland University of Technology, with your extensive use of Microsoft Office, Windows Media codecs (for recorded lectures), Cisco VPNs, Microsoft .NET framework, and numerous proprietary apps/standards.

QUT have a number of labs for each faculty, but also central labs. The central labs have OpenOffice installed, however the labs for Faculty of Engineering, and Faculty of IT, do not. So sure, I can work on some assignment on my personal laptop (running Gentoo Linux of course) — but if I have to email it to the lecturer, I have to either convert it to a PDF (my preferred method), or some have the gaull to ask for it in Microsoft Office formats.

If I comment that I don’t have the money to purchase Microsoft Office, the comment usually is something along the lines of, “Ohh, well you’ll just have to use the computers here.” Yeah well… how about I email my stuff in OpenDocument (ISO26300) format, and see how YOU like walking out of your cozy little office, into the library, and using a computer other than your own to view some file you’re expected to read. Exactly, you don’t like it … why should we be expected to put up with it?!

If that isn’t bad enough, they’ve now dropped using Java apparently for a teaching language. They instead use Scheme for the first years, then go throw them in the deep end with .NET. Way to go for consistency! Probably worth noting that they know nothing about Mono, and expect everyone to use VisualStudio.NET.

I really do think this is highly hypocritical of the university, and it’s an attitude that really disgusts me. Sadly I know they’re not the only ones doing this — some are even worse in this regard. (Then again, some are really open source friendly.) I have good reasons for using the software I do. I at least give you, the choice of using anything that opens OpenDocument formats — which is quite a lot — just sad that your office suite of choice isn’t among them by default. That’s not my fault, and you shouldn’t blame me for that.

I’ve complained directly to them about this before … so I’m now taking this complaint onto the world stage. Don’t like it? Tough.

I try to practice what I preach. One site I maintain, the Asperger Services Australia site, does make use of open standards. Sure Microsoft Office is used internally to write the documents that get uploaded (I’m working on that, give me time), they are converted to PDF. PDF of course is another open standard, ISO32000.

Any multimedia on the site, uses the XIPH foundation codecs Theora and Vorbis. Sure, I get the odd question from a Windows or Mac user about how to play the files, but thanks to the Cortado player applet, and ITheora, I’m able to make the video play for 99% of users out-of-the-box, and cater for the other 1% by allowing them to download the file and play it any number of players that support Theora and Vorbis.

This is handled automatically in most cases, the user isn’t even aware of the underlying architecture. However, if curious, the underlying architecture is open and present for them to look at.

I think it somewhat ridiculous, when looking at science fiction shows such as Star Trek, depicting (fictional) alien craft, produced by completely different lifeforms, are somehow 100% compatible at every layer of the OSI stack. We haven’t even got this today, and every computer on this planet was built by the same species!

I really do think this closed-standards war is hurting more than it’s helping. It’s about time we cut the nonsense, and actually started working together. Protocols and formats, used by systems really should be open for anyone to implement. I don’t mind closed implementations of those standards, that’s fine, but the standards themselves should be open.

Anyway… that’s enough of my ranting… glad to get that out of my system. 🙂