Feb 282008
 

People who know me, will know I’m quite a keen supporter of open source projects. I’m not nearly as fanatical about it as others, such as Richard Stallman, but I try to support open source as much as I can.

However, I suppose I’m a much bigger supporter, of open standards, than open source. I don’t mind if a project implementing a standard is proprietary commercial software — if the underlying standards it is built on, are open, that makes it possible for an open source implementation to be created. This gives users a choice — they may choose for various reasons to go for a commercial solution, or they may choose open source, it’s entirely up to them.

Now I realise that many of you will be reading this on planet.gentoo.org, and thus I’m likely preaching to the converted. I’m mainly aiming this at organisations that are completely blind to the issues faced. I’m hoping some of those might see this post.

Some might ask, what’s wrong with closed standards? There are a number of issues regarding closed standards.

  • Vendor lock-in: it locks people in to buying from particular vendors, for better or worse.
  • Inflexibility: If you don’t know how it works, how can you modify it to make it do what you want?
  • Control: Who controls what you do with the application? Or the data produced?

If you’re using some closed system, and you run into technical difficulties, the only people who can help, are the makers of that product. You can’t easily switch to another product, and you’re completely at that vendor’s mercy. Some charge extortionate rates to fix even trivial problems, if they help at all. Now granted, there are some good players out there, and if you strike one, great… but if things change for the worse, you’re stuffed.

The ability to understand how a system works is particularly important. Not just with troubleshooting… but also with experiments. Users of a system may have ideas that you as a company have not even considered. Now if it’s open, they can either modify themselves, or hire someone to modify, the system to suit their needs.

Experimentation in one’s spare time is a great way to learn too — university can’t teach you everything. But if the system is closed, how can they experiment? The ability to learn about a system is greatly stifled, when you can’t play with the deep internals at the protocol level.

Control over what you can do with the data produced by a system is a hassle. Remember that you, as the vendor, do not own the data produced by someone using your product. As far as the user is concerned, it’s their data. If I put an audio or video clip of my own work up on my site (which I have done on occasions), it’s not companies like Fraunhofer, or Microsoft, or Apple that own the content, it’s me. And I want the right to be able to share that clip under my terms.

The only reason why the Internet is popular today, is because of open standards. You would likely not be reading this, had it not been due to open protocols such as IEEE802.11b, OpenVPN, Ethernet, TCP/IP and HTTP, and open formats such as HTML. Look at what happened to Compuserve… The Microsoft Network… AOL… Ring a bell? They were all closed networks, that died out because the open wild of the Internet was more appealing to their users.

It isn’t just an issue in the information technology realm. Allow me to look at the problem in another context. Amateur radio, would not exist today as a hobby, if it were not for open communications standards.

If you look past the obvious social and competitive aspects of amateur radio, you see there’s another aspect, the experimentation side. As defined by the ACMA LCD (I’m sure it’s similar in other countries) …

6. Use of an amateur station

The licensee:

  1. must use an amateur station solely for the purpose of:
    1. self training in radiocommunications; or
    2. intercommunications; or
    3. technical investigations into radiocommunications; or
    4. transmitting news and information services related to the operation of amateur stations, as a means of facilitating intercommunication

The two points I’ve highlighted in bold above, are rather important. Put in layman’s terms… if you’re not in the hobby to talk to people, it’s mainly there for experimenting with the technology.

There’s another restriction here too … we’re not allowed to use cryptography, or any kind of secret code, it must be public domain. (e.g. I could, for instance, theoretically use UTF-8 on CW, encoding ones as a dash, zeros as a dot, and using RS-232-like encapsulation. Morse users would get confused however.)

Now suppose FM, for example, were a closed standard — that is, you had to pay some company royalty fees to use them. (Yes, I know that almost did happen way back in the 1930s, but anyway.) How well do you think that’d sit with radio amateurs, who typically like to build homebrew equipment? I don’t think it’d be liked much at all. In fact, if it were secret, it may very well be illegal in some countries. Thankfully this isn’t the case, and even emerging standards like D-Star, are fully open.

Now… back to the IT situation. We can see that a system where the protocols and standards used are fully open, can work. I have to ask why IT thinks it’s special, and insists on closed standards?

Looking at the educational environment … it’s here more than any other place, where we need open standards. How can students be expected to learn about something, if they can’t conduct their own experiments? Experimenting in one’s own time is a good way to gain a better understanding of the topic of study. It’s people graduating from these universities, that will be carrying the industry forward, and I really do think the present industry, should assist by being as open as possible.

Why is it, that universities like inflicting this poor choice of closed systems on its students? Yes, I’m looking at you, Queensland University of Technology, with your extensive use of Microsoft Office, Windows Media codecs (for recorded lectures), Cisco VPNs, Microsoft .NET framework, and numerous proprietary apps/standards.

QUT have a number of labs for each faculty, but also central labs. The central labs have OpenOffice installed, however the labs for Faculty of Engineering, and Faculty of IT, do not. So sure, I can work on some assignment on my personal laptop (running Gentoo Linux of course) — but if I have to email it to the lecturer, I have to either convert it to a PDF (my preferred method), or some have the gaull to ask for it in Microsoft Office formats.

If I comment that I don’t have the money to purchase Microsoft Office, the comment usually is something along the lines of, “Ohh, well you’ll just have to use the computers here.” Yeah well… how about I email my stuff in OpenDocument (ISO26300) format, and see how YOU like walking out of your cozy little office, into the library, and using a computer other than your own to view some file you’re expected to read. Exactly, you don’t like it … why should we be expected to put up with it?!

If that isn’t bad enough, they’ve now dropped using Java apparently for a teaching language. They instead use Scheme for the first years, then go throw them in the deep end with .NET. Way to go for consistency! Probably worth noting that they know nothing about Mono, and expect everyone to use VisualStudio.NET.

I really do think this is highly hypocritical of the university, and it’s an attitude that really disgusts me. Sadly I know they’re not the only ones doing this — some are even worse in this regard. (Then again, some are really open source friendly.) I have good reasons for using the software I do. I at least give you, the choice of using anything that opens OpenDocument formats — which is quite a lot — just sad that your office suite of choice isn’t among them by default. That’s not my fault, and you shouldn’t blame me for that.

I’ve complained directly to them about this before … so I’m now taking this complaint onto the world stage. Don’t like it? Tough.

I try to practice what I preach. One site I maintain, the Asperger Services Australia site, does make use of open standards. Sure Microsoft Office is used internally to write the documents that get uploaded (I’m working on that, give me time), they are converted to PDF. PDF of course is another open standard, ISO32000.

Any multimedia on the site, uses the XIPH foundation codecs Theora and Vorbis. Sure, I get the odd question from a Windows or Mac user about how to play the files, but thanks to the Cortado player applet, and ITheora, I’m able to make the video play for 99% of users out-of-the-box, and cater for the other 1% by allowing them to download the file and play it any number of players that support Theora and Vorbis.

This is handled automatically in most cases, the user isn’t even aware of the underlying architecture. However, if curious, the underlying architecture is open and present for them to look at.

I think it somewhat ridiculous, when looking at science fiction shows such as Star Trek, depicting (fictional) alien craft, produced by completely different lifeforms, are somehow 100% compatible at every layer of the OSI stack. We haven’t even got this today, and every computer on this planet was built by the same species!

I really do think this closed-standards war is hurting more than it’s helping. It’s about time we cut the nonsense, and actually started working together. Protocols and formats, used by systems really should be open for anyone to implement. I don’t mind closed implementations of those standards, that’s fine, but the standards themselves should be open.

Anyway… that’s enough of my ranting… glad to get that out of my system. 🙂

  5 Responses to “Open Standards”

  1. I feel the same when I’m expected to use MS Office formats, or worst, when I’m supposed to use a shiny feature of the very last MS Office version. Or when we use Oracle for just a tiny introduction to SQL, instead of using PostgreSQL or something that every student can have at home if he wants.

    One of the worst cases was when I had to use a closed source CASE tool whith its own obscure closed format… such tool sucked badly at usability and I couldn’t use any alternative.

    That being said, it depends on the department. There are professors who use open standards, open source, and even tools that dont’ suck.

  2. This is a very good point, however I would like to mention the the .Net in 003 would have been fine with Scheme if they had used Nemerle (Superset of .Net C#, with Lisp constructs, this would have added consistency between the two units). I am not sure if you are aware of this but both the subjects you mentioned are from this year onwards teaching Python (Which seems to be an open source, open standard to me). As for the lack of OpenOffice inside the IT Faculty, I believe the Data Comm labs in S622 have Openoffice on their Ubuntu image.

    Cisco VPN works fine with vpnc or kvpnc and Mplayer does play both the wma and wmv formats with win32codes package installed. C# is actually an open standard, hence the existence of Mono. QUT is also migrating from VPN to WPA2, but I find the VPN more robust with my bcm43xx or b43 wireless card. Sorry if this information disappoints, but it seems standards are being slowly improved in QUT, and people are finally responding to your feedback.

  3. Well… with .NET… it’s true that C# is an open standard… I believe there’s an ECMA standard that defines it.

    However, VisualStudio.NET uses a proprietary system for defining projects, and in general, makes it quite difficult to work with other IDEs and platforms. This was my experience anyway, back when I was learning .NET for an IT subject in 2004.

    I’m sure OpenOffice might be present on some workstations, but the fact is, lecturers aren’t using it. PDFCreator is also there, but again, lecturers aren’t using it. This is especially true in the Engineering department. I’ve pointed out their presence on many an occasion, with little success.

    OpenOffice can open Microsoft Office documents, this is true, but with somewhat limited capabilities. I’ve had formulae totally messed up and diagrams made unintelligible when using OpenOffice — not good enough for my needs. This isn’t the fault of OpenOffice, but rather, the fault of Microsoft for making its format so difficult to implement.

    As far as Windows Media codecs… yes, mplayer will play them, with win32codecs, which require an x86 processor. Tough luck if you use AMD64, or in my case, MIPS. In addition, they run damn awful even on x86. No thanks… there are perfectly servicable codecs they could use out there (such as x264, Theora…etc) and practically no reason why they couldn’t use them.

    Up until very recently… WPA2 is no good to me. I’ve only *just* managed to upgrade to a card that can theoretically do WPA2. I normally use an Entrasys/Cabletron 802.11DS PCMCIA wireless card, a rebadge of the Orinoco Silver card. It doesn’t do WPA. It doesn’t even do 128-bit WEP — it does 40-bit WEP in hardware. Without WPA support natively, I can’t connect, so that network is useless to me. Prior to them moving to the Cisco VPN, they were using a system called BlueSocket, which was based on PPTP — quite trivial to set up and use. There’s also OpenVPN. Why they chose to move to the Cisco nonsense, I have no idea.

  4. the way things are going the digital dark age is coming

    The digital Dark Age is a term used to describe a possible future situation where it will be difficult or impossible to read historical documents, because they have been stored in an obsolete digital format. This could cause the period around the turn of the 21st century to be comparable to the Dark Ages during the Middle Ages in the sense that there will be a relative lack of written record. The term is not limited to text documents, but applies equally to photos, video, audio and other kinds of electronic documents.

    The concern leading to the use of the term is that documents are stored on physical media which require special hardware in order to be read and that this hardware will not be available in a few decades from the time the document was created. An example is that already today the necessary disk drive to read a 5¼-inch floppy disk is not readily available. The digital Dark Age also applies to the problems which arise due to obsolete file formats. In this case it is the lack of the necessary software which causes problems when desiring to retrieve stored documents. One example is that a word processor document saved in the WordStar format popular in the 1980s cannot be read by software typically installed on modern PCs.
    ” -http://en.wikipedia.org/wiki/Digital_Dark_Age

    the best example of this is the BBC Domesday Project http://en.wikipedia.org/wiki/BBC_Domesday_Project
    “Acorn BBC Master expanded with an SCSI controller and an additional coprocessor controlled a Philips VP415 “Domesday Player”, a specially-produced laserdisc player.”
    “The software for the project was written in BCPL to make cross platform porting easier, although BCPL never attained the popularity that its early promise suggested it might.”
    lucky this project was saved

  5. […] And I’m noticing there are some bad habits that lecturers seem to be keen on repeating… again, and again.  Here’s some of my pet hates, as a student.  These relate to the presentation of the material we’re given, the actual format they’re provided in is another matter. […]