I have been doing general computer support (along with a lot of other IT related jobs, such as web site design and general programming) for many years now so I have had a lot of interesting experiences in that area. In my last blog entry I discussed some of the reasons for computer problems in regards to the computer hardware and software. Today I want to comment on (or maybe whinge about) how the user contributes to the whole situation.
I mainly work in a university which is (theoretically at least) filled with intelligent, capable people. You might assume, in this situation, that supporting their computing requirements would be fairly easy. Well no, not necessarily!
If I can actually work on the computer then there are usually few problems. With ubiquitous networking it is usually possible to take control of a computer without being physically present, and in many cases I can actually visit to tackle the problem directly. But there are many other situations where phone support is required, and that’s where things get interesting.
Recently I spend 20 minutes trying to establish whether a computer was plugged in to the power or not. I know that sounds ridiculous but think about it: even with an iMac, which has less cables than many other computers, there are still a few to worry about, such as keyboard, ethernet, and printer cables. Some users look behind the computer, see some cables, and assume that it is plugged into the power.
And yes, I did describe the power cable as a “quite thick (about 5 mm) grey one which probably goes through a big hole in the computer’s stand and connects in the middle”. The user saw a fairly thin green one which didn’t go through a hole and connected on the right, and thought that was close enough!
So when I try to figure out why the computer won’t start, after being told the power is connected and switched on, you can imagine my confusion. I often feel like my time is being wasted by help systems which ask if the computer is plugged in to the power. I think “of course it is, get on to the helpful suggestions” but clearly you can’t assume anything!
Sometimes users just seem to be totally blind to what is on their screen. I asked another user to open the Applications folder and tell me if “Image Capture” was there. He said it wasn’t and listed some other programs with similar names: iPhoto, iTunes, etc. I was surprised by this because Image Capture is a standard program installed with the operating system (an install which I had done earlier).
So I said “just type I, then M together quickly”. He said that it had highlighted Mail. Obviously he had typed too slowly and instead of highlighting a program starting with “IM” it had highlighted one starting with “I” followed by another starting with “M”. So I asked him to type a bit more quickly but before he did that he said “Oh, here it is, next to iPhoto”. Yes, and it wasn’t there all along?
But that was nothing compared with a similar issue I wasted about 10 minutes on (these may not sound like long periods of time but if you get 2 or 3 in a row suddenly you start thinking about how you might prefer to be doing something else). All I wanted to do was confirm which program a user was currently using.
On a Mac that is quite easy because the second menu in the menu bar (which is always across the top of the screen) is the name of the program, and that menu is always to the right of the Apple menu which is at the top-left of the screen. So all the user has to do is look at the top-left of the screen, recognise the small Apple symbol and read the word immediately to its right. How hard can that be?
Well, as I said, it took 10 minutes to prise that piece of information from one user I recently worked with.
First of all I assumed a certain amount of basic knowledge on the user’s part and asked her directly which program she was in (that terminology usually works, instead of asking which is the “active” or frontmost” program). She didn’t know, of course.
RIght, so again assuming a certain amount of basic knowledge, I asked what is the name of the second menu. She said she had no menus. Now I really don’t know if she knew what a menu was in this context so I needed to dumb it down a bit. I said “do you see the small Apple symbol at the top-left of the screen”. No, she didn’t see that. I said “what’s at the top-left of your screen”. She replied ” some coloured dots”.
OK, that was progress, there are coloured dots at the top-left of each window. So I said “no, at the top-left of the whole screen, not just the window (of course, I knew for sure she wouldn’t know what a window was but I’m eternally optimistic). No, there was nothing there according to her.
So it sounded like the program was in full-screen mode but this was an older machine with a system which didn’t support that and, even if it did, the coloured buttons wouldn’t be visible then.
So I said “what happens when you move the mouse pointer to the top of the screen? Apparently it disappeared. Now, I’m fairly sure it didn’t disappear – more likely she just briefly lost sight of it – but what could I say?
So I was thinking about what I could do next when she said “what does Finder mean?” I asked her where she saw that and she said near the top-left of the screen. I asked “to the right of the Apple symbol” and she replied “yes”. I asked “what did you do to make that appear?” and she didn’t know. She thought it must have been there all along.
At this point a career in – well just about anything except IT – seemed like a good idea, but at least I knew which program was currently active so I could proceed to the next step. But I really wonder to this day what it was she was looking at when I asked her to look for the Apple symbol at the top-left of the screen. And I guess she still doesn’t know what a menu or a window actually is!
Other users seem to take a long time to do basic things, like select from a menu. When it takes a minute to choose an item from a menu I get worried that maybe the user is really doing something else.
Here’s an example: I asked a user if he could see the Apple menu. Yes he could see that fine (obviously this guy is like Alan Turing compared with the previous user). So I said “click on that menu and choose System Preferences”. I waited a few seconds then said “now…”. But he interrupted me: “wait”. OK, I waited, then I said “OK now?”. He replied “No, just wait a minute, you’re going to fast”. At this stage I wonder if this guy is erasing his hard disk or writing a shell script to hack into NASA in the time between clicking Apple, moving down 100 pixels and clicking System Preferences, but after about 30 seconds I could continue. I still don’t know what he was doing during that time.
Finally there is the most creative and dangerous user of all: the user who actually thinks he or she knows what they are doing! This is usually bad… very bad.
I often like to explain why I am doing certain things just so that the user is reassured and can possibly learn something from the experience, but usually it’s just easier to list a series of actions and have the user repeat them on their computer.
Sometimes it becomes apparent that the user hasn’t got to the expected place, so I need to backtrack and find out what’s gone wrong. Often it is because they have taken a “shortcut” or “applied their knowledge to make things easier”. So I think: yes, well you called me because you can’t solve the problem, let’s just try it my way for a change, OK?
Some users have learned a few computer words and are keen to show them off. But their explanations, rather than clarifying the true situation, often just make things a lot worse! For example, I had one user tell me she had “pointed the font at the window and clicked the pointer but nothing happened”. Well I sort of understand all of those individual words but I can’t make a lot of sense out of how they have been combined!
So yes, computers can be bizarre and difficult to understand, but compared with users, computers are a trivial problem. At least computers make a certain amount of sense and follow some basic rules which can be understood after a few years of intense study of computer science. Users on the other hand (despite the fact that I majored in psychology as well as computer science at university) will always be a mystery to me!
I recently listened to an item which featured Steve Job’s first boss, from the company Atari. He thought that Jobs was an unusual and difficult person to work with, and that he might have a lot of trouble even getting a job in the modern work environment. He thinks most employers reject individuality and difficult and critical personalities in favour people who are easier to get on with and more compliant.
Clearly Jobs was an awkward person and it’s easy to see why he might have been seen as difficult to manage, so there is an obvious reason why he might have had trouble being hired, but whose fault is that really? Sure Jobs was difficult but he was also brilliant. It seems to me that most modern personnel management policies favour people who will fit in a mould rather than do genuinely brilliant work.
Of course having an awkward personality in no way guarantees that a person is brilliant but there does seem to be a correlation between the two. It seems to make sense that people who are going to be able to make a genuinely unique contribution to a company are likely to “think different” from the rest and those people are unlikely to fit in with the standard profile most managers are looking for.
There is also the possibility, which I have discussed in the past, that managers might feel threatened by someone who would be employed in a position below themselves but might be far more capable than they are.
The ultimate example of the failure of a conventional mediocre leadership was the “bad times” at Apple. During the time when Jobs wasn’t there and the “suits” controlled the company they almost destroyed it. Apple is an exceptional case and relies on constant innovation and cutting edge design but it does make me wonder whether every company being run by suits (that is, almost all of them) is achieving well below its potential and could do so much more if they were just prepared to take on an exceptional person instead of just another one from the same old mould.
In my experience I have seen this phenomenon a lot. I see very mediocre people with no innovative ideas at all in senior roles and far more capable and original people being controlled by them. So the less brilliant people are not only enjoying the benefits of seniority themselves but they are also holding back those below them who might otherwise really achieve something.
I do recognise, especially in large organisations, that creative people do present a risk because while they might be theoretically capable of excellent original work, that might not fit in with the “bigger picture”. I also recognise that most bigger companies are very risk averse, and would generally prefer to sacrifice the possibility of a very positive new innovation if there is also a chance it could go wrong.
This problem (if it really is a problem) extends to all levels of human organisation: from national politics all the way down to small groups. Despite the claims to the contrary there is generally very little chance of anything genuinely innovative coming out of a typically organised company or other institution.
It’s difficult to say where the cause of the problem lies. It could be, as I have suggested above, that innovative people are blocked from advancement because they are seen as a risk or a threat. It could be that innovative people do get promotions but they are forced to become part of the “machine” once they do gain senior status so their ideas are wasted. And it could be that creative people just aren’t interested in politics or management. I suspect it is all three.
There is no obvious answer to the problem because the people who need to make the changes are exactly the ones who can’t see that there is a problem which needs to be solved. The best we can realistically hope for is that the power of big corporations and senior business and political leaders is kept under control. But how realistic that is, I really don’t know.
Maybe we’re all doomed to living in a world of increasing mediocrity, where people like Steve Jobs are often wasted. It certainly seems that way to me.
Don’t be concerned, there hasn’t been another nuclear accident in Japan! Actually it’s much worse than that… not really. What has happened is that the hard drive in my Mac laptop has failed and it involves a major effort to get things working again. In fact, I am writing this blog entry on my iPad because my Mac is busy reloading files. When you have a million files totalling hundreds of Gigabytes of data, recovery from hardware failure is not going to be quick!
Before anyone thinks “oh those Macs are so unreliable” I should say that it is not the Apple supplied drive which has failed (although they do occasionally). It is a very expensive and very fast solid state drive which I fitted myself which is the problem. This has been working brilliantly for the last few months but has just suddenly died horribly.
Also don’t think “he’s an IT professional, why doesn’t he have a backup” because I do have several backups, but they only store my data files, so I still need to reconstruct the operating system and applications, although I also have most of those on an old disk.
The problem is my laptop is very complex. It has a massive collection of Mac and Unix programs installed which I update every day, plus virtual machines for Windows XP, Windows 7, Windows 8, Ubuntu Linux, Mac OS X Server, and Chrome OS. I challenge anyone to come up with a more complete and finely tuned collection of stuff! Oh, and I hardly ever use the Windows systems just in case you were wondering. They are mainly for testing web sites using IE, and other minor tasks like that.
Until recently I used Retrospect to backup my computers at home but I have recently switched to ChronoSync, so this will be a good test of how well that has worked. I find it interesting that many people run backup systems without ever testing a restore. This will be my first “real world” test of ChronoSync.
It has been a day and a half now that my laptop has been unavailable but I have managed to do a lot of what I would usually do on it using my iPhone and iPad. That is another interesting test of technology which I have now been forced into. Of course, I can’t practically do programming work on the iPad so that gives me an excuse to take a weekend off!
I am often asked if it’s practical to use an iPad as a substitute for a computer. I guess, depending on the specific requirements, in many cases it is. I really can type just as well (or just as badly because my typing has never been great) on the virtual keyboard of an iPad as I do on the real keyboard of the laptop. Plus the iPad is easier to use in different locations even though the 15 inch laptop is also fairly portable itself.
So whether this IT meltdown will take as long to recover from as Chernobyl of Fukushima I still don’t know. It is inconvenient, but it is also a learning opportunity. In the future I will run a disk clone overnight every day (or maybe once a week if it turns out to be inconvenient) as well as my existing daily backup.
I was hoping the solid state drive would be more reliable than a conventional hard disk as well as being faster. And I was hoping cloud storage would reduce the need for backups too. But maybe not. Backups are still as important as they ever were.
One of the most common frustrations of working in the modern environment is the constant cut-backs, downsizing, and budgeting which makes getting things done more difficult. These are generally justified through an argument such as: in these difficult economic times we must tighten our belts, or public (or shareholder) funds must be spent responsibly, or we must become more efficient to compete in the market. While politicians and managers love this sort of stuff, in most cases it’s all crap.
In every case I am familiar with there is a remarkable inequality and inconsistency in these “austerity” styled processes. When there are redundancies, reduced pay, or less generous conditions it very rarely affects top management. Sure, sometimes middle and lower management are reduced in numbers, but far more often it is those at the bottom who are the victims, and they are the least responsible for the organisation’s alleged difficult situation.
When cutbacks are made it is surprising how often that later it is discovered that the same organisation is wasting large amounts on worthless nonsense. My colleague Fred (not his real name but he is an IT professional in a similar job to mine) reported this phenomenon recently where the workers’ conferences were cancelled, they had great difficulty getting the equipment their job required, and they were forced to keep old equipment well past its useful lifetime, yet management spent large sums (how much they would not reveal) on pathetically childish management motivational and organisational material. Fred claims it looks like the sort of stuff that a bunch of kindergarten kids with a box of crayons could produce. I thought that was being a bit unkind to the intellect of 5 year olds!
The same thing happens at government level. Here in New Zealand there are constant claims that there just isn’t the money to carry out many projects which would otherwise be very worthwhile. Yet the same government wastes huge amounts on stuff that benefits very few. For example, research budgets are being cut here yet the government can still give big overseas corporations like Warners tens of millions of dollars in hand-outs.
I think austerity measures are doomed to failure in most cases whatever the circumstances are but at least I would be more prepared to support them if there wasn’t this obvious double standard. If I was in an organisation afflicted with this problem I would be prepared to accept cut-backs to expensive equipment purchases and conferences if that was necessary. But I would not be happy to do that and then find management has just spent a small fortune on some nonsense which some “criminal” masquerading as a “management consultant” has produced.
I would regretfully accept the necessity to keep using old equipment even past its useful lifetime, but if I then found management being given fancy new gadgets which they barely even knew how to use I would be less than impressed. This is the sort of thing which does happen regularly unfortunately, and it’s about time workers stopped just accepting it.
But there is one other factor here. It’s difficult for many people – even some who are very skilled – to get a job in the current environment. So people just tend to put up with whatever outrageous nonsense is going on even when they know it is wrong, because if they were fired or resigned they would have a lot of trouble finding another job.
And I think that is one reason why the current New Zealand government is so reluctant to do anything about unemployment. I think they actually want unemployment to be high because that gives employers a huge advantage in pay negotiations. There is also the standard dogma they have about not interfering in the markets of course, but that seems to be set aside when it comes to welfare for corporates.
With the conservatives it seems to be something like this: welfare for those who least need it, don’t interfere except when you really should, blame those who have the least responsibility, have the least accountability for those who demand the most from everyone else, and above all make sure the pain of cutbacks is inflicted on those who can least bear it.
There’s the standard recipe for modern conservative governments and modern management. Why do we put up with it? Sure beats me!
If you have read my previous blog entries you will be very aware of my lack of respect for the process of management in general and most managers in particular. In fact I have been discussing this with my colleague Fred (not his real name) again and he has made some interesting observations which certainly have some resonance for me, specifically on the topic of change management. Here are some of his more astute comments…
When he debates (I get the impression these “debates” are often more like arguments) with the management team at his place of work an accusation often made against him is that he can’t cope with change. In fact this seems to be a very common criticism of anyone who is hesitant to endorse a new way of doing things. I’m sure that in some cases it is true, because many people really don’t like change, but I think more often this is just an excuse used to try to justify new policies and procedures which really don’t have a lot of merit.
In Fred’s case for example it seems counterintuitive that he would reject change when he works in an area (computing) where constant change is the norm. Where else is it necessary to be so open to change or be left behind? So the simple accusation that he doesn’t like change in general is ridiculous. In fact what he doesn’t like is change for the worse, or change for no good reason, or change without consultation.
But the problem with most conversations between a manager and a worker is that the manager doesn’t have to justify anything they do. Generally they initially go with the old justification of “you just don’t like change” and if that fails there’s always the classic follow up of “you could always work somewhere else”.
How would the manager feel if they had a complaint about a staff member’s work and the only response they got was “you just don’t like the new way I do things” or “you could always manage someone else”. They would find that unacceptable wouldn’t they? Yet they think the staff member should accept it when they use exactly equivalent statements. If anyone doesn’t have a good way to justify their conclusions then maybe they should re-examine them.
I guess every organisation does need some ultimate way to make decisions and enforce necessary change but if that is going to happen I think it’s really important that the change be open to criticism and should be able to be defended. Most people’s experience with change in bureaucratic organisations (and that is basically every organisation) is that it is forced on staff against their will, is unsupported with any real proof that it is necessary or advantageous, and is never properly evaluated later to see if it has been successful.
So let’s look at some of the changes Fred has had to endure in recent years to get an idea of why he might be resistant to them…
His salary in real terms has gone down, allegedly because funding is decreasing, but at the same time there is always plenty of money for managers and bureaucrats. Is this the sort of change anyone should be happy with? I can imagine being unhappy but accepting of sacrifice of that sort if it was applied fairly and gross waste wasn’t obvious elsewhere, but when it’s just a cynical ploy like this why should anyone accept it?
Fred has worked at the same organisation for many years and has noticed a lot of change over that time. In general the change has been in the direction of a huge increase in administration and loss of independence. A significant part of his time is now spent filling in time sheets, completing charging forms, attending meaningless meetings, and other activities which aren’t really part of his job. Interestingly the people who forced the cost recovery system on the technical staff are never required to work that way themselves, maybe because they do nothing so would never charge out any time! Is forcing a skilled professional into doing a lot of meaningless administrivia the sort of change which he should be happy about? I don’t think so.
When Fred started work at his organisation his professional skills were quite trusted. It was assumed (quite rightly) that he knew how to solve the problems he encountered and could consult with colleagues where necessary. But over they years his work has been forced into more of a “template”. Now he has to follow policies formulated by people who have no clues at all about the real issues he encounters. Is trusting a bureaucrats opinion instead of a professionals the sort of change he should be happy with? Who would be?
I’m sure Fred’s situation isn’t unique and, as I said earlier, I can identify with why he is frustrated with the types of changes that have been inflicted on him.
Finally, to emphasise my point, imagine the following imaginary situation…
Manager: Hello Fred, I’ve asked for this meeting so we can discuss the new corporate direction and policy framework the management has been working on.
Fred: (extremely worried about what nonsense is about to ensue) Err, OK, what are these changes?
Manager: Well we have decided the professional staff should be paid more and we are going to fund that by reducing the number of administrators in our organisation. Also, we trust your professional skills so you can use our policies as a guideline but bypass them where that will get a better outcome for the client. And we also are throwing out the cost recovery system, the inefficient user pays mechanism, and we will have an administrator to do any of the remaining paper work for you.
Fred: That sounds fair, I am happy to fully cooperate with the new direction.
Manager: Excellent, I had heard you don’t like change but clearly that’s not true.
Fred: Not like change? Where would you get that idea?
But that wouldn’t happen, would it. Because change is almost never positive. It’s always the opposite of what this manager was proposing. But Fred lives in the real world where change is (almost) always bad. It’s not the change which is the problem, it’s the type of change.
I’m writing this blog entry on my iPad because I’m doing an upgrade on my laptop. Specifically I’m switching it from a conventional hard disk to a solid state drive. I have done the same for a client’s machine recently and I was very impressed with the increased performance, lower noise, less weight, decreased heat, and better battery life. I have also used computers with SSDs pre-fitted – like the new MacBook Pro – and the speed is awesome.
Sounds great, doesn’t it? But if SSDs are so great you might wonder why all computers (especially laptops) don’t use them. Well there is one problem: cost. SSDs cost about 5 times as much as an equivalent capacity hard disk, so they are generally used in most compact laptops (like the MacBook Air) and in premium devices where cost is less of a factor.
In fact all Apple laptops now use SSDs so I hope greater production will mean the price will continue to drop, although it will be a while before conventional hard disks are completely replaced. An intermediate step is Apple’s “fusion drive” which combines a conventional disk and an SSD to give the speed of an SSD and the relatively cheap high capacity of a conventional hard disk. That doesn’t help much in a laptop though because the weight, heat, and power issues are not improved like they would be in a pure SSD.
So how does it work? As the name suggests, an SSD is a solid state device – it has no moving parts – which explains the reduced weight and power use. Because there’s no need to wait for moving parts to access different parts of the disk the speed is also much better – and I mean a lot better: in my testing programs launch, the system boots, and files open up to 5 times faster – it really is a huge improvement.
The technology in SSDs is flash memory, similar to what is in flash drives. It’s not just a simple matter of scaling that technology up though, because most flash drives are designed for a limited number of read/write cycles, but a hard disk replacement must be able to do a lot more. So paying about NZ$600 for a 512G drive is actually quite reasonable.
I am expecting that reliability should be improved as well because the lack of moving parts should improve robustness. That is a bit of an unknown factor because SSDs haven’t been widely used for long enough yet to get an accurate idea of reliability. All I will say is that I have a box with about 100 dead hard disks in my office but no dead SSDs… yet!
I’m finishing off this entry on the laptop – after the 5 hour process of the data copying from the old disk to the new one – and I am happy to say the performance is as good as I had hoped for. I click an icon and a second later the program is ready, plus the area of my laptop above the hard disk is cool instead of warm like it used to be.
So my recommendation for anyone wanting to improve the performance of their computer is to consider an SSD – in many cases it will provide a better boost than a fast hard disk, more memory, or even a a faster CPU or GPU. Of course it will depend on the specific machine, what it is used for, and how much data you have to store, but I’m convinced SSDs are the easiest way to really make a difference.
It’s common for people to defend their profession: lawyers tell us they do a useful job and that they can be trusted, police say they are just enforcing the law and keeping us safe, even used car salesmen say they provide a service which we need. And politicians? Well I suppose some professions are just indefensible!
But what about IT (information technology or computer) experts? Many people are a bit distrustful of technology and the “geeks” who support it. Most don’t understand computer technology very well and can easily be persuaded to do the wrong thing by “experts”. And many of those experts don’t actually have a particularly high level of expertise, although it is usually higher than the average user. Plus there’s the all to common problem of IT professionals charging too much simply because the customer has no idea of what might be really involved in a project.
And then there’s the (to use the technical term) major cock-ups! A large proportion of the technical/professional disasters we have heard about in the news recently have been related to computer problems. Just here in New Zealand there is the Ministry of Social Development public computer kiosk security fiasco and the Ministry of Education payroll disaster. But these are just the tip of the iceberg. Every day I come across examples of poorly designed and dysfunctional web sites, overpriced and poorly performing internet services, unnecessarily complex and unreliable software, and many other computer-related issues.
As an IT professional myself you might think I would defend my profession and I do to a certain extent. Many of the problems I know some of the details about aren’t really entirely the fault of the IT geeks who are implementing them. More often than not the problems arise because the technical expert isn’t listened to and is overridden by the client or by management, or the programmer isn’t given a proper specification for what is required to be done, or a senior management decision means that the system has to be implemented with serious and unnecessary compromises.
Interestingly, the compromises are often not related to cost. Sometimes the exact opposite is true in fact. Often management insist on using “industry standard” solutions based around Microsoft (or Oracle, or Cisco just as examples) technology which is not only slower and less secure than many alternatives, but is also a lot more expensive! Why do they do this? Because they are too ignorant to make real decisions based on the facts and instead resort to “best practice” which, as I have said on past occasions, is generally a formula for mediocrity at best and disaster at worst.
I know it is easy to be critical, and I’m sure people could find fault with my work if they looked hard enough, but the poor standards seem so unnecessary to me. I see web sites created at great cost by large teams of professionals which are far less flexible, reliable, easy to use, and elegant than stuff I have created as a single programmer. Of course we all know that sometimes a single person can do more than a team which might spend more time in meetings and creating business plans than actually creating the required solution.
Let me cite an example (I’m going to be deliberately vague here because obviously I can’t reveal any of the parties involved). There was a project I was involved with a few years back which the client initially contracted out to an IT company who put a team on to the task and created a web-based solution for about $40,000. I had quoted for about an eighth of this price. The company created the system which also required expensive hardware and hosting and it was used for a few months before it was realised it was unusable. So I was asked to look at the issue again. I ended up building something ten times better (in my opinion and according to feedback from users) at under a quarter the cost and it ran on a basic Mac which required no specialised hosting.
But why was the expensive option selected in the first place? The client threw away tens of thousands of dollars when they could have had the whole job done for a few thousand. No one could really say but I suspect it was because of the “suits”. I’m sure the representatives of the IT company came in wearing their expensive suits and gave a very professional (and meaningless) PowerPoint presentation before being awarded the contract. The client wouldn’t have known any better and many people would choose that over an individual enthusiast wearing an Apple t-shirt. But they would be wrong.
There are two types of IT professionals out there: those that work in IT because they love it and the pay is just an added bonus, and those who do it for the money and see computers as just a means to an end (making more money). Guess which I am and guess which the suits are. And guess which will almost always get the better result. But guess which usually gets the work!
I’m sometimes asked why I don’t adopt a more “professional” image and turn up at meetings in a fancy suit and with a meaningless PowerPoint presentation. Well it’s because I have too much self-respect to do that. And if the client can’t see through the fake professionalism of the others then I really don’t want them as a client anyway. Now you might be able to see why I am primarily an employee of a large organisation and only do part time private consulting! My personality is not exactly well suited to working in the business world – I’m just too honest.
So my solution for most of our IT problems, and for most of our other problems today, is this: first, get rid of all the managers; second, get rid of all the suits; and third, make PowerPoint illegal. Yeah, that should do it.
Today I listened to a Radio NZ podcast which discussed computer science. One of the topics they talked about was recursion, but in my humble opinion they didn’t explain it very well. Many years ago I did a computer science degree and have worked as a computer consultant and programmer ever since so I wanted to offer my own contribution to the topic here: that is recursion and programming in general.
First I want to say why I love programming so much. To me it is the ideal combination of art and science. That suits my personality best because I am an analytical and precise person but I also like to be creative. There is no other profession that I know of which combines both of those elements in quite the same way. Writing a program involves solving a problem in an analytical way but many of the elements of creating a program also involve a lot of creativity and “beauty”.
In programming (as in many fields where beauty seems a strange word to use as a description, such as maths) beauty refers more to the elegance, simplicity, and subtlety of a solution to a problem more than any outward manifestation of the item being created. In programming there is often an opportunity to create a visual interface which can be described as beautiful but that’s not really what I am talking about here. It’s deeper and more subtle than that.
When I write a program I don’t just try to solve the initial problem, I try to make the solution extendable, tolerant of errors, fast, compact, and easy to understand. Usually a short program is a far more impressive achievement than a long one which has the same function. And every moderately complex problem has an infinite (or so close to infinite that it doesn’t matter) number of possible solutions, some of which are elegant and beautiful and some which aren’t.
Of course there is a certain amount of subjectivity in judging how good a program is but, as in most areas of expertise, skilled programmers will generally agree on what is good and what isn’t.
Now getting back to recursion. First of all, what is it? Well it’s a way to solve a problem by creating a series of steps (what computer scientists call an algorithm) and allowing that algorithm to refer to itself. The nerdy joke in the computer world is that if you look up recursion in the dictionary the definition will include “see recursion”. There is also the little “in” jokes where some languages have recursive names. For example the name of the scripting language “PHP” is a recursive acronym for “PHP: Hypertext Preprocessor”, and GNU is an acronym for Gnu’s Not Unix.
The serious example given in the podcast was an algorithm to climb stairs which went like this: (I have given the following steps the name “climb”)…
go up one step
are you at the top?
– if no: “climb”
end (of climb)
You can see in this example that the algorithm called “climb” has a step which refers to itself (the last step “climb”). But this is a bad example because it could also be done like this…
go up one step
are a you at the top?
- if no go back to “start”
This is what we call an iterative algorithm: it iterates or “loops” around until it stops at a particular step. Generally these are more efficient than recursive algorithms.
By the way, I realise that neither of these work properly if you start at the top of the steps already. That sort of thing is a common programming problem and one which is obvious and quite easy to fix here but often not in more complex algorithms.
So what about an example where recursion does make sense? There is a classic case often used in computing which involves processing a binary tree. So what is a binary tree? It’s a structure in a computer’s memory which contains information in a way which makes it easy to search, sort, and manipulate in other ways. Imagine a series of words and with each word are two links to two other words (or the link could be empty). The words are put in the tree so that the left link always goes to words alphabetically before the current word, and the right link to words after.
If the first word is “computers” for example and the second word is “are” the second word would be accessed from the left link from “computers”. If the third word was “fun” then that would go on the right link from “computers”. if the fourth word was “sometimes” it couldn’t go on the right link from “computers” because “fun” is already there so it would go on the right link from “fun” instead (“s” comes after “f”). If the next word was “but” that would go right from “are”. Continuing the sentence with the words “can be tricky too” we would get this…
Now let’s say I wanted to display the words in alphabetical order. First I make a link pointing to the top of the tree. Now I create some steps called “sort” which I give a link to the current word (initially “computers”). Here are the steps…
Algorithm “sort” using “link”:
Is there a left link at the word for the link you are given?
– if yes, “sort” with the left link
– if no:
— display the current word
— is there a right link?
—- if yes “sort” with the right link.
end (of sort)
That’s it! That will display the words alphabetically. The first link points to “computers” but there is a left link so we send a link to that which points to “are”. There is no left link so we print “are” and send the right link (to “but”) to sort. There is a left link so we send that (to “be”) to sort. There is no left link so it prints “be” and finds no right link. At that point the sort algorithm ends but the previous sort which caused this sort to start is still active so we go back to that. That sort pointed to “but” and we had just taken the left link, so we carry on from that and check the right link using sort. That takes the next sort to “can”, etc…
The key thing here is that each time “sort” is used it sticks around until the end, so there can be any number of sorts waiting to continue where they left off before a “sort” further down the tree was started.
Wow, that sounds so complicated but it’s really quite simple. I did all of this from memory and it’s quite easy when you understand the concept. Without recursion sorting a binary tree would be difficult because there is no reverse link back up the tree and no easy way to remember what has already been done at each word. With recursion when one version of sort launches another its information is left behind and creates a way to go back up the tree when the one it started has finished running.
The recursive algorithm in this case is efficient and elegant. It’s also very simple because all of the complexity that might be required otherwise is available as an intrinsic part of how recursion works. It’s a simple example of “beauty” in programming.
Recently a friend who works for a large organisation has been telling me about the experiences he has had in his workplace. He has had a few disagreements with the bureaucrats he works with and seems to have found the experience quite annoying and even upsetting.
I guess these issues can be seen from many points of view, and I haven’t got all the facts, but the friend seems to be quite dedicated to offering the best service he can for his clients. Unfortunately that sometimes involves doing things in a way which is unprofessional in the opinion of some of the managers he works with.
It is just one opinion against another, but managers don’t have to be right, they just have to force their opinion on others through seniority.
Of course many people have similar experiences in their place of work and it’s easy to rant about these things in places like this blog. But I think we should look at it in perspective. I watched a news report on the situation in Syria today. Anyone who thinks a disagreement with some frivolous administrator is important in the least way when there are so many atrocities going on around the world has got his perspectives terribly messed up.
My advice to the friend was to continue to try to do his best for his clients but to at least appear to be doing things the way the bureaucrats want. If he really expects much sympathy from the rest of us after watching the reports on the violence in Syria (or the reports on famine in Western Africa; or floods, earthquakes, fires, and other disasters in various places) then, as I said above, he needs to get some perspective.
Naturally the world would be a far better place if brainless bureaucrats didn’t stifle the genuine efforts of the people who actually do the work, but in the greater scheme of things it’s really not that important. Everyone should get some perspective.
According to American author and essayist, Edward Abbey “a patriot must always be ready to defend his country against his government.” It’s one of my favourite quotes and one which I have mentioned in this blog before. I have also mentioned the idea of extending it to include defence from other power structures.
For example I would say that every employee should be prepared to defend his clients against his corporation (or company or organisation). Or every doctor should be prepared to defend his patients against his hospital or health authority.
A recent case in the news here in New Zealand is a good example. A doctor from the far north of New Zealand has resigned because of a disagreement with the authority which runs the practice he worked in. The disagreement was mainly about the doctor treating people for free when they couldn’t pay. A specific example involved him giving advice to one person who was accompanying another person he was treating.
This doctor is very highly regarded and has had very positive, effective ideas in the past yet he was forced to leave because he didn’t fit in with the accounting requirements of the organisation. That’s disgusting. Not only was the management prepared to stop people being treated and potentially suffering a worse outcome (possibly death) in the future, but they also ignored the distinct possibility that a $50 value of free treatment early could save $5000 of intervention in the future.
Plus they ignored the moral requirements of the Hippocratic Oath, but of course managers and accountants aren’t bound by moral rules. If they were moral they probably wouldn’t be managers or accountants!
I realise that there is a possibility that giving free treatment is a privilege which might be abused. The person getting the treatment might have enough to pay but just be too greedy to, or they might spend the money they saved on medical treatment on cigarettes or alcohol instead. But I don’t think that possibility should be used as an excuse. The advantages of treatment for those who genuinely can’t afford it surely outweigh the disadvantages of possible abuse.
My thought on the subject is get rid of some of the bureaucrats who forced the doctor out and use the money saved by not paying them to fund some free treatment. Seems like a good idea to me! In fact I suspect that if all the layers of mindless bureaucracy were stripped away from the health system that medical treatment could be made much cheaper and that the occasional free consultation would be perfectly viable.
It’s not only in the area of health that this sort of thing happens. In my own job (IT support and programming) I was forced into a “cost recovery” system many years ago and since then some people would contend that the organisation I work for has become increasingly bureaucratic and out of touch with what our clients really need. So the option then naturally arises of bypassing procedures and using “creative record keeping” when it is necessary to do the best job for clients.
The same issue has recently been discussed in relation to New Zealand’s ACC system. There has been discussion over what should be the main priority for its employees: customer service, following the specifics of the law, or following the dictates of management. The answer seemed to be a balance of all of those things which I entirely agree with.
Of course for many the standard answer would be that an employee’s obligation is to do what management wants him to do. It should be up to management to make sure that those instructions automatically lead to the best legal and service outcomes. But few people would see this as being anything but a convenient fantasy. And the equally unrealistic fantasy that problems of that sort can be solved through standard management procedures can also be rejected.
So yes, everyone should be prepared to stand up to authority to ensure he is really doing what’s right. I would go further and say that unless a person finds himself in trouble for that sort of behaviour occasionally he probably isn’t doing his job very well – unless he is very skilled at subterfuge, of course!
So continuing my theme from a few blog entries back (Who Are the Heroes? on 2012-05-31) I would say that this is another example of how the best people are the ones who refuse to play the corrupt games they find themselves involved with. If the game is unfair then just make up your own rules!
So in summary what I think is this: if you are a moral and competent person it is up to you to defend what’s important from the forces of orthodoxy, mediocrity, immorality and corruption.