I have been doing general computer support (along with a lot of other IT related jobs, such as web site design and general programming) for many years now so I have had a lot of interesting experiences in that area. In my last blog entry I discussed some of the reasons for computer problems in regards to the computer hardware and software. Today I want to comment on (or maybe whinge about) how the user contributes to the whole situation.
I mainly work in a university which is (theoretically at least) filled with intelligent, capable people. You might assume, in this situation, that supporting their computing requirements would be fairly easy. Well no, not necessarily!
If I can actually work on the computer then there are usually few problems. With ubiquitous networking it is usually possible to take control of a computer without being physically present, and in many cases I can actually visit to tackle the problem directly. But there are many other situations where phone support is required, and that’s where things get interesting.
Recently I spend 20 minutes trying to establish whether a computer was plugged in to the power or not. I know that sounds ridiculous but think about it: even with an iMac, which has less cables than many other computers, there are still a few to worry about, such as keyboard, ethernet, and printer cables. Some users look behind the computer, see some cables, and assume that it is plugged into the power.
And yes, I did describe the power cable as a “quite thick (about 5 mm) grey one which probably goes through a big hole in the computer’s stand and connects in the middle”. The user saw a fairly thin green one which didn’t go through a hole and connected on the right, and thought that was close enough!
So when I try to figure out why the computer won’t start, after being told the power is connected and switched on, you can imagine my confusion. I often feel like my time is being wasted by help systems which ask if the computer is plugged in to the power. I think “of course it is, get on to the helpful suggestions” but clearly you can’t assume anything!
Sometimes users just seem to be totally blind to what is on their screen. I asked another user to open the Applications folder and tell me if “Image Capture” was there. He said it wasn’t and listed some other programs with similar names: iPhoto, iTunes, etc. I was surprised by this because Image Capture is a standard program installed with the operating system (an install which I had done earlier).
So I said “just type I, then M together quickly”. He said that it had highlighted Mail. Obviously he had typed too slowly and instead of highlighting a program starting with “IM” it had highlighted one starting with “I” followed by another starting with “M”. So I asked him to type a bit more quickly but before he did that he said “Oh, here it is, next to iPhoto”. Yes, and it wasn’t there all along?
But that was nothing compared with a similar issue I wasted about 10 minutes on (these may not sound like long periods of time but if you get 2 or 3 in a row suddenly you start thinking about how you might prefer to be doing something else). All I wanted to do was confirm which program a user was currently using.
On a Mac that is quite easy because the second menu in the menu bar (which is always across the top of the screen) is the name of the program, and that menu is always to the right of the Apple menu which is at the top-left of the screen. So all the user has to do is look at the top-left of the screen, recognise the small Apple symbol and read the word immediately to its right. How hard can that be?
Well, as I said, it took 10 minutes to prise that piece of information from one user I recently worked with.
First of all I assumed a certain amount of basic knowledge on the user’s part and asked her directly which program she was in (that terminology usually works, instead of asking which is the “active” or frontmost” program). She didn’t know, of course.
RIght, so again assuming a certain amount of basic knowledge, I asked what is the name of the second menu. She said she had no menus. Now I really don’t know if she knew what a menu was in this context so I needed to dumb it down a bit. I said “do you see the small Apple symbol at the top-left of the screen”. No, she didn’t see that. I said “what’s at the top-left of your screen”. She replied ” some coloured dots”.
OK, that was progress, there are coloured dots at the top-left of each window. So I said “no, at the top-left of the whole screen, not just the window (of course, I knew for sure she wouldn’t know what a window was but I’m eternally optimistic). No, there was nothing there according to her.
So it sounded like the program was in full-screen mode but this was an older machine with a system which didn’t support that and, even if it did, the coloured buttons wouldn’t be visible then.
So I said “what happens when you move the mouse pointer to the top of the screen? Apparently it disappeared. Now, I’m fairly sure it didn’t disappear – more likely she just briefly lost sight of it – but what could I say?
So I was thinking about what I could do next when she said “what does Finder mean?” I asked her where she saw that and she said near the top-left of the screen. I asked “to the right of the Apple symbol” and she replied “yes”. I asked “what did you do to make that appear?” and she didn’t know. She thought it must have been there all along.
At this point a career in – well just about anything except IT – seemed like a good idea, but at least I knew which program was currently active so I could proceed to the next step. But I really wonder to this day what it was she was looking at when I asked her to look for the Apple symbol at the top-left of the screen. And I guess she still doesn’t know what a menu or a window actually is!
Other users seem to take a long time to do basic things, like select from a menu. When it takes a minute to choose an item from a menu I get worried that maybe the user is really doing something else.
Here’s an example: I asked a user if he could see the Apple menu. Yes he could see that fine (obviously this guy is like Alan Turing compared with the previous user). So I said “click on that menu and choose System Preferences”. I waited a few seconds then said “now…”. But he interrupted me: “wait”. OK, I waited, then I said “OK now?”. He replied “No, just wait a minute, you’re going to fast”. At this stage I wonder if this guy is erasing his hard disk or writing a shell script to hack into NASA in the time between clicking Apple, moving down 100 pixels and clicking System Preferences, but after about 30 seconds I could continue. I still don’t know what he was doing during that time.
Finally there is the most creative and dangerous user of all: the user who actually thinks he or she knows what they are doing! This is usually bad… very bad.
I often like to explain why I am doing certain things just so that the user is reassured and can possibly learn something from the experience, but usually it’s just easier to list a series of actions and have the user repeat them on their computer.
Sometimes it becomes apparent that the user hasn’t got to the expected place, so I need to backtrack and find out what’s gone wrong. Often it is because they have taken a “shortcut” or “applied their knowledge to make things easier”. So I think: yes, well you called me because you can’t solve the problem, let’s just try it my way for a change, OK?
Some users have learned a few computer words and are keen to show them off. But their explanations, rather than clarifying the true situation, often just make things a lot worse! For example, I had one user tell me she had “pointed the font at the window and clicked the pointer but nothing happened”. Well I sort of understand all of those individual words but I can’t make a lot of sense out of how they have been combined!
So yes, computers can be bizarre and difficult to understand, but compared with users, computers are a trivial problem. At least computers make a certain amount of sense and follow some basic rules which can be understood after a few years of intense study of computer science. Users on the other hand (despite the fact that I majored in psychology as well as computer science at university) will always be a mystery to me!
When you are an IT consultant there are two elements of the job which you have to be aware (or maybe afraid) of: the first is the computer and all it’s complex, and possibly conflicting, parts; and the second is the user, generally an even more bizarre and unfathomable part of the equation.
Many people think computers give far more problems than they should do, and wonder why this issue is so common. They compare computers with other machines which seem to have far fewer faults and accuse computer experts of being somehow negligent in being responsible for these problems.
There is a certain element of truth in this criticism. Computers do seem to develop more faults than most other technology, but I would say there are several really good reasons for this which I will go through here.
First, computer technology is relatively new. The computer as a common workplace tool is only about 20 or 30 years old. Some common functions of computers, such as use of the internet by non-specialists, were developed much more recently than that. So I think there is a partial excuse in saying that computers are still being developed to be more reliable and easy to use and maintain.
Look at how good they are now compared with 10 years ago and it’s obvious considerable progress has already been made. I agree that there is still room for improvement, but how good were cars (for example) just 25 years after they were first mass produced? I would suggest they had progressed nowhere near as far as computers have in the same time.
Second, computers are extremely flexible and tend to be configured with components from a large number of different manufacturers. The computer might come from one company, the operating system form another, various drivers from a third, software from several others, and various peripherals from still others. It is almost inevitable that there will be some compatibility issues when all of these components interact.
Imagine if you bought a car from Ford, then put a Toyota engine in it, with an engine management unit from Mitsubishi and a gearbox from VW. Would you expect these parts to all work together easily? Clearly there are likely to be more issues with this approach, but that is basically what happens with many computers, especially Windows PCs. One reason Macs tend to give a lot less problems is that Apple provides more of the components (hardware, OS, drivers, and some of the software) which are more likely to work together in harmony.
Third, people tend to fine-tune and customise their computers far more than other technology. Traditionally computers have been completely open to reconfiguration by the owner, which has provided clear benefits, but a lot of problems as well. There are exceptions to this approach, for example the iPhone which is a much more closed system, but one where software conflicts, security issues, and crashes are almost unheard of.
To use my car analogy again, it would be like the owner being able to change the way components of the car worked. Would we be surprised if a car stopped working after the owner started fine tuning the engine management system, or disabled the cooling system, for example?
Finally, there has been (especially in the past but not so much today) a trend by major software and hardware companies to engage in a battle with their competitors to see who can create the computer with the best specs or the program with the most features without worrying too much about how relevant those specs were to the average user, or how well those features worked together.
We now have programs like Microsoft Word which can do almost everything but which does all those things really badly, or we have stuff like Flash which comes from the distant past and has just been added to in an attempt to keep it relevant. Both of these approaches result in more functionality theoretically but in less useful functionality in practice.
So when all of these difficulties are considered it’s fairly impressive that computers are as good as they are. It’s not clear which direction these trends will go in future. Apple is making its systems more closed which makes them more reliable, secure and consistent, but also less flexible and configurable. Microsoft is also reducing some of the flexibility of the past. But Google seems determined to offer maximum openness in its Android OS (and that has always been the case with Linux).
As I said above, it’s not clear which is the best approach, but it does seem to me that every other area of technology has become more closed off to the user (few people can service their cars now for example) while becoming more reliable and sophisticated. If that trend also applies to IT then maybe Google is taking the wrong approach.
I think computers will continue to become more reliable and less prone to the problems we still get today, but even now I think a lot of progress has been made. Considering what we ask of them, modern computers (even PCs) are remarkably problem-free.
I seem to have spent a lot of time describing the first difficult element in computer support. The user is in many ways a far more fascinating topic but that will have to wait for another entry.
I’m writing this blog entry on my iPad because I’m doing an upgrade on my laptop. Specifically I’m switching it from a conventional hard disk to a solid state drive. I have done the same for a client’s machine recently and I was very impressed with the increased performance, lower noise, less weight, decreased heat, and better battery life. I have also used computers with SSDs pre-fitted – like the new MacBook Pro – and the speed is awesome.
Sounds great, doesn’t it? But if SSDs are so great you might wonder why all computers (especially laptops) don’t use them. Well there is one problem: cost. SSDs cost about 5 times as much as an equivalent capacity hard disk, so they are generally used in most compact laptops (like the MacBook Air) and in premium devices where cost is less of a factor.
In fact all Apple laptops now use SSDs so I hope greater production will mean the price will continue to drop, although it will be a while before conventional hard disks are completely replaced. An intermediate step is Apple’s “fusion drive” which combines a conventional disk and an SSD to give the speed of an SSD and the relatively cheap high capacity of a conventional hard disk. That doesn’t help much in a laptop though because the weight, heat, and power issues are not improved like they would be in a pure SSD.
So how does it work? As the name suggests, an SSD is a solid state device – it has no moving parts – which explains the reduced weight and power use. Because there’s no need to wait for moving parts to access different parts of the disk the speed is also much better – and I mean a lot better: in my testing programs launch, the system boots, and files open up to 5 times faster – it really is a huge improvement.
The technology in SSDs is flash memory, similar to what is in flash drives. It’s not just a simple matter of scaling that technology up though, because most flash drives are designed for a limited number of read/write cycles, but a hard disk replacement must be able to do a lot more. So paying about NZ$600 for a 512G drive is actually quite reasonable.
I am expecting that reliability should be improved as well because the lack of moving parts should improve robustness. That is a bit of an unknown factor because SSDs haven’t been widely used for long enough yet to get an accurate idea of reliability. All I will say is that I have a box with about 100 dead hard disks in my office but no dead SSDs… yet!
I’m finishing off this entry on the laptop – after the 5 hour process of the data copying from the old disk to the new one – and I am happy to say the performance is as good as I had hoped for. I click an icon and a second later the program is ready, plus the area of my laptop above the hard disk is cool instead of warm like it used to be.
So my recommendation for anyone wanting to improve the performance of their computer is to consider an SSD – in many cases it will provide a better boost than a fast hard disk, more memory, or even a a faster CPU or GPU. Of course it will depend on the specific machine, what it is used for, and how much data you have to store, but I’m convinced SSDs are the easiest way to really make a difference.
Where is the world of computers going in the next few years? If I, or anyone else, knew that they could become quite rich as in investor, or maybe extremely well known as a future technology commentator. But the fact is that often experts are hardly better than any other person at predicting where technology trends will go and IT must be the most prominent example of this.
But despite what I said above I am going to offer a few observations on current trends and where I think they will lead. I guess I can check back on this blog in a few years and see how close I got to reality but until then my guess is as good as any other, and better than most!
The general trend will be for more people to use tablets and smart phones and for traditional computers to become less relevant. That is clear to most of us already so it’s hardly a brilliant revelation. I think Android phones and tablets wil become the basic unit that most people will use, just like Windows is now on computers. Apple’s iOS devices – the iPhone and iPad – will be the premium device just like the Mac is now in computers. And Microsoft won’t be very relevant.
Microsoft seem to think they can create a device which is really a poorly designed laptop, add a few tablet-like features and call it a tablet or hybrid device. I don’t think this approach works. The way people would use a Surface tablet to its strengths would be just like they would use a laptop: while seated at a desk, with a physical keyboard, in landscape mode, and avoiding touch features.
So why have a Microsoft tablet at all? Why not just buy a compact laptop? There is no reason at all but Microsoft are just so intent on maintaining their existing advantage that they can’t move on. Well those who don’t move on disappear and it’s obvious this is already happening to Microsoft. They never have innovated, they’ve always been followers, but they have succeeded anyway because of historical factors. That won’t happen again. Goodbye Microsoft. You won’t be missed!
And Windows 8 hasn’t exactly been an outrageous success. Sure there have been a lot of people switch to it but they are the people trapped in the old paradigm. That market is shrinking every year. Already there are many bad reviews of Windows 8 (after all it’s just Windows 7 with some confusing coloured tiles thrown on top with no real thought to integrated design) plus a critical security flaw has already been patched and it was hacked the day it was released.
I admit I haven’t used Windows 8 very much but I still have it on a virtual machine on my Mac laptop where it has been for a month before most of my PC using friends got it. When I did use it I just found it confusing and pointless.
What about Android? Again I have only used Android devices for short periods of time so I don’t claim to be an expert. My conclusion, based on this short period of trial, would be that Android is quite good but it’s nowhere near as intuitive, consistent, smooth, or trouble-free as iOS.
So Android is a bit like what Windows used to be: a fairly decent system for the majority of people who want something fairly cheap and who don’t demand the ultimate in security, elegance, or reliability. That does sound quite condescending, doesn’t it? Typical Apple fan-boy stuff. But I think I can defend that point.
There are many advantages to Android over iOS (just like there are advantages to Windows over OS X) but I don’t think they outweigh the disadvantages. The advantages include that Android runs on a wide range of hardware, it runs on cheaper devices, it’s more open, and it’s more configurable. But those are the exact same points which are disadvantages because it means the Android system isn’t as closely tied to the hardware so consistency, security, and reliability inevitably suffer, exactly the same situation as occurred with Windows (notice how I refer to Windows in the past tense, as if it just doesn’t matter any more).
So in 5 years time here’s where we’ll be with IT: most people will use tablets and smartphones and most of those will run Android. Apple’s devices will be a significant factor at the top end of the market. In computers Microsoft will still have the biggest share but Apple will make steady gains. Linux will be even less relevant on the desktop, but increasingly relevant in supercomputing and servers.
And new super-compact devices, such as phones with built-in projectors and devices which project an image directly onto the user’s eye will begin to appear.
Everything will connect to the internet wirelessly: appliances, vehicles, everything. And the Internet will increase it’s importance as the major source of information including books, TV, radio, and news.
None of the above is particularly outrageous but I am only predicting the future in 5 years. If I was thinking ahead 10 or more years instead I might be tempted to contemplate far more dramatic changes. But those are the ones which no one ever predicts so why should I try?
All I can say is that working in IT is great. There’s always something new and interesting happening. Whatever happens it will be a wild ride!
The Mac’s current operating system, Mac OS X, has been quite successful since its introduction about 10 years ago. It has powered a wide range of Mac computers, from single core G3 PowerPC machines all the way up to modern 12 core Intel machines. And iOS, a variation of Mac OS X (well not technically, but based on the same core technology at least), has been powering the iPhone and iPad for years too.
But while Mac OS X is far more reliable and capable than the systems which preceded it, the way that users interact with it isn’t that much different. As Mac OS X (now OS X) becomes more mature it’s natural to wonder what will come next. What will Apple give us in OS XI?
I think I see where they are going. A conspicuous trend with the latest iteration of the system, Mountain Lion, has been to split bigger programs into smaller, single function apps. For example Mail no longer handles notes or RSS feeds, and iCal doesn’t handle reminders any more. Other programs perform these functions instead, so instead of a few big programs there are now many small ones.
This can be convenient because it gives the user the ability to choose which program to use for a single function instead of being locked in to a single program like Microsoft’s Outlook which does email, calendars, notes, address books, and reminders. Because the programs know how to communicate with each other most of the convenience and interoperability of a single big program is maintained. And several smaller programs are generally easier to use, more reliable, and faster than one big program which tries to do everything.
But there are disadvantages as well. For example, it can be inconvenient to swap between programs to get to different functions.
So what is the answer? I think Apple are heading towards component software in a similar way to what they tried to do with OpenDoc back in the 90s. With OpenDoc the user could create his own program by mixing small components. The computer interface was centered around the document and whatever software components were required could be used to create a single complex document which might need to be assembled from many smaller parts in a conventional system.
At the time the operating system and hardware weren’t really up to the task and OpenDoc failed, but what about now?
The object architecture of OS X is a natural fit for this approach. Already there are system components, such as Webkit (the Mac’s built-in web engine) which can be used by programmers. Why not extend that type of function at a higher level so that users can use higher level components to make their own programs?
When I create web databases and apps I usually have a web browser, a text editing program, a PDF viewer for documentation, and a graphics program open. I would like to create my own web projects using a tool I design by mixing my favourite apps which have those abilities. Not only would the whole thing run inside a single window but all the components could freely exchange information. As I types the name of a PHP function, for example, the PDF program would show the syntax for that function.
Some of this functionality is already available in monolithic tools but I already have a text editor and other programs I want to use. Why can’t I link up my existing programs to make them work together more smoothly?
I discussed this idea a few years back with a fairly senior Apple engineer and he seemed skeptical so maybe there are good reasons it can’t be done, or maybe Apple just think it’s a bad idea from a user perspective. Maybe they don’t want their users to have too much control. That certainly seems to be the case superficially.
But that is where I think OS XI should go. Eventually I would like only one app on my whole computer which I created by mixing components I like to use. It’s quite a neat idea and if Apple want to use it in a future OS, hey they can have it for free! I just want to be able to create that sort of environment on my next Mac (and iPhone and iPad).
Apple like to push the boundaries of technology. As an Apple “fan boy” I would say that but I think even Apple’s detractors would grudgingly admit that it is often the first to introduce new stuff and, more often, remove old stuff which is no longer necessary – or at least is in the process of becoming less necessary.
One of Apple’s strengths is knowing what not to put into its new machines. The Apple experience is often defined as much by what you don’t get as what you do get. And the lack of certain functionality can seem painful initially but in the end Apple is always right. I can’t think of any situation where Apple has removed a function and then had to re-introduce it later.
The classic example is the floppy drive. There was a period of about a year when not having a floppy was a nuisance but who misses them now? Flash drives are so much faster, higher capacity, smaller, and more reliable. I’m sure that the demise of the floppy on Macs accelerated the move to flash media, both on Macs and PCs.
Sometimes the discontinued feature is software rather than hardware. The most conspicuous example recently was the dropping of the “Rosetta” environment. This allowed Intel based Macs to run software designed for the previous processor Apple used: the PowerPC. It was a remarkable piece of software engineering because it was fast and almost invisible to the user: even quite complex and sometimes fairly flakey older games ran well in Rosetta.
Now it’s gone and that will force anyone who still (after all these years) hasn’t converted their programs to run on the Intel processors to make that update. No matter how good Rosetta was programs running that way still weren’t as good as those running directly on the Intel processor.
I think Apple does engage in a little bit of “social engineering” when it does this stuff. It does force people to make a break with the past and to move on. It’s one reason why the Mac environment is so much cleaner and more reliable than Windows: there’s just less legacy code to support. In some ways I admire the way Microsoft has continued to support old stuff: it’s both good and bad, and just a different approach to Apple’s.
Sometimes changes can cause a bit of consternation to users and support staff. I recently had trouble diagnosing why one of the latest MacBook Pros wouldn’t run properly in 32 bit mode. It would crash after a short time running but was fine in 64 bit mode. Unfortunately I needed to start in 32 to support an old piece of network software our infrastructure here required.
After Googling the problem a bit and finding nothing meaningful I called Apple support. Even they were initially stumped but after going away and discussing it for a short time the support person got back to me and said “we don’t recommend running the new machines in 32 bit mode”. Apparently this is due to performance enhancements related to the new processor and how it does memory access.
It did seem odd though that 32 bit mode wasn’t totally blocked. You can start that way but the machine will die horribly after 30 seconds. So Apple don’t recommend having you machine die before you can get anything done? Why not avoid confusion and just force the machine to start in 64 bit mode every time?
Anyway it means that the odd piece of 32 bit software will now need to be upgraded to 64 bit. And that will be Intel only of course! More social engineering from Apple maybe?
It’s interesting why some of the changes are necessary. The latest “retina display” MacBook Pro is so thin that there is physically no space for an ethernet port on it … or an optical drive either … or a Firewire port! Plus the power connector has been changed to make it fit. On the other hand it does have USB3 and Thunderbolt so all that stuff can be connected at high speed with the right adapter. And do we really need CDs, DVDs or wired networks any more? I can’t remember the last time I used any of that old stuff!
I’m sure that in a few years PCs will do the same things that Apple are doing now. In the past that has been the case: to see what a PC will be like in 5 years look at a Mac now. Actually I don’t know if that’s necessarily true, most of the PC laptops I’ve seen are more like 10 years behind Apple, not just 5!
Apple does take risks and there could easily be a negative user reaction to the loss of all of these previously useful technologies. But that doesn’t seem to happen. Apple really does have an amazing ability to know what its users want before even the users themselves know, and more significantly it knows what the user doesn’t need any more!
How is Apple going after the unfortunate loss of Steve Jobs about 6 months ago now? Well by all indicators I have seen, it seems to be coping really well. I’m not trying to diminish the lasting legacy of Jobs in any way but it does seem that maybe this time Apple can transcend the loss of its inspirational leader and move on without losing its own special culture.
We should remember that a lot of what is happening now (the new iPad, improvements to laptops making them lighter and faster, better integration between iDevices and Macs) was planned by Jobs in the past. Apple do plan a long way ahead so it might be more reasonable to evaluate their progress without Jobs in another year or two.
Andrew Grove said that “only the paranoid survive”. I think he could be right but I don’t know whether paranoia is really the state of mind that describes the situation at Apple. Paranoia usually refers to persecution from the outside or to an exaggerated sense of your own importance. I don’t think Apple suffer from that. They are very aware that the greatest danger comes from within and I’m sure that after the disaster of the 90s they are very aware of the company’s potential for total failure.
This danger from within could take several forms…
First, Apple could become another Microsoft and settle into a state of protecting its existing assets without doing anything genuinely innovative. Sure, we all know Microsoft claim they are innovating but do they really? (there are exceptions: the new Windows Phone OS is quite good and their Kinect technology is impressive but they bought that from another company). Apple really do innovate and aren’t scared to threaten an existing product by introducing a new one (notice how the iPod is in decline after the introduction of the iPhone, for example).
Another problem could be Apple becoming another IBM. It could become excessively corporatised and lose the unique culture which has made it so successful. I’m not saying IBM hasn’t done great things, it clearly has, but I get the impression its bureaucracy prevents it from really using some of the brilliant things its research division produces.
And we also don’t want them to become like companies which rely too much on focus groups (I couldn’t find a specific example, probably because no one wanted to admit to it!) Apple have always “told people what they want” instead of reacting to what the user thinks they want. How could any company produce innovative products by listening to users? As I have said before, if Henry Ford had done that he would have made a faster horse!
Finally Apple could become more like Google and try to do too much. Of course Google does great things, but it has had a lot of failures too, maybe because it tries to do too much. Apple are heading that way too, although currently all of their product lines are successful. It might be difficult in future to be a technology leader in computers, phones, tablets, cloud services, media sales, app distribution, software, operating systems, and retail stores. When you consider what they are currently doing so well it really is quite amazing that they are at or near the front in all of these areas.
Tim Cook seems to be a good CEO. The best CEOs are either those with genuine vision who can really inspire talented people to make brilliant products, or good administrators who can provide an environment where those with real talent can excel. Jobs was probably the first type and Cook the second. Either way Apple should avoid the error of having a CEO who mistakenly believes he has any real talent in the areas the company is working in. That is a guaranteed disaster just waiting to happen.
So things look good for Apple at this point. I hope they do succeed, not just because I am an Apple consultant and programmer but because there aren’t a lot of other companies working in the computing field which are doing anything useful. Everyone needs Apple to succeed, even if they don’t even use their products.
Note: This blog entry was written in mid January but somehow failed to get posted. Since I went to all the trouble of writing it I thought it was still worth presenting here, even if it is a bit late…
After a break of about three weeks over Christmas I am back at work today. That is always a challenge but today has been particularly interesting! Before I even arrived I got a cell phone call from someone experiencing problems with Microsoft Word hanging – I know, that hardly ever happens! (sarcasm)
And now I have just finished setting up a PC (I am a Mac specialist and hate PCs) which was hideously slow and yes, Internet Explorer (which I only ran for 30 seconds) crashed, followed by the printer installation program hanging. So it’s got to get better from here.
One positive note is that I spending the last three days of the week in Auckland working on some computers up there, so that will be a nice break (although still work).
I do find that, even though I am a Mac consultant and programmer, most of my problems are caused by Microsoft products. In all the time I have used Pages it has never crashed on me where I almost expect Word to crash several times if I’m working on a document of any complexity. Of course, I don’t use Word myself unless I really have to, but I do support a lot of people who do.
And I get more issues with Exchange based email than any other type. Again Microsoft products destroy the elegance and reliability of the Mac experience. I do avoid using Exchange for email as much as possible but that isn’t always totally practical.
I often wonder how much better the world would be if Microsoft hadn’t wormed its way into the dominant position it is in now. What would have happened if we had genuine innovation going on? Look at the progress in areas not dominated by Microsoft (tablets, cell phones, etc) and they seem a lot better than the world of PCs.
Not that I should complain too much because, as I intimated above, I refuse to use Microsoft junk unless I am really backed into a corner. Generally that means I am helping out a client who has to use a Microsoft program for some reason.
The most common reasons for using Microsoft software are interesting. In my experience the most common is “that’s what everyone else uses”, closely followed by “that’s what I was given” and “I didn’t know there was an alternative”.
Licensing agreements sometimes make using Office almost compulsory. If an organisation pays a large sum every year for a site license it makes sense to use that licensed software even though it might be better economy if there was a choice.
There are genuine reasons to use it too. The “everyone else uses it” argument is valid because sharing documents with someone using a different program does introduce an extra layer of complexity. And I will admit that Microsoft programs do tend to be fairly feature rich (they do a lot of things very badly) which is an important factor for some people.
So it is a tough one. I am convinced that if Microsoft hadn’t reached the monopoly situation it is now in we would all be better off, but now that it is there in many ways it is just easier for most people to accept the inevitable and use its products, despite how frustrating an experience that is, especially for someone who has seen how much better similar software from other companies usually is.
Today I finished the (600 page) biography of Steve Jobs. I should have spent more time working on my programming projects but I found the book so well written and so full of interesting details that I couldn’t stop reading it. And an ironic aspect of this was that it was a real, paper book. This is the first “real” book I have read for a while. Most of my reading now is in the form of audio books on my iPhone and eBooks on my iPad (two of Steve Jobs’ last great creations).
Most people agree that Jobs was a genius. He wasn’t a genius in the same way that Einstein was – having exceptional intelligence – but he had a unique combination of artistic and engineering skill, amazing intuition about how his devices should work, and an unstoppable ambition to make Apple the greatest technology company ever.
His professional life seemed to have two stages: the early years which were full of disasters, ridiculous fiascos, and crazy decisions which made Apple a huge success story and then almost destroyed it a few years later; and the later years when he returned to Apple when the company just produced one exceptional product after another: the iMac, Mac OS X, the iPod, the iPhone, the iPad, the iTunes store, and the Apple stores.
How did he do it? I don’t know, but there were elements of genius, elements of luck, and maybe most important: elements of pure determination and persistence. Apple became the world’s most important and biggest (by some measures) technology company under his leadership. Could anyone else have done it? I doubt it. Will Apple be able to continue succeeding now that Jobs is gone? Only if they remember the lessons he has given them.
There was one big reason Apple did so well when other great technology companies gradually sunk into oblivion (I’m thinking about HP, IBM, Microsoft, and RIM, but I’m sure there are many others). That reason was that Apple did not follow the standard corporate business model. It was better than that.
I guess the number one edict of the standard model is: profits first. Given the huge profits Apple makes you might think that philosophy applies there too, but I don’t think it does. Jobs himself said so and I think his actions showed that. Jobs’ philosophy was “products first”. He created products people wanted and were prepared to pay for. Because of the success of the products, profit naturally followed.
Another unusual aspect of Apple’s strategy was that it didn’t give its customers what they asked for. Jobs said that if Henry Ford had given his customers what they asked for he would have given them a a faster horse, instead he gave them what they really wanted (even though they didn’t know it). Of course it’s easy to create something you think people might want but only Jobs has consistently created new products which genuinely are so highly loved by their users.
Apple does no product testing, it does no focus groups, it doesn’t do PowerPoint presentations, and it’s very suspicious of the advice and opinions of experts in business, management and marketing. Many of these are contrary to “best practice” and that’s why other companies don’t succeed like Apple does: they just follow the crowd. And many of them have the audacity to claim they are innovators or entrepreneurs. What a joke!
For Apple to continue to succeed it must continue valuing the opinion of the engineers and artists (people like Jony Ive) who have the real talent. The CEO is currently Tim Cook who has a degree in industrial engineering (that has to be good) and an MBA (that is very, very bad). Let’s just hope he forgets all the crap he learnt when he did that MBA!
Being a computer consultant and programmer provides its fair share of challenges. First, there is the temperamental nature of some computers, then there is the constantly changing nature of the IT world, and then there is the ultimate challenge: the users!
I work almost entirely with Macs so I’m not exposed to the same level of troublesome behaviour that my PC colleagues have to put up with. I’m not necessarily saying Macs are totally free from odd and unexplained problems (they certainly aren’t) but Apple’s control over the hardware, operating system, and some of the software means that most Mac systems suffer less from bizarre behaviour than Windows PCs.
The constant change in the computer world can be seen as both its greatest challenge and as its greatest attraction. Having new technologies appearing so quickly does make working in IT interesting but it also makes it hard to keep up. Supporting whole new technology areas, such as iPads and the extremely capable smart phones we now have, is a challenge but would we really want to do without these cool new toys?
And then there’s the users. Few people realise how difficult it can be to support some computer users. It’s not so bad if you have direct access to the computer in need of your intervention, or even if you have screen sharing or terminal access to it, but trying to support computer users by “remote control” over the phone is probably the ultimate exercise in frustration!
It’s not just computers where this happens, because other forms of technology can suffer from similar problems. A friend recently described an experience she had trying to describe how to change the settings on a new TV over the phone for example. And it’s probably significant that TVs (along with almost everything else) are actually controlled by small computers and their on-screen control systems suffer from similar issues to conventional computers.
Ironically it was easier in the “old days” where the primary way to control a computer was through a command-line interface. Asking someone to type a command like “cd /” is often easier than asking them to find the icon for the hard disk and double-click on it. Issues with the “visual” approach include: is the HD icon visible? what does it look like? what is it called? can the user double-click at the correct speed? what display mode is the hard disk window set to display? (and, no doubt, many more) And yes, I know you can control modern computers through a command-line (I love the Mac terminal) but explaining how to launch that can be a major process in itself!
I sometimes wonder what users are thinking. These aren’t stupid people but when it comes to working on their computer they can do some odd things. Here’s a few examples which illustrate the problem…
First there’s the phenomenon of inappropriate use of terminology. A user I was trying to help once told me something like “I pointed my font at the box and clicked but the mouse didn’t appear.” Say what? I recognise all of those words but I have no idea what they mean in that context!
Then there’s the users who just can’t respond appropriately when asked a question. I once asked a user “Is the Finder at the front? You can tell that because the first menu at the top-left (next to the Apple) is called Finder.” I was assured it is so it was then safe to say “go to the Go menu and choose Connect to Server”. But there was no Go menu. That was odd. So I tried a new approach. I said “press command-K” and was informed “it just beeped”. Stranger! Anyway after a while I said: look at the top-left of the screen and read out what it says. The response was “an Apple symbol, then Mail, then…” What? Did you say the second word was Mail? I thought it said Finder? Who knows what the explanation for that slight inconsistency was. It’s still a mystery!
Many users can’t describe real physical objects much better. Recently I was trying to find out what type of computer a person had. She said it was something like a Mac 72. A Mac 72? What is that? The closest thing I could think of was a Power Mac 7200 but that was from the distant past. Anyway it turned out it had a built-in screen, was quite heavy, and didn’t have a CD drive. That didn’t seem to fit anything either but then the name “eMac” was recalled. So I showed this person an old eMac waiting to be recycled and I was assured it was like that except blue and with no CD drive. When I finally saw the computer it was a white iMac with a CD drive. And one other thing: the person wanted to replace the old machine because it had no ethernet to connect to a broadband router. Except, of course, all iMacs have ethernet built-in! Another mystery!
So, as you can see, working with users is a real treat. It’s like a game where they try to deceive you as much as possible and it’s your job to help them despite their best efforts to stop you from doing so. It’s great fun and I really enjoy it when I finally see through the deception and the truth is fully revealed!