The story so far.... In 1975, Ed Roberts invented the Altair personal computer. It was a pain to use until 19 year-old pre-billionaire Bill Gates wrote the first personal computer language. Still, the public didn't care. Then two young hackers -- Steve Jobs and Steve Wozniak -- built the Apple computer to impress their friends. We were all impressed and Apple was a stunning success. By 1980, the PC market was worth a billion dollars. Now, view on.....
We are nerds.
Most of the people in the industry were young because the guys who had any real experience were too smart to get involved in all these crazy little machines.
It really wasn't that we were going to build billion dollar businesses. We were having a good time.
I thought this was the most fun you could possibly have with your clothes on.
When the personal computer was invented twenty years it was just that - an invention - it wasn't a business. These were hobbyists who built these machines and wrote this software to have fun but that has really changed and now this is a business this is a big business. It just goes to show you that people can be bought. How the personal computer industry grew from zero to 100 million units is an amazing story. And it wasn't just those early funky companies of nerds and hackers, like Apple, that made it happen. It took the intervention of a company that was trusted by the corporate world. Big business wasn't interested in the personal computer. In the boardrooms of corporate America a computer still meant something the size of a room that cost at least a hundred thousand dollars. Executives would brag that my mainframe is bigger than your mainframe. The idea of a $2,000 computer that sat on your desk in a plastic box was laughable that is until that plastic box had three letters stamped on it - IBM. IBM was, and is, an American business phenomenon. Over 60 years, Tom Watson and his son, Tom Jr., built what their workers called Big Blue into the top computer company in the world. But IBM made mainframe computers for large companies, not personal computers -- at least not yet. For the PC to be taken seriously by big business, the nerds of Silicon Valley had to meet the suits of corporate America. IBM never fired anyone, requiring only that undying loyalty to the company and a strict dress code. IBM hired conservative hard-workers straight from school. Few IBM'ers were at the summer of love. Their turn-ons were giant mainframes and corporate responsibility. They worked nine to five and on Saturdays washed the car. This is intergalactic HQ for IBM - the largest computer company in the world...but in many ways IBM is really more a country than it is a company. It has hundreds of thousands of citizens, it has a bureaucracy, it has an entire culture everything in fact but an army. OK Sam we're ready to visit IBM country, obviously we're dressed for the part. Now when you were in sales training in 1959 for IBM did you sing company songs?
Former IBM Executive
BOB: Well just to get us in the mood let's sing one right here.
SAM: You're kidding.
BOB: I have the IBM - the songs of the IBM and we're going to try for number 74, our IBM salesmen sung to the tune of Jingle Bells.
Bob & Sam singing
'IBM, happy men, smiling all the way, oh what fun it is to sell our products our pruducts night and day. IBM Watson men, partners of TJ. In his service to mankind - that's why we are so gay.'
Now gay didn't mean what it means today then remember that OK?
BOB: Right ok let's go.
SAM: I guess that was OK.
When I started at IBM there was a dress code, that was an informal oral code of white shirts. You couldn't wear anything but a white shirt, generally with a starched collar. I remember attending my first class, and a gentleman said to me as we were entering the building, are you an IBMer, and I said yes. He had a three piece suit on, vests were of the vogue, and he said could you just lift your pants leg please. I said what, and before I knew it he had lifted my pants leg and he said you're not wearing any garters. I said what?! He said your socks, they're not pulled tight to the top, you need garters. And sure enough I had to go get garters.
IBM is like Switzerland -- conservative, a little dull, yet prosperous. It has committees to verify each decision. The safety net is so big that it is hard to make a bad decision - or any decision at all. Rich Seidner, computer programmer and wannabe Paul Simon, spent twenty-five years marching in lockstep at IBM. He feels better now.
Former IBM Programmer
I mean it's like getting four hundred thousand people to agree what they want to have for lunch. You know, I mean it's just not going to happen - it's going to be lowest common denominator you know, it's going to be you know hot dogs and beans. So ahm so what are you going to do? So IBM had created this process and it absolutely made sure that quality would be preserved throughout the process, that you actually were doing what you set out to do and what you thought the customer wanted. At one point somebody kind of looked at the process to see well, you know, what's it doing and what's the overhead built into it, what they found is that it would take at least nine months to ship an empty box.
By the late seventies, even IBM had begun to notice the explosive growth of personal computer companies like Apple.
The Apple 2 - small inexpensive and simple to use the first computer.....
What's more, it was a computer business they didn't control. In 1980, IBM decided they wanted a piece of this action.
Former IBM Executive
There were suddenly tens of thousands of people buying machines of that class and they loved them. They were very happy with them and they were showing up in the engineering departments of our clients as machines that were brought in because you can't do the job on your mainframe kind of thing.
JB wanted to know why I'm doing better than all the other managers...it's no secret...I have an Apple - sure there's a big computer three flights down but it won't test my options, do my charts or edit my reports like my Apple.
The people who had gotten it were religious fanatics about them. So the concern was we were losing the hearts and minds and supply me a machine to win back the hearts and minds.
In business, as in comedy, timing is everything, and time looked like it might be running out for an IBM PC. I'm visiting an IBMer who took up the challenge. In August 1979, as IBM's top management met to discuss their PC crisis, Bill Lowe ran a small lab in Boca Raton Florida.
Hello Bob nice to see you.
BOB: Nice to see you again. I tried to match the IBM dress code how did I do?
BILL: That's terrific, that's terrific.
He knew the company was in a quandary. Wait another year and the PC industry would be too big even for IBM to take on. Chairman Frank Carey turned to the department heads and said HELP!!!
Head, IBM IBM PC Development Team 1980
He kind of said well, what should we do, and I said well, we think we know what we would like to do if we were going to proceed with our own product and he said no, he said at IBM it would take four years and three hundred people to do anything, I mean it's just a fact of life. And I said no sir, we can provide with product in a year. And he abruptly ended the meeting, he said you're on Lowe, come back in two weeks and tell me what you need.
An IBM product in a year! Ridiculous! Down in the basement Bill still has the plan. To save time, instead of building a computer from scratch, they would buy components off the shelf and assemble them -- what in IBM speak was called 'open architecture.' IBM never did this. Two weeks later Bill proposed his heresy to the Chairman.
And frankly this is it. The key decisions were to go with an open architecture, non IBM technology, non IBM software, non IBM sales and non IBM service. And we probably spent a full half of the presentation carrying the corporate management committee into this concept. Because this was a new concept for IBM at that point.
BOB: Was it a hard sell?
BILL: Mr. Carey bought it. And as result of him buying it, we got through it.
With the backing of the chairman, Bill and his team then set out to break all the IBM rules and go for a record.
We'll put it in the IBM section.
Once IBM decided to do a personal computer and to do it in a year - they couldn't really design anything, they just had to slap it together, so that's what we'll do. You have a central processing unit and eh let's see you need a monitor or display and a keyboard. OK a PC, except it's not, there's something missing. Time for the Cringely crash course in elementary computing. A PC is a boxful of electronic switches, a piece of hardware. It's useless until you tell it what to do. It requires a program of instructions...that's software. Every PC requires at least two essential bits of software in order to work at all. First it requires a computer language. That's what you type in to supply instructions to the computer. To tell it what to do. Remember it was a computer language called BASIC that Paul Allen and Bill Gates adapted to the Altair...the first PC. The other bit of software that's required is called an operating system and that's the internal traffic cop that tells the computer itself how the keyboard is connected to the screen or how to store files on a floppy disk instead of just losing them when you turn off the PC at the end of the day. Operating systems tend to have boring unfriendly names like UNIX and CPM and MS-DOS but though they may be boring it's an operating system that made Bill Gates the richest man in the world. And the story of how that came about is, well, pretty interesting. So the contest begins. Who would IBM buy their software from? Let's meet the two contenders -- the late Gary Kildall, then aged 39, a computer Ph.D., and a 24 year old Harvard drop-out - Bill Gates. By the time IBM came calling in 1980, Bill Gates and his small company Microsoft was the biggest provider of computer languages in the fledgling PC industry.
'Many different computer manufacturers are making the CPM Operating System standard on most models.'
For their operating system, though, the logical guy for the IBMers to see was Gary Kildall. He ran a company modestly called Interglactic Digital Research. Gary had invented the PC's first operating system called CP/M. He had already sold 600,000 of them, so he was the big cheese of operating systems.
Founder Digital Research
Speaking in 1983
In the early 70s I had a need for an operating system myself and eh it was a very natural thing to write and it turns out other people had a need for an operating system like that and so eh it was a very natural thing I wrote it for my own use and then started selling it.
In Gary's mind it was the dominant thing and it would always be the dominant of course Bill did languages and Gary did operating systems and he really honestly believed that would never change.
But what would change the balance of power in this young industry was the characters of the two protagonists.
Founder West Coast Computer Faire 1978
So I knew Gary back when he was an assistant professor at Monterrey Post Grad School and I was simply a grad student. And went down, sat in his hot tub, smoked dope with him and thoroughly enjoyed it all, and commiserated and talked nerd stuff. He liked playing with gadgets, just like Woz did and does, just like I did and do.
He wasn't really interested in how you drive the business, he worked on projects, things that interested him.
He didn't go rushing off to the patent office and patent CPM and patent every line of code he could, he didn't try to just squeeze the last dollar out of it.
Gary was not a fighter, Gary avoided conflict, Gary hated conflict. Bill I don't think anyone could say backed away from conflict.
Nobody said future billionaires have to be nice guys. Here, at the Microsoft Museum, is a shrine to Bill's legacy. Bill Gates hardly fought his way up from the gutter. Raised in a prosperous Seattle household, his mother a homemaker who did charity work, his father was a successful lawyer. But beneath the affluence and comfort of a perfect American family, a competitive spirit ran deep.
President, The Paul Allen Group
I ended up spending Memorial Day Weekend with him out at his grandmother's house on Hood Canal. She turned everything in to a game. It was a very very very competitive environment, and if you spent the weekend there, you were part of the competition, and it didn't matter whether it was hearts or pickleball or swimming to the dock. And you know and there was always a reward for winning and there was always a penalty for losing.
CEO Corporate Computing Intl.
One time, it was funny. I went to Bill's house and he really wanted to show me his jigsaw puzzle that he was working on, and he really wanted to talk about how he did this jigsaw puzzle in like four minutes, and like on the box it said, if you're a genius you will do the jigsaw puzzle in like seven. And he was into it. He was like I can do it. And I said don't, you know, I believe you. You don't need to break it up and do it for me. You know.
Bill Gates can be so focused that the small things in life get overlooked.
Former VP, Corporate Comms, Microsoft
If he was busy he didn't bathe, he didn't change clothes. We were in New York and the demo that we had crashed the evening before the announcement, and Bill worked all night with some other engineers to fix it. Well it didn't occur to him to take ten minutes for a shower after that, it just didn't occur to him that that was important, and he badly needed a shower that day.
The scene is set in California...laid back Gary Kildall already making the best selling PC operating system CPM. In Seattle Bill Gates maker of BASIC the best selling PC language but always prepared to seize an opportunity. So IBM had to choose one of these guys to write the operating system for its new personal computer. One would hit the jackpot the other would be forgotten...a footnote in the history of the personal computer and it all starts with a telephone call to an eighth floor office in that building the headquarters of Microsoft in 1980.
At about noon I guess I called Bill Gates on Monday and said I would like to come out and talk with him about his products.
Bill said well, how's next week, and they said we're on an airplane, we're leaving in an hour, we'd like to be there tomorrow. Well, hallelujah. Right oh.
Steve Ballmer was a Harvard roommate of Gates. He'd just joined Microsoft and would end up its third billionaire. Back then he was the only guy in the company with business training. Both Ballmer and Gates instantly saw the importance of the IBM visit.
You know IBM was the dominant force in computing. A lot of these computer fairs discussions would get around to, you know, I.. most people thought the big computer companies wouldn't recognise the small computers, and it might be their downfall. But now to have one of the big computer companies coming in and saying at least the - the people who were visiting with us that they were going to invest in it, that - that was er, amazing.
And Bill said Steve, you'd better come to the meeting, you're the only other guy here who can wear a suit. So we figure the two of us will put on suits, we'll put on suits and we'll go to this meeting.
We got there at roughly two o'clock and we were waiting in the front, and this young fella came out to take us back to Mr. Gates office. I thought he was the office boy, and of course it was Bill. He was quite decisive, we popped out the non-disclosure agreement - the letter that said he wouldn't tell anybody we were there and that we wouldn't hear any secrets and so forth. He signed it immediately.
IBM didn't make it easy. You had to sign all these funny agreements that sort of said I...IBM could do whatever they wanted, whenever they wanted, and use your secrets however they - they felt. But so it took a little bit of faith.
Jack Sams was looking for a package from Microsoft containing both the BASIC computer language and an Operating System. But IBM hadn't done their homework.
They thought we had an operating system. Because we had this Soft Card product that had CPM on it, they thought we could licence them CPM for this new personal computer they told us they wanted to do, and we said well, no, we're not in that business.
When we discovered we didn't have - he didn't have the rights to do that and that it was not...he said but I think it's ready, I think that Gary's got it ready to go. So I said well, there's no time like the present, call up Gary.
And so Bill right there with them in the room called Gary Kildall at Digital Research and said Gary, I'm sending some guys down. They're going to be on the phone. Treat them right, they're important guys.
The men from IBM came to this Victorian House in Pacific Grove California, headquarters of Digital Research, headed by Gary and Dorothy Kildall. Just imagine what its like having IBM come to visit - its like having the Queen drop by for tea, its like having the Pope come by looking for advice, its like a visit from God himself. And what did Gary and Dorothy do? They sent them away.
Gary had some other plans and so he said well, Dorothy will see you. So we went down the three of us...
Former Head of Language Division, Digital Research
IBM showed up with an IBM non-disclosure and Dorothy made what I...a decision which I think it's easy in retrospect to say was dumb.
We popped out our letter that said please don't tell anybody we're here, and we don't want to hear anything confidential. And she read it and said and I can't sign this.
She did what her job was, she got the lawyer to look at the nondisclosure. The lawyer, Gerry Davis who's still in Monterey threw up on this non-disclosure. It was uncomfortable for IBM, they weren't used to waiting. And it was unfortunate situation - here you are in a tiny Victorian House, its overrun with people, chaotic.
So we spent the whole day in Pacific Grove debating with them and with our attorneys and her attorneys and everybody else about whether or not she could even talk to us about talking to us, and we left.
This is the moment Digital Research dropped the ball. IBM, distinctly unimpressed with their reception, went back to Microsoft.
BOB: It seems to me that Digital Research really screwed up.
STEVE BALLMER: I think so - I think that's spot on. They made a big mistake. We referred IBM to them and they failed to execute.
Bill Gates isn't the man to supply a rival a second chance. He saw the opportunity of a lifetime.
Digital research didn't seize that, and we knew it was essential, if somebody didn't do it, the project was going to fall apart.
We just got carried away and said look, we can't afford to lose the language business. That was the initial thought - we can't afford to have IBM not go forward. This is the most exciting thing that's going to happen in PCs.
And we were already out on a limb, because we had licensed them not only Basic, but Fortran, Cobol Assembler er, typing tutor and Venture. And basically every - every product the company had we had committed to do for IBM in a very short time frame.
But there was a problem. IBM needed an operating system fast and Microsoft didn't have one. What they had was a stroke of luck - the ingredient everyone needs to be a billionaire. Unbelievably, the solution was just across town. Paul Allen, Gates's programming partner since high school, had found another operating system.
There's a local company here in CL called CL Computer Products by a guy named Tim Patterson and he had done an operating system a very rudimentary operating system that was kind of like CPM.
And we just told IBM look, we'll go and get this operating system from this small local company, we'll take care of it, we'll fix it up, and you can still do a PC.
Tim Patterson's operating system, which saved the deal with IBM, was, well, adapted from Gary Kildall's CPM.
So I took a CPM manual that I'd gotten from the Retail Computer Store five dollars in 1976 or something, and used that as the basis for what would be the application program interface, the API for my operating system. And so using these ideas that came from different places I started in April and it was about half time for four months before I had my first working version.
This is it, the operating system Tim Patterson wrote. He called in QDOS the quick and dirty operating system. Microsoft and IBM called it PC DOS 1.0 and under any name it looks an awful lot like CPM. On this computer here I have running a PC DOS and CPM 86 and frankly it�s very hard to tell the difference between the two. The command structures are the same, so are the directories, in fact the only obvious external difference is the floppy dirive is labelled A in PC DOS and and C in CPM. Some difference and yet one generated billions in revenue and the other disappeared. As usual in the PC business the prize didn't go to the inventor but to the exploiter of the invention. In this case that wasn't Gary Kildall it wasn't even Tim Paterson.
There was still one problem. Tim Patterson worked for Seattle Computer Products, or SCP. They still owned the rights to QDOS - rights that Microsoft had to have.
Former Vice-President Microsoft
But then we went back and said to them look, you know, we want to buy this thing, and SCP was like most little companies, you know. They always needed cash and so that was when they went in to the negotiation.
And so ended up working out a deal to buy the operating system from him for whatever usage we wanted for fifty thousand dollars.
Hey, let's pause there. To savour an historic moment.
For whatever usage we wanted for fifty thousand dollars.
It had to be the deal of the century if not the millenium it was certainly the deal that made Bill Gates and Paul Allen multi billionaires and allowed Paul Allen to buy toys like these, his own NBA basketball team and arena. Microsoft bought outright for fifty thousand dollars the operating system they needed and they turned around and licensed it to the world for up to fifty dollars per PC. Think of it - one hundred million personal computers running MS DOS software funnelling billions into Microsoft - a company that back then was fifty kids managed by a twenty-five year old who needed to wash his hair. Nice work if you can get it and Microsoft got it. There are no two places further apart in the USA than south eastern Florida and Washington State where Microsoft is based. This - this is Florida, Boca Raton and this building right here is where the IBM PC was developed. Here the nerds from Seattle joined forces with the suits of corporate and in that first honeymoon year they pulled off a fantastic achievement.
After we got a package in the mail from the people down in Florida...
As August 1981 approached, the deadline for the launch of the IBM Acorn, the PC industry held its breath.
Supposedly, maybe at this very moment eh, IBM is announcing the personal computer. We don't know that yet.
Software writers like Dan Bricklin, the creator of the first spreadsheet VisiCalc waited by the phones for news of the announcement. This is a moment of PC history. IBM secrecy had codenamed the PC 'The Floridian Project.' Everyone in the PC business knew IBM would change their world forever. They also knew that if their software was on the IBM PC, they would make fortunes.
Please note that the attached information is not to be disclosed prior to any public announcement. (It's on the ticker) It's on the ticker OK so now you can tell people.
What we're watching are the first few seconds of a $100 billion industry.
After years of thinking big today IBM came up with something small. Big Blue is looking for a slice of Apple's market share. Bits and Bytes mean nothing try this one. Now they're going to sell $1,000 computers to millions of customers. I have seen the future said one analyst and it computes.
Today an IBM computer has reached a personal......
Nobody was ever fired for buying IBM. Now companies could put PCs with the name they trusted on desks from Wisconsin to Wall Street.
When the IBM PC came and the PC became a serious business tool, a lot of them, the first of them went into those buildings over there and that was the real ehm when the PC industry started taking off, it happened there too.
Can learn to use it with ease...
Former IBM Executive
What IBM said was it's okay corporate America for you to now start buying and using PCs. And if it's okay for corporate America, it's got to be okay for everybody.
For all the hype, the IBM PC wasn't much better than what came before. So while the IBM name could create immense demand, it took a killer application to sustain it. The killer app for the IBM PC was yet another spreadsheet. Based on Visicalc, but called Lotus 1-2-3, its creators were the first of many to get rich on IBM's success. Within a year Lotus was worth $150 million bucks. Wham! Bam! Thank you IBM!
Time to rock time for code...
IBM had forecast sales of half a million computers by 1984. In those 3 years, they sold 2 million.
Euphoric I guess is the right word. Everybody was believed that they were not going to... At that point two million or three million, you know, they were now thinking in terms of a hundred million and they were probably off the scale in the other direction.
What did all this mean to Bill Gates, whose operating system, DOS, was at the heart of every IBM PC sold? Initially, not much, because of the deal with IBM. But it did supply him a vital bridgehead to other players in the PC marketplace, which meant trouble in the long run for Big Blue.
The key to our...the structure of our deal was that IBM had no control over...over our licensing to other people. In a lesson on the computer industry in mainframes was that er, over time, people built compatible machines or clones, whatever term you want to use, and so really, the primary upside on the deal we had with IBM, because they had a fixed fee er, we got about $80,000 - we got some other money for some special work we did er, but no royalty from them. And that's the DOS and Basic as well. And so we were hoping a lot of other people would come along and do compatible machines. We were expecting that that would happen because we knew Intel wanted to vend the chip to a lot more than just than just IBM and so it was great when people did start showing up and ehm having an interest in the licence.
IBM now had fifty per cent market share and was defining what a PC meant. There were other PCs that were sorta like the IBM PC, kinda like it. But what the public wanted was IBM PCs. So to be successful other manufacturers would have to build computers exactly like the IBM. They wanted to copy the IBM PC, to clone it. How could they do that legally, well welcome to the world of reverse engineering. This is what reverse engineering can get you if you do it right. It's the modest Aspen, Colorado ski shack of Rod Canion, one of the founders of Compaq, the company set up to compete head-on with the IBM PC. Back in 1982, Rod and three fellow engineers from Texas Instruments sketched out a computer design on a place mat at the House of Pies restaurant in Houston, Texas. They decided to manufacture and market a portable version of the IBM PC using the curious technique of reverse engineering.
Reverse engineering is figuring out after something has already been created how it ticks, what makes it work, usually for the purpose of creating something that works the same way or at least does something like the thing you're trying to reverse engineer.
Here's how you clone a PC. IBM had made it easy to copy. The microprocessor was available off the shelf from Intel and the other parts came from many sources. Only one part was IBM's alone, a vital chip that connected the hardware with the software. Called the ROM-BIOS, this was IBM's own design, protected by copyright and Big Blue's army of lawyers. Compaq had to somehow copy the chip without breaking the law.
First you have to decide how the ROM works, so what we had to do was have an engineer sit down with that code and through trial and error write a specification that said here's how the BIOS ROM needs to work. It couldn't be close it had to be exact so there was a lot of detailed testing that went on.
You test how that all-important chip behaves, and make a list of what it has to do - now it's time to meet my lawyer, Claude.
Silicon Valley Attorney
BOB: I've examined the internals of the ROM BIOS and written this book of specifications now I need some help because I've done as much as I can do, and you need to explain what's next.
CLAUDE: Well,the first thing I'm going to do is I'm going to go through the book of specifications myself, but the first thing I can tell you Robert is that you're out of it now. You are contaminated, you are dirty. You've seen the product that's the original work of authorship, you've seen the target product, so now from here on in we're going to be working with people who are not dirty. We're going to be working with so called virgins, who are going to be operating in the clean room.
BOB: I certainly don't qualify there.
CLAUDE: I imagine you don't. So what we're going to do is this. We're going to hire a group of engineers who have never seen the IBM ROM BIOS. They have never seen it, they have never operated it, they know nothing about it.
Claude interrogates Mark
CLAUDE: Have you ever before attempted to disassemble decompile or to in any way shape or form reverse engineer any IBM equipment?
MARK: Oh no.
CLAUDE: And have you ever tried to disassemble....
This is the Silicon Valley virginity test. And good virgins are hard to find.
CLAUDE: You understand that in the event that we discover that the information you are providing us is inaccurate you are subject to discipline by the company and that can include but not limited to termination immediately do you understand that?
MARK: Yes I do.
After the virgins are deemed intact, they are forbidden contact with the outside world while they build a new chip -- one that behaves exactly like the one in the specifications. In Compaq's case, it took l5 senior programmers several months and cost $1 million to do the reverse engineering. In November 1982, Rod Canion unveiled the result.
What I�ve brought today is a Compaq portable computer.
When Bill Murto, another Compaq founder got a plug on a cable TV show their selling point was clear 100 percent IBM compatibility.
It turns out that all major popular software runs on the IBM personal computer or the Compaq portable computer.
Q: That extends through all software written for IBM?
A: Eh Yes.
Q: It all works on the Compaq?
The Compaq was an instant hit. In their first year, on the strength of being exactly like IBM but a little cheaper, they sold 47,000 PCs.
In our first year of sales we set an American business record. I guess maybe a world business record. Largest first year sales in history. It was a hundred and eleven million dollars.
So Rod Canion ends up in Aspen, famous for having the most expensive real estate in America and I try not to look envious while Rod tells me which executive jet he plans to buy next.
ROD: And finally I picked the Lear 31.
BOB: Oh really?
ROD: Now thart was a fun airplane.
BOB: Oh yeh.
Poor Big Blue! Suddenly everybody was cashing in on IBM's success. The most obvious winner at first was Intel, maker of the PCs microprocessor chip. Intel was selling chips like hotcakes to clonemakers -- and making them smaller, quicker and cheaper. This was unheard of! What kind of an industry had Big Blue gotten themselves into?
Former Head, IBM PC Division
Things get less expensive every year. People aren't used to that in general. I mean, you buy a new car, you buy one now and four years later you go and buy one it costs more than the one you bought before. Here is this magical piece of an industry - you go buy one later it costs less and it does more. What a wonderful thing. But it causes some funny things to occur when you think about an industry. An industry where prices are coming down, where you have to sell it and use it right now, because if you wait later it's worth less.
Where Compaq led, others soon followed. IBM was now facing dozens of rivals - soon to be familiar names began to appear, like AST, Northgate and Dell. It was getting spectacularly easy to build a clone. You could get everything off the shelf, including a guaranteed-virgin ROM BIOS chip. Every Tom, Dick & Bob could now make an IBM compatible PC and take another bite out of Big Blue's business. OK we're at Dominos Computers at Los Altos California, Silicon Valley and this is Yukio and we're going to set up the Bob and Yukio Personal Computer Company making IBM PC clones. You're the expert, I of course brought all the money so what is it that we're going to do.
OK first of all we need a motherboard.
BOB: What's a motherboard?
YUKIO: That's where the CPU is set in...that's the central processor unit.
YUKIO: In fact I have one right here. OK so this is the video board...
BOB: That drives the monitor.
BILL LOWE: Oh, of course. I mean we were able to sell a lot of products but it was getting difficult to make money.
YUKIO: And this is the controller card which would control the hard drive and the floppy drive.
And the way we did it was by having low overhead. IBM had low cost of product but a lot of overhead - they were a very big company.
YUKIO: Right this is a high density recorder.
BOB: So this is a hard disk drive.
And by keeping our overhead low even though our margins were low we were able to make a profit.
YUKIO: OK I have one right here.
BOB: Hey...OK we have a keyboard which plugs in right over here.
BOB: People build them themselves - how long does it take?
YUKIO: About an hour.
BOB: About an hour.
And where did every two-bit clone-maker buy his operating system? Microsoft, of course. IBM never iniagined Bill Gates would sell DOS to anyone else. Who was there? But by the mid 80's it was boom time for Bill. The teenage entrepreneur had predicted a PC on every desk and in every home, running Microsoft software. It was actually coming true. As Microsoft mushroomed there was no way that Bill Gates could personally dominate thousands of employees but that didn't stop him. He still had a need to be both industry titan and top programmer. So he had to come up with a whole new corporate culture for Microsoft. He had to find a way to satisfy both his adolescent need to dominate and his adult need to inspire. The average Microsoftee is male and about 25. When he's not working, well he's always working. All his friends are Microsoft programmers too. He has no life outside the office but all the sodas are free. From the beginning, Microsoft recruited straight out of college. They chose people who had no experience of life in other companies. In time they'd be called Microserfs.
Chief Programmer, Microsoft
It was easier to to to create a new culture with people who are fresh out of school rather than people who came from, from from eh other companies and and and other cultures. You can rely on it you can predict it you can measure it you can optimise it you can make a machine out of it.
I mean everyone like lived together, ate together dated each other you know. Went to the movies together it was just you know very much a it was like a frat or a dorm.
Everybody's just push push push - is it right, is it right, do we have it right keep on it - no that's not right ugh and you're very frank about that - you loved it and it wasn't very formal and hierarchical because you were just so desirous to do the right thing and get it right. Why - it reflects Bill's personality.
And so a lot of young, I say people, but mostly it was young men, who just were out of school saw him as this incredible role model or leader, almost a guru I guess. And they could spend hours with him and he valued their contributions and there was just a wonderful camaraderie that seemed to exist between all these young men and Bill, and this strength that he has and his will and his desire to be the best and to be the winner - he is just like a cult leader, really.
As the frenzied 80's came to a close IBM reached a watershed - they had created an open PC architecture that anyone could copy. This was intentional but IBM always thought their inside track would keep them ahead - wrong. IBM's glacial pace and high overhead put them at a disadvantage to the leaner clone makers - everything was turning into a nightmare as IBM lost its dominant market share. So in a big gamble they staked their PC future to a new system a new line of computers with proprietary closed hardware and their very own operating system. It was war.
Start planning for operating system 2 today.
IBM planned to steal the market from Gates with a brand new operating system, called - drum roll please - OS/2. IBM would design OS/2. Yet they asked Microsoft to write the code. Why would Microsoft help create what was intended to be the instrument of their own destruction? Because Microsoft knew IBM was was the source of their success and they would tolerate almost anything to stay close to Big Blue.
It was just part of, as we used to call it, the time riding the bear. You just had to try to stay on the bear's back and the bear would twist and turn and try to buck you and throw you, but darn, we were going to ride the bear because the bear was the biggest, the most important you just had to be with the bear, otherwise you would be under the bear in the computer industry, and IBM was the bear, and we were going to ride the back of the bear.
It's easy for people to forget how pervasive IBM's influence over this industry was. When you talked to people who've come in to the industry recently there's no way you can get that in to their - in to their head, that was the environment.
The relationship between IBM and Microsoft was always a culture clash. IBMers were buttoned-up organization men. Microsoftees were obsessive hackers. With the development of OS/2 the strains really began to show.
In IBM there's a religion in software that says you have to count K-LOCs, and a K-LOC is a thousand line of code. How big a project is it? Oh, it's sort of a 10K-LOC project. This is a 20K-LOCer. And this is 5OK-LOCs. And IBM wanted to sort of make it the religion about how we got paid. How much money we made off OS 2, how much they did. How many K-LOCs did you do? And we kept trying to convince them - hey, if we have - a developer's got a good idea and he can get something done in 4K-LOCs instead of 20K-LOCs, should we make less money? Because he's made something smaller and faster, less KLOC. K-LOCs, K-LOCs, that's the methodology. Ugh anyway, that always makes my back just crinkle up at the thought of the whole thing.
When I took over in '89 there was an enormous amount of resources working on OS 2, both in Microsoft and the IBM company. Bill Gates and I met on that several times. And we pretty quickly came to the conclusion together that that was not going to be a success, the way it was being managed. It was also pretty clear that the negotiating and the contracts had given most of that control to Microsoft.
It was no longer just a question of styles. There was now a clear conflict of business interest. OS/2 was planned to undermine the clone market, where DOS was still Microsoft's major money-maker. Microsoft was DOS. But Microsoft was helping develop the opposition? Bad idea. To keep DOS competitive, Gates had been pouring resources into a new programme called Windows. It was designed to provide a nice user-friendly facade to boring old DOS. Selling it was another job for shy, retiring Steve Ballmer.
Steve Ballmer (Commercial)
How much do you think this advanced operating environment is worth - wait just one minute before you answer - watch as Windows integrates Lotus 1, 2, 3 with Miami Vice. Now we can take this...
Just as Bill Gates saw OS/2 as a threat, IBM regarded Windows as another attempt by Microsoft to hold on to the operating system business.
We created Windows in parallel. We kept saying to IBM, hey, Windows is the way to go, graphics is the way to go, and we got virtually everyone else, enthused about Windows. So that was a divergence that we kept thinking we could get IBM to - to come around on.
It was clear that IBM had a different vision of its relationship with Microsoft than Microsoft had of its vision with IBM. Was that Microsoft's fault? You know, maybe some, but IBM's not blameless there either. So I don't view any of that as anything but just poor business on IBM's part.
Bill Gates is a very disciplined guy. He puts aside everything he wants to read and twice a year goes away for secluded practicing weeks - the decisive moment in the Microsoft/IBM relationship came during just such a retreat. In front of a log fire Bill concluded that it was no longer in Microsoft's long term interests to blindly follow IBM. If Bill had to choose between OS2, IBM's new operating system and Windows, he'd choose Windows.
We said ooh, IBM's probably not going to like this. This is going to threaten OS 2. Now we told them about it, right away we told them about it, but we still did it. They didn't like it, we told em about it, we told em about it, we offered to licence it to em.
We always thought the best thing to do is to try and combine IBM promoting the software with us doing the engineering. And so it was only when they broke off communication and decided to go their own way that we thought, okay, we're on our own, and that was definitely very, very scary.
We were in a major negotiation in early 1990, right before the Windows launch. We wanted to have IBM on stage with us to launch Windows 3.0, but they wouldn't do the kind of deal that would allow us to profit it would allow them essentially to take over Windows from us, and we walked away from the deal.
Jack Sams, who started IBM's relationship with Microsoft with that first call to Bill Gates in 1980, could only look on as the partnership disintegrated.
Then they at that point I think they agreed to disagree on the future progress of OS 2 and Windows. And internally we were told thou shalt not ship any more products on Windows. And about that time I got the opportunity to take early retirement so I did.
Bill's decison by the fireplace ended the ten year IBM/Microsoft partnership and turned IBM into an also-ran in the PC business. Did David beat Goliath? The Boca Raton, Florida birthplace of the IBM's PC is deserted - a casualty of diminishing market share. Today, IBM is again what it was before - a profitable, dominant mainframe computer company. For awhile IBM dominated the PC market. They legitimised the PC business, created the standards most of us now use, and introduced the PC to the corporate world. But in the end they lost out. Maybe it was to a faster, more flexible business culture. Or maybe they just threw it away. That's the view of a guy who's been competing with IBM for 20 years, Silicon Valley's most outspoken software billionaire, Larry Ellison.
I think IBM made the single worst mistake in the history of enterprise on earth.
Q: Which was?
LARRY: Which was the manufacture - being the first manufacturer and distributor of the Microsoft/Intel PC which they mistakenly called the IBM PC. I mean they were the first manufacturer and distributor of that technology I mean it's just simply astounding that they could ah basically supply a third of their market value to Intel and a third of their market value to Microsoft by accident - I mean no-one, no-one I mean those two companies today are worth close to you know approaching a hundred billion dollars I mean not many of us get a chance to make a $100 billion mistake.
As fast as IBM abandons its buildings, Microsoft builds new ones. In 1980 IBM was 3000 times the size of Microsoft. Though still a smaller company, today Wall Street says Microsoft is worth more. Both have faced anti-trust investigations about their monopoly positions. For years IBM defined successful American corporate culture - as a machine of ordered bureaucracy. Here in the corridors of Microsoft it's a different style, it's personal. This company - in its drive, its hunger to succeed - is a reflection of one man, its founder, Bill Gates.
Bill wanted to win. Incredible desire to win and to beat other people. At Microsoft we, the whole idea was that we would put people under, you know. Unfortunately that's happened a lot.
Computer Industry Analyst
Bill Gates is special. You wouldn't have had a Microsoft with take a random other person like Gary Kildall. On the other hand, Bill Gates was also lucky. But Bill Gates knows that, unlike a lot of other people in the industry, and he's paranoid. Every morning he gets up and he doesn't feel secure, he feels nervous about this. They're trying hard, they're not relaxing, and that's why they're so successful.
And I remember, I was talking to Bill once and I asked him what he feared, and he said that he feared growing old because you know, once you're beyond thirty, this was his belief at the time, you know once you're beyond thirty, you know, you don't have as many good ideas anymore. You're not as smart anymore.
If you just slow down a little bit who knows who it'll be, probably some company that may not even exist yet, but eh someone else can come in and take the lead.
And I said well, you know, you're going to age, it's going to happen, it's kind of inevitable, what are you going to do about it? And he said I'm just going to hire the smartest people and I'm going to surround myself with all these smart people, you know. And I thought that was kind of interesting. It was almost - it was like he was like oh, I can't be immortal, but like maybe this is the second best and I can buy that, you know.
If you miss what's happening then the same kind of thing that happened to IBM or many other companies could happen to Microsoft very easily. So no-one's got a guaranteed position in the high technology business, and the more you think about, you know, how could we move faster, what could we do better, are there good ideas out there that we should be going beyond, it's important. And I wouldn't trade places with anyone, but the reason I like my job so much is that we have to constantly stay on top of those things.
The Windows software system that ended the alliance between Microsoft and IBM pushed Gates past all his rivals. Microsoft had been working on the software for years, but it wasn't until 1990 that they finally came up with a version that not only worked properly, it blew their rivals away and where did the idea for this software come from? Well not from Microsoft, of course. It came from the hippies at Apple. Lights! Camera! Boot up! In 1984, they made a famous TV commercial. Apple had set out to create the first user friendly PC just as IBM and Microsoft were starting to make a machine for businesses. When the TV commercial aired, Apple launched the Macintosh.
Glorious anniversary of the information...
The computer and the commercial were aimed directly at IBM - which the kids in Cupertino thought of as Big Brother. But Apple had targeted the wrong people. It wasn't Big Brother they should have been worrying about, it was big Bill Gates.
We are one people....
To find out why, join me for the concluding episode of Triumph of the Nerds.
...........we shall prevail.
&#151; -- IBM Corp. and PriceWaterhouseCoopers (PWC) have separately agreed to pay the U.S. government a total of US$5.3 million to settle allegations that the companies solicited and provided improper payments on technology contracts with government agencies, the U.S. Department of Justice announced Thursday.
IBM has agreed to pay just under $3 million and PWC will pay $2.3 million to settle the complaints, the DOJ said in a news release. The allegations of improper contracting practices came from a group of whistleblower lawsuits targeting several tech vendors and systems integrators filed in U.S. District Court for the Eastern District of Arkansas in September 2004.
The lawsuits, filed by men who worked at Accenture LLP and PWC, alleged a multimillion-dollar kickback scheme involving work on numerous U.S. government contracts. Other companies named in the lawsuits include Accenture, Hewlett-Packard Co. and Sun Microsystems Inc.
IBM and PWC knowingly requested or made kickback payments under programs called "alliance benefits," shared between systems integrators and tech vendors, said the DOJ, which joined the lawsuits in April.
Both PWC and IBM denied the allegations. "IBM did not engage in any kickbacks, false claims, or any other illegal conduct in any of the claims that have been filed in this matter," said IBM spokesman Fred McNeese. "We're not going to discuss the reasons for the settlement."
PWC, in a statement, said the settlement covers the action of its consulting business, sold in October 2002 to IBM. "PWC has cooperated fully with the federal government officials conducting this investigation," said the statement from spokesman David Nestor. "PWC believes that the allegations of the complaint characterizing conduct as 'kickbacks' are completely without merit as to the firm, but chose to settle the case, without any admission of wrongdoing, in order to avoid the expense, distraction and uncertainty of litigation."
The lawsuits allege that several other prominent tech vendors received kickbacks disguised as alliance benefits. Accenture and HP in April denied wrongdoing. Sun said it was cooperating with a government investigation. Sun is "proud of what we have achieved for the American taxpayer," the company said last month.
Court documents allege a cozy relationship between systems integrators, hired by government agencies to oversee large tech projects, and tech vendors. The lawsuits accuse the companies of exchanging millions of dollars in illegal rebates and other payments since the late 1990s.
The companies also failed to disclose their conflict-of-interest relationships, the DOJ said Thursday.
"The payment of kickbacks or illegal inducements undermines the government procurement process," Peter Keisler, assistant attorney general for the DOJ's Civil Division, said in a statement. "The Justice Department is acting in these cases and in the overall investigation to protect the integrity of the procurement process for technology products and services."
The DOJ investigation into other contracts continues, the agency said.
The lawsuits were filed by Norman Rille, who had worked as a senior manager in Accenture's information delivery architecture group, and Neal Roberts, who investigated the relationship between systems integrators and IT vendors while a partner at PWC.
(Ben Ames in Boston contributed to this report.)
IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.
Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.
The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.
Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.
Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other recent increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.
Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”
Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.
Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”
In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.
Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most recent number at $812,000 globally.
The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.
Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.
Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”
Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.
Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.
Watson is already very good at recognizing images. Drop in an image of a building and it will tell you the type of building and even what materials it's likely made of. But New York City-based design studio SOFTlab wanted to know if Watson could do more than just recognize art. Could Watson help create it?
It turns out it can. IBM is calling it the first “thinking sculpture” – an art piece that helped pick its own materials, shapes, and colors.
Antoni Gaudí was a 19th century Spanish architect whose avant-garde work has become synonymous with the look and feel of Barcelona. Inspired by naturally-occurring forms, Gaudí was known for his unique treatment of materials, including ceramics, that has given his pieces, including his most well-known work – the Sagrada Família – their distinctive look.
As MWC 2017 is being held in Barcelona this week, SOFTlab decided creating a sculpture inspired by Gaudí's work would be the perfect task to set Watson to for the event. The team at SOFTlab fed Watson a plethora of academic and artistic work around Gaudí and the city of Barcelona, including images, articles, literature, and even music – teaching it to become an expert on Gaudi and his design process. From there Watson was able to identify themes and patterns in Gaudi's work, including his use of materials, and was then able to suggest designs based on its knowledge.
Watson was able to recognize structures, elements, and features in [Gaudí's] art and his work,” Jonas Nwuke, Manager, IBM Watson, told Design News. “It gets to the essence of an image and when it looks at another it tries to make sense of that image through what it's been taught.”
Essentially by teaching Watson the difference between two categories (i.e. an image of a Gaudí structure and a non-Gaudí structure), via its Visual Recognition API, Watson is able to learn the difference between the two. The more examples it has, the better it gets. It can then take in new images and figure out what category they belong in. The other half was performed by Watson's AlchemyLanguage API, which analyzes text and language for keywords, taxonomy, and concepts that it is taught. Again, the more text about Gaudí the system is exposed to, the better it gets at recognizing words, phrases, and even emotions associated with his work.
While certain patterns in Gaudí's structures, such as waves and arches, would be clear to any architect well-versed in his work, Watson was also able to draw on its existing database and find less obvious connections in forms found in things like crabs, spiders, shells, and even candies. It also helped the designers with their material selection based on their criteria, helping them to arrive at the color scheme (ultramarine blue, jade green, yellow and orange) as well as the the iridescent dichroic film material used throughout the sculpture.
As an added layer, the sculpture is also being fed social media data from MWC attendees via Twitter and it is able to move and reshape itself based on the emotions it reads from the tweets by utilizing Watson's Tone Analyzer API.
“As we've opened up the Watson platform for developers and makers what we found was there were some creative pursuits that presented themselves,” Nwuke said. “Our engineering team got involved in creating food recipes, music, fashion, even movie trailers ... 2017 has become the year that we are going to see what Watson can do in the architectural field.”
Nwuke said a lot what IBM looks at when bringing Watson into the real world is very constrained. This collaboration with SOFTlab presented an opportunity to see how well Watson could be applied to a purely creative endeavour. And though this particular instance was centered around Gaudi, Nwuke added that Watson could be trained in expertise in any artist and could even be trained on multiple artists in order to mix and match influences.
The same concept could be extended into areas of design including product engineering. Perhaps a design engineer wants to create a product inspired by a certain artist, form, or even other product, or maybe they're looking to find patterns and associations in existing product designs. Watson could be taught to become an expert on a particular product and design and assist engineers in the design process, including material selection.
Nwuke pointed to another project, OmniEarth as an example of how robust and flexible Watson's visual recognition is. OmniEarth is leveraging Watson's services to analyze satellite images for water conversation, by being able to classify irrigable, irrigated and non-irrigated areas, agricultural zones, lawns, and even swimming pools.
But the goal is not to have Watson design something, according to Nwuke. It's part of an initiative IBM is calling “augmented intelligence.” “The endgame is not to replace [architects], it's to provide a way to augment them,” Nwuke said.
Chris Wiltz is the Managing Editor of Design News.
The IBM PC spawned the basic architecture that grew into the dominant Wintel platform we know today. Once heavy, cumbersome and power thirsty, it’s a machine that you can now emulate on a single board with a cheap commodity microcontroller. That’s thanks to work from [Fabrizio Di Vittorio], who has shared a how-to on Youtube.
The full playlist is quite something to watch, showing off a huge number of old-school PC applications and games running on the platform. There’s QBASIC, FreeDOS, Windows 3.0, and yes, of course, Flight Simulator. The latter game was actually considered somewhat of a de facto standard for PC compatibility in the 1980s, so the fact that the ESP32 can run it with [Fabrizio’s] code suggests he’s done well.
It’s amazingly complete, with the ESP32 handling everything from audio and video to sound output and keyboard and mouse inputs. It’s a testament to the capability of modern microcontrollers that this is such a simple feat in 2021.
We’ve seen the ESP32 emulate 8-bit gaming systems before, too. If you remember [Fabrizio’s] name, it’s probably from his excellent FabGL library. Videos after the break.
Android announced its updateable, fully-integrated ML inference stack for developers to get built-in on-device inference essentials, optimal performance on all devices and a consistent API that spans Android versions.
TensorFlow Lite will be available on all devices with Google Play Services and will no longer require developers to include the runtime in their apps.
Also, automatic acceleration is a new feature in TensorFlowLite for Android that enables per-model testing to create allowlists for specific devices taking performance, accuracy and stability into account.
IBM announced plans to acquire BoxBoat Technologies, a DevOps consultancy and enterprise Kubernetes certified service provider.
“Our clients require a cloud architecture that allows them to operate across a traditional IT environment, private cloud and public clouds. That’s at the heart of our hybrid cloud approach,” said John Granger, senior vice president of Hybrid Cloud Services at IBM. “No cloud modernization project can succeed without a containerization strategy, and BoxBoat is at the forefront of container services innovation.”
BoxBoat will join IBM Global Business Services’ Hybrid Cloud Services business to enhance IBM’s capacity to meet rising client demand for container strategy.
Additional details are available here.
Aqua Security announced that it is acquiring the cloud security company tfsec to add infrastructure as code (IaC) security capabilities to its open-source portfolio and cloud-native security platform.
The unique approach tfsec takes to loading code ensures that one’s IaC is interpreted exactly as Terraform does, meaning that regardless of complexity, users get a comprehensive view of any vulnerabilities before deployment, according to Aqua Security
“Aqua Trivy has become the industry standard for open source vulnerability scanning thanks to its simple user experience and rich functionality. Now Trivy brings the same superior experience into Infrastructure as Code scanning to provide even more value to container and code scanning,” says Itay Shakury, the director of open source at Aqua Security. “By integrating tfsec and Trivy, our users can scan code repositories and container images for vulnerabilities and IaC configuration issues – all using a single tool, that can integrate into their CI tool or even be used as a Github action.”
Devart adds new data connectivity tool
Devart added a new tool to their data connectivity product line, ODBC Driver for Hubspot, which has enterprise-level features for accessing HubSpot from ODBC-compliant reporting, analytics, BI and ETL tools.
The tool provides full support for standard ODBC API functions and data types and for all HubSpot objects and data types. It can also be connected to HubSpot directly through HTTPS or through a proxy server.
“Our ODBC driver is a standalone installation file that doesn’t require the user to deploy and configure any additional software such as a database client or a vendor library. Deployment costs are reduced drastically, especially when using the silent install method with an OEM license in large organizations that have hundreds of machines,” the company stated on its website.
This week at the Apache Software Foundation (ASF) saw the release of ShardingSphere ElasticJob 3.0.0, an ecosystem that consists of a set of distributed database solutions, including 3 independent products, JDBC, Proxy & Sidecar (Planning).
Also new this week are AntUnit 1.4.1, CloudStack 184.108.40.206 LTS, Tika 1.27, UIMA Java SDK 2.11.0, Qpid Proton 0.35.0, Dispatch 1.16.1 and more. Apache Sqoop is now retired.
Additional details on all of the new releases from the ASF are available here.
The majority of UK business leaders (81%) expect industry disruption from quantum computing by 2030, according to EY research. That’s a bold claim for a technology that’s yet to have a proven business case. The UK, though, is betting the near future will see the practical application of quantum computing in a wide range of industries.
Last year also saw the establishment of the National Quantum Computing Centre (NQCC). Commenting on EY’s research when it was published, Dr Simon Plant, deputy director for Innovation at the NQCC, said: "Quantum computing is expected to significantly speed up the time to solution for certain tasks, addressing computational problems that are currently intractable using conventional digital technologies. As a result, the pace of development is accelerating, and the question is how and when – not if – quantum computing can address industrially-relevant use cases.”
We must first quantify the claim that quantum computing will be disruptive, and define this within the context of business processes. The pharmaceutical, finance and automotive sectors, for instance, are likely to be the first adopters. Moving away from classical computing architectures could revolutionise the development of new drugs, for example, radically cutting the typical ten-year development cycle and slashing the $2 billion average cost of bringing a new drug to market.
Quantum advantage, however, is fast approaching for all businesses, according to managing director of Accenture Technology in the UK and Ireland, Maynard Williams. “The power that this new generation of machines will create will start to make these core challenges achievable, with quantum at the pinnacle of next-generation problem solving. The single biggest watershed moment for computing will be when quantum computers solve the problems that were considered quite literally intractable – making the impossible possible.”
The key to harnessing quantum computers will be to place them into context and use these machines alongside binary computers. Classic computers, indeed, have their own place in the computing landscape in such a way that all enterprises, no matter their size, can take full commercial advantage of the next technological wave.
“I hope the detailed quantum strategy will put a strong focus on industry skills in addition to research and development,” says Richard Hopkins, distinguished engineer and fellow of the Institute of Engineering and Technology from IBM. His company is one of many striving to achieve quantum supremacy alongside the likes of D-Wave and Google. "There's a lot of value the UK could derive by being the first to apply quantum computing to real-world industrial problems, but that will need us to invest in the associated skills.
The popularity behind the potential of adopting quantum commuting is vast, as research by Fujitsu and the Tecknowlodgy Group – looking closely at how quantum computing could be applied across a diverse business landscape – found. The study saw 81% of business leaders across sectors including manufacturing, life sciences, retail, transport, and utilities saying optimising business processes can help them to tackle digital transformation challenges. It’ll also allow such businesses to remain competitive in a fast-changing market.
It’s telling that there's been a profound shift in the quantum computing discourse, with the conversation moving from theory to how to apply the technology in practice. Now the worst of COVID-19 has passed, businesses must also be more agile and innovate at speed, which includes establishing how to adopt emerging and once-gimmicky technologies like quantum computing. Business leaders now see this new frontier for computing as a tool in the arsenal towards boosting innovation, including the likes of BASF and Volkswagen, which are already using quantum algorithms within their businesses.
The Total Economic Impact™ of IBM Watson Assistant
Cost savings and business benefits enabled by Watson AssistantFree Download
In March this year, HSBC detailed its intention to join the IBM Quantum Accelerator programme, which would supply the bank access to the company's 127-qubit processor known as Eagle. “This technology has the potential to transform how we run areas of the bank by addressing challenges which classical computers may never be able to solve alone,” commented HSBC Bank plc and HSBC Europe chief executive officer (CEO), Colin Bell, on the announcement.
Accenture’s Maynard Williams also advises that businesses should begin their strategic planning to shape their quantum future. "The quickest action business leaders can take on the journey to quantum readiness is to begin evaluating how these technologies will shape the operations of their enterprise,” he says. “This will allow them to identify their knowledge and operational gaps, and begin filling them in before it's too late.”
The cost of investment has, until today, positioned quantum computing almost exclusively within the corporate realm. Smaller and medium-sized businesses (SMBs) and startups, however, will also be eager to tap into its potential. Is the quantum door closed to them?
Access to technology has always shaped business cases. What we are seeing in the marketplace right now is the major cloud service providers developing quantum as a service (QaaS) platforms that offer the high level of computing capability to the broadest possible audience. IBM has been offering its quantum computers through the cloud for some time, with the likes of AWS offering a rival Braket quantum cloud service and Microsoft offering Pasqal machines through Azure.
Digital Annealer, produced by Fujitsu, is another quantum-inspired technology architecture designed to help businesses solve complex combinatorial challenges beyond the capabilities of today's computers. Although not an genuine deployment of quantum computing, this technology does show the way to the quantum future many businesses can see as part of their development over the next decade.
In some sectors in which there are many smaller businesses – notably financial services and fintech especially – quantum computing is seen as a potential business differentiator, and certainly a technology that could deliver commercial advantages. Despite the progress made in recent years, however, the real-world practical applications of quantum computing won’t arrive until 2030, according to McKinsey. Until then, quantum-inspired algorithms will continue to be developed and deployed in association with high performance computing (HPC) models.
Do all businesses have a quantum future? “Given the difference in the nature of this technology, organisations may wish to ‘phone a friend’ to help them develop their quantum strategy and start their initial investigations,” Deloitte’s Scott Buchholz, national emerging technology research director and CTO for government and public services tells IT Pro. “Done with the right partner, that approach will save time and speed up the learning process.”
Jan Beitner, lead data scientist at Boston Consulting Group, also advises businesses should take practical action today: "Businesses should start by examining their value chain and identifying the use cases most relevant to them. From there, they need to quantify them and, with the support of experts, work out when computers can address them. From that starting point companies can then build a roadmap to prepare for the application and integration of quantum as its potential gets unlocked by advances in the technology.”
As businesses look to re-draw their digital transformation roadmaps in the wake of COVID-19, quantum computing should be a component of that journey. As development continues, we’re not far from a breakthrough that will see quantum computers enter perhaps not the mainstream, per se, but become a transformative technology all businesses can access. The key is having a clear deployment strategy and defined business cases to harness quantum computing's unique properties.
The state of Salesforce: Future of business
Three articles that look forward into the changing state of Salesforce and the future of businessFree Download
The mighty struggle to migrate SAP to the cloud may be over
A simplified and unified approach to delivering Enterprise Transformation in the cloudFree Download
The business value of the transformative mainframe
Modernising on the mainframeFree Download
The Total Economic Impact™ Of IBM FlashSystem
Cost savings and business benefits enabled by FlashSystemFree Download
Even as businesses and individuals seek a return to something like normalcy following the COVID-19 pandemic, ripple effects of the pandemic are here to stay. One major trend has been the accelerated pace of technology adoption by businesses. In fact, a McKinsey & Co. report found the pandemic has caused businesses to bring their digital transformations forward by seven years or more.
Changes like the normalization of hybrid workforces and work-from-anywhere arrangements have drastically shifted office culture and security practices. Simultaneously, technologies that further these practices, such as workplace virtualization and cloud computing, have expanded in popularity. These shifts have enabled greater flexibility for employees and a wider talent pool for companies, but they have also increased security concerns.
IBM’s most recent report on the syllabu found the average cost of a data breach for a small or midsize business in 2021 was $2.98 million, a 26.8% increase from 2020. However, IBM also found that one of the most effective methods for mitigating a breach, or at least reducing the impact and cost of one, was a zero-trust security model. [Related article: Worried About a Cyberattack? What It Could Cost Your Small Business]
With that in mind, we’ve put together this guide to explain exactly what zero trust is, the benefits of using it, and why SMBs need to focus more on their security.
The U.S. National Institute of Standards and Technology (NIST) defines zero-trust architecture as a security practice focused on users, assets, and resources as opposed to a more static practice of network-based perimeter defense. Essentially, ZTA assumes that any network is already breached and that administrators should focus on authenticating and validating users.
ZTA focuses less on defending a network through older cybersecurity tools like firewalls (although this is still good practice) than on encouraging business owners and IT administrators to find ways to protect data within a network. It achieves this by continuously authenticating, authorizing, and validating user identity before granting access to any data, resources, or networks.
This shift in thinking is largely due to the way working conditions have changed. ZTA allows a business to craft more effective security policies and procedures for remote users, cloud-based systems not within a business’s own network, BYOD (bring your own device) policies, and employees or third parties potentially using other devices to access workplace resources.
Key takeaway: ZTA is an updated security framework focusing on user authentication, authorization, and validation, whether or not that user is within the business’s network. Unlike earlier security concepts, it does not assume a traditional network edge; networks can now extend into the cloud, so cybersecurity systems must as well.
As ZTA assumes any user account within a network – or the network itself – is already breached, the security model requires some changes in thinking from past security modes. Per IBM’s Cost of a Data Breach Report 2021, ZTA requires rigorous use of analytics and AI systems to continuously authorize and validate user connections. This authorization and validation relies on established behavioral and environmental baselines, which security tools can use to compare against any dubious activity.
For example, a network following a ZTA system establishes baselines for when an employee typically tries to access work data (behavioral) and from what locations they work (environmental). Knowing this information, a network using ZTA would be able to flag and potentially deny access attempts outside of the employee’s normal work hours or from a different state or country. This sort of continuous authorization and validation, even for resources an employee routinely has access to, can help prevent data breaches and other cyberattacks.
While setting baselines is one of the key principles of ZTA, you’ll need to follow other critical tenets:
A zero-trust model is one of the easiest and most efficient ways to Improve your business’s overall security.
According to cloud security company Zscaler, one of the largest benefits of adopting zero trust is limiting the impact and severity of cyberattacks. It mitigates your business’s overall risk of various cyberattacks and incidents, like supply chain attacks, ransomware attacks, and insider threats. [Brush up on common attacks and the best defenses in our guide to small business cybersecurity.]
If a cyberattack still happens, ZTA reduces the associated cost to your business. IBM’s report found that businesses with mature ZTA paid 42.3% less per breach on average than businesses that hadn’t started employing ZTA. [Has your business suffered a data breach? Learn how to mitigate the damage.]
Besides reducing your business’s potential attack area, ZTA can help you closely adhere to compliance initiatives. Zscaler noted that ZTA supports initiatives like PCI DSS for businesses that accept credit cards and NIST’s SP 800-207, which makes it easier to prove your business’s compliance during security audits.
Cybersecurity company CrowdStrike also noted that ZTA can Improve visibility on a network for administrators while helping to contain potential breaches. Both of these benefits come from ZTA only allowing access to approved users after proper authentication and validation. By implementing ZTA, you can see exactly who is on your network and why, and limit them to certain resources or segments of the network.
Did you know?: Business data encryption is a key component of ZTA. According to IBM’s Cost of a Data Breach Report, strong encryption (at least AES-256) was the third most efficient means of lowering the cost of a breach in 2021. Businesses that encrypted their data had 29.4% lower data breach costs than businesses that used low-standard or no encryption.
ZTA is a critical tool for small businesses’ security, as SMBs face an overall worsening cybersecurity environment. According to the Cyber Readiness Report 2022 compiled by the insurance company Hiscox, SMBs with annual revenues of just $100,000 to $500,000 can now expect as many cyberattacks per year as businesses earning $1 million to $9 million.
These cyberattacks can have long-term implications for business. In Hiscox’s survey of IT professionals, 21% said their company’s solvency was threatened after an attack. Additionally, 22% of these businesses recorded a loss of customers, as well as greater difficulty attracting new customers. Approximately 27% of respondents also noted a negative impact on their brand and company reputation following a cyberattack.
Unfortunately, SMBs are likely to be targeted repeatedly. In Fortinet’s survey for The State of Small Business Security 2022, 71% of businesses with fewer than 500 employees confirmed or believed they’d been a victim of a cyberattack in the past year. Among companies with fewer than 25 employees, almost a quarter experienced at least four cyberattacks throughout 2021.
All sizes of SMBs reported social engineering as their top security concern, as well as the top reason they believed a security incident affected them in 2021. Fortunately, as Fortinet noted, implementation of ZTA limits the chances of success and the ultimate impact of a social engineering attack. Tools like multifactor authentication – a pillar of ZTA – can also mitigate credential theft and potential attacks.
Key takeaway: According to Hiscox, the most common point of entry for a cyberattack in 2021 was a cloud server. ZTA can help prevent this sort of attack.
Whole-cloth implementation of ZTA can be difficult if your business lacks technical expertise or dedicated IT staff. In this case, it may be easiest to hire an outside IT department, like a managed service provider. Qualified MSPs should be able to install the necessary tools and configurations for ZTA within your business.
If you do have a dedicated IT team capable of establishing ZTA, the best method is to break down the transition to zero trust into small action items. Different security vendors and cybersecurity companies break down the ZTA journey into different numbers of steps. RSI Security, a leader in cybersecurity compliance, lays out seven steps businesses can take to establish ZTA, while network security company Palo Alto Networks breaks down the creation of ZTA into a five-step method.
Whichever methodology your SMB follows, you need to take a few fundamental steps for ZTA to work:
While ZTA can mitigate various security incidents, it is not foolproof and should be just one measure in your defense-in-depth strategy. Fortunately, ZTA works well with other security best practices that are easy to implement at a low cost:
The webinar outlined what AI and ML mean in today’s world and how students could get involved
Mr Subhendu Dey also laid out a comprehensive roadmap for those looking to start a career in AI and ML
Artificial Intelligence and Machine Learning as disciplines have taken the world by storm, particularly in the 21st century. While many youngsters have drawn inspiration from some of the best science fiction featuring AI and robots, the genuine world of AI and ML has been growing by leaps and bounds. But what does the world of AI and ML have to offer? How can you transition from campus to career with AL & ML? And how can you be an expert in AI & ML? To answer these and many other questions, The Telegraph Online Edugraph organised a webinar with Subhendu Dey, a Cloud Architect and advisor on Data and AI.
The webinar saw participants from class 8 right up to those in advanced degrees, as well as teachers. Hence, the subject matter of the webinar contained takeaways that would be relevant at all stages. Mr Dey also highlighted that he would be focusing on showing how things that have always existed around us contribute to AI - giving students a more intuitive idea of AI and making it more interesting.
The webinar started by taking a look at a simple action like sending a text. People would find that their mobiles would keep suggesting words to them. Be it as soon as they have typed a few letters or after they have typed a few words, they would get suggestions that are surprisingly accurate. This is called Language Modelling and requires an intuitive understanding of language. A human may be able to do it from his or her extensive knowledge of words and language, but in this case, it is a fine demonstration of the intuitiveness of AI.
Let’s look at another aspect of AI - when we key in a question into the Google search bar, a decade or so ago, Google would have analysed the keywords and thrown up a list of links that feature the keywords. But fast-forward to this decade and Natural Language search is today capable of not just practicing the keywords but also finding out the intent behind the query. This means that Google will, in addition to giving you the links, also supply you the answer, as well as other questions that have the same or related intent. In fact, Google also has a system for taking feedback, which facilitates the Google AI to learn to be even more intuitive and better at giving suggestions.
One need only look at the digital assistant - Siri, Google Assistant or Alexa - to understand the advancements in AI. From understanding spoken queries to giving intuitive, and often very witty, answers, these assistants communicate in a surprisingly human-like manner. Of course, there is a cycle of tasks that they must perform behind the scenes, which Mr Dey spoke about in detail.
While these changes that we can observe are new, AI has been around for a long time now. One of the earliest feats was in 1997, when the IBM Supercomputer Deep Blue beat world chess champion Gary Kasparov, in a six-match tournament.
Today artificial intelligence is a booming area of development and the Ministry of Electronics & Information Technology projects the addition of about 20 million jobs in the sector by 2025. In fact, this is also underscored by multiple studies and reports prepared by global auditing firms like Deloitte, NASSCOM and PwC.
However, one question that has always baffled scientists and engineers working in the domain of AI, is striking a balance between behaviour and reasoning on the one hand and human/irrational and rational on the other, when designing the various Artificial Intelligence agents. It has, however, been found that more intuitive AI agents with better user experience interfaces have a higher penetration in human society.
Next we take a look at Machine Learning. When an AI agent learns on its own from the interactions it has, this is known as Machine Learning. When humans learn something, it registers in some form in the mind. However, machines perceive data in the form of functions and variables. With Machine Learning, AI agents create models which exist as executable software components made up of a sequence of mathematical variables and functions. Hence, becoming an expert in AI and ML usually requires a person to have a sound understanding of mathematics and statistics.
Speaking of building a career in AI and ML, Mr Dey threw light on three avenues into the industry. These are:
Let’s take a look at each of these.
As a Scientist
As mentioned above, to communicate with AI, your query must be represented in a mathematical/logical format. Hence, when choosing your educational degrees or courses, go for courses that cover the following subjects which contribute to the core of AI:
Choosing a major which covers these aspects should arm you with the knowledge and skills you need to become a scientist in AI.
As an Engineer
Being a scientist is not your only option, though. AI also depends heavily on engineers to grow and develop. From the engineering perspective, here is a list of functions that need to be carried out:
As a contributor
If you find you are not interested in being a scientist of an engineer, there are other significant ways you can contribute to AI. That could be in the following areas:
Mr Dey discusses all these avenues at length in the course of the webinar with examples. At the same time, he lays out the basic qualities that one must have - irrespective of which role one chooses to pursue. And these are creative vision, innate curiosity and perseverance.
Here are some courses that you should explore if you want to build a career in the core AI aspects:
The webinar ended with a detailed Q&A session which opened with some questions received from participants submitted at the time of registration and carried on to questions asked by participants in the course of the webinar. The Q&A covered a range of interesting subjects like:
To learn the answer to these and many more questions, watch our video recording of the live webinar.
A career in AI and ML is an excellent choice now - and this small initiative of The Telegraph Edugraph was aimed at providing the right guidance for you to make the transition from Campus to Career. Best of luck!
Last updated on 26 Jul 2022
The trophies have been lifted, the Pimm’s drunk and the dust has just about settled on Centre Court at the All England Lawn Tennis & Croquet Club in Wimbledon.
Serbian champion Novak Djokovic lifted the trophy for the seventh time on Sunday, beating Australian Nick Kyrgios to the title, while the day before saw Kazakhstani player Elena Rybakina snatch victory over Tunisian Ons Jabeur in the women’s final.
The 2022 tournament was the first full-capacity event since 2019, with Covid-19 cutting attendances in 2021 and the tournament cancelled entirely the year before. But while all eyes were on the action on court, there was a lot more going on in the background.
In the basement of the broadcast centre, tucked away behind Court 18, lies the nerve centre for a sophisticated operation that collects and analyses data from every game.
That means monitoring every shot, rally and fault from different angles, the type of shot, success rate and so on. This huge pool of data underpins the insights and statistics that are fed to the Wimbledon website and app, and supply data to the broadcasters covering the tournament.
This is where tech giant IBM carries out its operations. The company has worked with the All England Lawn Tennis Club on the championship for three decades, with 2022 marking the 33rd year of the technology partnership. Since 1990, IBM has provided technology and consulting services to bring innovation to what is viewed as a traditional organisation, using the technology IBM has developed for its business customers to help Wimbledon on its quest for digitalisation, powering its fan platforms and keeping the tournament’s systems secure.
And evidence of its success is all around. In the past few years, the tournament has seen the introduction of AI-powered predictions for matches, the power index that ranks players, recommendations and AI-powered highlight reels that are generated automatically.
It’s all in the name of fan engagement, attracting more people to the tournament through its digital platforms.
This network of rooms in the media centre, out of the heatwave that hit Wimbledon, is where the operations are co-ordinated. Banks of screens display stats and matches on all the courts being played at Wimbledon. Shots are examined. On the wall of one room is a commemoration of the fastest serve recorded by IBM at Wimbledon: 148 miles per hour, served by Taylor Dent in 2010; second to that was Andy Roddick.
If statisticians watching the games are unsure about a particular shot, they may call down here for clarification from the people behind the screens, who can replay all the points on all the courts around Wimbledon.
Over the course of the championship, IBM collects and analyses 125,000 data points, including about 2,000 per match. How the ball bounces, for example, or a player’s gestures during the match. In total, it has collected more than nine million data points at Wimbledon, an astonishing amount of data that IBM has at its fingertips.
“That data is what drives the statistics that go on the screens when you watch the BBC or coverage,” says Simon Boyden, digital architect with IBM. “All of that data is coming from this kind of nerve centre.”
In 2017, IBM Watson was brought into the mix. That is IBM’s artificial intelligence for business, which helps organisations predict future outcomes and automate complex processes. Now, it can also help make predictions for matches.
“Every player ends up with a whole set of structured data about them,” said Boyden. “We have a historical archive of it. What we’ve been really focusing on over the last few years is using Watson to tap into some of the unstructured data.”
A combination of IBM Watson Discovery and IBM Cloud analyses performance data, looks through media commentary, and measures player momentum.
Last year saw the introduction of the IBM Power Index, which uses natural language processing to analyse player performance before and during the championship. The same system was used for prematch insights, a preview sheet generated by AI that compares players head to head.
This year’s big innovation is the ability in the app for fans to have their say on how they think the matches will go, and compare their thoughts with those of other fans and the AI-generated predictions supplied by IBM.
For fans, these match predictions and the ability to have their say has been a firm favourite.
Although Watson hits the mark more often than not in its predictions, it is not infallible. A quick scroll through the app shows its predictions and how close to the mark they were. Although it correctly predicted the result of the men’s and women’s semi-finals that were played, the finals were a different story.
Before the players even set foot on the pitch, Watson had predicted a win for Kyrgios — the wrong result, in the end, and proof that technology isn’t infallible. The IBM system had taken a number of things into account, including Kyrgios’s previous record against Djokovic; he had won both the previous two outings against the champion.
In contrast, the fan predictions swung the other way, proclaiming Djokovic would be the winner.
For the women’s final the previous day, the system had predicted a win for Jabeur, but it was Rybakina who lifted the trophy.
That’s half the fun of it, though.
“Ultimately, it’s about provoking debate,” explains Boyden. “One of the things that we’ve added for this year is this idea of explainable AI. The challenge with AI is often that you have a very complicated big black box that data goes into and results come out. One of the trends we’re seeing is around explainable AI — why did the model think this?”
Artificial intelligence is also taking over some of the coverage. It can automatically create highlight reels for critical match events as quickly as possible. It uses three main factors to decide if an event is worth highlighting: crowd noise, player gestures, and the match situation, such as if it is a game or match point. A process that would have been done manually a few years ago is now being controlled by algorithms.
“Leveraging technology to help fans become more informed, engaged and involved throughout the Wimbledon fortnight is at the core of our strategy to ensure we are leveraging innovation to keep Wimbledon relevant and deliver outstanding digital experiences for fans, wherever they may be,” said Alexandra Willis, communications and marketing director at the All England Club. “Core to these experiences is our ambition to help fans get closer to Wimbledon by understanding which players to follow and analyse, and inviting them to get involved with new match predictions and insights features, alongside our extensive scoring, news and video content across our channels.”
The whole operation is run on IBM Cloud and hybrid cloud technologies, with more and more functions being shifted to the cloud. Although some functions are still done on site for operational reasons, recent years have seen the balance shift to the cloud. The system also has to be robust enough to take a surge in traffic without collapsing, and secure enough to keep out unwanted intruders.
Keeping innovation alive at the tournament is a commitment. And although the tournament has just wrapped up, the two organisations are already working on innovations for 2023.
Twice a year, in spring and autumn, IBM’s digital and design consultancy arm IBM iX holds workshops with IBM and All England Lawn Tennis & Croquet Club representatives to discuss where they will go next. Spring is when the new ideas are put forward and discussed; autumn is when they decide what will be implemented the following year.
While the focus may be on driving innovation, though, doing things right is just as much of a priority, says Boyden.
“It’s quality. You’ve got to get this right.”