Free C2160-667 PDF and VCE at killexams.com

Create sure that a person has IBM C2160-667 braindumps of actual questions for the particular IBM Test667- Architectural Design of SOA Solutions exam prep before you choose to take the particular real test. All of us give the most up-to-date and valid C2160-667 cheat sheets that will contain C2160-667 real examination questions. We possess collected and produced a database associated with C2160-667 VCE from actual examinations having a specific finish goal to provide you an opportunity to get ready plus pass C2160-667 examination upon the first try. Simply memorize our own C2160-667

Exam Code: C2160-667 Practice test 2022 by Killexams.com team
IBM Test667, Architectural Design of SOA Solutions
IBM Architectural test
Killexams : IBM Architectural test - BingNews https://killexams.com/pass4sure/exam-detail/C2160-667 Search results Killexams : IBM Architectural test - BingNews https://killexams.com/pass4sure/exam-detail/C2160-667 https://killexams.com/exam_list/IBM Killexams : IBM Expands Its Power10 Portfolio For Mission Critical Applications

It is sometimes difficult to understand the true value of IBM's Power-based CPUs and associated server platforms. And the company has written a lot about it over the past few years. Even for IT professionals that deploy and manage servers. As an industry, we have become accustomed to using x86 as a baseline for comparison. If an x86 CPU has 64 cores, that becomes what we used to measure relative value in other CPUs.

But this is a flawed way of measuring CPUs and a broken system for measuring server platforms. An x86 core is different than an Arm core which is different than a Power core. While Arm has achieved parity with x86 for some cloud-native workloads, the Power architecture is different. Multi-threading, encryption, AI enablement – many functions are designed into Power that don’t impact performance like other architectures.

I write all this as a set-up for IBM's announced expanded support for its Power10 architecture. In the following paragraphs, I will provide the details of IBM's announcement and give some thoughts on what this could mean for enterprise IT.

What was announced

Before discussing what was announced, it is a good idea to do a quick overview of Power10.

IBM introduced the Power10 CPU architecture at the Hot Chips conference in August 2020. Moor Insights & Strategy chief analyst Patrick Moorhead wrote about it here. Power10 is developed on the opensource Power ISA. Power10 comes in two variants – 15x SMT8 cores and 30x SMT4 cores. For those familiar with x86, SMT8 (8 threads/core seems extreme, as does SMT4. But this is where the Power ISA is fundamentally different from x86. Power is a highly performant ISA, and the Power10 cores are designed for the most demanding workloads.

One last note on Power10. SMT8 is optimized for higher throughput and lower computation. SMT4 attacks the compute-intensive space with lower throughput.

IBM introduced the Power E1080 in September of 2021. Moor Insights & Strategy chief analyst Patrick Moorhead wrote about it here. The E1080 is a system designed for mission and business-critical workloads and has been strongly adopted by IBM's loyal Power customer base.

Because of this success, IBM has expanded the breadth of the Power10 portfolio and how customers consume these resources.

The big reveal in IBM’s recent announcement is the availability of four new servers built on the Power10 architecture. These servers are designed to address customers' full range of workload needs in the enterprise datacenter.

The Power S1014 is the traditional enterprise workhorse that runs the modern business. For x86 IT folks, think of the S1014 equivalent to the two-socket workhorses that run virtualized infrastructure. One of the things that IBM points out about the S1014 is that this server was designed with lower technical requirements. This statement leads me to believe that the company is perhaps softening the barrier for the S1014 in data centers that are not traditional IBM shops. Or maybe for environments that use Power for higher-end workloads but non-Power for traditional infrastructure needs.

The Power S1022 is IBM's scale-out server. Organizations embracing cloud-native, containerized environments will find the S1022 an ideal match. Again, for the x86 crowd – think of the traditional scale-out servers that are perhaps an AMD single socket or Intel dual-socket – the S1022 would be IBM's equivalent.

Finally, the S1024 targets the data analytics space. With lots of high-performing cores and a big memory footprint – this server plays in the area where IBM has done so well.

In addition, to these platforms, IBM also introduced the Power E1050. The E1050 seems designed for big data and workloads with significant memory throughput requirements.

The E1050 is where I believe the difference in the Power architecture becomes obvious. The E1050 is where midrange starts to bump into high performance, and IBM claims 8-socket performance in this four-socket socket configuration. IBM says it can deliver performance for those running big data environments, larger data warehouses, and high-performance workloads. Maybe, more importantly, the company claims to provide considerable cost savings for workloads that generally require a significant financial investment.

One benchmark that IBM showed was the two-tier SAP Standard app benchmark. In this test, the E1050 beat an x86, 8-socket server handily, showing a 2.6x per-core performance advantage. We at Moor Insights & Strategy didn’t run the benchmark or certify it, but the company has been conservative in its disclosures, and I have no reason to dispute it.

But the performance and cost savings are not just associated with these higher-end workloads with narrow applicability. In another comparison, IBM showed the Power S1022 performs 3.6x better than its x86 equivalent for running a containerized environment in Red Hat OpenShift. When all was added up, the S1022 was shown to lower TCO by 53%.

What makes Power-based servers perform so well in SAP and OpenShift?

The value of Power is derived both from the CPU architecture and the value IBM puts into the system and server design. The company is not afraid to design and deploy enhancements it believes will deliver better performance, higher security, and greater reliability for its customers. In the case of Power10, I believe there are a few design factors that have contributed to the performance and price//performance advantages the company claims, including

  • Use Differential DIMM technology to increase memory bandwidth, allowing for better performance from memory-intensive workloads such as in-memory database environments.
  • Built-in AI inferencing engines that increase performance by up to 5x.
  • Transparent memory encryption performs this function with no performance tax (note: AMD has had this technology for years, and Intel introduced about a year ago).

These seemingly minor differences can add up to deliver significant performance benefits for workloads running in the datacenter. But some of this comes down to a very powerful (pardon the redundancy) core design. While x86 dominates the datacenter in unit share, IBM has maintained a loyal customer base because the Power CPUs are workhorses, and Power servers are performant, secure, and reliable for mission critical applications.

Consumption-based offerings

Like other server vendors, IBM sees the writing on the wall and has opened up its offerings to be consumed in a way that is most beneficial to its customers. Traditional acquisition model? Check. Pay as you go with hardware in your datacenter? Also, check. Cloud-based offerings? One more check.

While there is nothing revolutionary about what IBM is doing with how customers consume its technology, it is important to note that IBM is the only server vendor that also runs a global cloud service (IBM Cloud). This should enable the company to pass on savings to its customers while providing greater security and manageability.

Closing thoughts

I like what IBM is doing to maintain and potentially grow its market presence. The new Power10 lineup is designed to meet customers' entire range of performance and cost requirements without sacrificing any of the differentiated design and development that the company puts into its mission critical platforms.

Will this announcement move x86 IT organizations to transition to IBM? Unlikely. Nor do I believe this is IBM's goal. However, I can see how businesses concerned with performance, security, and TCO of their mission and business-critical workloads can find a strong argument for Power. And this can be the beginning of a more substantial Power presence in the datacenter.

Note: This analysis contains insights from Moor Insights & Strategy Founder and Chief Analyst, Patrick Moorhead.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.

Wed, 13 Jul 2022 12:00:00 -0500 Matt Kimball en text/html https://www.forbes.com/sites/moorinsights/2022/07/14/ibm-expands-its-power10-portfolio-for-mission-critical-applications/
Killexams : Avnet, IBM announce ASIC distribution agreement Avnet selected as first channel Business Partner to execute IBM ASIC design methodologies

Phoenix, AZ and East Fishkill, NY, March 11, 2004 - Avnet, Inc. (NYSE: AVT) and IBM today announced the extension of their distribution agreement to include IBM application specific integrated circuit (ASIC) devices and technology in North America.

Under the agreement, Avnet Inc., through Avnet Cilicon, the Americas-based semiconductor distribution specialist division of Avnet's largest operating group, Avnet Electronics Marketing, will provide engineering design services to customers to help accelerate the adoption of IBM ASIC products and to help reduce customers' time to market. The agreement covers ASIC products and technologies from IBM at the .18 micron and .25 micron technology nodes.

In addition, Avnet Inc., through Avnet Cilicon, will provide sales and marketing support for IBM ASIC products to its large distribution customer base, along with providing these customers access to Avnet's materials management capabilities for their particular supply chain requirements.

This announcement marks the first time that IBM has opened up its ASIC design methodologies for execution by a channel business partner. As a result, a broader array of customers will now be able to gain access to IBM industry-leading ASIC technology through Avnet Design Centers.

"The expansion of our existing, successful distribution agreement to now include IBM ASIC products and technology is a big win for Avnet and most importantly for the distribution customer base," said Jeff Ittel, president of Avnet Cilicon. "IBM has the world's leading ASIC products and methodologies, which are proven to enable designs that are right the first time and to help reduce time-to-market for its customers' products. These capabilities will now be widely available to the Avnet distribution customer base."

"Avnet is the leading distributor in this segment and brings over 20 years of experience in the ASIC business and over 1000 completed designs by our ASIC design center engineering team, whose services include architectural design, IP integration, verification, test, timing closure and physical layout," Ittel noted.

"This agreement represents a new business model for IBM and a significant opportunity for our ASIC business," said Tom Reeves, vice president, ASIC product group, IBM Systems and Technology Group. "Avnet offers an established customer base and technical design support via four dedicated Design Centers in North America that can help our ASIC business expand into new opportunities."

About IBM Microelectronics
IBM is a recognized innovator in the semiconductor industry, having been first with advances like more power-efficient copper wiring in place of aluminum and faster SOI and silicon germanium transistors. These and other innovations have contributed to IBM's standing as the number one U.S. patent holder for 11 consecutive years. More information about IBM semiconductors can be found at: http://www.ibm.com/chips.

About Avnet Cilicon
Avnet Cilicon is the semiconductor distribution specialist division of Avnet Electronics Marketing in the Americas, an operating group of Avnet, Inc. (NYSE:AVT). Avnet Cilicon combines semiconductor expertise, technical excellence and deep market knowledge to enhance time to revenue for all supply-chain partners in the electronics arena. Avnet Cilicon's core competencies include materials management, technical support through Avnet Design Services, logistics support through Avnet Supply Chain Services, and customer-centric, dedicated sales channels. Avnet Cilicon, combined with Avnet IP&E, Avnet's interconnect, passive and electromechanical component and services division, delivers Support Across the Board. For more information, visit http://www.em.avnet.com/semi.

Mon, 18 Jul 2022 12:00:00 -0500 en text/html https://www.design-reuse.com/news/7370/avnet-ibm-asic-distribution-agreement.html
Killexams : Emulating The IBM PC On An ESP32

The IBM PC spawned the basic architecture that grew into the dominant Wintel platform we know today. Once heavy, cumbersome and power thirsty, it’s a machine that you can now emulate on a single board with a cheap commodity microcontroller. That’s thanks to work from [Fabrizio Di Vittorio], who has shared a how-to on Youtube. 

The full playlist is quite something to watch, showing off a huge number of old-school PC applications and games running on the platform. There’s QBASIC, FreeDOS, Windows 3.0, and yes, of course, Flight Simulator. The latter game was actually considered somewhat of a de facto standard for PC compatibility in the 1980s, so the fact that the ESP32 can run it with [Fabrizio’s] code suggests he’s done well.

It’s amazingly complete, with the ESP32 handling everything from audio and video to sound output and keyboard and mouse inputs. It’s a testament to the capability of modern microcontrollers that this is such a simple feat in 2021.

We’ve seen the ESP32 emulate 8-bit gaming systems before, too. If you remember [Fabrizio’s] name, it’s probably from his excellent FabGL library. Videos after the break.

Wed, 03 Aug 2022 11:59:00 -0500 Lewin Day en-US text/html https://hackaday.com/2021/07/28/emulating-the-ibm-pc-on-an-esp32/
Killexams : Learning from Failure

Learning from failure is a hallmark of the technology business. Nick Baker, a 37-year-old system architect at Microsoft, knows that well. A British transplant at the software giant's Silicon Valley campus, he went from failed project to failed project in his career. He worked on such dogs as Apple Computer's defunct video card business, 3DO's failed game consoles, a chip startup that screwed up a deal with Nintendo, the never-successful WebTV and Microsoft's canceled Ultimate TV satellite TV recorder.

But Baker finally has a hot seller with the Xbox 360, Microsoft's video game console launched worldwide last holiday season. The adventure on which he embarked four years ago would ultimately prove that failure is often the best teacher. His new gig would once again provide copious evidence that flexibility and understanding of detailed customer needs will beat a rigid business model every time. And so far the score is Xbox 360, one, and the delayed PlayStation 3, nothing.

The Xbox 360 console is Microsoft's living room Trojan horse, purchased as a game box but capable of so much more in the realm of digital entertainment in the living room. Since the day after Microsoft terminated the Ultimate TV box in February 2002, Baker has been working on the Xbox 360 silicon architecture team at Microsoft's campus in Mountain View, CA. He is one of the 3DO survivors who now gets a shot at revenge against the Japanese companies that vanquished his old firm.

"It feels good," says Baker. "I can play it at home with the kids. It's family-friendly, and I don't have to play on the Nintendo anymore."

Baker is one of the people behind the scenes who pulled together the Xbox 360 console by engineering some of the most complicated chips ever designed for a consumer entertainment device. The team labored for years and made critical decisions that enabled Microsoft to beat Sony and Nintendo to market with a new box, despite a late start with the Xbox in the previous product cycle. Their story, captured here and in a forthcoming book by the author of this article, illustrates the ups and downs in any big project.

When Baker and his pal Jeff Andrews joined games programmer Mike Abrash in early 2002, they had clear marching orders. Their bosses — Microsoft CEO Steve Ballmer, at the top of Microsoft; Robbie Bach, running the Xbox division; Xbox hardware chief Todd Holmdahl; Greg Gibson, for Xbox 360 system architecture; and silicon chief Larry Yang — all dictated what Microsoft needed this time around.

They couldn't be late. They had to make hardware that could become much cheaper over time and they had to pack as much performance into a game console as they could without overheating the box.

Trinity Taken

The group of silicon engineers started first among the 2,000 people in the Xbox division on a project that Baker had code-named Trinity. But they couldn't use that name, because someone else at Microsoft had taken it. So they named it Xenon, for the colorless and odorless gas, because it sounded cool enough. Their first order of business was to study computing architectures, from those of the best supercomputers to those of the most power-efficient portable gadgets. Although Microsoft had chosen Intel and NVIDIA to make the chips for the original Xbox the first time around, the engineers now talked to a broad spectrum of semiconductor makers.

"For us, 2002 was about understanding what the technology could do," says Greg Gibson, system designer.

Sony teamed up with IBM and Toshiba to create a full-custom microprocessor from the ground up. They planned to spend $400 million developing the cell architecture and even more fabricating the chips. Microsoft didn't have the time or the chip engineers to match the effort on that scale, but Todd Holmdahl and Larry Yang saw a chance to beat Sony. They could marshal a host of virtual resources and create a semicustom design that combined both off-the-shelf technology and their own ideas for game hardware. Microsoft would lead the integration of the hardware, own the intellectual property, set the cost-reduction schedules, and manage its vendors closely.

They believed this approach would get them to market by 2005, which was when they estimated Sony would be ready with the PlayStation 3. (As it turned out, Microsoft's dreams were answered when Sony, in March, postponed the PlayStation 3 launch until November.)

More important, using an IP ownership strategy with the chips could dramatically cut Microsoft's costs on the original Xbox. Microsoft had lost an estimated $3.7 billion over four years, or roughly a whopping $168 per box. By cutting costs, Microsoft could erase a lot of red ink.

Balanced Design

Baker and Andrews quickly decided they wanted to create a balanced design, trading off power efficiency and performance. So they envisioned a multicore microprocessor, one with as many as 16 cores — or miniprocessors — on one chip. They wanted a graphics chip with 60 shaders, or parallel processors for rendering distinct features in graphic animations.

Laura Fryer, manager of the Xbox Advanced Technology Group in Redmond, WA, solicited feedback on the new microprocessor. She said game developers were wary of managing multiple software threads associated with multiple cores, because the switch created a juggling task they didn't have to do on the original Xbox or the PC. But they appreciated the power efficiency and added performance they could get.

Microsoft's current vendors, Intel and NVIDIA, didn't like the idea that Microsoft would own the IP they created. For Intel, allowing Microsoft to take the x86 design to another manufacturer was as troubling as signing away the rights to Windows would be to Microsoft. NVIDIA was willing to do the work, but if it had to deviate from its road map for PC graphics chips in order to tailor a chip for a game box, then it wanted to get paid for it. Microsoft didn't want to pay that high a price. "It wasn't a good deal," says Jen Hsun-Huang, CEO of NVIDIA. Microsoft had also been through a painful arbitration on pricing for the original Xbox graphics chips.

IBM, on the other hand, had started a chip engineering services business and was perfectly willing to customize a PowerPC design for Microsoft, says Jim Comfort, an IBM vice president. At first IBM didn't believe that Microsoft wanted to work together, given a history of rancor dating back to the DOS and OS/2 operating systems in the 1980s. Moreover, IBM was working for Microsoft rivals Sony and Nintendo. But Microsoft pressed IBM for its views on multicore chips and discovered that Big Blue was ahead of Intel in thinking about these kinds of designs.

When Bill Adamec, a Microsoft program manager, traveled to IBM's chip design campus in Rochester, NY, he did a double take when he arrived at the meeting room where 26 engineers were waiting for him. Although IBM had reservations about Microsoft's schedule, the company was clearly serious.

Meanwhile, ATI Technologies assigned a small team to conceive a proposal for a game console graphics chip. Instead of pulling out a derivative of a PC graphics chip, ATI's engineers decided to design a brand-new console graphics chip that relied on embedded memory to feed a lot data to the graphics chip while keeping the main data pathway clear of traffic — critical for avoiding bottlenecks that would slow down the system.

Stomaching IBM

By the fall of 2002, Microsoft's chip architects decided they favored the IBM and ATI solutions. They met with Ballmer and Gates, who wanted to be involved in the critical design decisions at an early juncture. Larry Yang recalls, "We asked them if they could stomach a relationship with IBM." Their affirmative answer pleased the team.

By early 2003, the list of potential chip suppliers had been narrowed down. At that point, Robbie Bach, the chief Xbox officer, took his team to a retreat at the Salish Lodge, on the edge of Washington's beautiful Snoqualmie Falls, made famous by the "Twin Peaks" television show. The team hashed out a battle plan. They would own the IP for silicon that could take the costs of the box down quickly. They would launch the box in 2005 at the same time as Sony would launch its box, or even earlier. The last time, Sony had had a 20-month head start with the PlayStation 2. By the time Microsoft sold its first 1.4 million Xboxes, Sony had sold more than 25 million PlayStation 2s.

Those goals fit well with the choice of IBM and ATI for the two pieces of silicon that would account for more than half the cost of the box. Each chip supplier moved forward, based on a "statement of work," but Gibson kept his options open, and it would be months before the team finalized a contract. Both IBM and ATI could pull blocks of IP from their existing products and reuse them in the Microsoft chips. Engineering teams from both companies began working on joint projects such as the data pathway that connected the chips. ATI had to make contingency plans, in case Microsoft chose Intel over IBM, and IBM also had to consider the possibility that Microsoft might choose NVIDIA.

Hacking Embarrassment

Through the summer, Microsoft executives and marketers created detailed plans for the console launch. They decided to build security into the microprocessor to prevent hacking, which had proved to be a major embarrassment on the original Xbox. Marketers such as David Reid all but demanded that Microsoft try to develop the new machine in a way that would allow the games for the original Xbox to run on it. So-called backward compatibility wasn't necessarily exploited by customers, but it was a big factor in deciding which box to buy. And Bach insisted that Microsoft had to make gains in Japan and Europe by launching in those regions at the same time as in North America.

For a period in July 2003, Bob Feldstein, the ATI vice president in charge of the Xenon graphics chip, thought NVIDIA had won the deal, but in August Microsoft signed a deal with ATI and announced it to the world. The ATI chip would have 48 shaders, or processors that would handle the nuances of color shading and surface features on graphics objects, and would come with 10 Mbytes of embedded memory.

IBM followed with a contract signing a month later. The deal was more complicated than ATI's, because Microsoft had negotiated the right to take the IBM design and have it manufactured in an IBM-licensed foundry being built by contract chip maker Chartered Semiconductor. The chip would have three cores and run at 3.2 GHz. It was a little short of the 3.5 GHz that IBM had originally pitched, but it wasn't off by much.

By October 2003, the entire Xenon team had made its pitch to Gates and Ballmer. They faced some tough questions. Gates wanted to know if there was any chance the box would run the complete Windows operating system. The top executives ended up giving the green light to Xenon without a Windows version.

The ranks of Microsoft's hardware team swelled to more than 200, with half of the team members working on silicon integration. Many of these people were like Baker and Andrews, stragglers who had come from failed projects such as 3DO and WebTV. About 10 engineers worked on "Ana," a Microsoft video encoder chip, while others managed the schedule and cost reduction with IBM and ATI. Others supported suppliers, such as Silicon Integrated Systems, the supplier of the "south bridge," the communications and input/output chip. The rest of the team helped handle relationships with vendors for the other 1,700 parts in the game console.

Ilan Spillinger headed the IBM chip program, which carried the code name Waternoose, after the spiderlike creature from the film "Monsters, Inc." He supervised IBM's chief engineer, Dave Shippy, and worked closely with Microsoft's Andrews on every aspect of the design program.

Games at Center

Everything happened in parallel. For much of 2003, a team of industrial designers created the look and feel of the box. They tested the design on gamers, and the feedback suggested that the design seemed like something either Apple or Sony had created. The marketing team decided to call the machine the Xbox 360, because it put the gamer at the center. A small software team led by Tracy Sharp developed the operating system in Redmond. Microsoft started investing heavily in games. By February 2004, Microsoft sent out the first kits to game developers for making games on Apple Macintosh G5 computers. And in early 2004, Greg Gibson's evaluation team began testing subsystems to make sure they would all work together when the final design came together.

IBM assigned 421 engineers from six or seven sites to the project, which was a proving ground for its design services business. The effort paid off, with an early test chip that came out in August 2004. With that chip, Microsoft was able to begin debugging the operating system. ATI taped out its first design in September 2004, and IBM taped out its full chip in October 2004. Both chips ran game code early on, which was good, considering that it's very hard to get chips working at all when they first come out of the factory.

IBM executed without many setbacks. As it revised the chip, it fixed bugs with two revisions of the chip's layers. The company was able to debug the design in the factory quickly, because IBM's fab engineers could work on one part while the Chartered engineers could debug a different part of the chip. They fed the information to each other, speeding the cycle of revisions. By Jan. 30, 2005, IBM tapped out the final version of the microprocessor.

ATI, meanwhile, had a more difficult time. The company had assigned 180 engineers to the project. Although games ran on the chip early, problems came up in the lab. Feldstein said that in one game, one frame of animation would freeze as every other frame went by. It took six weeks to uncover the bug and find a fix. Delays in debugging threatened to throw the beta-development-kit program off schedule. That meant thousands of game developers might not get the systems they needed on time. If that happened, the Xbox 360 might launch without enough games, a disaster in the making.

The pressure was intense. But Neil McCarthy, a Microsoft engineer in Mountain View, designed a modification of the metal layers of the graphics chip. By doing so, he enabled Microsoft to get working chips from the interim design. ATI's foundry, Taiwan Semiconductor Manufacturing Co., churned out enough chips to seed the developer systems. The beta kits went out in the spring of 2005.

Meanwhile, Microsoft's brass was worried that Sony would trump the Xbox 360 by coming out with more memory in the PlayStation 3. So in the spring of 2005, Microsoft made what would become a fateful decision. It decided to double the amount of memory in the box, from 256 Mbytes to 512 Mbytes of graphics Double Data Rate 3 (GDDR3) chips. The decision would cost Microsoft $900 million over five years, so the company had to pare back spending in other areas to stay on its profit targets.

Microsoft started tying up all the loose ends. It rehired Seagate Technology, which it had hired for the original Xbox, to make hard disk drives for the box, but this time Microsoft decided to have two SKUs — one with a hard drive, for the enthusiasts, and one without, for the budget-conscious. It brought aboard both Flextronics and Wistron, the current makers of the Xbox, as contract manufacturers. But it also laid plans to have Celestica build a third factory for building the Xbox 360.

Just as everyone started to worry about the schedule going off course, ATI spun out the final graphics chip design in mid-July 2005. Everyone breathed a sigh of relief, and they moved on to the tough work of ramping up manufacturing. There was enough time for both ATI and IBM to build a stockpile of chips for the launch, which was set for Nov. 22 in North America, Dec. 2 in Europe and Dec. 10 in Japan.

Flextronics debugged the assembly process first. Nick Baker traveled to China to debug the initial boxes as they came off the line. Although assembly was scheduled to start in August, it didn't get started until September. Because the machines were being built in southern China, they had to be shipped over a period of six weeks by boat to the regions. Each factory could build only as many as 120,000 machines a week, running at full tilt. The slow start, combined with the multiregion launch, created big risks for Microsoft.

An Unexpected Turn

The hardware team was on pins and needles. The most-complicated chips came in on time and were remarkable achievements. Typically, it took more than two years to do the initial designs of complicated chip projects, but both companies were actually manufacturing inside that time window.

Then something unexpected hit. Both Samsung and Infineon Technologies had committed to making the GDDR3 memory for Microsoft. But some of Infineon's chips fell short of the 700 MHz specified by Microsoft. Using such chips could have slowed games down noticeably. Microsoft's engineers decided to start sorting the chips, not using the subpar ones. Because GDDR3 700 MHz chips were just ramping up, there was no way to get more chips. Each system used eight chips. The shortage constrained the supply of Xbox 360s.

Microsoft blamed the resulting shortfall of Xbox 360s on a variety of component shortages. Some users complained of overheating systems. But overall, the company said, the launch was still a great achievement. In its first holiday season, Microsoft sold 1.5 million Xbox 360s, compared to 1.4 million original Xboxes in the holiday season of 2001. But the shortage continued past the holidays.

Leslie Leland, hardware evaluation director, says she felt "terrible" about the shortage and that Microsoft would strive to get a box into the hands of every consumer who wanted one. But Greg Gibson, system designer, says that Microsoft could have worse problems on its hands than a shortage. The IBM and ATI teams had outdone themselves.

The project was by far the most successful Nick Baker had ever worked on. One night, hoisting a beer and looking at a finished console, he said it felt good.

J Allard, the head of the Xbox platform business, praised the chip engineers such as Baker: "They were on the highest wire with the shortest net."

Get more information on Takahashi's book.

This story first appeared in the May issue of Electronic Businessmagazine.

Tue, 26 Jul 2022 12:00:00 -0500 en text/html https://www.designnews.com/automation-motion-control/learning-failure
Killexams : The 8 Most Powerful Computers in The World No result found, try new keyword!We're used to seeing powerful computers in science fiction capable of processing massive amounts of data in a matter of seconds, machines so advanced they make modern personal computers look like toys ... Mon, 01 Aug 2022 23:00:14 -0500 en-us text/html https://www.msn.com/en-us/news/technology/the-8-most-powerful-computers-in-the-world/ar-AA10dR27 Killexams : IBM is Modeling New AI After the Human Brain

Attentive Robots

Currently, artificial intelligence (AI) technologies are able to exhibit seemingly-human traits. Some are intentionally humanoid, and others perform tasks that we normally associate strictly with humanity — songwriting, teaching, and visual art.

But as the field progresses, companies and developers are re-thinking the basis of artificial intelligence by examining our own intelligence and how we might effectively mimic it using machinery and software. IBM is one such company, as they have embarked on the ambitious quest to teach AI to act more like the human brain.

Click to View Full Infographic

Many existing machine learning systems are built around the need to draw from sets of data. Whether they are problem-solving to win a game of Go or identifying skin cancer from images, this often remains true. This basis is, however, limited — and it differentiates from the human brain.

We as humans learn incrementally. Simply put, we learn as we go. While we acquire knowledge to pull from as we go along, our brains adapt and absorb information differently from the way that many existing artificial systems are built. Additionally, we are logical. We use reasoning skills and logic to problem solve, something that these systems aren't yet terrific at accomplishing.

IBM is looking to change this. A research team at DeepMind has created a synthetic neural network that reportedly uses rational reasoning to complete tasks.

Rational Machinery

By giving the AI multiple objects and a specific task, "We are explicitly forcing the network to discover the relationships that exist," says Timothy Lillicrap, a computer scientist at DeepMind in an interview with Science Magazine. In a test of the network back in June, it was questioned about an image with multiple objects. The network was asked, for example: "There is an object in front of the blue thing; does it have the same shape as the tiny cyan thing that is to the right of the gray metal ball?"

In this test, the network correctly identified the object a staggering 96 percent of the time, compared to the measly 42 to 77 percent that more traditional machine learning models achieved. The advanced network was also apt at word problems and continues to be developed and improved upon. In addition to reasoning skills, researchers are advancing the network's ability to pay attention and even make and store memories.

Image Credit: ColiN00B / Pixabay

The future of AI development could be hastened and greatly expanded by using such tactics, according to Irina Rish, an IBM research staff member, in an interview with Engadget, "Neural network learning is typically engineered and it's a lot of work to actually come up with a specific architecture that works best. It's pretty much a trial and error approach ... It would be good if those networks could build themselves."

It might be scary to think of AI networks building and improving themselves, but if monitored, initiated, and controlled correctly, this could allow the field to expand beyond current limitations. Despite the brimming fears of a robot takeover, the advancement of AI technologies could save lives in the medical field, allow humans to get to Mars, and so much more. 


Wed, 29 Dec 2021 18:58:00 -0600 text/html https://futurism.com/ibm-is-modeling-new-ai-after-the-human-brain
Killexams : Romsey mourns the loss of exceptional Rex Trayhorne

ROMSEY residents are mourning the loss of one of their own, Rex Trayhorne, a man known by many as the ‘heart of the town’.

Rex, a much beloved husband, father and friend to many died peacefully on Thursday, July 14, aged 90, after living an ‘exceptional life’.

Berkshire-born Rex trained as a technical illustrator and, in his early years, worked for British European Airways, the Atomic Energy Authority and various advertising agencies.

Later he taught at Southampton College of Art and worked as a freelance – his last contract was with IBM Hursley.

He then became a full-time professional artist. His skills in graphic design, airbrushing and photo-retouching, used in technical and architectural illustrations, were developed well before the arrival of computer aids.

This influenced many of his paintings, which exhibit realism and often intricate detail while still showing strong artistic interpretation.

Rex produced paintings which range from expressive watercolours to exquisite miniatures. His work has been exhibited at the Royal Institute of Watercolour Painters, The Tate Gallery, The Medici Gallery and the Royal Miniature Society and, in 1989, he was elected a member in recognition of his work.

Since 1965 he was a prominent member of Romsey Art Group and the Mountbatten Gallery in Lee. He organised the Wessex Artists Exhibitions in Salisbury for over twenty-five years.

In 1998, then 67, he spent countless days filming a year-long documentary of life in Romsey featuring all aspects of local people and events – a valuable record of those times. This project also raised funds for Romsey Hospital.

Rex wrote magazine articles, appeared on television, held watercolour courses and raised funds for many local charities – too numerous to list.

He was a founder member of the Rotary Club of Romsey Test and in 1990 was awarded ‘Honorary Membership’ of the Rotary movement. He was also a very active member and President of the Romsey Abbey Probus Club.

Rex had a long-standing association with Romsey Abbey. He was confirmed there in 1996 and, in 2000, celebrated 25 years of marriage to his wife and fellow artist, Geraldine, in a thanksgiving service conducted by Canon Crawford-Jones.

For each of the past thirty years he produced a unique Christmas card with an appropriate Romsey winter scene mostly featuring Romsey Abbey - whenever possible encased in snow.

In 2010 Rex’s painting of Romsey Abbey was used on special edition postage stamps as well as on a vast range of well-known postcards and calendars featuring local scenes. One of his most striking works is the detailed view of the interior of the Abbey that took a month to produce. It is now one of the most popular cards on sale.

With his wife, Geraldine, Rex lived in central Romsey for 34 years. His family have said he loved the town, the Abbey and the many friends made there.

He made great contributions to the local artistic community, local organisations and raised funds for local charities. All testament to a life well lived.

His funeral will take place in Romsey Abbey on Monday, August 15 at 1.30pm. Rex’s family have asked for people to donate to Wessex Heartbeat or Romsey Abbey PCC rather than buying flowers. Donations can be made via ahcheater.co.uk.

Written by John Scarborough

Fri, 05 Aug 2022 14:44:00 -0500 en text/html https://www.dailyecho.co.uk/news/20601576.romsey-mourns-loss-exceptional-rex-trayhorne/
Killexams : IBM Annual Cost of Data Breach Report 2022: Record Costs Usually Passed On to Consumers, “Long Breach” Expenses Make Up Half of Total Damage

IBM’s annual Cost of Data Breach Report for 2022 is packed with revelations, and as usual none of them are good news. Headlining the report is the record-setting cost of data breaches, with the global average now at $4.35 million. The report also reveals that much of that expense comes with the data breach version of “long Covid,” expenses that are realized more than a year after the attack.

Most organizations (60%) are passing these added costs on to consumers in the form of higher prices. And while 83% of organizations now report experiencing at least one data breach, only a small minority are adopting zero trust strategies.

Security AI and automation greatly reduces expected damage

The IBM report draws on input from 550 global organizations surveyed about the period between March 2021 and March 2022, in partnership with the Ponemon Institute.

Though the average cost of a data breach is up, it is only by about 2.6%; the average in 2021 was $4.24 million. This represents a total climb of 13% since 2020, however, reflecting the general spike in cyber crime seen during the pandemic years.

Organizations are also increasingly not opting to absorb the cost of data breaches, with the majority (60%) compensating by raising consumer prices separate from any other recent increases due to inflation or supply chain issues. The report indicates that this may be an underreported upward influence on prices of consumer goods, as 83% of organizations now say that they have been breached at least once.

Brad Hong, Customer Success Manager for Horizon3.ai, sees a potential consumer backlash on the horizon once public awareness of this practice grows: “It’s already a breach of confidence to lose the confidential data of customers, and sure there’s bound to be an organization across those surveyed who genuinely did put in the effort to protect against and curb attacks, but for those who did nothing, those who, instead of creating a disaster recovery plan, just bought cyber insurance to cover the org’s operational losses, and those who simply didn’t care enough to heed the warnings, it’s the coup de grâce to then pass the cost of breaches to the same customers who are now the victims of a data breach. I’d be curious to know what percent of the 60% of organizations who increased the price of their products and services are using the extra revenue for a war chest or to actually reinforce their security—realistically, it’s most likely just being used to fill a gap in lost revenue for shareholders’ sake post-breach. Without government regulations outlining restrictions on passing cost of breach to consumer, at the least, not without the honest & measurable efforts of a corporation as their custodian, what accountability do we all have against that one executive who didn’t want to change his/her password?”

Breach costs also have an increasingly long tail, as nearly half now come over a year after the date of the attack. The largest of these are generally fines that are levied after an investigation, and decisions or settlements in class action lawsuits. While the popular new “double extortion” approach of ransomware attacks can drive long-term costs in this way, the study finds that companies paying ransom demands to settle the problem quickly aren’t necessarily seeing a large amount of overall savings: their average breach cost drops by just $610,000.

Sanjay Raja, VP of Product with Gurucul, expands on how knock-on data breach damage can continue for years: “The follow-up attack effect, as described, is a significant problem as the playbooks and solutions provided to security operations teams are overly broad and lack the necessary context and response actions for proper remediation. For example, shutting down a user or application or adding a firewall block rule or quarantining a network segment to negate an attack is not a sustainable remediation step to protect an organization on an ongoing basis. It starts with a proper threat detection, investigation and response solution. Current SIEMs and XDR solutions lack the variety of data, telemetry and combined analytics to not only identify an attack campaign and even detect variants on previously successful attacks, but also provide the necessary context, accuracy and validation of the attack to build both a precise and complete response that can be trusted. This is an even greater challenge when current solutions cannot handle complex hybrid multi-cloud architectures leading to significant blind spots and false positives at the very start of the security analyst journey.”

Rising cost of data breach not necessarily prompting dramatic security action

In spite of over four out of five organizations now having experienced some sort of data breach, only slightly over 20% of critical infrastructure companies have moved to zero trust strategies to secure their networks. Cloud security is also lagging as well, with a little under half (43%) of all respondents saying that their security practices in this area are either “early stage” or do not yet exist.

Those that have onboarded security automation and AI elements are the only group seeing massive savings: their average cost of data breach is $3.05 million lower. This particular study does not track average ransom demands, but refers to Sophos research that puts the most recent number at $812,000 globally.

The study also notes serious problems with incident response plans, especially troubling in an environment in which the average ransomware attack is now carried out in four days or less and the “time to ransom” has dropped to a matter of hours in some cases. 37% of respondents say that they do not test their incident response plans regularly. 62% say that they are understaffed to meet their cybersecurity needs, and these organizations tend to suffer over half a million more dollars in damages when they are breached.

Of course, cost of data breaches is not distributed evenly by geography or by industry type. Some are taking much bigger hits than others, reflecting trends established in prior reports. The health care industry is now absorbing a little over $10 million in damage per breach, with the average cost of data breach rising by $1 million from 2021. And companies in the United States face greater data breach costs than their counterparts around the world, at over $8 million per incident.

Shawn Surber, VP of Solutions Architecture and Strategy with Tanium, provides some insight into the unique struggles that the health care industry faces in implementing effective cybersecurity: “Healthcare continues to suffer the greatest cost of breaches but has among the lowest spend on cybersecurity of any industry, despite being deemed ‘critical infrastructure.’ The increased vulnerability of healthcare organizations to cyber threats can be traced to outdated IT systems, the lack of robust security controls, and insufficient IT staff, while valuable medical and health data— and the need to pay ransoms quickly to maintain access to that data— make healthcare targets popular and relatively easy to breach. Unlike other industries that can migrate data and sunset old systems, limited IT and security budgets at healthcare orgs make migration difficult and potentially expensive, particularly when an older system provides a small but unique function or houses data necessary for compliance or research, but still doesn’t make the cut to transition to a newer system. Hackers know these weaknesses and exploit them. Additionally, healthcare orgs haven’t sufficiently updated their security strategies and the tools that manufacturers, IT software vendors, and the FDA have made haven’t been robust enough to thwart the more sophisticated techniques of threat actors.”

Familiar incident types also lead the list of the causes of data breaches: compromised credentials (19%), followed by phishing (16%). Breaches initiated by these methods also tended to be a little more costly, at an average of $4.91 million per incident.

Global average cost of #databreach is now $4.35M, up 13% since 2020. Much of that are realized more than a year after the attack, and 60% of organizations are passing the costs on to consumers in the form of higher prices. #cybersecurity #respectdataClick to Tweet

Cutting the cost of data breach

Though the numbers are never as neat and clean as averages would indicate, it would appear that the cost of data breaches is cut dramatically for companies that implement solid automated “deep learning” cybersecurity tools, zero trust systems and regularly tested incident response plans. Mature cloud security programs are also a substantial cost saver.

Mon, 01 Aug 2022 10:00:00 -0500 Scott Ikeda en-US text/html https://www.cpomagazine.com/cyber-security/ibm-annual-cost-of-data-breach-report-2022-record-costs-usually-passed-on-to-consumers-long-breach-expenses-make-up-half-of-total-damage/
Killexams : THE TELEVISION PROGRAM TRANSCRIPTS: PART II THE TELEVISION PROGRAM TRANSCRIPTS: PART II

The story so far.... In 1975, Ed Roberts invented the Altair personal computer. It was a pain to use until 19 year-old pre-billionaire Bill Gates wrote the first personal computer language. Still, the public didn't care. Then two young hackers -- Steve Jobs and Steve Wozniak -- built the Apple computer to impress their friends. We were all impressed and Apple was a stunning success. By 1980, the PC market was worth a billion dollars. Now, view on.....

Christine Comaford
We are nerds.

Vern Raburn
Most of the people in the industry were young because the guys who had any real experience were too smart to get involved in all these crazy little machines.

Gordon Eubanks
It really wasn't that we were going to build billion dollar businesses. We were having a good time.

Vern Raburn
I thought this was the most fun you could possibly have with your clothes on.

When the personal computer was invented twenty years it was just that - an invention - it wasn't a business. These were hobbyists who built these machines and wrote this software to have fun but that has really changed and now this is a business this is a big business. It just goes to show you that people can be bought. How the personal computer industry grew from zero to 100 million units is an amazing story. And it wasn't just those early funky companies of nerds and hackers, like Apple, that made it happen. It took the intervention of a company that was trusted by the corporate world. Big business wasn't interested in the personal computer. In the boardrooms of corporate America a computer still meant something the size of a room that cost at least a hundred thousand dollars. Executives would brag that my mainframe is bigger than your mainframe. The idea of a $2,000 computer that sat on your desk in a plastic box was laughable that is until that plastic box had three letters stamped on it - IBM. IBM was, and is, an American business phenomenon. Over 60 years, Tom Watson and his son, Tom Jr., built what their workers called Big Blue into the top computer company in the world. But IBM made mainframe computers for large companies, not personal computers -- at least not yet. For the PC to be taken seriously by big business, the nerds of Silicon Valley had to meet the suits of corporate America. IBM never fired anyone, requiring only that undying loyalty to the company and a strict dress code. IBM hired conservative hard-workers straight from school. Few IBM'ers were at the summer of love. Their turn-ons were giant mainframes and corporate responsibility. They worked nine to five and on Saturdays washed the car. This is intergalactic HQ for IBM - the largest computer company in the world...but in many ways IBM is really more a country than it is a company. It has hundreds of thousands of citizens, it has a bureaucracy, it has an entire culture everything in fact but an army. OK Sam we're ready to visit IBM country, obviously we're dressed for the part. Now when you were in sales training in 1959 for IBM did you sing company songs?

Sam Albert
Former IBM Executive
Absolutely.

BOB: Well just to get us in the mood let's sing one right here.
SAM: You're kidding.
BOB: I have the IBM - the songs of the IBM and we're going to try for number 74, our IBM salesmen sung to the tune of Jingle Bells.

Bob & Sam singing
'IBM, happy men, smiling all the way, oh what fun it is to sell our products our pruducts night and day. IBM Watson men, partners of TJ. In his service to mankind - that's why we are so gay.'

Sam Albert
Now gay didn't mean what it means today then remember that OK?
BOB: Right ok let's go.
SAM: I guess that was OK.
BOB: Perfect.

Sam Albert
When I started at IBM there was a dress code, that was an informal oral code of white shirts. You couldn't wear anything but a white shirt, generally with a starched collar. I remember attending my first class, and a gentleman said to me as we were entering the building, are you an IBMer, and I said yes. He had a three piece suit on, vests were of the vogue, and he said could you just lift your pants leg please. I said what, and before I knew it he had lifted my pants leg and he said you're not wearing any garters. I said what?! He said your socks, they're not pulled tight to the top, you need garters. And sure enough I had to go get garters.

IBM is like Switzerland -- conservative, a little dull, yet prosperous. It has committees to verify each decision. The safety net is so big that it is hard to make a bad decision - or any decision at all. Rich Seidner, computer programmer and wannabe Paul Simon, spent twenty-five years marching in lockstep at IBM. He feels better now.

Rich Seidner
Former IBM Programmer
I mean it's like getting four hundred thousand people to agree what they want to have for lunch. You know, I mean it's just not going to happen - it's going to be lowest common denominator you know, it's going to be you know hot dogs and beans. So ahm so what are you going to do? So IBM had created this process and it absolutely made sure that quality would be preserved throughout the process, that you actually were doing what you set out to do and what you thought the customer wanted. At one point somebody kind of looked at the process to see well, you know, what's it doing and what's the overhead built into it, what they found is that it would take at least nine months to ship an empty box.

By the late seventies, even IBM had begun to notice the explosive growth of personal computer companies like Apple.

Commercial
The Apple 2 - small inexpensive and simple to use the first computer.....

What's more, it was a computer business they didn't control. In 1980, IBM decided they wanted a piece of this action.

Jack Sams
Former IBM Executive
There were suddenly tens of thousands of people buying machines of that class and they loved them. They were very happy with them and they were showing up in the engineering departments of our clients as machines that were brought in because you can't do the job on your mainframe kind of thing.

Commercial
JB wanted to know why I'm doing better than all the other managers...it's no secret...I have an Apple - sure there's a big computer three flights down but it won't test my options, do my charts or edit my reports like my Apple.

Jack Sams
The people who had gotten it were religious fanatics about them. So the concern was we were losing the hearts and minds and give me a machine to win back the hearts and minds.

In business, as in comedy, timing is everything, and time looked like it might be running out for an IBM PC. I'm visiting an IBMer who took up the challenge. In August 1979, as IBM's top management met to discuss their PC crisis, Bill Lowe ran a small lab in Boca Raton Florida.

Bill Lowe
Hello Bob nice to see you.
BOB: Nice to see you again. I tried to match the IBM dress code how did I do?
BILL: That's terrific, that's terrific.

He knew the company was in a quandary. Wait another year and the PC industry would be too big even for IBM to take on. Chairman Frank Carey turned to the department heads and said HELP!!!

Bill Lowe
Head, IBM IBM PC Development Team 1980
He kind of said well, what should we do, and I said well, we think we know what we would like to do if we were going to proceed with our own product and he said no, he said at IBM it would take four years and three hundred people to do anything, I mean it's just a fact of life. And I said no sir, we can provide with product in a year. And he abruptly ended the meeting, he said you're on Lowe, come back in two weeks and tell me what you need.

An IBM product in a year! Ridiculous! Down in the basement Bill still has the plan. To save time, instead of building a computer from scratch, they would buy components off the shelf and assemble them -- what in IBM speak was called 'open architecture.' IBM never did this. Two weeks later Bill proposed his heresy to the Chairman.

Bill Lowe
And frankly this is it. The key decisions were to go with an open architecture, non IBM technology, non IBM software, non IBM sales and non IBM service. And we probably spent a full half of the presentation carrying the corporate management committee into this concept. Because this was a new concept for IBM at that point.
BOB: Was it a hard sell?
BILL: Mr. Carey bought it. And as result of him buying it, we got through it.

With the backing of the chairman, Bill and his team then set out to break all the IBM rules and go for a record.

Bill Lowe
We'll put it in the IBM section.

Once IBM decided to do a personal computer and to do it in a year - they couldn't really design anything, they just had to slap it together, so that's what we'll do. You have a central processing unit and eh let's see you need a monitor or display and a keyboard. OK a PC, except it's not, there's something missing. Time for the Cringely crash course in elementary computing. A PC is a boxful of electronic switches, a piece of hardware. It's useless until you tell it what to do. It requires a program of instructions...that's software. Every PC requires at least two essential bits of software in order to work at all. First it requires a computer language. That's what you type in to give instructions to the computer. To tell it what to do. Remember it was a computer language called BASIC that Paul Allen and Bill Gates adapted to the Altair...the first PC. The other bit of software that's required is called an operating system and that's the internal traffic cop that tells the computer itself how the keyboard is connected to the screen or how to store files on a floppy disk instead of just losing them when you turn off the PC at the end of the day. Operating systems tend to have boring unfriendly names like UNIX and CPM and MS-DOS but though they may be boring it's an operating system that made Bill Gates the richest man in the world. And the story of how that came about is, well, pretty interesting. So the contest begins. Who would IBM buy their software from? Let's meet the two contenders -- the late Gary Kildall, then aged 39, a computer Ph.D., and a 24 year old Harvard drop-out - Bill Gates. By the time IBM came calling in 1980, Bill Gates and his small company Microsoft was the biggest supplier of computer languages in the fledgling PC industry.

Commercial
'Many different computer manufacturers are making the CPM Operating System standard on most models.'

For their operating system, though, the logical guy for the IBMers to see was Gary Kildall. He ran a company modestly called Interglactic Digital Research. Gary had invented the PC's first operating system called CP/M. He had already sold 600,000 of them, so he was the big cheese of operating systems.

Gary Kildall
Founder Digital Research
Speaking in 1983
In the early 70s I had a need for an operating system myself and eh it was a very natural thing to write and it turns out other people had a need for an operating system like that and so eh it was a very natural thing I wrote it for my own use and then started selling it.

Gordon Eubanks
In Gary's mind it was the dominant thing and it would always be the dominant of course Bill did languages and Gary did operating systems and he really honestly believed that would never change.

But what would change the balance of power in this young industry was the characters of the two protagonists.

Jim Warren
Founder West Coast Computer Faire 1978
So I knew Gary back when he was an assistant professor at Monterrey Post Grad School and I was simply a grad student. And went down, sat in his hot tub, smoked dope with him and thoroughly enjoyed it all, and commiserated and talked nerd stuff. He liked playing with gadgets, just like Woz did and does, just like I did and do.

Gordon Eubanks
He wasn't really interested in how you drive the business, he worked on projects, things that interested him.

Jim Warren
He didn't go rushing off to the patent office and patent CPM and patent every line of code he could, he didn't try to just squeeze the last dollar out of it.

Gordon Eubanks
Gary was not a fighter, Gary avoided conflict, Gary hated conflict. Bill I don't think anyone could say backed away from conflict.

Nobody said future billionaires have to be nice guys. Here, at the Microsoft Museum, is a shrine to Bill's legacy. Bill Gates hardly fought his way up from the gutter. Raised in a prosperous Seattle household, his mother a homemaker who did charity work, his father was a successful lawyer. But beneath the affluence and comfort of a perfect American family, a competitive spirit ran deep.

Vern Raburn
President, The Paul Allen Group
I ended up spending Memorial Day Weekend with him out at his grandmother's house on Hood Canal. She turned everything in to a game. It was a very very very competitive environment, and if you spent the weekend there, you were part of the competition, and it didn't matter whether it was hearts or pickleball or swimming to the dock. And you know and there was always a reward for winning and there was always a penalty for losing.

Christine Comaford
CEO Corporate Computing Intl.
One time, it was funny. I went to Bill's house and he really wanted to show me his jigsaw puzzle that he was working on, and he really wanted to talk about how he did this jigsaw puzzle in like four minutes, and like on the box it said, if you're a genius you will do the jigsaw puzzle in like seven. And he was into it. He was like I can do it. And I said don't, you know, I believe you. You don't need to break it up and do it for me. You know.

Bill Gates can be so focused that the small things in life get overlooked.

Jean Richardson
Former VP, Corporate Comms, Microsoft
If he was busy he didn't bathe, he didn't change clothes. We were in New York and the demo that we had crashed the evening before the announcement, and Bill worked all night with some other engineers to fix it. Well it didn't occur to him to take ten minutes for a shower after that, it just didn't occur to him that that was important, and he badly needed a shower that day.

The scene is set in California...laid back Gary Kildall already making the best selling PC operating system CPM. In Seattle Bill Gates maker of BASIC the best selling PC language but always prepared to seize an opportunity. So IBM had to choose one of these guys to write the operating system for its new personal computer. One would hit the jackpot the other would be forgotten...a footnote in the history of the personal computer and it all starts with a telephone call to an eighth floor office in that building the headquarters of Microsoft in 1980.

Jack Sams
At about noon I guess I called Bill Gates on Monday and said I would like to come out and talk with him about his products.

Steve Ballmer
Vice-President Microsoft
Bill said well, how's next week, and they said we're on an airplane, we're leaving in an hour, we'd like to be there tomorrow. Well, hallelujah. Right oh.

Steve Ballmer was a Harvard roommate of Gates. He'd just joined Microsoft and would end up its third billionaire. Back then he was the only guy in the company with business training. Both Ballmer and Gates instantly saw the importance of the IBM visit.

Bill Gates
You know IBM was the dominant force in computing. A lot of these computer fairs discussions would get around to, you know, I.. most people thought the big computer companies wouldn't recognise the small computers, and it might be their downfall. But now to have one of the big computer companies coming in and saying at least the - the people who were visiting with us that they were going to invest in it, that - that was er, amazing.

Steve Ballmer
And Bill said Steve, you'd better come to the meeting, you're the only other guy here who can wear a suit. So we figure the two of us will put on suits, we'll put on suits and we'll go to this meeting.

Jack Sams
We got there at roughly two o'clock and we were waiting in the front, and this young fella came out to take us back to Mr. Gates office. I thought he was the office boy, and of course it was Bill. He was quite decisive, we popped out the non-disclosure agreement - the letter that said he wouldn't tell anybody we were there and that we wouldn't hear any secrets and so forth. He signed it immediately.

Bill Gates
IBM didn't make it easy. You had to sign all these funny agreements that sort of said I...IBM could do whatever they wanted, whenever they wanted, and use your secrets however they - they felt. But so it took a little bit of faith.

Jack Sams was looking for a package from Microsoft containing both the BASIC computer language and an Operating System. But IBM hadn't done their homework.

Steve Ballmer
They thought we had an operating system. Because we had this Soft Card product that had CPM on it, they thought we could licence them CPM for this new personal computer they told us they wanted to do, and we said well, no, we're not in that business.

Jack Sams
When we discovered we didn't have - he didn't have the rights to do that and that it was not...he said but I think it's ready, I think that Gary's got it ready to go. So I said well, there's no time like the present, call up Gary.

Steve Ballmer
And so Bill right there with them in the room called Gary Kildall at Digital Research and said Gary, I'm sending some guys down. They're going to be on the phone. Treat them right, they're important guys.

The men from IBM came to this Victorian House in Pacific Grove California, headquarters of Digital Research, headed by Gary and Dorothy Kildall. Just imagine what its like having IBM come to visit - its like having the Queen drop by for tea, its like having the Pope come by looking for advice, its like a visit from God himself. And what did Gary and Dorothy do? They sent them away.

Jack Sams
Gary had some other plans and so he said well, Dorothy will see you. So we went down the three of us...
Gordon Eubanks
Former Head of Language Division, Digital Research
IBM showed up with an IBM non-disclosure and Dorothy made what I...a decision which I think it's easy in retrospect to say was dumb.

Jack Sams
We popped out our letter that said please don't tell anybody we're here, and we don't want to hear anything confidential. And she read it and said and I can't sign this.

Gordon Eubanks
She did what her job was, she got the lawyer to look at the nondisclosure. The lawyer, Gerry Davis who's still in Monterey threw up on this non-disclosure. It was uncomfortable for IBM, they weren't used to waiting. And it was unfortunate situation - here you are in a tiny Victorian House, its overrun with people, chaotic.

Jack Sams
So we spent the whole day in Pacific Grove debating with them and with our attorneys and her attorneys and everybody else about whether or not she could even talk to us about talking to us, and we left.

This is the moment Digital Research dropped the ball. IBM, distinctly unimpressed with their reception, went back to Microsoft.

BOB: It seems to me that Digital Research really screwed up.
STEVE BALLMER: I think so - I think that's spot on. They made a big mistake. We referred IBM to them and they failed to execute.

Bill Gates isn't the man to give a rival a second chance. He saw the opportunity of a lifetime.

Bill Gates
Digital research didn't seize that, and we knew it was essential, if somebody didn't do it, the project was going to fall apart.

Steve Ballmer
We just got carried away and said look, we can't afford to lose the language business. That was the initial thought - we can't afford to have IBM not go forward. This is the most exciting thing that's going to happen in PCs.

Bill Gates
And we were already out on a limb, because we had licensed them not only Basic, but Fortran, Cobol Assembler er, typing tutor and Venture. And basically every - every product the company had we had committed to do for IBM in a very short time frame.

But there was a problem. IBM needed an operating system fast and Microsoft didn't have one. What they had was a stroke of luck - the ingredient everyone needs to be a billionaire. Unbelievably, the solution was just across town. Paul Allen, Gates's programming partner since high school, had found another operating system.

Paul Allen
There's a local company here in CL called CL Computer Products by a guy named Tim Patterson and he had done an operating system a very rudimentary operating system that was kind of like CPM.

Steve Ballmer
And we just told IBM look, we'll go and get this operating system from this small local company, we'll take care of it, we'll fix it up, and you can still do a PC.

Tim Patterson's operating system, which saved the deal with IBM, was, well, adapted from Gary Kildall's CPM.

Tim Patterson
Programmer
So I took a CPM manual that I'd gotten from the Retail Computer Store five dollars in 1976 or something, and used that as the basis for what would be the application program interface, the API for my operating system. And so using these ideas that came from different places I started in April and it was about half time for four months before I had my first working version.

This is it, the operating system Tim Patterson wrote. He called in QDOS the quick and dirty operating system. Microsoft and IBM called it PC DOS 1.0 and under any name it looks an awful lot like CPM. On this computer here I have running a PC DOS and CPM 86 and frankly it�s very hard to tell the difference between the two. The command structures are the same, so are the directories, in fact the only obvious external difference is the floppy dirive is labelled A in PC DOS and and C in CPM. Some difference and yet one generated billions in revenue and the other disappeared. As usual in the PC business the prize didn't go to the inventor but to the exploiter of the invention. In this case that wasn't Gary Kildall it wasn't even Tim Paterson.

There was still one problem. Tim Patterson worked for Seattle Computer Products, or SCP. They still owned the rights to QDOS - rights that Microsoft had to have.

Vern Raburn
Former Vice-President Microsoft
But then we went back and said to them look, you know, we want to buy this thing, and SCP was like most little companies, you know. They always needed cash and so that was when they went in to the negotiation.

Paul Allen
And so ended up working out a deal to buy the operating system from him for whatever usage we wanted for fifty thousand dollars.

Hey, let's pause there. To savour an historic moment.

Paul Allen
For whatever usage we wanted for fifty thousand dollars.

It had to be the deal of the century if not the millenium it was certainly the deal that made Bill Gates and Paul Allen multi billionaires and allowed Paul Allen to buy toys like these, his own NBA basketball team and arena. Microsoft bought outright for fifty thousand dollars the operating system they needed and they turned around and licensed it to the world for up to fifty dollars per PC. Think of it - one hundred million personal computers running MS DOS software funnelling billions into Microsoft - a company that back then was fifty kids managed by a twenty-five year old who needed to wash his hair. Nice work if you can get it and Microsoft got it. There are no two places further apart in the USA than south eastern Florida and Washington State where Microsoft is based. This - this is Florida, Boca Raton and this building right here is where the IBM PC was developed. Here the nerds from Seattle joined forces with the suits of corporate and in that first honeymoon year they pulled off a fantastic achievement.

Dan Bricklin
After we got a package in the mail from the people down in Florida...

As August 1981 approached, the deadline for the launch of the IBM Acorn, the PC industry held its breath.

Dan Bricklin
Supposedly, maybe at this very moment eh, IBM is announcing the personal computer. We don't know that yet.

Software writers like Dan Bricklin, the creator of the first spreadsheet VisiCalc waited by the phones for news of the announcement. This is a moment of PC history. IBM secrecy had codenamed the PC 'The Floridian Project.' Everyone in the PC business knew IBM would change their world forever. They also knew that if their software was on the IBM PC, they would make fortunes.

Dan Bricklin
Please note that the attached information is not to be disclosed prior to any public announcement. (It's on the ticker) It's on the ticker OK so now you can tell people.

What we're watching are the first few seconds of a $100 billion industry.

Promo
After years of thinking big today IBM came up with something small. Big Blue is looking for a slice of Apple's market share. Bits and Bytes mean nothing try this one. Now they're going to sell $1,000 computers to millions of customers. I have seen the future said one analyst and it computes.

Commercial
Today an IBM computer has reached a personal......

Nobody was ever fired for buying IBM. Now companies could put PCs with the name they trusted on desks from Wisconsin to Wall Street.

Bob Metcalfe
Founder 3COM
When the IBM PC came and the PC became a serious business tool, a lot of them, the first of them went into those buildings over there and that was the real ehm when the PC industry started taking off, it happened there too.

Commercial
Can learn to use it with ease...

Sparky Sparks
Former IBM Executive
What IBM said was it's okay corporate America for you to now start buying and using PCs. And if it's okay for corporate America, it's got to be okay for everybody.

For all the hype, the IBM PC wasn't much better than what came before. So while the IBM name could create immense demand, it took a killer application to sustain it. The killer app for the IBM PC was yet another spreadsheet. Based on Visicalc, but called Lotus 1-2-3, its creators were the first of many to get rich on IBM's success. Within a year Lotus was worth $150 million bucks. Wham! Bam! Thank you IBM!

Commercial
Time to rock time for code...

IBM had forecast sales of half a million computers by 1984. In those 3 years, they sold 2 million.

Jack Sams
Euphoric I guess is the right word. Everybody was believed that they were not going to... At that point two million or three million, you know, they were now thinking in terms of a hundred million and they were probably off the scale in the other direction.

What did all this mean to Bill Gates, whose operating system, DOS, was at the heart of every IBM PC sold? Initially, not much, because of the deal with IBM. But it did give him a vital bridgehead to other players in the PC marketplace, which meant trouble in the long run for Big Blue.

Bill Gates
The key to our...the structure of our deal was that IBM had no control over...over our licensing to other people. In a lesson on the computer industry in mainframes was that er, over time, people built compatible machines or clones, whatever term you want to use, and so really, the primary upside on the deal we had with IBM, because they had a fixed fee er, we got about $80,000 - we got some other money for some special work we did er, but no royalty from them. And that's the DOS and Basic as well. And so we were hoping a lot of other people would come along and do compatible machines. We were expecting that that would happen because we knew Intel wanted to vend the chip to a lot more than just than just IBM and so it was great when people did start showing up and ehm having an interest in the licence.

IBM now had fifty per cent market share and was defining what a PC meant. There were other PCs that were sorta like the IBM PC, kinda like it. But what the public wanted was IBM PCs. So to be successful other manufacturers would have to build computers exactly like the IBM. They wanted to copy the IBM PC, to clone it. How could they do that legally, well welcome to the world of reverse engineering. This is what reverse engineering can get you if you do it right. It's the modest Aspen, Colorado ski shack of Rod Canion, one of the founders of Compaq, the company set up to compete head-on with the IBM PC. Back in 1982, Rod and three fellow engineers from Texas Instruments sketched out a computer design on a place mat at the House of Pies restaurant in Houston, Texas. They decided to manufacture and market a portable version of the IBM PC using the curious technique of reverse engineering.

Rod Canion
Co-founder Compaq
Reverse engineering is figuring out after something has already been created how it ticks, what makes it work, usually for the purpose of creating something that works the same way or at least does something like the thing you're trying to reverse engineer.

Here's how you clone a PC. IBM had made it easy to copy. The microprocessor was available off the shelf from Intel and the other parts came from many sources. Only one part was IBM's alone, a vital chip that connected the hardware with the software. Called the ROM-BIOS, this was IBM's own design, protected by copyright and Big Blue's army of lawyers. Compaq had to somehow copy the chip without breaking the law.

Rod Canion
First you have to decide how the ROM works, so what we had to do was have an engineer sit down with that code and through trial and error write a specification that said here's how the BIOS ROM needs to work. It couldn't be close it had to be exact so there was a lot of detailed testing that went on.

You test how that all-important chip behaves, and make a list of what it has to do - now it's time to meet my lawyer, Claude.

Claude Stern
Silicon Valley Attorney
BOB: I've examined the internals of the ROM BIOS and written this book of specifications now I need some help because I've done as much as I can do, and you need to explain what's next.
CLAUDE: Well,the first thing I'm going to do is I'm going to go through the book of specifications myself, but the first thing I can tell you Robert is that you're out of it now. You are contaminated, you are dirty. You've seen the product that's the original work of authorship, you've seen the target product, so now from here on in we're going to be working with people who are not dirty. We're going to be working with so called virgins, who are going to be operating in the clean room.
BOB: I certainly don't qualify there.
CLAUDE: I imagine you don't. So what we're going to do is this. We're going to hire a group of engineers who have never seen the IBM ROM BIOS. They have never seen it, they have never operated it, they know nothing about it.

Claude interrogates Mark
CLAUDE: Have you ever before attempted to disassemble decompile or to in any way shape or form reverse engineer any IBM equipment?
MARK: Oh no.
CLAUDE: And have you ever tried to disassemble....

This is the Silicon Valley virginity test. And good virgins are hard to find.

CLAUDE: You understand that in the event that we discover that the information you are providing us is inaccurate you are subject to discipline by the company and that can include but not limited to termination immediately do you understand that?
MARK: Yes I do.
CLAUDE: OK.

After the virgins are deemed intact, they are forbidden contact with the outside world while they build a new chip -- one that behaves exactly like the one in the specifications. In Compaq's case, it took l5 senior programmers several months and cost $1 million to do the reverse engineering. In November 1982, Rod Canion unveiled the result.

Bill Murto
What I�ve brought today is a Compaq portable computer.

When Bill Murto, another Compaq founder got a plug on a cable TV show their selling point was clear 100 percent IBM compatibility.

Bill Murto
It turns out that all major popular software runs on the IBM personal computer or the Compaq portable computer.
Q: That extends through all software written for IBM?
A: Eh Yes.
Q: It all works on the Compaq?

The Compaq was an instant hit. In their first year, on the strength of being exactly like IBM but a little cheaper, they sold 47,000 PCs.

Rod Canion
In our first year of sales we set an American business record. I guess maybe a world business record. Largest first year sales in history. It was a hundred and eleven million dollars.

So Rod Canion ends up in Aspen, famous for having the most expensive real estate in America and I try not to look envious while Rod tells me which executive jet he plans to buy next.
ROD: And finally I picked the Lear 31.
BOB: Oh really?
ROD: Now thart was a fun airplane.
BOB: Oh yeh.

Poor Big Blue! Suddenly everybody was cashing in on IBM's success. The most obvious winner at first was Intel, maker of the PCs microprocessor chip. Intel was selling chips like hotcakes to clonemakers -- and making them smaller, quicker and cheaper. This was unheard of! What kind of an industry had Big Blue gotten themselves into?

Jim Cannavino
Former Head, IBM PC Division
Things get less expensive every year. People aren't used to that in general. I mean, you buy a new car, you buy one now and four years later you go and buy one it costs more than the one you bought before. Here is this magical piece of an industry - you go buy one later it costs less and it does more. What a wonderful thing. But it causes some funny things to occur when you think about an industry. An industry where prices are coming down, where you have to sell it and use it right now, because if you wait later it's worth less.

Where Compaq led, others soon followed. IBM was now facing dozens of rivals - soon to be familiar names began to appear, like AST, Northgate and Dell. It was getting spectacularly easy to build a clone. You could get everything off the shelf, including a guaranteed-virgin ROM BIOS chip. Every Tom, Dick & Bob could now make an IBM compatible PC and take another bite out of Big Blue's business. OK we're at Dominos Computers at Los Altos California, Silicon Valley and this is Yukio and we're going to set up the Bob and Yukio Personal Computer Company making IBM PC clones. You're the expert, I of course brought all the money so what is it that we're going to do.

Yukio
OK first of all we need a motherboard.
BOB: What's a motherboard?
YUKIO: That's where the CPU is set in...that's the central processor unit.
BOB: OK.
YUKIO: In fact I have one right here. OK so this is the video board...
BOB: That drives the monitor.
YUKIO: Right.
BOB: Terror?
BILL LOWE: Oh, of course. I mean we were able to sell a lot of products but it was getting difficult to make money.
YUKIO: And this is the controller card which would control the hard drive and the floppy drive.
BOB: OK.

Rod Canion
And the way we did it was by having low overhead. IBM had low cost of product but a lot of overhead - they were a very big company.

YUKIO: Right this is a high density recorder.
BOB: So this is a hard disk drive.

Rod Canion
And by keeping our overhead low even though our margins were low we were able to make a profit.

YUKIO: OK I have one right here.
BOB: Hey...OK we have a keyboard which plugs in right over here.
YUKIO: Right...
BOB: People build them themselves - how long does it take?
YUKIO: About an hour.
BOB: About an hour.

And where did every two-bit clone-maker buy his operating system? Microsoft, of course. IBM never iniagined Bill Gates would sell DOS to anyone else. Who was there? But by the mid 80's it was boom time for Bill. The teenage entrepreneur had predicted a PC on every desk and in every home, running Microsoft software. It was actually coming true. As Microsoft mushroomed there was no way that Bill Gates could personally dominate thousands of employees but that didn't stop him. He still had a need to be both industry titan and top programmer. So he had to come up with a whole new corporate culture for Microsoft. He had to find a way to satisfy both his adolescent need to dominate and his adult need to inspire. The average Microsoftee is male and about 25. When he's not working, well he's always working. All his friends are Microsoft programmers too. He has no life outside the office but all the sodas are free. From the beginning, Microsoft recruited straight out of college. They chose people who had no experience of life in other companies. In time they'd be called Microserfs.

Charles Simonyi
Chief Programmer, Microsoft
It was easier to to to create a new culture with people who are fresh out of school rather than people who came from, from from eh other companies and and and other cultures. You can rely on it you can predict it you can measure it you can optimise it you can make a machine out of it.

Christine Comaford
I mean everyone like lived together, ate together dated each other you know. Went to the movies together it was just you know very much a it was like a frat or a dorm.

Steve Ballmer
Everybody's just push push push - is it right, is it right, do we have it right keep on it - no that's not right ugh and you're very frank about that - you loved it and it wasn't very formal and hierarchical because you were just so desirous to do the right thing and get it right. Why - it reflects Bill's personality.

Jean Richardson
And so a lot of young, I say people, but mostly it was young men, who just were out of school saw him as this incredible role model or leader, almost a guru I guess. And they could spend hours with him and he valued their contributions and there was just a wonderful camaraderie that seemed to exist between all these young men and Bill, and this strength that he has and his will and his desire to be the best and to be the winner - he is just like a cult leader, really.

As the frenzied 80's came to a close IBM reached a watershed - they had created an open PC architecture that anyone could copy. This was intentional but IBM always thought their inside track would keep them ahead - wrong. IBM's glacial pace and high overhead put them at a disadvantage to the leaner clone makers - everything was turning into a nightmare as IBM lost its dominant market share. So in a big gamble they staked their PC future to a new system a new line of computers with proprietary closed hardware and their very own operating system. It was war.

Presentation
Start planning for operating system 2 today.

IBM planned to steal the market from Gates with a brand new operating system, called - drum roll please - OS/2. IBM would design OS/2. Yet they asked Microsoft to write the code. Why would Microsoft help create what was intended to be the instrument of their own destruction? Because Microsoft knew IBM was was the source of their success and they would tolerate almost anything to stay close to Big Blue.

Steve Ballmer
It was just part of, as we used to call it, the time riding the bear. You just had to try to stay on the bear's back and the bear would twist and turn and try to buck you and throw you, but darn, we were going to ride the bear because the bear was the biggest, the most important you just had to be with the bear, otherwise you would be under the bear in the computer industry, and IBM was the bear, and we were going to ride the back of the bear.

Bill Gates
It's easy for people to forget how pervasive IBM's influence over this industry was. When you talked to people who've come in to the industry recently there's no way you can get that in to their - in to their head, that was the environment.

The relationship between IBM and Microsoft was always a culture clash. IBMers were buttoned-up organization men. Microsoftees were obsessive hackers. With the development of OS/2 the strains really began to show.

Steve Ballmer
In IBM there's a religion in software that says you have to count K-LOCs, and a K-LOC is a thousand line of code. How big a project is it? Oh, it's sort of a 10K-LOC project. This is a 20K-LOCer. And this is 5OK-LOCs. And IBM wanted to sort of make it the religion about how we got paid. How much money we made off OS 2, how much they did. How many K-LOCs did you do? And we kept trying to convince them - hey, if we have - a developer's got a good idea and he can get something done in 4K-LOCs instead of 20K-LOCs, should we make less money? Because he's made something smaller and faster, less KLOC. K-LOCs, K-LOCs, that's the methodology. Ugh anyway, that always makes my back just crinkle up at the thought of the whole thing.

Jim Cannavino
When I took over in '89 there was an enormous amount of resources working on OS 2, both in Microsoft and the IBM company. Bill Gates and I met on that several times. And we pretty quickly came to the conclusion together that that was not going to be a success, the way it was being managed. It was also pretty clear that the negotiating and the contracts had given most of that control to Microsoft.

It was no longer just a question of styles. There was now a clear conflict of business interest. OS/2 was planned to undermine the clone market, where DOS was still Microsoft's major money-maker. Microsoft was DOS. But Microsoft was helping develop the opposition? Bad idea. To keep DOS competitive, Gates had been pouring resources into a new programme called Windows. It was designed to provide a nice user-friendly facade to boring old DOS. Selling it was another job for shy, retiring Steve Ballmer.

Steve Ballmer (Commercial)
How much do you think this advanced operating environment is worth - wait just one minute before you answer - watch as Windows integrates Lotus 1, 2, 3 with Miami Vice. Now we can take this...

Just as Bill Gates saw OS/2 as a threat, IBM regarded Windows as another attempt by Microsoft to hold on to the operating system business.

Bill Gates
We created Windows in parallel. We kept saying to IBM, hey, Windows is the way to go, graphics is the way to go, and we got virtually everyone else, enthused about Windows. So that was a divergence that we kept thinking we could get IBM to - to come around on.

Jim Cannavino
It was clear that IBM had a different vision of its relationship with Microsoft than Microsoft had of its vision with IBM. Was that Microsoft's fault? You know, maybe some, but IBM's not blameless there either. So I don't view any of that as anything but just poor business on IBM's part.

Bill Gates is a very disciplined guy. He puts aside everything he wants to read and twice a year goes away for secluded reading weeks - the decisive moment in the Microsoft/IBM relationship came during just such a retreat. In front of a log fire Bill concluded that it was no longer in Microsoft's long term interests to blindly follow IBM. If Bill had to choose between OS2, IBM's new operating system and Windows, he'd choose Windows.

Steve Ballmer
We said ooh, IBM's probably not going to like this. This is going to threaten OS 2. Now we told them about it, right away we told them about it, but we still did it. They didn't like it, we told em about it, we told em about it, we offered to licence it to em.

Bill Gates
We always thought the best thing to do is to try and combine IBM promoting the software with us doing the engineering. And so it was only when they broke off communication and decided to go their own way that we thought, okay, we're on our own, and that was definitely very, very scary.

Steve Ballmer
We were in a major negotiation in early 1990, right before the Windows launch. We wanted to have IBM on stage with us to launch Windows 3.0, but they wouldn't do the kind of deal that would allow us to profit it would allow them essentially to take over Windows from us, and we walked away from the deal.

Jack Sams, who started IBM's relationship with Microsoft with that first call to Bill Gates in 1980, could only look on as the partnership disintegrated.

Jack Sams
Then they at that point I think they agreed to disagree on the future progress of OS 2 and Windows. And internally we were told thou shalt not ship any more products on Windows. And about that time I got the opportunity to take early retirement so I did.

Bill's decison by the fireplace ended the ten year IBM/Microsoft partnership and turned IBM into an also-ran in the PC business. Did David beat Goliath? The Boca Raton, Florida birthplace of the IBM's PC is deserted - a casualty of diminishing market share. Today, IBM is again what it was before - a profitable, dominant mainframe computer company. For awhile IBM dominated the PC market. They legitimised the PC business, created the standards most of us now use, and introduced the PC to the corporate world. But in the end they lost out. Maybe it was to a faster, more flexible business culture. Or maybe they just threw it away. That's the view of a guy who's been competing with IBM for 20 years, Silicon Valley's most outspoken software billionaire, Larry Ellison.

Larry Ellison
Founder, Oracle
I think IBM made the single worst mistake in the history of enterprise on earth.
Q: Which was?
LARRY: Which was the manufacture - being the first manufacturer and distributor of the Microsoft/Intel PC which they mistakenly called the IBM PC. I mean they were the first manufacturer and distributor of that technology I mean it's just simply astounding that they could ah basically give a third of their market value to Intel and a third of their market value to Microsoft by accident - I mean no-one, no-one I mean those two companies today are worth close to you know approaching a hundred billion dollars I mean not many of us get a chance to make a $100 billion mistake.

As fast as IBM abandons its buildings, Microsoft builds new ones. In 1980 IBM was 3000 times the size of Microsoft. Though still a smaller company, today Wall Street says Microsoft is worth more. Both have faced anti-trust investigations about their monopoly positions. For years IBM defined successful American corporate culture - as a machine of ordered bureaucracy. Here in the corridors of Microsoft it's a different style, it's personal. This company - in its drive, its hunger to succeed - is a reflection of one man, its founder, Bill Gates.

Jean Richardson
Bill wanted to win. Incredible desire to win and to beat other people. At Microsoft we, the whole idea was that we would put people under, you know. Unfortunately that's happened a lot.

Esther Dyson
Computer Industry Analyst
Bill Gates is special. You wouldn't have had a Microsoft with take a random other person like Gary Kildall. On the other hand, Bill Gates was also lucky. But Bill Gates knows that, unlike a lot of other people in the industry, and he's paranoid. Every morning he gets up and he doesn't feel secure, he feels nervous about this. They're trying hard, they're not relaxing, and that's why they're so successful.

Christine Comaford
And I remember, I was talking to Bill once and I asked him what he feared, and he said that he feared growing old because you know, once you're beyond thirty, this was his belief at the time, you know once you're beyond thirty, you know, you don't have as many good ideas anymore. You're not as smart anymore.

Bill Gates
If you just slow down a little bit who knows who it'll be, probably some company that may not even exist yet, but eh someone else can come in and take the lead.

Christine Comaford
And I said well, you know, you're going to age, it's going to happen, it's kind of inevitable, what are you going to do about it? And he said I'm just going to hire the smartest people and I'm going to surround myself with all these smart people, you know. And I thought that was kind of interesting. It was almost - it was like he was like oh, I can't be immortal, but like maybe this is the second best and I can buy that, you know.

Bill Gates
If you miss what's happening then the same kind of thing that happened to IBM or many other companies could happen to Microsoft very easily. So no-one's got a guaranteed position in the high technology business, and the more you think about, you know, how could we move faster, what could we do better, are there good ideas out there that we should be going beyond, it's important. And I wouldn't trade places with anyone, but the reason I like my job so much is that we have to constantly stay on top of those things.

The Windows software system that ended the alliance between Microsoft and IBM pushed Gates past all his rivals. Microsoft had been working on the software for years, but it wasn't until 1990 that they finally came up with a version that not only worked properly, it blew their rivals away and where did the idea for this software come from? Well not from Microsoft, of course. It came from the hippies at Apple. Lights! Camera! Boot up! In 1984, they made a famous TV commercial. Apple had set out to create the first user friendly PC just as IBM and Microsoft were starting to make a machine for businesses. When the TV commercial aired, Apple launched the Macintosh.

Commercial
Glorious anniversary of the information...

The computer and the commercial were aimed directly at IBM - which the kids in Cupertino thought of as Big Brother. But Apple had targeted the wrong people. It wasn't Big Brother they should have been worrying about, it was big Bill Gates.

Commercial
We are one people....

To find out why, join me for the concluding episode of Triumph of the Nerds.

Commercial
...........we shall prevail.

Fri, 17 Jun 2022 09:42:00 -0500 text/html https://www.pbs.org/nerds/part2.html
Killexams : How to be an AI & ML expert: A Webinar with Cloud Architect Subhendu Dey

Summary

The webinar outlined what AI and ML mean in today’s world and how students could get involved

Mr Subhendu Dey also laid out a comprehensive roadmap for those looking to start a career in AI and ML

Artificial Intelligence and Machine Learning as disciplines have taken the world by storm, particularly in the 21st century. While many youngsters have drawn inspiration from some of the best science fiction featuring AI and robots, the actual world of AI and ML has been growing by leaps and bounds. But what does the world of AI and ML have to offer? How can you transition from campus to career with AL & ML? And how can you be an expert in AI & ML? To answer these and many other questions, The Telegraph Online Edugraph organised a webinar with Subhendu Dey, a Cloud Architect and advisor on Data and AI.

The webinar saw participants from class 8 right up to those in advanced degrees, as well as teachers. Hence, the subject matter of the webinar contained takeaways that would be relevant at all stages. Mr Dey also highlighted that he would be focusing on showing how things that have always existed around us contribute to AI - giving students a more intuitive idea of AI and making it more interesting.

The webinar started by taking a look at a simple action like sending a text. People would find that their mobiles would keep suggesting words to them. Be it as soon as they have typed a few letters or after they have typed a few words, they would get suggestions that are surprisingly accurate. This is called Language Modelling and requires an intuitive understanding of language. A human may be able to do it from his or her extensive knowledge of words and language, but in this case, it is a fine demonstration of the intuitiveness of AI.

Let’s look at another aspect of AI - when we key in a question into the Google search bar, a decade or so ago, Google would have analysed the keywords and thrown up a list of links that feature the keywords. But fast-forward to this decade and Natural Language search is today capable of not just reading the keywords but also finding out the intent behind the query. This means that Google will, in addition to giving you the links, also give you the answer, as well as other questions that have the same or related intent. In fact, Google also has a system for taking feedback, which facilitates the Google AI to learn to be even more intuitive and better at giving suggestions.

One need only look at the digital assistant - Siri, Google Assistant or Alexa - to understand the advancements in AI. From understanding spoken queries to giving intuitive, and often very witty, answers, these assistants communicate in a surprisingly human-like manner. Of course, there is a cycle of tasks that they must perform behind the scenes, which Mr Dey spoke about in detail.

While these changes that we can observe are new, AI has been around for a long time now. One of the earliest feats was in 1997, when the IBM Supercomputer Deep Blue beat world chess champion Gary Kasparov, in a six-match tournament.

Today artificial intelligence is a booming area of development and the Ministry of Electronics & Information Technology projects the addition of about 20 million jobs in the sector by 2025. In fact, this is also underscored by multiple studies and reports prepared by global auditing firms like Deloitte, NASSCOM and PwC.

However, one question that has always baffled scientists and engineers working in the domain of AI, is striking a balance between behaviour and reasoning on the one hand and human/irrational and rational on the other, when designing the various Artificial Intelligence agents. It has, however, been found that more intuitive AI agents with better user experience interfaces have a higher penetration in human society.

Next we take a look at Machine Learning. When an AI agent learns on its own from the interactions it has, this is known as Machine Learning. When humans learn something, it registers in some form in the mind. However, machines perceive data in the form of functions and variables. With Machine Learning, AI agents create models which exist as executable software components made up of a sequence of mathematical variables and functions. Hence, becoming an expert in AI and ML usually requires a person to have a sound understanding of mathematics and statistics.

Speaking of building a career in AI and ML, Mr Dey threw light on three avenues into the industry. These are:

  • As a scientist
  • As an engineers
  • As a contributor

Let’s take a look at each of these.

As a Scientist

As mentioned above, to communicate with AI, your query must be represented in a mathematical/logical format. Hence, when choosing your educational degrees or courses, go for courses that cover the following syllabus which contribute to the core of AI:

  • Vectors and Matrices
  • Probability
  • Relation and Function
  • Differential Calculus
  • Statistical Analysis

Choosing a major which covers these aspects should arm you with the knowledge and skills you need to become a scientist in AI.

As an Engineer

Being a scientist is not your only option, though. AI also depends heavily on engineers to grow and develop. From the engineering perspective, here is a list of functions that need to be carried out:

  • Visualisation/representation of data
  • Collection of data from multiple sources
  • Building pipelines to prepare data to scale
  • Using Machine Learning services/frameworks available on clouds to scale up
  • Test, audit and explain to various stakeholders the Machine Learning output

As a contributor

If you find you are not interested in being a scientist of an engineer, there are other significant ways you can contribute to AI. That could be in the following areas:

  • User experience design
  • Process modelling
  • Domain knowledge
  • Linguistic details
  • Social aspects

Mr Dey discusses all these avenues at length in the course of the webinar with examples. At the same time, he lays out the basic qualities that one must have - irrespective of which role one chooses to pursue. And these are creative vision, innate curiosity and perseverance.

Here are some courses that you should explore if you want to build a career in the core AI aspects:

  • A Bachelors or Masters degree in Computer Science or Engineering or Mathematics or Statistics.
  • A specialisation in any of the following areas:
    • Artificial Intelligence
    • Machine Learning
    • Data Science
    • Automation and Robotics
  • B Tech/ BE in other engineering fields, followed by work experience in the field of software or IT.
  • Artificial Intelligence
  • Machine Learning
  • Data Science
  • Automation and Robotics

The webinar ended with a detailed Q&A session which opened with some questions received from participants submitted at the time of registration and carried on to questions asked by participants in the course of the webinar. The Q&A covered a range of interesting syllabus like:

  • Neural networks/deep learning
  • Importance of Maths and Statistics in AI/ML
  • How valuable are practical projects for developing skills needed to work in AI/ML
  • Which programming language is the best to learn for a career working with AI/ML
  • Which are the best courses to consider as a student - traditional degrees or online certification courses
  • How does AI compare to the human brain
  • Will AI and automation endanger human jobs in the future
  • What are intelligent agents and how are they useful in AI

To learn the answer to these and many more questions, watch our video recording of the live webinar.

A career in AI and ML is an excellent choice now - and this small initiative of The Telegraph Edugraph was aimed at providing the right guidance for you to make the transition from Campus to Career. Best of luck!

Last updated on 26 Jul 2022

Mon, 25 Jul 2022 01:52:00 -0500 text/html https://www.telegraphindia.com/edugraph/career/how-to-be-an-ai-ml-expert-a-webinar-with-cloud-architect-subhendu-dey/cid/1876427
C2160-667 exam dump and training guide direct download
Training Exams List