For the last few decades, the computer keyboard has been seen as just another peripheral. There’s no need to buy a quality keyboard, conventional wisdom goes, because there’s no real difference between the fancy, ‘enthusiast’ keyboards and ubiquitous Dell keyboards that inhabit the IT closets of offices the world over.
Just like the mechanic who will only buy a specific brand of wrenches, the engineer who has a favorite pair of tweezers, or the amateur woodworker who uses a hand plane made 150 years ago, some people who use keyboards eight or twelve hours a day have realized the older tools of the trade are better. Old keyboards, or at least ones with mechanical switches, aren’t gummy, they’re precise, you don’t have to hammer on them to type, and they’re more ergonomic. They sound better. Even if it’s just a placebo effect, it doesn’t matter: there’s an effect.
This realization has led to the proliferation of high-end keyboards and keyboard aficionados hammering away on boards loaded up with Cherry MX, Alps, Gateron, Topre, and other purely ‘mechanical’ key switches. Today, there are more options available to typing enthusiasts than ever before, even though some holdouts are still pecking away at the keyboard that came with the same computer they bought in 1989.
The market is growing, popularity is up, and with that comes a herculean effort to revive what could be considered the greatest keyboard of all time. This is the revival of the IBM 4704 terminal keyboard. Originally sold to banks and other institutions, this 62-key IBM Model F keyboard is rare and coveted. Obtaining one today means finding one behind a shelf in an IT closet, or bidding $500 on an eBay auction and hoping for the best.
Now, this keyboard is coming back from the dead, and unlike the IBM Model M that has been manufactured continuously for 30 years, the 62-key IBM Model F ‘Kishsaver’ keyboard is being brought back to life by building new molds, designing new circuit boards, and remanufacturing everything IBM did in the late 1970s.
The first computer keyboards, found in the Commodore 64, the original Macintosh, and the first PCs were not the keyboards you would find being used today. These keyboards used mechanical switches with moving parts, a satisfying clack but were exceptionally expensive. There was also little standardization; the DEC VT100 terminal used complicated leaf spring switches made by Hi-Tek, the original Macintosh, Commodore Amiga, and Atari 800 used coiled spring switches from Mitsumi. IBM, though their typewriter division, manufactured their own key switches from the 1970s onward, beginning with a conversion of the Selectric typewriter for computer use to beam spring key switches in the 1970s to buckling spring switches in the 1980s.
Where the rest of the computer industry would slowly move away from mechanical key switches towards membrane or ‘rubber dome’ switches, IBM was surprisingly steadfast. The IBM Model F, the first keyboard to use buckling springs, would appear on the market with the System/23 in August of 1981, a month before the release of the original IBM PC. The IBM Model M keyboard would replace the Model F in 1985, but the basic mechanism remained the same: a spring would press a rocker in the base of the key, and the sufficient force was applied, the spring would ‘buckle’, pushing the rocker down onto a capacitive contacts or a membrane switch.
The usual mechanical key switch mechanisms found in high-end keyboards today – based on the Cherry MX and ALPS switches – do double duty as leaf springs and electrical contacts. The buckling spring mechanism is baroque in its complexity, relying on a spring and paddle to register a keypress. This complexity is assuredly not the result of building a keyboard down to a price point, and gives the buckling spring keyboards a satisfying tactile feel that only gets better after a few decades of use.
IBM sold the unit responsible for manufacturing the Model M to Lexmark in 1991, and in 1996, the employees at this plant purchased the rights and injection molds to create Unicomp. Yes, you can still buy a Model M keyboard made in Lexington, Kentucky for about the same price as what IBM was selling these keyboards for back in 1986. These are Model M keyboards, though, and the tooling and technology required to produce the arguably superior Model F keyboard disappeared off the face of the planet sometime in the 1980s.
IBM’s Model F keyboards look odd to the modern eye. Most keyboards from the 1970s and 1980s do as well. This was a time when keyboard layouts were in flux. Even the most common elements of the computer keyboard were constantly changing in this era. The ‘inverted T’ layout of the arrow keys was first popularized by the DEC LK201 keyboard from 1982, most commonly found in conjunction with the VT-220 terminal. Numpads were fairly common, but the keys underneath the left hand pinky – Control, Alt, Caps Lock, and Tab – wouldn’t be standardized until the mid 1980s. Even the Windows key, or whatever the key between the left Control and Alt keys is called, wouldn’t appear until 1994 with the Microsoft Natural Keyboard.
Although keyboard layout designs would standardise around the ANSI or ISO specifications sometime in the late 80s, there were still enthusiasts looking for compact and minimalist keyboards. Numpads weren’t necessary for these people, and everything was more efficient. The Control key was where the Caps Lock key was, as god intended. The Happy Hacking Keyboard, a tiny 60-key keyboard designed for *nix operating systems, became a status symbol. The age of keyboard enthusiasts had arrived.
Sometime around 2012, on a few of the Internet’s largest mechanical keyboard communities, news of a strange IBM keyboard made the rounds. It was small, with just 62 keys, made out of metal, and used the buckling spring key switches of the Model F. It was the IBM 4704, Part Number 6019284. This keyboard was re-discovered by a forum member named Kishy. Since IBM was wont to put a ‘space saver’ label on their smaller keyboards, the portmanteau ‘Kishsaver’ stuck.
The Kishsaver was a true Model F, made of metal, and could serve as an impromptu melee weapon in the event of a zombie apocalypse. It had the IBM aura about it, and it was rare. This wasn’t a keyboard that anyone buying a home computer would have had – this was a keyboard hooked up to terminals and connected to the main office’s mainframe. This was a keyboard that also had a relatively modern layout, and played to the minimalist proclivities of the mechanical keyboard enthusiast. Needless to say, demand outstripped supply, and today a Kishsaver will cost you about $500, if you can find it.
Find any piece of popular, discontinued tech, and you’ll find a replica and reproduction. In the computer world, it’s easy to find a reproduction of the Apple I. This observation extends to classic cars, motorcycles, and even hand tools. It was almost inevitable a classic IBM keyboard would eventually be cloned and remanufactured, all it took was someone to pool their resources, find the people who could do the work, and start a business. All it would take is someone with a little bit of experience.
This person happened to be [Ellipse] on the Geekhack and Deskthority keyboard forums. He’s handled group buys for forum members for open source keyboard controllers that convert the ancient electronics inside IBM keyboards into something that speaks USB. These keyboard controllers were designed by [xwhatsit] as open source alternatives to old electronics boards, but as with so many electronics projects, a lot of people don’t want to deal with Mouser or Digikey orders, and they can’t handle a soldering iron anyway.
With a little bit of manufacturing under his belt, [Ellipse] turned to the Model F Kishsaver. He’s disassembled, repaired, cleaned, and restored a lot of these keyboards over the years, and after carefully measuring his collection of 77- and 122-key Model Fs, he had the data to start talking to manufacturers.
Like any manufacturing project these days, [Ellipse] turned to China. Of course the original molds and dies for the Kishsaver don’t exist anymore, but a lot of the legwork has already been done. The keyboard controller board was already taken care of thanks to [xwhatsit]’s modernization. The Model F keyboard is built around a huge PCB, and with a few Kishsavers floating around, reverse engineering this PCB proved relatively easy.
The problem, as with so many projects, is the mechanical design. Molds needed to be made, the right type of plastic for the conductive ‘flippers’ of the buckling spring mechanism needed to be sourced, parts were cast, and dies were formed. New metal cases for the Kishsaver were created and powder coated. If you have an old Kishsaver, every new part is a drop in replacement for the old. Even the styrofoam packaging is a replica; [Ellipse] took measurements of the original IBM packaging materials and replicated with new molds.
The remanufacturing of the IBM Model F 62-key keyboard is among the top achievements among vintage computer enthusiasts. The only comparable project would be an Apple 1 replica built up from parts with late 1975 date codes. It’s an exceptional achievement for the mechanical keyboard community, made even more impressive by the fact that no keyboard manufacturer has taken up Model F manufacturing.
These efforts have culminated in a group buy on the Internet’s major keyboard forums, and an online shop that will sell the 62-key and 77-key Model F for $325 to $376, depending on options. Both the powder coated eggshell and the somewhat anachronistic industrial gray cases are available, with key caps sourced from Unicomp, the current manufacturers of the Model M.
Still, booting up an entire manufacturing line through what is effectively the trust of a community isn’t easy. [Ellipse] says he would like to do die-cast aluminum enclosures, instead of the machined aluminum cases he has now. Die casting would greatly reduce the manufacturing costs, the relatively inexpensive machining of each individual enclosure for the one-time but high cost of producing a steel mold. Still, individually machined enclosures allow for some experimentation, such as the ‘ultra compact’ case design that capitalizes on the modern, smaller controller board and appeals to modern design sensibilities.
For all the hullabaloo about hardware entrepreneurs, the maker movement, crowdfunding, rapid prototyping, and Chinese manufacturing, there aren’t many real success stories. Sure, there are hardware startups coasting on Y Combinator funding, but when it comes to actually producing something people want, there really aren’t many companies out there. For someone who is just an enthusiast, someone who isn’t a programmer, engineer, or product designer to pull a team together to remanufacture the best that came out of IBM in the 1970s is remarkable. It’s a testament to what a community can do, and what a single, dedicated person can achieve.
Title image source: murium on Deskthority
With the PowerToys utility for Windows 11/10, you can easily remap any key on the keyboard to another key or some system function. Remapping means when you press a key, instead of executing a default function, the key will execute an entirely different action. This way, you can potentially use any key for many different tasks. If you prefer using an older classic keyboard such as the IBM Model M that doesn’t include a physical Windows key, then we will show you in this post how to map Windows Key on an older classic keyboard on Windows 11/10.
To remap the desired function to the Scroll Lock key, do the following:
Now you have to decide which key you want to double as the Windows key. The right Alt key works very well (if you have one), because it is easy to use for one-handed Windows shortcuts and most people use the left Alt key more frequently. You could also choose a seldom-used key, such as Scroll Lock or right Ctrl instead.
On the left, you have to select the key you’d like to function as your Windows key. In this example, we’re using Alt (Right).
Windows will probably warn you that the key you’re remapping won’t be usable because you’ve reassigned it to another function. In that case, click Continue Anyway.
Once done, the new Windows key mapping should be active. You can close PowerToys, and you can use your computer as usual. You won’t have to log out or restart your PC; your change will take effect immediately.
If at any time you want to discard the mapping, navigate to the Remap Keyboard window in PowerToys and, then click the Trash icon beside the mapping to remove it.
That’s it on how to map Windows Key on older classic keyboard on Windows 11/10!
Related post: How to Remap and Launch any Program with Scroll Lock key.
Advanced technologies, such as artificial intelligence and cloud computing, among others, are increasingly being adopted. These technologies, used appropriately, can reinvent government services, education, healthcare, and the way businesses interact with their customers. If not responsibly applied, however, they have the potential to cause harm.
With so much at stake, it is essential that we, as a society, continue to advance the practice of infusing ethics and responsibility through the entire life cycle of technology. To this end, the Ethics Center and World Economic Forum (WEF) have collaborated on the Responsible Use of Technology initiative. The ongoing project aims to provide practical resources for organizations to operationalize ethics in their use of technology and features a series of case studies designed to surface lessons that can help organizations advance their own responsible innovation practices.
To date, Microsoft and IBM have been profiled with Ethics Center staff Don Heider, Brian P. Green, and Ann Skeet contributing to the work. WEF and the Ethics Center hope that the initiative will encourage organizations to not only adopt and operationalize technology ethics, but also to share their experience with the global community.
Ethics by Design: An organizational approach to responsible use of techology, December 2020
Responsible Use of Technology: The Microsoft Case Study, February 2021
Responsible Use of Technology: The IBM Case Study, September 2021
Hitoshi Kume, a recipient of the 1989 Deming Prize for use of quality principles, defines problems as "undesirable results of a job." Quality improvement efforts work best when problems are addressed systematically using a consistent and analytic approach; the methodology shouldn't change just because the problem changes. Keeping the steps to problem-solving simple allows workers to learn the process and how to use the tools effectively.
Easy to implement and follow up, the most commonly used and well-known quality process is the plan/do/check/act (PDCA) cycle (Figure 1). Other processes are a takeoff of this method, much in the way that computers today are takeoffs of the original IBM system. The PDCA cycle promotes continuous improvement and should thus be visualized as a spiral instead of a closed circle.
Another popular quality improvement process is the six-step PROFIT model in which the acronym stands for:
P = Problem definition.
R = Root cause identification and analysis.
O = Optimal solution based on root cause(s).
F = Finalize how the corrective action will be implemented.
I = Implement the plan.
T = Track the effectiveness of the implementation and verify that the desired results are met.
If the desired results are not met, the cycle is repeated. Both the PDCA and the PROFIT models can be used for problem solving as well as for continuous quality improvement. In companies that follow total quality principles, whichever model is chosen should be used consistently in every department or function in which quality improvement teams are working.
Figure 1. The most common process for quality improvement is the plan/do/check/act cycle outlined above. The cycle promotes continuous improvement and should be thought of as a spiral, not a circle.
Once the basic problem-solving or quality improvement process is understood, the addition of quality tools can make the process proceed more quickly and systematically. Seven simple tools can be used by any professional to ease the quality improvement process: flowcharts, check sheets, Pareto diagrams, cause and effect diagrams, histograms, scatter diagrams, and control charts. (Some books describe a graph instead of a flowchart as one of the seven tools.)
The concept behind the seven basic tools came from Kaoru Ishikawa, a renowned quality expert from Japan. According to Ishikawa, 95% of quality-related problems can be resolved with these basic tools. The key to successful problem resolution is the ability to identify the problem, use the appropriate tools based on the nature of the problem, and communicate the solution quickly to others. Inexperienced personnel might do best by starting with the Pareto chart and the cause and effect diagram before tackling the use of the other tools. Those two tools are used most widely by quality improvement teams.
Flowcharts describe a process in as much detail as possible by graphically displaying the steps in proper sequence. A good flowchart should show all process steps under analysis by the quality improvement team, identify critical process points for control, suggest areas for further improvement, and help explain and solve a problem.
The flowchart in Figure 2 illustrates a simple production process in which parts are received, inspected, and sent to subassembly operations and painting. After completing this loop, the parts can be shipped as subassemblies after passing a final test or they can complete a second cycle consisting of final assembly, inspection and testing, painting, final testing, and shipping.
Figure 2. A basic production process flowchart displays several paths a part can travel from the time it hits the receiving dock to final shipping.
Flowcharts can be simple, such as the one featured in Figure 2, or they can be made up of numerous boxes, symbols, and if/then directional steps. In more complex versions, flowcharts indicate the process steps in the appropriate sequence, the conditions in those steps, and the related constraints by using elements such as arrows, yes/no choices, or if/then statements.
Check sheets help organize data by category. They show how many times each particular value occurs, and their information is increasingly helpful as more data are collected. More than 50 observations should be available to be charted for this tool to be really useful. Check sheets minimize clerical work since the operator merely adds a mark to the tally on the prepared sheet rather than writing out a figure (Figure 3). By showing the frequency of a particular defect (e.g., in a molded part) and how often it occurs in a specific location, check sheets help operators spot problems. The check sheet example shows a list of molded part defects on a production line covering a week's time. One can easily see where to set priorities based on results shown on this check sheet. Assuming the production flow is the same on each day, the part with the largest number of defects carries the highest priority for correction.
Figure 3. Because it clearly organizes data, a check sheet is the easiest way to track information.
The Pareto diagram is named after Vilfredo Pareto, a 19th-century Italian economist who postulated that a large share of wealth is owned by a small percentage of the population. This basic principle translates well into quality problems—most quality problems result from a small number of causes. Quality experts often refer to the principle as the 80-20 rule; that is, 80% of problems are caused by 20% of the potential sources.
A Pareto diagram puts data in a hierarchical order (Figure 4), which allows the most significant problems to be corrected first. The Pareto analysis technique is used primarily to identify and evaluate nonconformities, although it can summarize all types of data. It is perhaps the diagram most often used in management presentations.
Figure 4. By rearranging random data, a Pareto diagram identifies and ranks nonconformities in the quality process in descending order.
To create a Pareto diagram, the operator collects random data, regroups the categories in order of frequency, and creates a bar graph based on the results.
Cause and effect diagrams
The cause and effect diagram is sometimes called an Ishikawa diagram after its inventor. It is also known as a fish bone diagram because of its shape. A cause and effect diagram describes a relationship between variables. The undesirable outcome is shown as effect, and related causes are shown as leading to, or potentially leading to, the said effect. This popular tool has one severe limitation, however, in that users can overlook important, complex interactions between causes. Thus, if a problem is caused by a combination of factors, it is difficult to use this tool to depict and solve it.
A fish bone diagram displays all contributing factors and their relationships to the outcome to identify areas where data should be collected and analyzed. The major areas of potential causes are shown as the main bones, e.g., materials, methods, people, measurement, machines, and design (Figure 5). Later, the subareas are depicted. Thorough analysis of each cause can eliminate causes one by one, and the most probable root cause can be selected for corrective action. Quantitative information can also be used to prioritize means for improvement, whether it be to machine, design, or operator.
Figure 5. Fish bone diagrams display the various possible causes of the final effect. Further analysis can prioritize them.
The histogram plots data in a frequency distribution table. What distinguishes the histogram from a check sheet is that its data are grouped into rows so that the identity of individual values is lost. Commonly used to present quality improvement data, histograms work best with small amounts of data that vary considerably. When used in process capability studies, histograms can display specification limits to show what portion of the data does not meet the specifications.
After the raw data are collected, they are grouped in value and frequency and plotted in a graphical form (Figure 6). A histogram's shape shows the nature of the distribution of the data, as well as central tendency (average) and variability. Specification limits can be used to display the capability of the process.
Figure 6. A histogram is an easy way to see the distribution of the data, its average, and variability.
A scatter diagram shows how two variables are related and is thus used to test for cause and effect relationships. It cannot prove that one variable causes the change in the other, only that a relationship exists and how strong it is. In a scatter diagram, the horizontal (x) axis represents the measurement values of one variable, and the vertical (y) axis represents the measurements of the second variable. Figure 7 shows part clearance values on the x-axis and the corresponding quantitative measurement values on the y-axis.
Figure 7. The plotted data points in a scatter diagram show the relationship between two variables.
A control chart displays statistically determined upper and lower limits drawn on either side of a process average. This chart shows if the collected data are within upper and lower limits previously determined through statistical calculations of raw data from earlier trials.
The construction of a control chart is based on statistical principles and statistical distributions, particularly the normal distribution. When used in conjunction with a manufacturing process, such charts can indicate trends and signal when a process is out of control. The center line of a control chart represents an estimate of the process mean; the upper and lower critical limits are also indicated. The process results are monitored over time and should remain within the control limits; if they do not, an investigation is conducted for the causes and corrective action taken. A control chart helps determine variability so it can be reduced as much as is economically justifiable.
In preparing a control chart, the mean upper control limit (UCL) and lower control limit (LCL) of an approved process and its data are calculated. A blank control chart with mean UCL and LCL with no data points is created; data points are added as they are statistically calculated from the raw data.
Figure 8. Data points that fall outside the upper and lower control limits lead to investigation and correction of the process.
Figure 8 is based on 25 samples or subgroups. For each sample, which in this case consisted of five rods, measurements are taken of a quality characteristic (in this example, length). These data are then grouped in table form (as shown in the figure) and the average and range from each subgroup are calculated, as are the grand average and average of all ranges. These figures are used to calculate UCL and LCL. For the control chart in the example, the formula is ± A2R, where A2 is a constant determined by the table of constants for variable control charts. The constant is based on the subgroup demo size, which is five in this example.
Many people in the medical device manufacturing industry are undoubtedly familiar with many of these tools and know their application, advantages, and limitations. However, manufacturers must ensure that these tools are in place and being used to their full advantage as part of their quality system procedures. Flowcharts and check sheets are most valuable in identifying problems, whereas cause and effect diagrams, histograms, scatter diagrams, and control charts are used for problem analysis. Pareto diagrams are effective for both areas. By properly using these tools, the problem-solving process can be more efficient and more effective.
Those manufacturers who have mastered the seven basic tools described here may wish to further refine their quality improvement processes. A future article will discuss seven new tools: relations diagrams, affinity diagrams (K-J method), systematic diagrams, matrix diagrams, matrix data diagrams, process decision programs, and arrow diagrams. These seven tools are used less frequently and are more complicated.
Ashweni Sahni is director of quality and regulatory affairs at Minnetronix, Inc. (St. Paul, MN), and a member of MD&DI's editorial advisory board.
For 175 years, the University at Buffalo has made life better in Western New York and around the world with bold action and unmatched tenacity. While we are immensely proud of what we have accomplished, we cannot—and will not—stand still.
The UB of tomorrow is in our hands today. Your investment will make UB even stronger, more accessible and more impactful.
You may not remember this, but Nintendo hardware used to be a pretty big deal. The original Game Boy and NES both had remarkable industrial design that, like the Apple II and IBM Thinkpad, weren’t quite appreciated until many years after production ended. But, like many of you, [daftmike] had nostalgia-fueled memories of the NES experience still safely locked away.
Memories like lifting the cartridge door, blowing on the cartridge, and the feel of the cartridge clicking into place. So, understandably, reliving those experiences was a key part of [daftmike’s] Raspberry Pi-based NES build, though at 40% of the original size. He didn’t just want to experience the games of his youth, he wanted to experience the whole NES just as he had as a child.
Now, like any respectable hacker, [daftmike] didn’t let gaps in his knowledge stop him. This project was a learning experience. He had to teach himself a lot about 3D design and modeling, using Linux, and programming. But, the end result was surely worth the work; the attention to detail shows in features like the USB placement, the power and reset buttons, and of course the game cartridges which work with the magic of NFC and still include the insert and toggle action of the original cartridge carriage.
If you have a 3D printer and Raspberry Pi available, you could build a similar NES emulator yourself. But if you don’t have a 3D printer, but do have an original NES lying around, you could pull of the Raspberry Pi in a NES case hack. Whichever you do, the NES’s beauty deserves to be displayed in your home.
By WP Creative Group
The U.S. is facing a shortage of microchips that power everything from coffeemakers to lifesaving medical equipment. To solve it, industry stakeholders are putting aside rivalries and red tape and embracing public-private partnerships as the model for restoring the country’s leadership in semiconductor research, development and production.
“At this point, we are all rolling up our sleeves and developing the right agenda,” said Mukesh Khare, VP, hybrid cloud at IBM Research. Khare is leading IBM’s role in the American Semiconductor Innovation Coalition (ASIC). ASIC was formed to help revitalize the U.S. chip industry, with members representing private industry, academia and non-profits. In addition to IBM, ASIC’s 70-plus member roster includes notable organizations such as Microsoft, Applied Materials, Siemens-EDA, Cornell University, MIT and Howard University. “No one company, no single state can do this alone,” said David Anderson, president of ASIC member NY CREATES, a state-backed facilitator of high-tech projects in the Empire State.
ASIC’s immediate priority is to persuade Congress to fully fund the CHIPS for America Act. The act sets aside $52 billion for federal investment in semiconductor research and development. With both the House and Senate having passed their own versions, it’s now up to Congress to deliver a reconciled version to President Biden for his signature. Biden has publicly signaled his support for the act. Congress is also mulling the FABS Act, which provides a tax credit for semiconductor investment. The concern is that without Congressional support, the country could fall further behind in the global semiconductor market. Europe, for example, recently passed its own package, known as the EU Chips Act, to fund chip development on the Continent. The U.S. government “has not been involved in really fostering the semiconductor industry, while governments overseas have had no problems in doing that,” said Jesús del Alamo, professor of engineering at MIT.
As recently as 1990, the U.S. held 37 percent of the world’s semiconductor manufacturing capacity, according to the Semiconductor Industry Association. That’s now down to 12 percent even as some of the most sophisticated processors continue to be designed by U.S. companies. Meanwhile, Asia has become the dominant producer—with one Taiwanese company in particular manufacturing over 90 percent of the world’s advanced microchips. The U.S. is currently experiencing the consequences of becoming overly reliant on a handful of overseas suppliers. The pandemic boosted chip demand as more consumers and businesses adopted digital-first lifestyles and operations. At the same time, covid reduced shipments as lockdowns disrupted global supply chains. With demand and supply moving in opposite directions, there are now critical shortages for chips that power a vast range of products. One major automaker recently blamed the shortage for a $3.1 billion loss. Meanwhile, medical device manufacturers have warned that the situation could stall production of vital products such as defibrillators and pacemakers.
And with foundries running at full tilt just to meet current demand, some of the larger overseas facilities have cut back on prototyping services. This hurts U.S. startups that need to test chips designed for cutting-edge applications like AI, virtual reality and quantum computing. “Addressing this chip shortage goes beyond just being able to buy the next computer or car. It really is key to fostering future innovation,” said del Alamo.
ASIC is looking to revitalize the American chip industry by focusing on a number of key areas, including infrastructure, innovation and skills development. The CHIPS Act calls for the establishment of a National Semiconductor Technology Center, and ASIC believes it should be distributed across a network of innovation hubs around the country. These hubs would provide “an innovative ecosystem for research, development and prototyping with first-class resources, scientists, facilities and partners who can work quickly and efficiently to demonstrate and transfer breakthrough technology to manufacturing to secure a strong, domestic chip supply chain for the future,” ASIC said in a exact white paper.
A strong candidate for inclusion into the NSTC is the Albany Nanotech Complex, which exemplifies the innovation part of the equation. Last December, IBM and Samsung researchers at the publicly owned facility in Albany, New York announced what’s known as a vertical transistor. This new transistor design has the potential to reduce energy consumption by 85 percent compared to conventional chips in which transistors are arrayed across a flat surface. Potential applications range from autonomous vehicles to smartphones. “You wouldn’t have to charge your phone for three days,” said IBM’s Khare.
The CHIPS Act also supports research into advanced packaging methods that would allow manufacturers to combine a variety of chipsets into one, low cost and energy efficient package.
Investments in infrastructure alone won’t restore the U.S. to preeminence in chipmaking. The country is on pace to see a shortage of about 1.1 million STEM workers as soon as 2024, according to the American Action Forum. To help close this gap, ASIC members are working to increase and diversify the supply of skilled labor for chipmaking and related industries.
Officials at Howard University, an ASIC member, note that giving minority students equal access to STEM programs is essential to ensuring that U.S. industry has the technologically skilled workforce it needs. Currently, just nine percent of STEM workers are Black while only seven percent are Hispanic, according to Pew. Those numbers haven’t changed much over the years, despite the fact that both groups are growing as a portion of the overall population. “The country’s demographics are changing,” said Michaela Amoo, an assistant professor at Howard’s College of Engineering and Architecture. Amoo said that progress in diversifying STEM opportunities needs to start well before college. “What we really need to do is get into the whole K-12 pipeline,” she said.
Restoring America’s chip manufacturing leadership would create a ripple effect of benefits for the entire nation. Every semiconductor job generates 5.7 additional jobs across the economy, according to SIA. Having a more robust, domestic chipmaking industry would also alleviate supply chain shortages and it would strengthen national defense by eliminating U.S. reliance on overseas suppliers for chips that power military assets like ships, tanks and aircraft. “Ultimately it will lead to greater economic and national security for our future,” said NY CREATES’ Anderson.
Click here for more information about how ASIC is working to ensure America’s innovation future.
Credits: By WP Creative Group
By using artificial intelligence, agencies can Boost their procurement processes.
Perhaps no governmental process creates the perception of a complex bureaucracy more than the U.S. government’s procurement system.
Agencies follow the Federal Acquisition Regulation—or FAR, for short—or its U.S. Department of Defense counterpart, DFAR. The FAR consists of 37 chapters at over 2,000 pages, plus various agency supplements.
Within the context of this highly regulated procurement process, agencies have experimented over the last decade with using artificial intelligence (AI) to innovate and streamline how the U.S. government makes acquisitions. These efforts show promise and agencies will benefit from learning the lessons of past efforts at innovation, as well as envisioning future possibilities for improving procurement.
Such efforts become even more necessary given that the number of pages in the FAR does even not include agency-specific interpretations from chief acquisition officers, legal opinions that guide agency actions, protest rulings by the Government Accountability Office which agencies must account for to avoid future risk, and more.
And changing this stock of existing rules and guidance itself requires undergoing a complex process that, as with any regulatory action, must follow the Administrative Procedure Act’s provisions for proposing a rule change for public comment, reviewing comments, and finalizing the rule. Any significant new additions or changes to procurement rules must be reviewed by the Office of Management and Budget before proposal and final action, a process that also includes interagency review and can involve separate public comment.
Any time procurement officials in government set out to propose a change to the federal procurement rules, they must assess the change in the context of these processes. And any time an industry official seeks to bid on a contract or to comply with procurement regulations, these processes set the rules of the road for engagement with the government. Industry must also remain aware of and respond to agency enforcement actions, even though these are not released to the public in any consistent manner.
Government agencies have experimented with AI to Boost the procurement process. The IBM Center for the Business of Government has released four lessons learned from these innovative efforts, which can help guide agencies, companies, and researchers understand past precedents and envision future possibilities.
First, the foundation of AI involves developing capabilities for adapting AI to government functions. In a pioneering report, Kevin DeSouza—then with Arizona State University and now with the Queensland University of Technology in Australia—identified success factors for any application of AI by government, focusing on data literacy about AI among agency officials, training for knowledge on how to apply AI, and systems to manage risk in implementing AI systems. He identified a variety of opportunities for building a stronger foundation for AI deployment, including the following:
Agencies seeking to develop AI in their procurement process can pursue these opportunities.
Second, experimentation requires AI pilots to Boost the procurement process. The IBM Center collaborated with the Partnership for Public Service in a report to explore examples of early applications of AI to Boost the acquisition function. One important example in the report centered on the U.S. Air Force’s experimentation with an AI system, designed to help acquisition professionals make sense of complex acquisition regulations and speed the process of buying goods and services.
The Air Force’s pilot project involved uploading thousands of regulations, contract cases, acquisition training material and Defense Department policy to a database. AI technology then helped answer queries from federal contract officials and contractors about acquisition rules and regulations, such as how to proceed with a contract, what procedures to follow, and what contract a small business could bid on.
Although this work did not scale to operational capability, the experience demonstrated the power of AI to help government cut through decades of accumulated burdens in implementing procurements.
Third, engagement means government working with industry to evolve applications. The Procurement Innovation Laboratory (PIL) at the U.S. Department of Homeland Security has established a center of excellence for procurement innovation, including AI and other emerging technologies among its capabilities.
The IBM Center and the Partnership for Public Service have reported on the PIL model and its process for engaging industry to apply new AI applications for improving the acquisition process. The PIL exemplifies the value of public-private partnerships and helped capitalize on engagement with industry, rather than being limited by the assumption that such interactions cannot be accommodated.
In addition, the PIL adopted a user orientation—adopting the perspective of those who use the procurement system and how they interact with it. The PIL has incorporated elements of an agile approach, iterating innovation in short phases with learning built in. Support from the top agency leadership has helped facilitate innovation and risk-taking.
Finally, maturity occurs with advancing AI capabilities across the acquisition organization. A exact IBM Center report, also authored by Kevin DeSouza, sets out an AI Maturity Model for public sector enterprises—advancing from ad hoc experiments to enterprise-wide transformation. The model includes incorporating AI best practices into acquisition operations at higher levels, such as through agency-wide policies on AI acquisition and the development of a robust ecosystem of external partners that can generate opportunities. Another key best practice would help an agency to put in place a formal process for learning and improving its AI acquisition, deployment, and maintenance.
A similar transformation model that I and other leaders in government and industry developed several years ago can guide acquisition organizations in maturing to leverage innovation like the application of AI to their craft.
This model—Acquisition of the Future—identifies levels of growth for procurement organizations, with a special focus on innovation and reform of longstanding bureaucratic processes. This model is currently housed with the government industry collaborative American Council for Technology-Industry Advisory Council.
As these and similar initiatives demonstrate, leveraging AI to develop procurement innovations can help agencies and companies work together to develop and implement acquisition strategies that clarify requirements and identify best value bidders. They can cut through the seemingly endless provisions of existing policy and guidance and enable rapid action to meet agency needs.
To help scale and sustain this evolution, hiring and training skilled acquisition professionals will be needed, and initiatives such as the federal Digital IT Acquisition Professional Training program can help.
Overall, innovations in procurement processes can complement programs to acquire AI. The key to a better procurement system involves developing a mutually reinforcing pathway for government and industry to advance best practice and public value.
This essay is part of a nine-part series entitled Artificial Intelligence and Procurement.
Receive free World updates
We’ll send you a myFT Daily Digest email rounding up the latest World news every morning.
Almost half a trillion dollars has been wiped from the valuation of once high-flying financial technology companies that took advantage of a boom in initial public offerings, as concerns about rising interest rates, lack of profits and a potential recession put them at the sharp end of this year’s sell-off.
More than 30 fintechs have listed in the US since the start of 2020, according to CB Insights data, as investors flocked to companies they believed could benefit from a long-term shift toward digitisation accelerated by the pandemic.
However, concerns about rising interest rates, lack of profits and untested business models as the economy heads towards a potential recession have put them at the sharp end of this year’s sell-off.
Shares in recently listed fintechs have fallen an average of more than 50 per cent since the start of the year, according to a Financial Times analysis, compared with a 29 per cent drop in the Nasdaq Composite.
The decline in the valuations of publicly listed companies such as PayPal and Block — formerly known as Square — has filtered through to privately owned groups. Klarna slashed its price tag from $46bn to under $7bn in a private funding round earlier this month, and the Wall Street Journal reported last week that Stripe had cut its internal valuation by more than a quarter.
For regular updates on the fintech industry premium subscribers can sign up to our weekly newsletter.
Here is the rest of the day’s news to start your week — Gordon
1. Central banks embrace big rises A Financial Times analysis has found that central banks are now, more than at any other time this century, opting for larger rate rises, laying bare the challenge of tackling soaring inflation as rate increases by the US Federal Reserve put pressure on peer institutions to follow suit.
2. GSK spin-off becomes world’s biggest standalone consumer health group Shares in Haleon began trading in London today, becoming the largest listing in the city for more than a decade, with a market valuation of £30.5bn. The owner of Sensodyne toothpaste and Panadol painkillers is the world’s biggest standalone consumer health business. The split will leave GSK free to focus on prescription drugs and vaccines.
3. Sri Lanka declares state of emergency Acting president Ranil Wickremesinghe has declared a nationwide state of emergency, two days before the country’s parliamentarians are set to pick a replacement for a president who had fled the country after months of unrest.
4. Volodymyr Zelenskyy fires security chiefs over ‘treasonous’ officials Ukraine’s president fired his head of the state security service and chief prosecutor yesterday for allegedly allowing collaboration with Russian forces by scores of officials in occupied territories.
5. PwC set for record revenues PwC’s global boss Bob Moritz said the firm would report record revenues of about $50bn this year as he defended the model of combining audit and consulting services that Big Four rival EY is threatening to abandon.
Company earnings Goldman Sachs and Bank of America report quarterly results today before the opening bell. The two Wall Street banks are likely to be scrutinised by investors after peers JPMorgan Chase and Morgan Stanley last week reported that declines in investment banking fees contributed to their disappointing earnings for the most exact quarter. Real estate investment trust company Prologis and broker Charles Schwab also report earnings before Wall Street opens. Technology company IBM will report after the market closes.
Economic data US homebuilder confidence is expected to have edged down to a memorizing of 66 in July from a memorizing of 67 in June, as rising mortgage rates threaten housing affordability for first-time buyers, weakening demand for new homes.
Market outlook Shares in Asia and Europe rose today as investors scaled back expectations of how high the Federal Reserve will increase interest rates at its next meeting. Futures trading also suggests a positive start for shares on Wall Street. Elsewhere, the euro, which fell below $1 last week for the first time in 20 years, rose 0.5 per cent to $1.014 ahead of a meeting of the European Central Bank on Thursday and oil rose back towards $100 a barrel as the dollar weakened.
Tories whittle down PM candidates Conservative MPs hold another vote to narrow the field from five hopefuls seeking to become the next prime minister to four. A TV debate last night and one on Friday saw former chancellor Rishi Sunak extend his lead.
Farnborough Airshow Boeing chief executive Dave Calhoun has defended his record at America’s largest aerospace company as he struck a bullish tone on the eve of the biennial Farnborough International Air Show which opens today and runs until Friday.
The cryptocurrency industry is at a critical juncture and the FT is here to guide you through the deepening crisis. Premium subscribers can sign up to a new weekly email written by digital assets correspondent Scott Chipolina and sent every Friday. Sign up here.
Joe Biden’s fist bump belies unease between US and Saudi Arabia The US president went to the Middle East to reset rocky relations with Saudi Arabia, win some help bringing down oil prices and to reassure the region Washington was not abandoning them. But according to one analyst whatever the White House thinks, Biden ended up delivering a “huge win” for Prince Mohammed with that fist bump.
Will war in Ukraine transform Europe’s defence sector? Russia’s invasion has prompted the continent’s governments to reverse years of shrinking defence spending. Now, they want to do more to confront an aggressive Moscow — and defence and aerospace companies hope to benefit.
Related news The chief executive of Northrop Grumman warned that western weapons stockpiles had not been built to service a lengthy war and asked for a “clear signal” from governments if the conflict in Ukraine will be protracted.
How to solve the productivity paradox Since the computer age dawned in the 1970s, we have lived with a sense of accelerating progress and innovation. Yet the postwar productivity boom has ended: except for a revival at the turn of the century, productivity has trended downward for more than 50 years. Ruchir Sharma asks, what can we do about it?
Church or cult? Inside the Moonies’ ‘world of delusion’ Tetsuya Yamagami, the man suspected of killing Japan’s former PM Shinzo Abe, is reported to have been seeking revenge against the Unification Church. The connection is the latest controversy for the church, which has built a sprawling multibillion-dollar business empire, with interests ranging from a Brazilian football club to a Californian chinchilla ranch.
Personal branding: we may cringe but it works Like networking, a personal brand is something we are told we should be cultivating. But most of us don’t because we are embarrassed. Viv Groskop suggests breaking down the nebulous term into more tangible components: your professional reputation and visibility.
Chief economics commentator Martin Wolf joined presenter Lilah Raptopoulos to discuss having the confidence to change your mind. Martin reflected on how he forms a worldview, and how his opinions have shifted over the past half-century. There is also a feature on the “gentle parenting” craze from Washington correspondent Courtney Weaver.
Thank you for memorizing and remember you can add FirstFT to myFT. You can also elect to receive a FirstFT push notification every morning on the app. Send your recommendations and feedback to firstname.lastname@example.org. Sign up here.
IBM was among the first companies to help shape the parameters of the IT department and the ever-evolving role of CIO, and IBM CIO Kathryn Guarini has had a front row seat to watch the role change throughout her 22-year career with the technology giant.
Prior to becoming CIO last year, Guarini was COO of the IBM Research Division, arming her with extensive experience working with emerging technology and other innovations coming from IBM.
“Back in around the 1950s is when the first CIO roles really emerged," Guarini said. "And it was just beginning to be clear that technology would be a differentiator for us — IBM was advocating on behalf of our clients to elevate the role that IT leaders would play in enterprises."
And elevate CIOs have. From their early days as IT operations managers, CIOs have evolved to hold a firm place in the C-suite where they now are “sitting at the table and influencing decisions, whether that be the investment or the strategic decisions, and reacting to those decisions by putting the right investments and priorities in place to support the needs of the business,” Guarini said.
With a background working directly with some of the biggest innovations to come out of IBM, Guarini occupies a unique space at the intersection of technology and business, even among CIOs.
Like most CIOs, Guarini has a strong focus on emerging technologies, sustainability, and employee experience — key facets in delivering business value to her organisation. But in helming IT at IBM, she is also tasked with identifying what technologies make the most sense not only for IBM but also its CIO clients.
The evolving CIO role
In the past decade, IT has catapulted from back-office function to a vital department key to the success of nearly every business.
As strategic business partners, CIOs must now ensure the organisation’s technology agenda is “driving the most meaningful impact to the business,” Guarini said. And today that means not only having a firm grasp on business priorities and how to lead IT to achieve them but also a clear understanding that today’s CIOs must make sure the tools in place are effective and don’t stand in the way of employees getting their jobs done, Guarini said.
“We try to bring the technology to Boost the overall outcomes that we’re trying to achieve, … and if we can stay focused on that user experience it usually points us in the right direction on those things that matter most,” she said, adding that a large part of the role of CIO involves ensuring technology can “drive efficiencies, reduce friction, and Boost user experience” in the workplace.
There may be no greater example of how vital IT is to employee experience than the COVID-19 pandemic. Overnight, organisations around the globe had to pivot to remote work, whether they were prepared to or not. At IBM, which operates globally in 170 countries, the pivot to remote work was “relatively seamless,” said Guarini. The company had already been operating in a largely hybrid model, with flexible workplace best practices in place.
But even with a mostly seamless transition, there were still “additional requirements” that the pandemic put on the IT department, Guarini said.
IBM IT had to evaluate how to support that many “concurrent remote employees” with networking solutions, how to manage workflows that used to take place in person with product development and incident response, and other “interesting new challenges, both technical and process” that had to be dealt with due to the shift to remote work, she said.
Bridging the client/vendor divide
But a key remit for today’s CIOs, and one Guarini is uniquely positioned to understand, is helping organisations figure out where to make the right investments to “drive the innovation agenda,” she said, adding that CIOs must “shape the direction and influence the investments” as they are ultimately the ones responsible for ensuring that those investments adequately support business initiatives.
Here, Guarini operates in an interesting nexus — she is CIO of IBM, but as CIO, she’s also an IBM client.
“I am using the same technology that IBM develops and brings to our clients," she added. "And I want to be a few years ahead of where our clients are. I want to be an early adopter of that technology, help to validate it, make it better, demonstrate how it can scale, and meet the kind of challenges of a large-scale complex enterprise like IBM. When it can work for IBM, it will work for our clients as well."
Key to prioritising emerging technology, Guarini said, is being able to sift through the noise and identify technology that will be valuable to the organisation. Here, the CIO leans on her experience in the research arm of IBM, where evaluating trends to identify technology that will impact business was a central facet of the job.
“I think both automation and AI offer so much promise,” she said. “And we are beginning already to realise that promise and to see the benefits. But it’s not the technology in isolation — we need to think about ‘how do we marry that technology with the business process and the opportunity to really drive something that’s of value’?”
For example, IBM has deployed automation software such as Turbonomic, Red Hat Ansible Tower, and Konveyor Tackel to handle various aspects of IT automation to Boost reliability, efficiency, and scale, while also cutting down on human interactions with IT systems.
The company has also embraced IBM robotic process automation (RPA) across several business areas, including management, finance, compliance, and procurement, ultimately reducing manual labor by 234,000 hours and minimising the risks that can come from human error. They have also used RPA to automate invoice processing, to link system identities, to verify access requests, to notify managers for approval, and to spot potential conflicts with assigned duties across different users.
IBM has also deployed bots to help HR employees manage tasks such as job changes, department transfers, and salary adjustments. Chatbots are also being used to Boost IT support, answer simple questions, manage customer feedback, and manage invoicing.
The vendor has also integrated AI into its pricing process to remove inconsistencies or inherent bias. Since deploying AI for customer support issues, Guarini said IBM has seen a 26 per cent reduction in time-to-resolution of customer support cases.
Adopting emerging technologies at IBM
Having come from the IBM Research Division, which consists of around 3,000 technical scientists and engineers who help shape the future of AI, cyber security, quantum computing, and hybrid cloud, Guarini knows a lot must happen between identifying transformational technology and deploying it.
And that responsibility is even more important when internally vetting emerging technology as an enterprise IT provider — because as any client CIO knows, no matter how promising a technology may be, there are several hurdles to clear before it can be adopted and deployed at enterprise scale.
“We have ideas of what technology should do, but sometimes the complexity of deploying some of this technology becomes more challenging than we anticipated in a design dreaming session," she said. "When we begin to deploy it as an early adopter, a sponsor user, or an anchor client; that’s when we can figure out what it will take to make this work in practice in a real-life environment. And that’s really powerful."
Guarini feels her research background makes her more likely to “partner with IBM’s RD functions, to leverage emerging technologies at scale, and to question how we can drive innovation into our agenda,” she said.
She finds herself in a “unique position to be able to validate enterprise IT use cases,” while also ensuring that the IT department itself is an “early adopter of IBM’s own innovations,” which can ultimately help make the company’s solutions even better.
Moreover, sharing with IBM customers these real-life use-cases of IBM technology in action at scale helps “demonstrate what’s possible, brings credibility to our offerings, and builds confidence with our clients and partners,” said Guarini, who has also launched a blog, Making IT Real, where she details the various technologies IBM has embraced and how they’ve helped the organisation.
“I’ve both learned a lot and been able to bring a new perspective based on my experience in the rest of the company," she said.
"Certainly, the challenge of meeting the unique needs of our large, complex enterprise means that not all innovations are ready for large-scale production deployment. That’s okay. It’s been instructive to experiment with emerging technologies and determine what’s suitable for our needs and where we can provide feedback to enhance solutions."
Error: Please check your email address.