A bitter battle between so-called “stakeholder” and “shareowner” capitalists is heating up again, with BlackRock CEO Larry Fink and Vivek Ramaswamy pitted against each other. If Milton Friedman had anything to say about it, however, Fink and Ramaswamy would be locking arms rather than trading blows.
Like most conflicts, the perennial stakeholder-vs-shareowner controversy is born largely of a misunderstanding. Milton Friedman’s famous 1970 essay never stated corporations should maximize profits at all costs. Instead, he wrote they should “make as much money as possible while conforming to the basic rules of society, both those embodied in law and ethical custom.” Friedman also said shareowners are the final arbiters of relevant rules and customs. “In a free enterprise, private property system, a corporate executive is an employee of the owners of the business.”
Since 1970, shareowners have broadly operated as Friedman would hope. They have repeatedly intervened to Improve their own long-term welfare.
For example, in the 70s, the bankruptcy of Penn Central and repeated corporate bribery scandals led shareowners and the SEC to require independent audit committees. It worked. No corporate bankruptcy larger than Penn Central’s took place for more than three decades, until Enron’s, in 2001. In the 80s and 90s, institutional investors led by public pension plans like CalPERS imposed performance-based executive pay guidelines and poison-pill takeover protections, in an effort to mute excesses in CEO pay and hostile takeovers. Victor Pozner–who was once reported to “have the arrogance of a banana republic dictator”–had grown infamous for buying corporations like Sharon Steel and paying himself multiples of what his aggressed targets earned. No more.
From 1991 to 1993, underperforming CEOs at Westinghouse, American Express, IBM, Kodak, and General Motors were removed through the actions of their shareowners, not their boards.
Later, the global financial crisis of 2007-09 uncovered serious misalignments between compensation incentives and shareowner interests, as well as widespread risk management lapses. Stronger balance sheets and professional risk management became core shareowner priorities. So did deferred compensation packages and claw-back provisions.
In the past few years, shareowner resolutions have focused more on environmental and social concerns. In 2021, 81% of Dupont’s shareowners supported a proposal requiring the company to disclose how many of its plastic pellets end up in landfills and oceans each year (apparently, about 10 trillion). In the same year, 95% of Wendy’s shareowners required management to join the “fair food program” to support safer working conditions in the COVID era, something most other fast-food companies had already done.
Each of these shareowner interventions shares the same broad cause and effect: in every case, shareowners imposed some short-term cost in exchange for more probable, long-term gains. One can also appreciate why. Most shareowners are future pensioners, plan sponsors, insurance companies, or sovereign wealth funds–i.e., individuals or institutions with long-term liabilities. These financial needs span generations of corporate leadership, not just the next few quarters. Most shareowners want their long-term welfare to be maximized.
The stakeholder paradigm has always rested on shaky ground. Corporations shouldn’t risk displacing essential growth priorities with potentially superfluous, tenuous, fleeting, and/or socially divisive ones. Stakeholder capitalists have no proven optimization models or track record upon which they could base any broad competency claims.
The same cannot be said of shareowners. They have the legal right to order corporate bosses around coupled with a proven record of success. That’s why Friedman wanted the buck to stop with shareowners, not stakeholders. Shareowners bear the direct consequences of their orders. Incentives are aligned.
It's a sign of the times that the stakeholder-vs-shareowner debate has been resurrected. Contrived “red-vs-blue” conflicts are a pernicious, modern affliction. Besides, avid students of corporate purpose know this debate was effectively resolved seventeen years ago–by none other than Milton Friedman and Whole Foods CEO John Mackey.
In a friendly 2005 symposium on the course with then 96-year Milton Friedman, John Mackey stated that “Free-enterprise capitalism is the most powerful system for social cooperation and human progress ever conceived.” Friedman unhesitantly agreed. “The differences between John Mackey and me regarding the social responsibility of business are … rhetorical. Strip off all the camouflage and it turns out we are in essential agreement.”
Today, shareowners legally insist companies make as much money as possible. They further expect executives to do so in a manner that is socially and environmentally sustainable. Why? Because the concepts of social and environmental consciousness have increasingly become our ethical norms, as Milton Friedman anticipated.
Free enterprise capitalism will continue to generate the greatest benefits for the greatest number–and mindful shareowners will continue to guide free enterprise capitalism to do so for as long as possible.
Stated more simply, there are no substantive differences between stakeholder welfare and long-term shareowner welfare. Larry Fink and Vivek Ramaswamy can live in peace, after all.
Terrence R. Keeley is the CEO of 1PointSix and the author of SUSTAINABLE by Columbia University Press.
The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.
This story was originally featured on Fortune.com
More from Fortune:
IBM’s former CEO downplays the importance of a college degree for six-figure earning ‘new collar’ jobs
American cities are preparing for the worst
Managing Gen Z is like working with people from a ‘different country’
‘Ridiculously stupid’ economic policies have the U.S. hurtling toward a ‘perfect storm’ of economic pain, Ray Dalio says
A new general-purpose optimizer can speed up the design of autonomous systems including walking robots and self-driving vehicles.
Since the fastidious Roomba vacuum, autonomous robots have come a long way. In latest years, artificially intelligent systems have been deployed in self-driving cars, warehouse packing, patient screening, last-mile food delivery, hospital cleaning, restaurant service, meal prep, and building security.
Each of these robotic systems is a product of an ad hoc design process specific to that particular system. This means that in designing an autonomous robot, engineers must run countless trial-and-error simulations, often informed by intuition. These simulations are tailored to a particular robot’s components and tasks, in order to tune and optimize its performance. Designing an autonomous robot today is, in some respects, a lot like baking a cake from scratch, with no recipe or prepared mix to ensure a successful outcome.
Now, engineers at MIT have developed a general design tool for roboticists to use as a sort of automated recipe for success. Optimization code has been devised by the team that can be applied to simulations of virtually any autonomous robotic system and can be used to automatically identify how and where to tweak a system to Improve a robot’s performance.
The engineers showed that the tool was able to quickly Improve the performance of two very different autonomous systems: one in which a robot navigated a path between two obstacles, and another in which a pair of robots worked together to move a heavy box.
The group hopes the new general-purpose optimizer can help to speed up the development of a wide range of autonomous systems, from walking robots and self-driving vehicles, to soft and dexterous robots, and teams of collaborative robots.
The researchers, composed of Charles Dawson, an MIT graduate student, and ChuChu Fan, assistant professor in MIT’s Department of Aeronautics and Astronautics, presented their findings at the annual Robotics: Science and Systems conference in New York.
Dawson and Fan realized the need for a general optimization tool after observing a wealth of automated design tools available for other engineering disciplines.
“If a mechanical engineer wanted to design a wind turbine, they could use a 3D CAD tool to design the structure, then use a finite-element analysis tool to check whether it will resist certain loads,” Dawson says. “However, there is a lack of these computer-aided design tools for autonomous systems.”
Normally, a roboticist optimizes an autonomous system by first developing a simulation of the system and its many interacting subsystems, such as its planning, control, perception, and hardware components. She then must tune certain parameters of each component and run the simulation forward to see how the system would perform in that scenario.
Only after running many scenarios through trial and error can a roboticist then identify the optimal combination of ingredients to yield the desired performance. It’s a tedious, overly tailored, and time-consuming process that Dawson and Fan sought to turn on its head.
“Instead of saying, ‘Given a design, what’s the performance?’ we wanted to invert this to say, ‘Given the performance we want to see, what is the design that gets us there?’” Dawson explains.
The researchers developed an optimization framework, or a computer code, that can automatically find tweaks that can be made to an existing autonomous system to achieve a desired outcome.
The heart of the code is based on automatic differentiation, or “autodiff,” a programming tool that was developed within the machine learning community and was used initially to train neural networks. Autodiff is a technique that can quickly and efficiently “evaluate the derivative,” or the sensitivity to change of any parameter in a computer program. Dawson and Fan built on latest advances in autodiff programming to develop a general-purpose optimization tool for autonomous robotic systems.
“Our method automatically tells us how to take small steps from an initial design toward a design that achieves our goals,” Dawson says. “We use autodiff to essentially dig into the code that defines a simulator, and figure out how to do this inversion automatically.”
The team tested their new tool on two separate autonomous robotic systems, and showed that the tool quickly improved each system’s performance in laboratory experiments, compared with conventional optimization methods.
The first system comprised a wheeled robot tasked with planning a path between two obstacles, based on signals that it received from two beacons placed at separate locations. The team sought to find the optimal placement of the beacons that would yield a clear path between the obstacles.
They found the new optimizer quickly worked back through the robot’s simulation and identified the best placement of the beacons within five minutes, compared to 15 minutes for conventional methods.
The second system was more complex, comprising two-wheeled robots working together to push a box toward a target position. A simulation of this system included many more subsystems and parameters. Nevertheless, the team’s tool efficiently identified the steps needed for the robots to accomplish their goal, in an optimization process that was 20 times faster than conventional approaches.
“If your system has more parameters to optimize, our tool can do even better and can save exponentially more time,” Fan says. “It’s basically a combinatorial choice: As the number of parameters increases, so do the choices, and our approach can reduce that in one shot.”
The team has made the general optimizer available to download, and plans to further refine the code to apply to more complex systems, such as robots that are designed to interact with and work alongside humans.
“Our goal is to empower people to build better robots,” Dawson says. “We are providing a new building block for optimizing their system, so they don’t have to start from scratch.”
Reference: “Certifiable Robot Design Optimization using Differentiable Programming” by Charles B Dawson and Chuchu Fan, June 2022, Robotics: Science and Systems 2022.
This research was supported, in part, by the Defense Science and Technology Agency in Singapore and by the MIT-IBM Watson AI Lab.
<p>The University of Technology, Sydney (UTS) has selected IBM Cognos 8.3 as its new student management reporting tool to deliver information faster to students and Improve productivity.</p>
<p>UTS is transforming the way they use information, allowing the University to gain valuable insight about what is happening across their organisation to help drive smarter decisions.</p>
<p>UTS Student Administration is responsible for managing information for around 33,000 students via the Student Management portal - from student applications and enrolments through to curriculum documents, examination results and alumni data. To support this flow of data, UTS Student Centre staff process an average of 15 student inquiries per minute.</p>
<p>The Student Management portal application (also known as Ci) introduces .NET technology and workflow management allowing for better student self management and staff to student interactions, overall giving staff more time to focus on business improvement.</p>
<p>The new business intelligence tool leverages the Student Management source system for reporting, in turn providing greater flexibility of report format and prompting, audit capability to better understand usage, as well as integration with the Student Management portal application.</p>
<p>“Making IBM Cognos the common reporting platform across the University offers savings in licence costs, savings in software administration and maintenance, and capitalises on the existing IBM Cognos skills within the University. It also offers the future potential for a consolidated portal for all strategic University information,” said Miranda Brookes, Student Systems Implementation Project Manager, University of Technology, Sydney.</p>
<p>The ability to generate customised reports and automate standard processes, makes it possible for UTS staff to conduct in-depth analysis versus manual report production, helping the University stay on top of critical daily operations.</p>
<p>This technology will also power the University's management portal which provides performance, benchmarks and other information to UTS decision makers.</p>
<p>The new reporting solution includes web services for greater flexibility in the delivery of reports to staff and improved auditing capabilities through recording of information such as who is using the system, which reports and queries are being run, and the cost of queries to system resources. The University will also be using barcoding functionality to generate individual barcodes for each invoice prepared through the student management system.</p>
<p>Although originally driven by the need for customised reports, the decision to maintain reporting and student management applications also supports UTS' risk mitigation strategy by safeguarding reports from any application bugs or crashes.</p>
<p>A major factor in the selection of IBM Cognos was UTS' extensive and successful experience with Cognos over several years. At the same time, IBM Cognos Software Group Services demonstrated a successful prototype of converting reports using IBM Cognos 8.3.</p>
<p>“Using the IBM Cognos Services team in a quality assurance role has been a major help to the project. They've been here in our offices every week giving fast responses to any queries or issues and helping us to understand best practices,” Brookes added.</p>
<p>Deployment of IBM Cognos 8.3 is now under way with the conversion of approximately 170 customised reports. The project is being managed by UTS with support from IBM implementation partner, UXC Performance Management and IBM Cognos Guardian Services. After the system goes live end-September 2009 UTS staff will begin development of dashboards and drill-down capabilities for more comprehensive analysis.</p>
<p>Over the past twenty years the University of Technology has developed a strong reputation for innovative technologies and solutions, industry collaboration and research.</p>
<p>For more information about the University of Technology, Sydney, visit www.uts.edu.au</p>
<p>For more information on IBM, please visit www.ibm.com</p>
A four-year bachelor’s degree has long been the first rung to climbing America’s corporate ladder.
But the move to prioritize skills over a college education is sweeping through some of America’s largest companies, including Google, EY, Microsoft, and Apple. Strong proponents say the shift helps circumvent a needless barrier to workplace diversity.
“I really do believe an inclusive diverse workforce is better for your company, it’s good for the business,” Ginni Rometty, former IBM CEO, told Fortune Media CEO Alan Murray during a panel last month for Connect, Fortune’s executive education community. “That’s not just altruistic.”
Under Rometty’s leadership in 2016, tech giant IBM coined the term “new collar jobs” in reference to roles that require a specific set of skills rather than a four-year degree. It’s a personal commitment for Rometty, one that hits close to home for the 40-year IBM veteran.
When Rometty was 16, her father left the family, leaving her mother, who’d never worked outside the home, suddenly in the position to provide.
“She had four children and nothing past high school, and she had to get a job to…get us out of this downward spiral,” Rometty recalled to Murray. “What I saw in that was that my mother had aptitude; she wasn’t dumb, she just didn’t have access, and that forever stayed in my mind.”
When Rometty became CEO in 2012 following the Great Recession, the U.S. unemployment rate hovered around 8%. Despite the influx of applicants, she struggled to find employees who were trained in the particular cybersecurity area she was looking for.
“I realized I couldn’t hire them, so I had to start building them,” she said.
In 2011, IBM launched a corporate social responsibility effort called the Pathways in Technology Early College High School (P-TECH) in Brooklyn. It’s since expanded to 11 states in the U.S. and 28 countries.
Through P-TECH, Rometty visited “a very poor high school in a bad neighborhood” that received the company’s support, as well as a community college where IBM was offering help with a technology-based curriculum and internships.
“Voilà! These kids could do the work. I didn’t have [applicants with] college degrees, so I learned that propensity to learn is way more important than just having a degree,” Rometty said.
Realizing the students were fully capable of the tasks that IBM needed moved Rometty to return to the drawing board when it came to IBM’s own application process and whom it was reaching. She said that at the time, 95% of job openings at IBM required a four-year degree. As of January 2021, less than half do, and the company is continuously reevaluating its roles.
For the jobs that now no longer require degrees and instead rely on skills and willingness to learn, IBM had always hired Ph.D. holders from the very best Ivy League schools, Rometty told Murray. But data shows that the degree-less hires for the same jobs performed just as well. “They were more loyal, higher retention, and many went on to get college degrees,” she said.
Rometty has since become cochair of OneTen, a civic organization committed to hiring, promoting, and advancing 1 million Black individuals without four-year degrees within the next 10 years.
If college degrees no longer become compulsory for white-collar jobs, many other qualifications—skills that couldn’t be easily taught in a boot camp, apprenticeship program, or in the first month on the job—could die off, too, University of Virginia Darden School of Business professor Sean Martin told Fortune last year.
“The companies themselves miss out on people that research suggests…might be less entitled, more culturally savvy, more desirous of being there,” Martin said. Rather than pedigree, he added, hiring managers should look for motivation.
That’s certainly the case at IBM. Once the company widened its scope, Rometty said, the propensity to learn quickly became more of an important hiring factor than just a degree.
This story was originally featured on Fortune.com
More from Fortune:
A 2007 flashback: home flippers are in trouble again
Managing Gen Z is like working with people ‘from a different country’
The Renault Nissan empire once held together by fugitive Carlos Ghosn may slowly be unraveling
PayPal tells users it will fine them $2,500 for misinformation, then backtracks immediately
Digital transformation is never going to be done. As technologies continue to evolve and emerge, companies need to keep up and continue with their own transformations. While some of my digital transformation trends predictions for 2022 were correct, others have just barely scratched the surface and will continue to be trending in 2023. So, what will we be seeing in the year ahead? Will customer experience and data make headlines or will the metaverse come to the forefront of the conversation. Based upon hundreds of conversations with the world’s most prolific tech companies and consumers, here are 10 trends that I feel will continue to be top of mind in 2023.
Organizations everywhere are doing more with less. Many are still navigating staffing shortages as employees left the workplace in droves because of the pandemic. As a result, the employees that are left in the workplace are burnt out. They are jumping between platforms, searching for information, and spending a lot of time doing repetitive tasks. According to a survey from Asana, employees report spending only one-third of their day on work they were hired to do. That’s not conducive to business which is why I see many organizations finally starting to turn to technology as a solution.
The ability to streamline processes and drive efficiency can be huge for the bottom line, which is why I think automation and intelligent automation will be key in 2023. According to the Automation Now & Next Report, a survey our team at Futurum Research did in conjunction with Automation Anywhere, found that 61% of organizations are turning to automation to deal with staffing issues. For the year ahead, 94% say shifting employees to higher value work is a priority — which makes sense. Get employees working on tasks that will advance business instead of time-consuming and repetitive tasks that are a drain on resources and your bottom line is bound to improve.
I think there will be a focal point on solutions like Microsoft’s Power Platform, Red Hat’s Ansible, ServiceNow, and other low code or no code solutions that will fit easily into any organization’s tech stack. These solutions will streamline processes across the enterprise and will enable organizations to do less with more, and with more efficiency.
This was a trend that I predicted last year, and it will remain true for the coming year. Court rulings, antitrust legislation, and other regulations will continue to impact Amazon, Apple, Google, Meta, and other big tech companies as governments look to eliminate or mitigate the monopolistic practices in ecommerce, digital ads, search results, acquisitions, and app stores.
While most antitrust legislation has stalled or made very slight movement in congress here in the U.S., the EU has passed the Digital Markets Act which will be enforced starting in March 2023. The DMA will regulate big tech companies that act as “gatekeepers” due to their market position in areas around data collection, platform interoperability, bundling offerings, pre-installing apps and a few other areas.
We will have to wait and see how this is enforced, but if history is any indication, we will likely see some fallout that could impact several companies. But will it trickle down to consumers? Or will consumers get more choice and competition in the market like the act is setting out to accomplish? Again, only time will tell. It’s a complex topic, especially because I believe latest anti competition efforts have been less about protecting consumers and more about protecting competition given society’s attachment to ubiquitous platforms like Apple, Amazon, and Google. Regardless, I will also be watching to see how U.S. legislators respond either at federal or state levels just as it happened once the GDPR passed.
Big tech is crucial in our lives — we can’t live without it. So, I don’t think the cycle of alleged abuses, lawsuits, legislation, appeals, and more lawsuits will ever truly end. It will just be interesting to see what comes to fruition in the coming year or if big tech can continue their streak winning cases and avoiding strict regulations.
As the adoption of cloud-native infrastructure and serverless, container technologies have boomed in the last few years, IT departments have had to change their monitoring practices. What worked for legacy systems, no longer works for these new technologies. Observability enables complete visibility across complex infrastructures for everyone in IT from system administrators to developers. And with that visibility, IT can detect and fix errors before they become bigger problems, and security can detect and mitigate threats before they attack. Which is why observability will be a huge trend in 2023.
Companies like Splunk, IBM, Cisco and ServiceNow have all made big bets on observability. ServiceNow, for instance, made several acquisitions from Lightstep last year to Era Software just this week to unify observability throughout its platform, while Cisco shared its big bets on Observability at this year’s Cisco Live event.
Did someone say environmental, social, governance? This was a major trend in 2021 and 2022 and will continue likely for years to come. Investments to make operations more sustainable, new partnerships to develop technologies to reduce our impact on the climate, and pledges to reduce carbon footprints have been in headlines this year and I believe they will be again in 2023.
The tech community has really taken climate efforts into their own hands with pledges to achieve carbon neutrality by 2050. And these efforts can’t start soon enough. A latest study from the U.N. showed that we have not done enough to stem the climate crisis yet and are currently on track to see a global 2.7°C temperature increase in the coming decades, a slight change that would have detrimental consequences for everyone.
Which is why I’m pleased to see so many companies continue to take part in Amazon’s climate pledge, which now has over 350 signatories. Other companies like Microsoft, Intel, and SAP (among many others) have beefed up their sustainability offerings, making it easier for other companies to track and report on their sustainability initiatives. This will likely continue in 2023 as consumers and shareholders alike put pressure on businesses to create ESG programs.
This course will continue to be at the forefront, and while there is much to prove from the tech industry that this is more than just talk, it is encouraging to see progress so long as companies figure out how to balance sustainable practices, customer satisfaction, and profitable business operations.
The metaverse was an honorable mention last year but has made the full list of predictions for 2023. The metaverse has slowly started to grow in the last year with more companies making plans for business in the metaverse. While I still think we are a few years away from a full-blown metaverse, I think we are going to see more practical plans for what it might look like.
We’ve already seen changes to how money flows in the metaverse, with NFTs, cryptocurrency and decentralized finance capturing more headlines in 2021. Companies will likely look to continue to capitalize on this growth with more offerings while they look to position themselves wisely in for the future of the metaverse. And while companies like Apple and Meta will squabble over what the future will look like, the fact that there are already proposed legislations for management of the metaverse in the EU, South Korea and Japan shows that this fad, as some have called it, isn’t going away any time soon.
For me, the metaverse is less what we are hearing from Mark Zuckerberg and more in line with what CEOs like NVIDIA’s Jensen Huang is sharing in the build out of its Omniverse offering. It’s going to be about bringing immersion into our physical world more than the other way around. We are in the era of creation. From data replication to the build out of virtual, autonomous, simulated environments, the Metaverse for industry like smart cities and digital twins is something that is being done now—and this will continue to gain strong momentum in 2023.
In 2022, hybrid work has really taken center stage with more and more organizations allowing employees to take advantage of flexible work schedules. As a result, collaboration companies have released new features and adjusted platforms to make collaboration easy, seamless, and equitable regardless of location.
This will be our new normal. And I think it will only get better as collaboration platforms will begin to be integrated with systems of record, making it easier for employees to do work across tech stacks in the enterprise. Siloes are collapsing everywhere and collaboration is now at the center of how we operate — and as businesses find more success, it will never change.
Microsoft Teams will continue to lead the pack bringing together synchronous and asynchronous collaboration and merging with applications and productivity tools. However, I like the latest moves made by Salesforce with Slack Canvas and Huddles. Of course, Zoom was the pandemic darling and has been aggressively expanding its platform as well. HP bought Poly last year to bring greater convergence of hardware and personal devices, and Cisco has always been deeply entrenched in the collaboration space as well.
Collaboration will be more and more a thing of physical and digital—true immersion, and it will be further enhanced by AI, Metaverse, and 5G connectivity. It is also more than just meeting and events, but collaboration on experiences, which is what drove the Adobe Figma deal to be one of the largest and most talked about of 2022.
Automotive technologies have gone into hyperdrive in the last year with partnerships, investments, and technology developments from companies like Qualcomm, Mobileye, NVIDIA, Marvell, Luminar, and Plus happening left and right. The software-defined vehicle is the vehicle of the future. We have seen semiconductor companies take charge of the future of the vehicle. Tesla has done it with its largely home-grown technology, but the likes of Mercedes, VW, BMW, and GM are turning to chipmakers like those mentioned above to get it done. Qualcomm saw its design pipeline swell to $30 billion as large automakers are seeking to build the cars of the future on intelligent computing platforms. This trend is locked and will be a story line throughout 2023.
In 2022, we have seen the debut of impressive chips to power everything from infotainment systems to ADAS as well as more advanced LiDAR technology that will make self-driving cars safer. We’ve also seen a number of agreements with OEMs and automotive tech companies to equip everything from cars to semitrucks with next-generation technology. I think 2023 will be the year we will see sustained progress in getting safer, highly autonomous vehicles on the road. We will also probably see more tests of higher level ADAS (L3, L4) in a variety of road conditions as the safety of these systems will still be a major question for a lot of consumers.
As for electric vehicle production, now that California has passed a law outlawing the sale of gas-powered vehicles by 2035, there will likely be major infrastructure shifts in the year to come. This law will bring complexity to a state that has electric infrastructure challenges, but it is indicative of what is to come as we seek to be more sustainable. With this, we will need more batteries, more charging stations, and grids that can support the influx in electric power. We are just at the start of this change and as a bit of a car guy, I can’t wait to see what will be developed this year.
Much like automation, the proliferation of analytics and artificial intelligence will continue to make its way into every part of our business and life. Even these trends are deeply impacted by AI from autonomous vehicles to multicloud to better collaboration experiences.
As I see it, AI goes from being a self-contained subject of interest to a largely embedded technology that impacts more and more of our everyday work and life. For instance, we are seeing the continued improvement of conversational AI systems making our day-to-day brand interactions more valuable. From chatbots that can handle multi-turn conversations to smarter Alexa devices, we are having thoughtful interactions with machines, and it has happened almost seamlessly with new software and hardware updates.
Furthermore, recommender engines powered by technologies from companies like NVIDIA are making our digital interactions better, perhaps to the point of a little bit too good. The ability for AI and ML to understand our behavior and make intelligent suggestions for what we buy, where we eat, who we talk to, and how we work are becoming more and more integrated in our lives. This is improving our in-app experiences as well as delivering better proactive customer experience.
At the core of AI will continue to be our semiconductor designers and manufacturers. Software gets the credit, but it will be the continued innovation of Intel, AMD, Qualcomm, NVIDIA, and more that takes powers the CPUs, GPUs, IPUs, and DPUs that enable data to drive insight, optimization, and real-time interactions.
We are seeing the likes of Microsoft, Salesforce, Apple, Google, and Amazon embed AI deeply into our work apps, vehicles, and personal devices—this will snowball in 2023 as AI is part of almost every digital experience in our lives.
While 2022 was all about hybrid cloud, 2023 will see further shift from hybrid to multicloud. Organizations want to optimize their cloud usage by tapping into offerings from numerous providers like AWS, HPE, Google, Azure, Dell, Oracle, IBM (Red Hat), and more. This is becoming more and more normal as organizations want to leverage the best that is available on the market, at greater price efficiency—especially in a tougher macroenvironment.
Public cloud players have made their offerings more extensible over the last few years, offering that desired flexibility, especially to deal with redundancy, scalability, compliance, and other challenges that are multicloud favorable. This will continue as providers offer more open-source solutions and modular offerings that will make it easier to orchestrate workloads across the IT environment. And as enterprise organizations find success, this trend will only continue to grow. Which is a great transition to my last trend…
Last year, I predicted we would witness peak Everything-as-a-Service in 2022, and boy did we. Capital expenditures and overspending on unused software and infrastructure have been replaced by operational expenses that are purchased as needed. This has led to major company pivots like the one that HPE made with its GreenLake offering moving almost its entire portfolio to as a service. Other traditionally large capex hardware and software providers like Splunk, Cisco, and Dell have all done the same.
2023 will see this continue to grow. As we face a looming recession and organizations are slashing budgets and limiting expenditures, I think we are going to see more XaaS offerings. From multicloud offerings like I was just talking about, to security, and collaboration services. Pay as you go allows organizations to scale up and down as needed. As a result, tech companies — that are also struggling with reduced revenue and tempered guidance — are realizing the value in transitioning from licensing to more flexible offerings. It will be interesting to see though, how tech companies will position and market themselves to win that revenue.
While we will have to wait to see how the economy fluctuates in 2023 — and right now it’s not looking too good — I think the convenience of consumption models will continue to be the reason that we see XaaS offerings last, once the economy does eventually rebound.
Ten is never enough, and I can think of others that deserve at least an honorable mention. I’m certain that cybersecurity will be a hot commodity in 2023 as companies look to shore up their data environments. And speaking of data, we will almost certainly see a bigger focus on first-party data and consumer privacy. Quantum computing will continue to gain momentum and grow, albeit not primetime ready yet. And semiconductor companies will take this slowing period in chips and the economy to continue to innovate on process and manufacturing capacity to reduce the risk of another shortage.
2023 is setting up to be another fascinating year in technology and while markets continue to make us all a bit uneasy, it’s all but certain technology is our best path forward as we seek to return to the next period of economic growth.
By The Valuentum Team
International Business Machines Corporation (NYSE:IBM) has become a fundamentally different business in the past few years, one focused on providing hybrid cloud computing offerings. The company is a stellar free cash flow generator which enables IBM to reward investors via generous dividend increases, with shares of IBM yielding ~5.1% as of this writing. Substantial near-term headwinds remain, largely due to the various exogenous shocks seen of late (such as major inflationary pressures, rising interest rates, supply chain hurdles, and raging geopolitical tensions), though IBM is still worth considering as a high-yielding income generation idea.
IBM solves business problems via integrated hardware/software solutions that leverage IT and its knowledge of business processes. Its solutions help reduce a client's costs or enable new capabilities that generate revenue. The company was founded in 1924 and is headquartered in New York.
Back in 2019, IBM bought Red Hat (a top provider of open source cloud software) through a ~$34 billion deal which made IBM a contending hybrid cloud provider. IBM is looking to seize what it describes as a ~$1 trillion hybrid cloud opportunity, and latest growth in this area has been encouraging. IBM's revamped management team is working hard to turn things around after the company made various blunders during the 2010s decade. Its current Chairman and CEO, Arvind Krishna, has done a solid job righting the ship at IBM since taking on the top role in 2020.
In November 2021, IBM spun off its legacy business tax-free to shareholders as a new publicly traded entity, Kyndryl Holdings, Inc. (KD). Initially, IBM retained a 19.9% stake in Kyndryl though the firm intends to exit that position within 12 months of the spinoff.
On July 18, IBM reported earnings for the second quarter of 2022 that beat both consensus top- and bottom-line estimates. Its GAAP revenues rose by 9% year-over-year to hit $15.5 billion with strong growth at its Red Hat, various consulting services, and hybrid infrastructure offerings being key here. When removing foreign currency headwinds arising from the strong US dollar seen of late from the picture, IBM's non-GAAP constant currency revenues were up 16% year-over-year last quarter. IBM's portfolio optimization efforts are having a very powerful impact on its financial performance.
The firm's GAAP gross margin fell by ~185 basis points year-over-year last quarter, falling down to 55.4%. However, economies of scale helped drive its GAAP income from continuing operations up by 81% year-over-year in the second quarter, rising to $1.5 billion. There is some noise here due to the separation of IBM's legacy businesses (via the spinoff of Kyndryl) from its core operations. Keeping that noise in mind, IBM's underlying operations have performed quite well of late.
During its second quarter earnings call, IBM's management team noted the firm now forecasted that its full-year free cash flows would come in near $10.0 billion in 2022, at the low end of its previous forecast. IBM generated $3.6 billion in free cash flow (defined as net operating cash flow less 'payments for property, plant, and equipment' and 'investment in software') while spending $3.0 billion covering its dividend obligations during the first half of 2022. Its modest share repurchases during this period were related to tax withholding purposes as the new IBM is focused on retaining cash to invest in the business. We appreciate that IBM's dividend obligations remain well-covered by its traditional free cash flows.
The company exited June 2022 with a net debt load of $42.8 billion (inclusive of short-term debt, exclusive of restricted cash). One of the biggest risks to IBM's dividend is its large net debt load. IBM had $7.6 billion in cash, cash equivalents, and current marketable securities on hand at the end of June 2022 which provides the company with ample liquidity to meet its near-term funding needs.
IBM continues to expect that its constant currency revenues will grow decently this year (in the mid-single digit range), though sustained foreign currency headwinds are expected to offset strong demand for its offerings, to a degree. Over the long haul, we forecast that under its new management team, IBM will return to stable revenue growth which in turn should see the company's free cash flows swell higher. That would allow IBM to boost its dividend in a sustainable manner going forward, though we caution that its net debt load could limit the size of any future payout increases.
The Dividend Cushion Ratio Deconstruction, shown in the image up above, reveals the numerator and denominator of the Dividend Cushion ratio. At the core, the larger the numerator, or the healthier a company's balance sheet and future free cash flow generation, relative to the denominator, or a company's cash dividend obligations, the more durable the dividend.
The Dividend Cushion Ratio Deconstruction image puts sources of free cash in the context of financial obligations next to expected cash dividend payments over the next 5 years on a side-by-side comparison. Because the Dividend Cushion ratio and many of its components are forward-looking, our dividend evaluation may change upon subsequent updates as future forecasts are altered to reflect new information.
In the context of the Dividend Cushion ratio, IBM's numerator is smaller than its denominator, which suggests weak forward-looking dividend coverage. However, given IBM's strong and stable cash flow profile, we view its forward-looking dividend coverage favorably when considering IBM's ability to tap capital markets into account. Should IBM stumble for any reason, its ability to make good on its payout may be in danger.
The best measure of a firm's ability to create value for shareholders is expressed by comparing its return on invested capital ['ROIC'] with its weighted average cost of capital ['WACC']. The gap or difference between ROIC and WACC is called the firm's economic profit spread. IBM's 3-year historical return on invested capital (without goodwill) is 41.6%, which is above the estimate of its cost of capital of 9.2%.
In the chart down below, we show the probable path of ROIC in the years ahead based on the estimated volatility of key drivers behind the measure. The solid grey line reflects the most likely outcome, in our opinion, and represents the scenario that results in our fair value estimate. Assuming IBM's latest portfolio optimization efforts go as planned, the firm's ability to generate shareholder value (which historically has been impressive) should continue to improve.
Our discounted cash flow process values each firm on the basis of the present value of all future free cash flows, net of balance sheet considerations. We think IBM is worth $136 per share with a fair value range of $101-$171 per share. Shares of IBM are trading moderately below our fair value estimate as of this writing.
The near-term operating forecasts used in our enterprise cash flow model, including revenue and earnings, do not differ much from consensus estimates or management guidance. Our model reflects a compound annual revenue growth rate of 3.4% during the next five years, a pace that is higher than the firm's 3-year historical compound annual growth rate of -10.3%.
Our model reflects a 5-year projected average operating margin of 17.6%, which is above IBM's trailing 3-year average. Beyond Year 5, we assume free cash flow will grow at an annual rate of 2% for the next 15 years and 3% in perpetuity. For IBM, we use a 9.2% weighted average cost of capital to discount future free cash flows.
Although we estimate IBM's fair value at about $136 per share, every company has a range of probable fair values that's created by the uncertainty of key valuation drivers (like future revenue or earnings, for example). After all, if the future were known with certainty, we wouldn't see much volatility in the markets as stocks would trade precisely at their known fair values.
In the graphic up above, we show this probable range of fair values for IBM. We think the firm is attractive below $101 per share (the green line), but quite expensive above $171 per share (the red line). The prices that fall along the yellow line, which includes our fair value estimate, represent a reasonable valuation for the firm, in our opinion.
The steady decline in IBM's legacy business since 2010 represents a major reason why the firm spun off Kyndryl in November 2021. Going forward, IBM will need to prove that as a leaner and more focused enterprise, it can maintain solid revenue and operating income growth over the long haul. We think that will be the case, though substantial near-term headwinds remain. Investors looking for an income generation idea backed up by a strong cash flow profile should take a closer look at IBM.
Though quantum computing technology is still new, JPMorgan Chase, Ally Bank, Credit Agricole and other banks are actively testing and in some cases using it, according to speakers at the HPC + AI on Wall Street conference in New York this week.
"We realize that if a company doesn't do anything about the market right now, and just waits for quantum advantage to become a reality, when quantum advantage becomes real, it might be too late," said Marco Pistoia, managing director, distinguished engineer, head of global technology applied research and head of quantum computing at JPMorgan Chase. "We want to be ready when quantum advantage becomes possible on a higher level."
These banks are not attempting to buy and use quantum computers directly. They are using cloud-based quantum-computing-as-a-service offerings from companies like D-Wave, IBM, Google, Amazon, Rigetti, Microsoft and QC Ware. They're testing the advanced computer power for complex problems like portfolio optimization and index tracking.
The banks are seeking improvements in speed, as well as greater precision in simulations and calculations for risk analysis, fraud detection and pricing of complex derivatives.
"Classical computing is reaching its problem-solving and data analytical limits," said Heather West, research manager, infrastructure systems, platforms and technologies at IDC. "As a result, the financial industry, as well as many other industries, is looking for a way to expand compute capabilities. As a result, quantum computing is seen as an industry disruptor. In the financial industry, it's offering a way to solve new problems, it's offering a way to obtain processing speeds that currently aren't available using classical infrastructure, including high-performance computing and supercomputers."
Using quantum computing, "financial institutions will be able to produce better, more accurate predictions and risk assessments in almost-real time," she said.
In a survey of financial institution leaders West conducted in 2021, 25% said they are currently investing in quantum computing technology and 43% said they planned to invest in 2022. The surveyed bankers are experimenting with the use of quantum computing for a wide variety of use cases that include ATM cash allocation, credit scoring, derivative pricing, fraud detection, compliance and transaction settlement.
"While today's quantum computing technology is nascent, it is well suited for experimenting with optimization problems, making this a prime time for financial institutions to begin experimenting and identifying use cases suitable for running on quantum computing systems," West said. Banks should also be developing the quantum algorithms and applications that will be needed to run such problems once quantum systems are scaled to a point where quantum advantage can be achieved, she said.
Quantum computing directly leverages quantum mechanics, the laws of physics that govern the smallest particles in the universe, to solve problems at high speeds. Traditional computers only allow bits of information to live in one state (0 or 1) at a time. A quantum computer uses qubits (quantum bits) that enable bits of information to be a 1, 0 or both 0 and 1 simultaneously. The result is a computation system that can manipulate and assess many combinations of information concurrently.
A quantum computer can cycle through 10 to the 154th power potential answers to a problem in microseconds.
But the technology still has challenges to overcome. McKinsey analysts noted in a latest white paper that manufacturers are still trying to scale the number of qubits in a quantum computer while achieving a sufficient level of qubit quality.
"The most important milestone will be the achievement of fully error-corrected, fault-tolerant quantum computing, without which a quantum computer cannot provide exact, mathematically accurate results," the authors said. "Five manufacturers have announced plans to have fault-tolerant quantum-computing hardware by 2030. If this timeline holds, the industry will likely establish a clear quantum advantage for many use cases by then."
In the same white paper, McKinsey analysts said the most promising use cases for quantum computing in finance are in portfolio and risk management. "For example, efficiently quantum-optimized loan portfolios that focus on collateral could allow lenders to Improve their offerings, possibly lowering interest rates and freeing up capital," the authors stated.
"In finance, you have a lot of use cases with exponential complexity," Pistoia said. "As the level of complexity explodes and the data set becomes big enough, classical computing cannot solve that problem anymore."
Another reason the financial industry needs quantum computing is for speed, he said.
"In finance, we need answers right away, because the market is changing so quickly," Pistoia said. "The market is volatile and a computation that takes three days is totally useless. So we need answers right away and we need accurate answers."
The quantum computing research and engineering team at JPMorgan Chase is exploring the use of quantum computing for risk analysis, option pricing, portfolio optimization, fraud detection and merger analysis.
The bank is still in the research phase.
"I think quantum computing is very important," Pistoia said. "It's not yet completely at the stage at which it can be used in production. Quantum computers are not yet powerful enough. When we are in a scientific stage with a certain technology, that's the best moment to actually collaborate with other companies and publish our results and form partnerships so that we can learn from other groups and other groups can learn from us."
Vendors at the conference, even from traditional computer and chip companies like Dell and Intel, also seemed to feel a shift in high-performance computing technology to quantum computing was inevitable and that they felt compelled to invest in quantum technology.
"You don't have a choice," said William Hurley, founder and CEO of quantum computing startup Strangeworks. "It's coming whether you want it to or not."
IBM continues to spend millions to buy hybrid cloud companies, as the company makes its sixth acquisition in 2022 with Dialexa.
IBM continues to spend millions on buying hybrid cloud companies with the unveiling of its acquisition of engineering consulting specialist Dialexa to boost its cloud charge.
Since IBM CEO Arvind Krishna took the reins in April 2020, IBM has acquired more than 25 companies, including many hybrid cloud businesses.
In February alone, IBM acquired cloud consultant services standout Sentaca, as well as Microsoft Azure consultancy all-star Neudesic—with the two purchases squarely aimed at boosting IBM’s hybrid and multi-cloud services capabilities.
[Related: UK To Probe Amazon, Google, Microsoft’s Cloud Dominance]
Looking at the Armonk, N.Y.-based company’s purchase of Dialexa, IBM will gain 300 skilled product managers, designers, full-stack engineers and data scientists. Dialexa will become part of IBM’s Consulting business unit, which spearheads the company’s digital product engineering services in the Americas.
“Dialexa’s product engineering expertise, combined with IBM’s hybrid cloud and business transformation offerings, will help our clients turn concepts into differentiated product portfolios that accelerate growth,” said John Granger, senior vice president of IBM Consulting, in a statement.
Dialexa marks IBM’s sixth purchase in 2022 with the goal of boosting its hybrid cloud and artificial intelligence abilities.
Along with buying Dialexa, Sentaca and Neudesic, IBM has also acquired Randori, an attack surface management cybersecurity specialist that helps protect hybrid cloud environments.
Earlier this year, IBM’s CEO said hybrid cloud and artificial intelligence are top of mind for his company in terms of investment and the future.
“We are integrating technology and expertise—from IBM, our partners and even our competitors—to meet the urgent needs of our clients, who see hybrid cloud and AI as crucial sources of competitive advantage,” Krishna said in March. “And we are ready to be the catalyst of progress for our clients as they pursue the digital transformation of the world’s mission-critical businesses.”
In 2021, IBM’s hybrid cloud revenue jumped 19 percent compared with 2020, comprising 35 percent of its total revenue.
Based in Dallas and Chicago, Dialexa delivers a suite of digital product engineering services to help customers create transformative products to drive business outcomes.
Dialexa’s 300-strong engineers and skilled IT experts advise and create custom digital products for customers, which include Deere & Company, Pizza Hut U.S. and Toyota Motor North America. Financial terms of the Dialexa deal were not disclosed.
IBM said Dialexa provides deep experience delivering end-to-end digital product engineering services consisting of strategy, design, build, launch and optimization services across cloud platforms including Amazon Web Services and Microsoft Azure.
“Digital product engineering represents the tip of the spear for competitive advantage,” said Dialexa CEO Scott Harper in a statement. “IBM and Dialexa’s shared vision for delivering industry-defining digital products could be a game-changer.”
IBM (NYSE:IBM) acquired Dialexa, a Dallas TX and Chicago, IL-based digital product engineering services firm.
The amount of the deal was not disclosed. The transaction is expected to close in the fourth quarter of this year and is subject to customary closing conditions and regulatory clearances.
The acquisition is expected to enhance IBM’s product engineering expertise and provide end-to-end digital transformation services for clients. Upon close, Dialexa will join IBM Consulting, strengthening IBM’s digital product engineering services presence in the Americas.
Founded in 2010 and led by CEO Scott Harper, Dialexa delivers a suite of digital product engineering services, enabling organizations to create new products to drive business outcomes. The company has deep experience delivering end-to-end digital product engineering services consisting of strategy, design, build, launch, and optimization services across cloud platforms including AWS and Microsoft Azure. Its team of 300 product managers, designers, full-stack engineers and data scientists, based in Dallas and Chicago, advise and create custom, commercial-grade digital products for clients such as Deere & Company, Pizza Hut US, and Toyota Motor North America.