Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Designing, testing and provisioning updates to data digital networks depends on numerous manual and error-prone processes. Digital twins are starting to play a crucial role in automating more of this process to help bring digital transformation to network infrastructure. These efforts are already driving automation for campus networks, wide area networks (WANs) and commercial wireless networks.
The digital transformation of the network infrastructure will take place over an extended period of time. In this two-part series, we’ll be exploring how digital twins are driving network transformation. Today, we’ll look at the current state of networking and how digital twins are helping to automate the process, as well as the shortcomings that are currently being seen with the technology.
In part 2, we’ll look at the future state of digital twins and how the technology can be used when fully developed and implemented.
At its heart, a digital twin is a model of any entity kept current by constant telemetry updates. In practice, multiple overlapping digital twins are often used across various aspects of the design, construction and operation of networks, their components, and the business services that run on them.
Peyman Kazemian, cofounder of Forward Networks, argues that the original Traceroute program written by Van Jacobson in 1987 is the oldest and most used tool to understand the network. Although it neither models nor simulates the networks, it does help to understand the behavior of the network by sending a representative packet through the network and observing the path it takes.
Later, other network simulation tools were developed, such as OPNET (1986), NetSim (2005), and GNS3 (2008), that can simulate a network by running the same code as the genuine network devices.
“These kinds of solutions are useful in operating networks because they provide you a lab environment to try out new ideas and changes to your network,” Kazemian said.
Teresa Tung, cloud first chief technologist at Accenture, said that the open systems interconnection (OSI) conceptual model provides the foundation for describing networking capabilities along with separation of concerns.
This approach can help to focus on different layers of simulation and modeling. For example, a use case may focus on RF models at the physical layer, through to the packet and event-level within the network layer, the quality of service (QoS) and mean opinion score (MoS) measures in the presentation and application layers.
Today, network digital twins typically only help model and automate pockets of a network isolated by function, vendors or types of users.
The most common use case for digital twins is testing and optimizing network equipment configurations. However, because there are differences in how equipment vendors implement networking standards, this can lead to subtle variances in routing behavior, said Ernest Lefner, chief product officer at Gluware.
Lefner said the challenge for everyone attempting to build a digital twin is that they must have detailed knowledge of every vendor, feature, and configuration and customization in their network. This can vary by device, hardware type, or software release version.
Some network equipment providers, like Extreme Networks, let network engineers build a network that automatically synchronizes the configuration and state of that provider’s specific equipment.
Today, Extreme’s product supports only the capability to streamline staging, validation and deployment of Extreme switches and access points. The digital twin feature doesn’t currently support the SD-WAN customer on-premises equipment or routers. In the future, Extreme plans to add support for testing configurations, OS upgrades and troubleshooting problems.
Other network vendor offerings like Cisco DNA, Juniper Networks Mist and HPE Aruba Netconductor make it easier to capture network configurations and evaluate the impact of changes, but only for their own equipment.
“They are allowing you to stand up or test your configuration, but without specifically replicating the entire environment,” said Mike Toussaint, senior director analyst at Gartner.
You can test a specific configuration, and artificial intelligence (AI) and machine learning (ML) will allow you to understand if a configuration is optimal, suboptimal or broken. But they have not automated the creation and calibration of a digital twin environment to the same degree as Extreme.
Until digital twins are widely adopted, most network engineers use virtual labs like GNS3 to model physical equipment and assess the functionality of configuration settings. This tool is widely used to train network engineers and to model network configurations.
Many larger enterprises physically test new equipment at the World Wide Technology Advanced Test Center. The firm has a partnership with most major equipment vendors to provide virtual access for assessing the performance of genuine physical hardware at their facility in St. Louis, Missouri.
Network equipment vendors are adding digital twin-like capabilities to their equipment. Juniper Networks’ latest Mist acquisition automatically captures and models different properties of the network that informs AI and machine optimizations. Similarly, Cisco’s network controller serves as an intermediary between business and network infrastructure.
Balaji Venkatraman, VP of product management, DNA, Cisco, said what distinguishes a digital twin from early modeling and simulation tools is that it provides a digital replica of the network and is updated by live telemetry data from the network.
“With the introduction of network controllers, we have a centralized view of at least the telemetry data to make digital twins a reality,” Venkatraman said.
However, network engineering practices will need to evolve their practices and cultures to take advantage of digital twins as part of their workflows. Gartner’s Toussaint told VentureBeat that most network engineering teams still create static network architecture diagrams in Visio.
And when it comes to rolling out new equipment, they either test it in a live environment with physical equipment or “do the cowboy thing and test it in production and hope it does not fail,” he said.
Even though network digital twins are starting to virtualize some of this testing workload, Toussaint said physically testing the performance of cutting-edge networking hardware that includes specialized ASICs, FPGAs, and TPUs chips will remain critical for some time.
Eventually, Toussaint expects networking teams to adopt the same devops practices that helped accelerate software development, testing and deployment processes. Digital twins will let teams create and manage development and test network sandboxes as code that mimics the behavior of the live deployment environment.
But the cultural shift won’t be easy for most organizations.
“Network teams tend to want to go in and make changes, and they have never really adopted the devops methodologies,” Toussaint said.
They tend to keep track of configuration settings on text files or maps drawn in Visio, which only provide a static representation of the live network.
“There have not really been the tools to do this in real time,” he said.
Getting a network map has been a very time-intensive manual process that network engineers hate, so they want to avoid doing it more than once. As a result, these maps seldom get updated.
Toussaint sees digital twins as an intermediate step as the industry uses more AI and ML to automate more aspects of network provisioning and management. Business managers are likely to be more enthused by more flexible and adaptable networks that keep pace with new business ideas than a dynamically updated map.
But in the interim, network digital twins will help teams visualize and build trust in their recommendations as these technologies improve.
“In another five or 10 years, when networks become fully automated, then digital twins become another tool, but not necessarily something that is a must-have,” Toussaint said.
Toussaint said these early network digital twins are suitable for vetting configurations, but have been limited in their ability to grapple with more complex issues. He said he likes to consider it to be analogous to how we might use Google Maps as a kind of digital twin of our trip to work, which is good at predicting different routes under current traffic conditions. But it will not tell you about the effect of a trip on your tires or the impact of wind on the aerodynamics of your car.
This is the first of a two-part series. In part 2, we’ll outline the future of digital twins and how organizations are finding solutions to the issues outlined here.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
Security at the speed of cyber: What is CISA’s Binding Operational Directive (BOD) 22-01?
The Biden Administration is continuing efforts to adopt new cybersecurity protocols in the face of ongoing attacks that threaten to disrupt critical public services, infringe on citizen data privacy and compromise national security.
On November 3, 2021, the Cybersecurity and Infrastructure Security Agency (CISA) issued a directive for federal agencies and contractors who manage hardware or software on an agency’s behalf to fix nearly 300 known cyber vulnerabilities that malicious actors can use to infiltrate and damage federal information systems. These known exploited vulnerabilities fall into two categories, each with a deadline for remediation:
90 vulnerabilities that were discovered in 2021 must be remediated by November 17
About 200 security vulnerabilities that were first identified between 2017 and 2020 must be remediated by May 3, 2022
As part of the directive, CISA also created a catalog of known exploited vulnerabilities that carry “significant risk” and outlined requirements for agencies to fix them. The catalog includes software and configurations supplied by software providers like SolarWinds and Kaseya, and large tech companies like Apple, Cisco, Google, Microsoft, Oracle and SAP.
Improving the nation’s cybersecurity defenses continues to be a top priority as the country has experienced an unprecedented year of cyberattacks. Malicious actors are continuing to target remote systems and prey on known vulnerabilities as the pandemic continues, leading to public service disruptions in telecommunications and utilities.
This directive comes just shy of six months since President Biden issued his Executive Order on Improving the Nation’s Cybersecurity, which aims to modernize cybersecurity defenses by protecting federal networks, strengthen information-sharing on cyber issues, and strengthen the United States’ ability to quickly respond to incidents when they occur.
While the Biden Administration and many federal agency heads agree that these actions are necessary to Excellerate cybersecurity protocols — they can be extraordinarily difficult to implement without the right tools.
In the next section, we will explore how federal agencies and their security teams can gain visibility across distributed environments to remediate vulnerabilities outlined in the directive.
Gaining visibility into federated IT environments
While most federal agencies are headquartered in Washington, D.C., field offices and agency staff are spread across the country, using many different endpoints (laptops, desktops, and servers) to access federal networks. This distributed IT environment can make it difficult for CISOs and their security teams to gain visibility into their agency’s environment in real time.
To comply with CISA’s BOD 22-01, security teams first need to gain visibility across federated IT environments and be able to answer a few basic questions, including:
How many endpoints are on the network?
Are these endpoints managed or unmanaged?
Do any known exploited vulnerabilities cataloged in the
directive exist in our environment? If so, do we currently have
the tools to patch them quickly and at scale?
Do we have the capability to confirm whether deployed
patches were applied correctly?
While these questions may seem straightforward, they often take agencies weeks or months to answer due to a highly federated
IT environment and the nature of IT management, which often includes tool sprawl and conflicting data sets — which is at odds with the aggressive timelines outlined in the directive.
With Tanium, CISOs and their security teams can discover previously unseen or unmanaged endpoints connected to federal networks, and then search for all applicable Common Vulnerabilities and Exposures (CVEs) listed in the directive in minutes. With Tanium, it only takes a single agent on the endpoint to obtain compliance information, push patches and update software. Tanium provides a “single pane of glass” view to help align teams and prevent them from spending time gathering outdated endpoint data from various sources.
As CISA has committed to maintaining the catalog and alerting agencies of updates for awareness and action, having a unified endpoint management platform that provides visibility across an organization gives CISOs and their teams the tools they need to scan and patch future vulnerabilities at scale.
In the next section, we will explore how federal agencies and their security teams can prioritize actions and deploy patches to meet deadlines outlined in the directive.
Prioritizing actions and patching known vulnerabilities quickly
Once agency heads and their security teams have a clear picture of the state of their endpoints, the next step is to pinpoint known vulnerabilities and fix them fast based on associated deadlines in the directive.
With Tanium, federal agencies can search for the specific vulnerabilities listed in the directive and then patch those vulnerabilities in minutes, while having the confidence that patches were applied correctly. As a single lightweight agent, Tanium doesn’t weigh down the network. Remediation typically takes less than a day if an agency is already using Tanium. Existing customers should reference this step-by-step technical guidance on how to address the vulnerabilities laid out in the directive.
In addition to fixing known vulnerabilities, the directive also outlines other actions federal agencies must take, including:
Reviewing and updating internal vulnerability management procedures within 60 days.
At a minimum, agency policies must:
• Pave the way for automation around a single source of truth with high-fidelity data and remediate vulnerabilities that CISA identifies within a set timeline
• Assign roles and responsibilities for executing agency actions to align teams around a single source of truth
• Define necessary actions required to enable prompt responses • Establish internal validation and enforcement procedures to ensure adherence to the directive
• Set internal tracking and reporting requirements to evaluate adherence to the directive and provide reporting to
CISA, as needed
Reporting on the status of vulnerabilities listed in the catalog.
• Agencies are expected to automate data exchanges and report their respective directive implementation status through the CDM Federal Dashboard
As new threats and vulnerabilities are discovered, CISA will update the catalog of known vulnerabilities and alert agencies of updates for awareness and action.
Many federal agencies already use Tanium to provide visibility and maintain compliance across their distributed IT environment. Federal agencies can count on Tanium to be a valuable tool in discovering, patching and remediating future known critical vulnerabilities.
Tanium in action: scanning distributed networks and remediating at scale
While CISA has previously imposed cybersecurity mandates on federal agencies to immediately fix a critical software problem, this new directive is notable for its sheer scope and respective deadlines. Leveraging Tanium, federal agencies and contractors who manage hardware or software on an agency’s behalf can patch known critical vulnerabilities and comply with the deadlines in a fraction of the time.
The Tanium platform unifies security and IT operations teams using a “single pane of glass” approach of critical endpoint data, so that federal agencies can make informed decisions and act with lightning speed to minimize disruptions to mission-critical operations.
With Tanium, you can get rapid answers, real-time visibility and quickly take action when addressing current vulnerabilities in BOD 22-01. As CISA adds more vulnerabilities to the catalog, you can have confidence that Tanium is constantly checking for compliance and patching your endpoints quickly across your environment.
To learn more about how Tanium can help your agency remediate known vulnerabilities outlined in the CISA directive, visit Tanium.com/cisa
Todd Thibodeaux uses ChannelCon 2022 state of the industry remarks to unveil CompTIA's Project Agora; invites broad industry participation in the effort to fight for tech talent
CHICAGO, Aug. 3, 2022 /PRNewswire/ -- CompTIA, the nonprofit association for the information technology (IT) industry and workforce, is undertaking an expansive effort to create the most resource-rich source of information and support for anyone interested in starting, staying and succeeding in a career in technology.
"The goal of Project Agora is to create the most respected place to start, build and supercharge your tech career."
CompTIA President and CEO Todd Thibodeaux revealed the association's Project Agora during his state of the industry remarks at ChannelCon 2022.
"The goal of Project Agora is to create the most respected place to start, build and supercharge your tech career," he said. "With amazing resources and broad support from our members, partners and industry Project Agora will help people find success in the tech workforce."
The labor market is in a period of unprecedented transition, characterized in large part by the volume of frictional unemployment as individuals search for, or transition from one job to another. One in four US workers were actively seeking a new job or pursuing other career options during Q2 2022, CompTIA research reveals.1 While tech is among the top five industries job seekers were considering, it ranked behind several other sectors, including sales, real estate, healthcare, hospitality and finance. A lack of confidence in technical skills, concerns about the cost and the time it will take to learn those skills and perceptions about the tech industry culture are factors that contribute to reluctance to consider tech as a career option.
"Our challenge is to convert more career intent people to tech intent," Thibodeaux said. "We need to tell better stories, more consistently, about how truly great it is to work in tech. The way we get the talent we need is by fighting for it."
Project Agora will help in that effort, first by enabling individuals to explore in great depth tech jobs and careers. CompTIA has identified 30 different job roles covering 90% of tech employment. The next step is creating resources to engage users and convert them from career intent to tech intent. Thibodeaux issued a call to action for the industry to get involved in this effort to build the best, most comprehensive collection of tech career resources available anywhere.
"Confidence gaps, career transition gaps and reskilling gaps are not insurmountable barriers but rather opportunities to chart a new course for individuals and the companies that employ them," Thibodeaux concluded. "Project Agora is all about unlocking potential, for the industry, and for millions of people we want and need working in it,"
Organizations interested in getting involved in Project Agora can contact CompTIA at firstname.lastname@example.org
The Computing Technology Industry Association (CompTIA) is a leading voice and advocate for the $5 trillion global information technology ecosystem; and the estimated 75 million industry and tech professionals who design, implement, manage, and safeguard the technology that powers the world's economy. Through education, training, certifications, advocacy, philanthropy, and market research, CompTIA is the hub for unlocking the potential of the tech industry and its workforce. https://www.comptia.org/
1 This encompasses those currently employed plus those actively looking (classified as part of the labor market by the US Bureau of Labor Statistics). Job Seeker Trends, July 2022.
View original content to obtain multimedia:https://www.prnewswire.com/news-releases/comptia-ceo-outlines-bold-initiative-to-create-the-preeminent-destination-to-start-build-and-supercharge-a-career-in-tech-301598916.html
With the Keystone Pipeline dominating the news, and America's addiction to oil showing no signs of waning, it is more urgent than ever that wereconsider our energy needs in light of economic reality. In this New York Times bestseller, Jeremy Rifkin explores how Internet technology and renewable energy are merging to create a powerful new engine of economic growth, wherein hundreds of millions of people will produce their own green energy in their homes, offices, and factories and share it with each other in an 'energy Internet.' This process willusher in a fundamental reordering of human relationships, from hierarchical to lateral, that will impact the way we conduct commerce, govern society, educate our children, and engage in civic life. The Third Industrial Revolution is an insider's account of the next great economic era, including a look into the personalities and players - heads of state, global CEOs, social entrepreneurs, and NGOs - who are pioneering its implementation around the world.
"synopsis" may belong to another edition of this title.
"Jeremy Rifkin was always ahead of his time. The "New Industrial Revolution "confirms that the times have caught up with him. It is no longer possible to ignore his vision for the future of humankind."--Calestous Juma, Harvard Belfer Center for Science and International Affairs, John F. Kennedy School
"Jeremy Rifkin argues that green energy and the Internet will revolutionize society and the environment . . . With the European Union already on board, this is a big idea with backbone."--"Nature"
"Sobering reading...lays out a comprehensive plan to realize the third industrial revolution...a big, brash, bold book."--"Barnes and Noble Reviews"
"Impeccably argued...a compelling and cogent argument to overhaul our society and economy in favor of a distributed and collaborative model."--"Publishers Weekly"
"Rifkin connects the two defining technologies of the 21st Century -- the Internet and renewable energies -- giving us a powerful new economic vision for the future. As we look to regrow the economy, generate millions of jobs, and create a sustainable future for our children, the Third Industrial Revolution offers an indispensable roadmap."--Arianna Huffington--President and Editor-in-Chief of The Huffington Post Media Group
"Mr. Rifkin clearly outlines the challenges facing our global community, and creates a vision for business leaders, government and citizens."--John Chambers--Chairman and CEO of Cisco
"The creative thinking of Jeremy Rifkin has been inspiring policy makers and citizens alike. This book shows the key role renewables and modern technologies can play in our transition to a low-carbon economy."--Jose Manuel Barroso--President of the European Commission
"This is a remarkable piece of work from one of the foremost thinkers of our time...Rifkin has come up with a visionary and innovative economic development model that ensures the sustainability of our natural resources and ecosystems."--Rajendra Pachauri --Chairman of the UnBook Description:
A revealing look at how a new global 'Energy Internet' will soon replace the old electrical grids, end fossil fuel wars, avert climate change and transform markets and governments.
"About this title" may belong to another edition of this title.