Direct download links of 00M-242 free pdf at

Rather than wasting period and money upon searching updated IBM Commerce Solutions Supply Chain Mgmt Sales Mastery Test v1 queries, you should move forward and sign up on Download 100% free of charge dumps purchase complete 00M-242 real questions version. Read plus pass the 00M-242 examination.

Exam Code: 00M-242 Practice exam 2022 by team
IBM Commerce Solutions Supply Chain Mgmt Sales Mastery Test v1
IBM Solutions approach
Killexams : IBM Solutions approach - BingNews Search results Killexams : IBM Solutions approach - BingNews Killexams : Why your business' data strategy probably needs an overhaul

By Dinesh Nirmal, General Manager, Data, AI and Automation, IBM

The age of data-driven work is here. Virtually all employees today say they are expected to work with data on the job to some extent, from retail employees needing to keep tabs on sales and supply chains to professional athletes using data to fine tune their game.

Despite the expectation for employees, most people have a really hard time getting the data they need. Research shows that up to 82% of enterprises are inhibited by data silos, meaning data that's left in inaccessible repositories and databases. Simply put, the prevalence of these data silos can make working with data a real pain, a fact which perhaps helps explain why employees spend an average of one hour per week procrastinating on data-related tasks alone.

As more people need access to data – and the volume of data within an organization continues to dramatically grow – the task of governing that data, ensuring it is secure and compliant with privacy standards, is increasingly difficult.

Unlocking Innovation

It doesn't have to be this way. A clear data strategy defines how to make sense of vast amounts of data, align data initiatives to business strategy, and build solutions that span the entire organization. It helps organizations realize their data's potential and gather insights and identify efficiencies while complying with increasingly complex regulations.

But a critical part of implementing a data strategy vision is getting the right technology in place. That's exactly why companies around the world are investing heavily in a solution that can weave their disparate data sources together, also known as a data fabric. A data fabric is a type of data architecture that automates data discovery, governance, and consumption, allowing enterprises to elevate the value of their data by providing access to the right data, at the right time, regardless of where it resides. 

Research shows organizations that have adopted a data fabric architecture are significantly more innovative. Companies that have actively deployed and are seeing value from AI, for example, are 283% more likely to have a data fabric architecture in place than those that have not.

Making Work Easier

A data fabric unlocks innovation because it makes employees' jobs easier. With a data fabric, employees don't need to spend time hunting down, validating, and verifying the data they need to do their jobs. A data fabric architecture is also critical for feeding the models, AI, and automation needed to eliminate repetitive, low-value tasks and divert resources to more interesting, value-adding work.

A data fabric also enables a comprehensive approach to privacy and data access. Consumers need assurances that their data is being protected and companies are not running the risk of breaches, abuses, or other misuse that risks damage to their reputation.

The data-driven enterprise is more than just a buzzword – access to quality data can be transformative and differentiating. But ensuring access to quality data remains challenging for numerous reasons, and more employees need access to more data than ever before. A data strategy that takes advantage of data fabric architecture is the best way to democratize access to data so that data consumers —whether they are in human resources, customer service, manufacturing, or marketing — can get the data they need, while empowering IT teams to work smarter, not harder. 

Optimize your data strategy and democratize data access with a data fabric. Check out IBM's new guide for data leaders here.

This post was created by IBM with Insider Studios.

Tue, 09 Aug 2022 05:09:00 -0500 en-US text/html
Killexams : CIOReview Names Cobalt Iron Among 10 Most Promising IBM Solution Providers 2022

LAWRENCE, Kan.--(BUSINESS WIRE)--Jul 28, 2022--

Cobalt Iron Inc., a leading provider of SaaS-based enterprise data protection, today announced that the company has been deemed one of the 10 Most Promising IBM Solution Providers 2022 by CIOReview Magazine. The annual list of companies is selected by a panel of experts and members of CIOReview Magazine’s editorial board to recognize and promote innovation and entrepreneurship. A technology partner for IBM, Cobalt Iron earned the distinction based on its Compass ® enterprise SaaS backup platform for monitoring, managing, provisioning, and securing the entire enterprise backup landscape.

This press release features multimedia. View the full release here:

Cobalt Iron Compass® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection. (Graphic: Business Wire)

According to CIOReview, “Cobalt Iron has built a patented cyber-resilience technology in a SaaS model to alleviate the complexities of managing large, multivendor setups, providing an effectual humanless backup experience. This SaaS-based data protection platform, called Compass, leverages strong IBM technologies. For example, IBM Spectrum Protect is embedded into the platform from a data backup and recovery perspective. ... By combining IBM’s technologies and the intellectual property built by Cobalt Iron, the company delivers a secure, modernized approach to data protection, providing a ‘true’ software as a service.”

Through proprietary technology, the Compass data protection platform integrates with, automates, and optimizes best-of-breed technologies, including IBM Spectrum Protect, IBM FlashSystem, IBM Red Hat Linux, IBM Cloud, and IBM Cloud Object Storage. Compass enhances and extends IBM technologies by automating more than 80% of backup infrastructure operations, optimizing the backup landscape through analytics, and securing backup data, making it a valuable addition to IBM’s data protection offerings.

CIOReview also praised Compass for its simple and intuitive interface to display a consolidated view of data backups across an entire organization without logging in to every backup product instance to extract data. The machine learning-enabled platform also automates backup processes and infrastructure, and it uses open APIs to connect with ticket management systems to generate tickets automatically about any backups that need immediate attention.

To ensure the security of data backups, Cobalt Iron has developed an architecture and security feature set called Cyber Shield for 24/7 threat protection, detection, and analysis that improves ransomware responsiveness. Compass is also being enhanced to use several patented techniques that are specific to analytics and ransomware. For example, analytics-based cloud brokering of data protection operations helps enterprises make secure, efficient, and cost-effective use of their cloud infrastructures. Another patented technique — dynamic IT infrastructure optimization in response to cyberthreats — offers unique ransomware analytics and automated optimization that will enable Compass to reconfigure IT infrastructure automatically when it detects cyberthreats, such as a ransomware attack, and dynamically adjust access to backup infrastructure and data to reduce exposure.

Compass is part of IBM’s product portfolio through the IBM Passport Advantage program. Through Passport Advantage, IBM sellers, partners, and distributors around the world can sell Compass under IBM part numbers to any organizations, particularly complex enterprises, that greatly benefit from the automated data protection and anti-ransomware solutions Compass delivers.

CIOReview’s report concludes, “With such innovations, all eyes will be on Cobalt Iron for further advancements in humanless, secure data backup solutions. Cobalt Iron currently focuses on IP protection and continuous R&D to bring about additional cybersecurity-related innovations, promising a more secure future for an enterprise’s data.”

About Cobalt Iron

Cobalt Iron was founded in 2013 to bring about fundamental changes in the world’s approach to secure data protection, and today the company’s Compass ® is the world’s leading SaaS-based enterprise data protection system. Through analytics and automation, Compass enables enterprises to transform and optimize legacy backup solutions into a simple cloud-based architecture with built-in cybersecurity. Processing more than 8 million jobs a month for customers in 44 countries, Compass delivers modern data protection for enterprise customers around the world.

Product or service names mentioned herein are the trademarks of their respective owners.

Link to Word


Photo Caption: Cobalt Iron Compass ® is a SaaS-based data protection platform leveraging strong IBM technologies for delivering a secure, modernized approach to data protection.

Follow Cobalt Iron

View source version on

CONTACT: Agency Contact:

Sunny Branson

Wall Street Communications

Tel: +1 801 326 9946

Web:www.wallstcom.comCobalt Iron Contact:

Mary Spurlock

VP of Marketing

Tel: +1 785 979 9461



SOURCE: Cobalt Iron

Copyright Business Wire 2022.

PUB: 07/28/2022 09:00 AM/DISC: 07/28/2022 09:03 AM

Thu, 28 Jul 2022 01:03:00 -0500 en text/html
Killexams : Top 10 data lake solution vendors in 2022

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

As the world becomes increasingly data-driven, businesses must find suitable solutions to help them achieve their desired outcomes. Data lake storage has garnered the attention of many organizations that need to store large amounts of unstructured, raw information until it can be used in analytics applications.

The data lake solution market is expected to grow rapidly in the coming years and is driven by vendors that offer cost-effective, scalable solutions for their customers.

Learn more about data lake solutions, what key features they should have and some of the top vendors to consider this year. 

What is a data lake solution?

A data lake is defined as a single, centralized repository that can store massive amounts of unstructured and semi-structured information in its native, raw form. 

It’s common for an organization to store unstructured data in a data lake if it hasn’t decided how that information will be used. Some examples of unstructured data include images, documents, videos and audio. These data types are useful in today’s advanced machine learning (ML) and advanced analytics applications.

Data lakes differ from data warehouses, which store structured, filtered information for specific purposes in files or folders. Data lakes were created in response to some of the limitations of data warehouses. For example, data warehouses are expensive and proprietary, cannot handle certain business use cases an organization must address, and may lead to unwanted information homogeneity.

On-premise data lake solutions were commonly used before the widespread adoption of the cloud. Now, it’s understood that some of the best hosts for data lakes are cloud-based platforms on the edge because of their inherent scalability and considerably modular services. 

A 2019 report from the Government Accountability Office (GAO) highlights several business benefits of using the cloud, including better customer service and the acquisition of cost-effective options for IT management services.

Cloud data lakes and on-premise data lakes have pros and cons. Businesses should consider cost, scale and available technical resources to decide which type is best.

Read more about data lakes: What is a data lake? Definition, benefits, architecture and best practices

5 must-have features of a data lake solution

It’s critical to understand what features a data lake offers. Most solutions come with the same core components, but each vendor may have specific offerings or unique selling points (USPs) that could influence a business’s decision.

Below are five key features every data lake should have:

1. Various interfaces, APIs and endpoints

Data lakes that offer diverse interfaces, APIs and endpoints can make it much easier to upload, access and move information. These capabilities are important for a data lake because it allows unstructured data for a wide range of use cases, depending on a business’s desired outcome.

2. Support for or connection to processing and analytics layers

ML engineers, data scientists, decision-makers and analysts benefit most from a centralized data lake solution that stores information for easy access and availability. This characteristic can help data professionals and IT managers work with data more seamlessly and efficiently, thus improving productivity and helping companies reach their goals.

3. Robust search and cataloging features

Imagine a data lake with large amounts of information but no sense of organization. A viable data lake solution must incorporate generic organizational methods and search capabilities, which provide the most value for its users. Other features might include key-value storage, tagging, metadata, or tools to classify and collect subsets of information.

4. Security and access control

Security and access control are two must-have features with any digital tool. The current cybersecurity landscape is expanding, making it easier for threat actors to exploit a company’s data and cause irreparable damage. Only certain users should have access to a data lake, and the solution must have strong security to protect sensitive information.

5. Flexibility and scalability

More organizations are growing larger and operating at a much faster rate. Data lake solutions must be flexible and scalable to meet the ever-changing needs of modern businesses working with information.

Also read: Unlocking analytics with data lake and graph analysis

Top 10 data lake solution vendors in 2022

Some data lake solutions are best suited for businesses in certain industries. In contrast, others may work well for a company of a particular size or with a specific number of employees or customers. This can make choosing a potential data lake solution vendor challenging. 

Companies considering investing in a data lake solution this year should check out some of the vendors below.

1. Amazon Web Services (AWS)

The AWS Cloud provides many essential tools and services that allow companies to build a data lake that meets their needs. The AWS data lake solution is widely used, cost-effective and user-friendly. It leverages the security, durability, flexibility and scalability that Amazon S3 object storage offers to its users. 

The data lake also features Amazon DynamoDB to handle and manage metadata. AWS data lake offers an intuitive, web-based console user interface (UI) to manage the data lake easily. It also forms data lake policies, removes or adds data packages, creates manifests of datasets for analytics purposes, and features search data packages.

2. Cloudera

Cloudera is another top data lake vendor that will create and maintain safe, secure storage for all data types. Some of Cloudera SDX’s Data Lake Service capabilities include:

  • Data schema/metadata information
  • Metadata management and governance
  • Compliance-ready access auditing
  • Data access authorization and authentication for improved security

Other benefits of Cloudera’s data lake include product support, downloads, community and documentation. GSK and Toyota leveraged Cloudera’s data lake to garner critical business intelligence (BI) insights and manage data analytics processes.

3. Databricks 

Databricks is another viable vendor, and it also offers a handful of data lake alternatives. The Databricks Lakehouse Platform combines the best elements of data lakes and warehouses to provide reliability, governance, security and performance.

Databricks’ platform helps break down silos that normally separate and complicate data, which frustrates data scientists, ML engineers and other IT professionals. Aside from the platform, Databricks also offers its Delta Lake solution, an open-format storage layer that can Strengthen data lake management processes. 

4. Domo

Domo is a cloud-based software company that can provide big data solutions to all companies. Users have the freedom to choose a cloud architecture that works for their business. Domo is an open platform that can augment existing data lakes, whether it’s in the cloud or on-premise. Users can use combined cloud options, including:

  • Choosing Domo’s cloud
  • Connecting to any cloud data
  • Selecting a cloud data platform

Domo offers advanced security features, such as BYOK (bring your own key) encryption, control data access and governance capabilities. Well-known corporations such as Nestle, DHL, Cisco and Comcast leverage the Domo Cloud to better manage their needs.

5. Google Cloud

Google is another big tech player offering customers data lake solutions. Companies can use Google Cloud’s data lake to analyze any data securely and cost-effectively. It can handle large volumes of information and IT professionals’ various processing tasks. Companies that don’t want to rebuild their on-premise data lakes in the cloud can easily lift and shift their information to Google Cloud. 

Some key features of Google’s data lakes include Apache Spark and Hadoop migration, which are fully managed services, integrated data science and analytics, and cost management tools. Major companies like Twitter, Vodafone, Pandora and Metro have benefited from Google Cloud’s data lakes.

6. HP Enterprise

Hewlett Packard Enterprise (HPE) is another data lake solution vendor that can help businesses harness the power of their big data. HPE’s solution is called GreenLake — it offers organizations a truly scalable, cloud-based solution that simplifies their Hadoop experiences. 

HPE GreenLake is an end-to-end solution that includes software, hardware and HPE Pointnext Services. These services can help businesses overcome IT challenges and spend more time on meaningful tasks. 

7. IBM

Business technology leader IBM also offers data lake solutions for companies. IBM is well-known for its cloud computing and data analytics solutions. It’s a great choice if an operation is looking for a suitable data lake solution. IBM’s cloud-based approach operates on three key principles: embedded governance, automated integration and virtualization.

These are some data lake solutions from IBM: 

  • IBM Db2
  • IBM Db2 BigSQL
  • IBM Netezza
  • IBM Watson Query
  • IBM Watson Knowledge Catalog
  • IBM Cloud Pak for Data

With so many data lakes available, there’s surely one to fit a company’s unique needs. Financial services, healthcare and communications businesses often use IBM data lakes for various purposes.

8. Microsoft Azure

Microsoft offers its Azure Data Lake solution, which features easy storage methods, processing, and analytics using various languages and platforms. Azure Data Lake also works with a company’s existing IT investments and infrastructure to make IT management seamless.

The Azure Data Lake solution is affordable, comprehensive, secure and supported by Microsoft. Companies benefit from 24/7 support and expertise to help them overcome any big data challenges they may face. Microsoft is a leader in business analytics and tech solutions, making it a popular choice for many organizations.

9. Oracle

Companies can use Oracle’s Big Data Service to build data lakes to manage the influx of information needed to power their business decisions. The Big Data Service is automated and will provide users with an affordable and comprehensive Hadoop data lake platform based on Cloudera Enterprise. 

This solution can be used as a data lake or an ML platform. Another important feature of Oracle is it is one of the best open-source data lakes available. It also comes with Oracle-based tools to add even more value. Oracle’s Big Data Service is scalable, flexible, secure and will meet data storage requirements at a low cost.

10. Snowflake

Snowflake’s data lake solution is secure, reliable and accessible and helps businesses break down silos to Strengthen their strategies. The top features of Snowflake’s data lake include a central platform for all information, fast querying and secure collaboration.

Siemens and Devon Energy are two companies that provide testimonials regarding Snowflake’s data lake solutions and offer positive feedback. Another benefit of Snowflake is its extensive partner ecosystem, including AWS, Microsoft Azure, Accenture, Deloitte and Google Cloud.

The importance of choosing the right data lake solution vendor 

Companies that spend extra time researching which vendors will offer the best enterprise data lake solutions for them can manage their information better. Rather than choose any vendor, it’s best to consider all options available and determine which solutions will meet the specific needs of an organization.

Every business uses information, some more than others. However, the world is becoming highly data-driven — therefore, leveraging the right data solutions will only grow more important in the coming years. This list will help companies decide which data lake solution vendor is right for their operations.

Read next: Get the most value from your data with data lakehouse architecture

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Fri, 15 Jul 2022 09:40:00 -0500 Shannon Flynn en-US text/html
Killexams : How a CIO’s approach to cloud, AI and ML is transforming Nigeria’s Dangote Industries

Prasanna Burri began his career as a mechanical engineer in India but had a secret passion for IT. His endeavour to get into the technology space led him to learn about enterprise application platforms and he eventually started his ERP career at IBM. Later he joined SAP Labs in the US where he was immersed in product management for cloud technology. Since 2013 he’s been overseeing the Dangote Group’s IT operations across Africa.  

In a big group like Dangote, how challenging is it to manage IT in all parts of the company?

It’s a complex organisation with a lot of diversified business lines and regions. We embrace proven technologies like the Microsoft platform for endpoint management servers, active directory, email and endpoint protection. It was a modest start but today we spend almost 10 times more than in 2014 on the Azure Cloud subscriptions. Almost all our applications run in the cloud and I can say about 95% of our operations run in different cloud environments, whether SAP or Microsoft. We continue to expand to implement some near processes and go through continuous improvement cycles, and we’re a certified SAP Center of Excellence. We also have a very strong hybrid cloud infrastructure and a dedicated, in-house talent base that’s open to embracing newer technologies like AI and ML.

Can you describe the importance of cloud technologies across Africa today?

There’s an increasing appetite even though infrastructure scaling is time-consuming in terms of logistics bringing in equipment. A lot of good talent from Africa is migrating to greener pastures, especially in the last few years. With cloud technologies, though, companies can scale despite talent shortages in the region, and support tech talent can also be found anywhere in the world. We ultimately don’t need to suffer from the latency of acquisitions of equipment. It’s a big shot in the arm, especially in the environment we operate in where we can scale fast and where we have more visibility and control over what’s happening due to remote management options and features available with these platforms. It’s also necessary to have a hybrid system set up in case of any large-scale disruption even though they’re rare. Cloud is the way to go but it depends on the industry and the region.

With cloud technologies come the opportunity to implement AI and ML. How is the company taking advantage of this?

We have at least three use cases we’ve been working on: logistics, which is fleet management for our trucks; the master data management and data clean-up, which AI can do a better job; and where we can run optical character recognition (OCR) automatically on vendor invoices using Microsoft cloud Power Platform, along with ML services. We’re also trying to leverage the capabilities of AI and ML in security on the Azure platform, where Microsoft endpoint manager and Intune orchestrate security of servers and endpoints, as well as mobile devices, across the group from the cloud.

What is the state of connectivity that strings all these technologies together?

We don’t see as many disruptions and downtime for business solely due to network connectivity. It’s a lot less nowadays than maybe five years ago. There’s also a continuous growth in bandwidth. A lot of times, there are issues in the last mile. The trunk routes are generally okay, but there’s still always room for growth and optimisation, and there’s reasonable capacity, especially in urban regions with the advent of some newer technologies like Starlink. I expect that in a year or two, we’ll start seeing a greater prevalence of connectivity in the remote parts as well. 

What other challenges do you face when implementing these technologies in the company?

Technology in itself is never a problem. The hardest part is acquiring talent. Sometimes you don’t find engineering talent or the talent you have has matured, which may leave you short because they’ve found better opportunities in the West or Middle East, for instance. Then there’s user adoption and change management in processes and new technologies. Those are the two challenges that revolve around implementing new technologies.

How then do you find talent and screen them for suitability?

Most of the time, we hire people based on their attitude and knowledge, and in some cases also for their experience. We use recruiting tools for job board postings, and online assessment tools to pick qualified people based on the job specification. Then in some cases, they might even receive some additional assignments to ensure that the aptitude is there.

How big is the ICT team at Dangote Group?

We are close to 150 personnel and a good part of our team belongs to endpoint security and last-mile tech support. We have one of the leanest shops from that aspect, but we’re looking to hire more local talent and be more resilient due to the changing pressure of acquiring talent from the marketplace. We also have a constant flurry of training from original equipment manufacturers (OEMs) and subscriptions for LinkedIn Learning for the majority of our information work staff.

What would be your parting shot to other Africa-based enterprises looking to adopt cloud, AI and machine learning?

The primary goal is to sustain and enable businesses to operate efficiently with certain proven innovations. Also, the target is to expand the presence of the organisation in the market, and keep customers happy. IT experts should know the goal of the business before adopting technology. They can think through challenges like how to make sure the dispatch operations run without stopping, ensure there’s adequate disaster resilience, and that end users are being productive with such tools and services. Successful IT leaders have a consultancy and advisory approach. They understand the needs of the business and can conceive solutions and relay them in a way that gets the buy-in from the leadership.

Tue, 26 Jul 2022 22:06:00 -0500 Author: Vincent Matinde en-US text/html
Killexams : IBM Flash Storage and Cyber Resiliency

Flash storage has historically had a reputation for delivering large amounts of storage capacity and high performance in a relatively small package. But with the current threat landscape, it has become important to focus on the resilience of flash. 

IBM's 2021 Cost of a Data Breach Report found that the average cost of a customer data breach is more than four million dollars, and recovery from such an event can take days or even weeks. IBM is responding to the need for protection and rapid recovery from ransomware and other cyber threats by releasing new data resilience capabilities for its FlashSystem family of all-flash arrays.

Flash storage with the power of data protection

Even if your company has a robust security strategy, you still need to be prepared if and when an attack succeeds. IBM empowers organizations to recover from this eventuality by enhancing its FlashSystem storage with IBM Safeguarded Copy. 

Safeguarded Copy enables flash storage to play a role in recovery by automatically creating point-in-time snapshots on production storage on an administrator-defined schedule. Once snapshots have been created, they cannot be changed or deleted. These protections prevent malware and internal threats from tampering with backups.

With Safeguarded Copy, companies can recover from an attack quickly and completely. Safeguarded Copy snapshots reside on the same FlashSystem storage as operational data, which dramatically reduces recovery time when compared to tiered or offsite copy-based recovery solutions.

Rapid recovery with IBM FlashSystem Cyber Vault

IBM has also enhanced its FlashSystem storage with IBM FlashSystem Cyber Vault to enable it to quickly perform all three stages of the recovery process: detection, response and recovery. 

Cyber Vault runs continuously and monitors snapshots as Safeguarded Copy creates them and uses standard database tools and other software to ensure Safeguarded Copy snapshots haven't been compromised. If Cyber Vault finds the snapshots have been corrupted, it interprets that as a sign of an attack. Cyber Vault can reduce recovery time from days to hours by quickly determining which snapshots to use.

Flash storage designed for resiliency

IBM has added members to its FlashSystem family that are built to deliver on performance while also providing resilience: FlashSystem 9500 and 7300. 

The FlashSystem 9500 is IBM's flagship enterprise storage array, designed for environments that need the highest capability and resilience. It offers twice the performance, connectivity and capacity of its predecessor and 50 percent more cache. The 9500 also provides data resilience with numerous safeguards, including multi-factor authentication (MFA) and secure boot to help certain only IBM-authorized software runs on the system. Additionally, IBM's FlashCore Modules (FCMs) offer real-time hardware-based encryption and up to 7x increase in endurance compared to commodity SSDs.

The IBM FlashSystem 7300 offers about 25 percent better performance than the previous generation of FlashSystem storage. It has a smaller footprint than the 9500 but runs the same software and features, including 3:1 real-time compression and hardware encryption. The FlashSystem 7300 supports up to 2.2PB effective capacity per 2U control enclosure. 

The IBM FlashSystem family offers two- and three-site replication along with configuration options that can include an optional 100 percent data availability guarantee for business continuity.

Explore next-generation flash storage

The IBM FlashSystem family is continuously evolving with expanded capabilities around capacity, performance and data protection. 

WWT can help your company evaluate and choose the right flash storage solution to meet your needs. WWT is an IBM-designated global and regional systems integrator (SI) and solution provider, and we know how important data protection is for modern companies. We encourage your organization to take a holistic approach to data resilience.

Sun, 24 Jul 2022 17:00:00 -0500 en text/html
Killexams : How 'living architecture' could help the world avoid a soul-deadening digital future

(The Conversation is an independent and nonprofit source of news, analysis and commentary from academic experts.)

(THE CONVERSATION) My first Apple laptop felt like a piece of magic made just for me – almost a part of myself. The rounded corners, the lively shading, the delightful animations. I had been using Windows my whole life, starting on my family’s IBM 386, and I never thought using a computer could be so fun.

Indeed, Apple co-founder Steve Jobs said that computers were like bicycles for the mind, extending your possibilities and helping you do things not only more efficiently but also more beautifully. Some technologies seem to unlock your humanity and make you feel inspired and alive.

But not all technologies are like this. Sometimes devices do not work reliably or as expected. Often you have to change to conform to the limitations of a system, as when you need to speak differently so a digital voice assistant can understand you. And some platforms bring out the worst in people. Think of anonymous flame wars.

As a researcher who studies technology, design and ethics, I believe that a hopeful way forward comes from the world of architecture. It all started decades ago with an architect’s observation that newer buildings tended to be lifeless and depressing, even if they were made using ever fancier tools and techniques.

Tech’s wear on humanity

The problems with technology are myriad and diffuse, and widely studied and reported: from short attention spans and tech neck to clickbait and AI bias to trolling and shaming to conspiracy theories and misinformation.

As people increasingly live online, these issues may only get worse. Some exact visions of the metaverse, for example, suggest that humans will come to live primarily in virtual spaces. Already, people worldwide spend on average seven hours per day on digital screens – nearly half of waking hours.

While public awareness of these issues is on the rise, it’s not clear whether or how tech companies will be able to address them. Is there a way to ensure that future technologies are more like my first Apple laptop and less like a Twitter pile-on?

Over the past 60 years, the architectural theorist Christopher Alexander pursued questions similar to these in his own field. Alexander, who died in March 2022 at age 85, developed a theory of design that has made inroads in architecture. Translated to the technology field, this theory can provide the principles and process for creating technologies that unlock people’s humanity rather than suppress it.

How good design is defined

Technology design is beginning to mature. Tech companies and product managers have realized that a well-designed user interface is essential for a product’s success, not just nice to have.

As professions mature, they tend to organize their knowledge into concepts. Design patterns are a great example of this. A design pattern is a reusable solution to a problem that designers need to solve frequently.

In user experience design, for instance, such problems include helping users enter their shipping information or get back to the home page. Instead of reinventing the wheel every time, designers can apply a design pattern: clicking the logo at the upper left always takes you home. With design patterns, life is easier for designers, and the end products are better for users.

Design patterns facilitate good design in one sense: They are efficient and productive. Yet they do not necessarily lead to designs that are good for people. They can be sterile and generic. How, exactly, to avoid that is a major challenge.

A seed of hope lies in the very place where design patterns originated: the work of Christopher Alexander. Alexander dedicated his life to understanding what makes an environment good for humans – good in a deep, moral sense – and how designers might create structures that are likewise good.

His work on design patterns, dating back to the 1960s, was his initial effort at an answer. The patterns he developed with his colleagues included details like how many stories a good building should have and how many light sources a good room should have.

But Alexander found design patterns ultimately unsatisfying. He took that work further, eventually publishing his theory in his four-volume magnum opus, “The Nature of Order.”

While Alexander’s work on design patterns is very well known – his 1977 book “A Pattern Language” remains a bestseller – his later work, which he deemed much more important, has been largely overlooked. No surprise, then, that his deepest insights have not yet entered technology design. But if they do, good design could come to mean something much richer.

On creating structures that foster life

Architecture was getting worse, not better. That was Christopher Alexander’s conclusion in the mid-20th century.

Much modern architecture is inert and makes people feel dead inside. It may be sleek and intellectual – it may even win awards – but it does not help generate a feeling of life within its occupants. What went wrong, and how might architecture correct its course?

Motivated by this question, Alexander conducted numerous experiments throughout his career, going deeper and deeper. Beginning with his design patterns, he discovered that the designs that stirred up the most feeling in people, what he called living structure, shared certain qualities. This wasn’t just a hunch, but a testable empirical theory, one that he validated and refined from the late 1970s until the turn of the century. He identified 15 qualities, each with a technical definition and many examples.

The qualities are:

- Levels of scale

- Strong centers

- Boundaries

- Alternating repetition

- Positive space

- Good shape

- Local symmetries

- Deep interlocking and ambiguity

- Contrast gradients

- Roughness

- Echoes

- The void

- Simplicity and inner calm

- Notseparateness

As Alexander writes, living structure is not just pleasant and energizing, though it is also those. Living structure reaches into humans at a transcendent level – connecting people with themselves and with one another – with all humans across centuries and cultures and climates.

Yet modern architecture, as Alexander showed, has very few of the qualities that make living structure. In other words, over the 20th century architects taught one another to do it all wrong. Worse, these errors were crystallized in building codes, zoning laws, awards criteria and education. He decided it was time to turn things around.

Alexander’s ideas have been hugely influential in architectural theory and criticism. But the world has not yet seen the paradigm shift he was hoping for.

By the mid-1990s, Alexander recognized that for his aims to be achieved, there would need to be many more people on board – and not just architects, but all sorts of planners, infrastructure developers and everyday people. And perhaps other fields besides architecture. The digital revolution was coming to a head.

Alexander’s invitation to technology designers

As Alexander doggedly pursued his research, he started to notice the potential for digital technology to be a force for good. More and more, digital technology was becoming part of the human environment – becoming, that is, architectural.

Meanwhile, Alexander’s ideas about design patterns had entered the world of technology design as a way to organize and communicate design knowledge. To be sure, this older work of Alexander’s proved very valuable, particularly to software engineering.

Because of his fame for design patterns, in 1996 Alexander was invited to provide a keynote address at a major software engineering conference sponsored by the Association for Computing Machinery.

In his talk, Alexander remarked that the tech industry was making great strides in efficiency and power but perhaps had not paused to ask: “What are we supposed to be doing with all these programs? How are they supposed to help the Earth?”

“For now, you’re like guns for hire,” Alexander said. He invited the audience to make technologies for good, not just for pay.

Loosening the design process

In “The Nature of Order,” Alexander defined not only his theory of living structure, but also a process for creating such structure.

In short, this process involves democratic participation and springs from the bottom up in an evolving progression incorporating the 15 qualities of living structure. The end result isn’t known ahead of time – it’s adapted along the way. The term “organic” comes to mind, and this is appropriate, because nature almost invariably creates living structure.

But typical architecture – and design in many fields – is, in contrast, top-down and strictly defined from the outset. In this machinelike process, rigid precision is prioritized over local adaptability, project roles are siloed apart and the emphasis is on commercial value and investment over anything else. This is a recipe for lifeless structure.

Alexander’s work suggests that if living structure is the goal, the design process is the place to focus. And the technology field is starting to show inklings of change.

In project management, for example, the traditional waterfall approach followed a rigid, step-by-step schedule defined upfront. The turn of the century saw the emergence of a more dynamic approach, dubbed agile, which allows for more adaptability through frequent check-ins and prioritization, progressing in “sprints” of one to two weeks rather than longer phases.

And in design, the human-centered design paradigm is likewise gaining steam. Human-centered design emphasizes, among other elements, continually testing and refining small changes with respect to design goals.

A design process that promotes life

However, Alexander would say that both these trajectories are missing some of his deeper insights about living structure. They may spark more purchases and increase stock prices, but these approaches will not necessarily create technologies that are good for each person and good for the world.

Yet there are some emerging efforts toward this deeper end. For example, design pioneer Don Norman, who coined the term “user experience,” has been developing his ideas on what he calls humanity-centered design. This goes beyond human-centered design to focus on ecosystems, take a long-term view, incorporate human values and involve stakeholder communities along the way.

The vision of humanity-centered design calls for sweeping changes in the technology field. This is precisely the kind of reorientation that Alexander was calling for in his 1996 keynote speech. Just as design patterns suggested in the first place, the technology field doesn’t need to reinvent the wheel. Technologists and people of all stripes can build up from the tremendous, careful work that Alexander has left.

This article is republished from The Conversation under a Creative Commons license. Read the original article here:

Tue, 09 Aug 2022 07:04:00 -0500 en-US text/html
Killexams : Telos Corporation's (TLS) CEO John Wood on Q2 2022 Results - Earnings Call Transcript

Telos Corporation (NASDAQ:TLS) Q2 2022 Earnings Conference Call August 9, 2022 8:30 AM ET

Company Participants

Christina Mouzavires - Investor Relations

John Wood - Chairman and Chief Executive Officer

Mark Bendza - Executive Vice President and CFO

Mark Griffin - Executive Vice President, Security Solutions

Conference Call Participants

Zach Cummins - B. Riley

Rudy Kessinger - D.A. Davidson

Alex Henderson - Needham & Company

Nehal Chokski - Northland Capital Markets

Brad Clark - BMO


Good day and thank you for standing by. Welcome to the Telos Corporation Second Quarter 2022 Earnings Conference Call. At this time, all participants are in a listen-only mode. After the speakers’ presentation, there will be a question-and-answer session. [Operator Instructions]

Please be advised, today’s conference is being recorded. I would now like to hand the conference over to your speaker today, Christina Mouzavires. Please go ahead.

Christina Mouzavires

Good morning. Thank you for joining us to discuss Telos Corporation’s second quarter 2022 financial results. With me today is John Wood, Chairman and CEO of Telos; and Mark Bendza, Executive Vice President and CFO of Telos.

Let me quickly review the format of today’s presentation. John will begin with brief remarks on our 2022 second quarter results and Telos’ strategic priority, and Mark will cover the financials and guidance for the third quarter and full year 2022. Then we will open the line for questions-and-answers where Mark Griffin, Executive Vice President of Security Solutions will also join us.

The earnings press release was issued earlier today and is posted on the Telos Investor Relations website where this call is being simultaneously webcast. Additionally, we have provided presentation slides on our Investor Relations website.

Before we begin, we want to emphasize that some of our statements on this call are forward-looking statements and are made under the safe harbor provisions of the federal securities laws. These statements are based on current expectations and assumptions that are subject to risks and uncertainties.

Actual results could materially differ for various reasons, including the factors described in today’s earnings press release and the comments made during this conference call and in our SEC filings. We do not undertake any duty to update any forward-looking statements.

In addition, during today’s call we will discuss non-GAAP financial measures, which we believe are useful as supplemental and clarifying measures that help investors understand Telos’ financial performance.

These non-GAAP financial measures should be considered in addition to and not as assessed to for or in isolation from GAAP results. You can find additional disclosures regarding these non-GAAP measures, including reconciliations with comparable GAAP results in our earnings press release and on the Investor Relations portion at our website. Please also note that financial comparisons are year-over-year unless otherwise specified.

The webcast replay of this call will be available for the next year on our company website under the Investor Relations page.

With that, I will turn the call over to John.

John Wood

Thank you, Christina, and good morning, everyone. Let’s begin today on slide three. I am pleased to report that Telos over delivered again on key financial metrics in the second quarter of 2022. Mark will discuss our financial performance later in this call, but at a high level, we delivered $55.8 million of revenue in the second quarter, above our guidance range of $50 million to $54 million, up 4% year-over-year and 11%, sequentially. Gross margin was 37.5%, above our guidance range of 33% to 35%. Finally, we delivered $4.5 million of adjusted EBITDA, above the high end of our guidance range of negative $2 million to positive $2 million and $0.04 of adjusted EPS.

Now, let’s turn to slide four to discuss our exact business highlights and updates. This quarter we announced a new strategic partnership with IBM. Telos is the launch partner for the new active governance service or AGS offering with IBM Security.

Telos and IBM are teaming to provide capabilities to address the significant challenges organizations are facing with cybersecurity and risk compliance. AGS is a unique and comprehensive offering, coupling the Xacta suite of tools with IBM’s services and security expertise to significantly Strengthen the efficiency of clients’ approach to cyber security risk management in today’s increasingly challenging cyber environment. Target customers include large enterprise organizations in global markets such as financial services, healthcare, telecommunications and energy.

We are very excited about this opportunity to partner with IBM, a leading global organization that brings recognized thought leadership and leading capability in the cybersecurity management space. This relationship also enables us to effectively broaden our reach in the global marketplace for sales of our Xacta suite of tools to drive future growth for Telos.

Beyond the IBM partnership, we have continued to maintain momentum in the current environment. Within the Security Solutions business, Telos received Xacta renewals with several key customers, including the Central Intelligence Agency, The U.S. Department of the Interior, The U.S. Environmental Protection Agency, our U.S. Federal Reserve Bank and The U.S. Department of Energy, as well as Salesforce.

The company was also awarded new contracts with a foreign government customer, The U.S. Army Space and Missile Command, The U.S. Department of Homeland Security, Palantir Technologies and OmniHealth.

We continue to focus on the government and commercial space, and in particular, prioritizing regulated industries. The company also received an important Ghost renewal with a classified customer to continue providing support. Additionally, we were awarded up to a 10-year contract to continue and expand our aviation security practice with the U.S. Transportation Security Administration.

Our ONYX technology won first place in the Mobile Fingerprint Information Challenge posted by the National Institute of Standards and Technology.

Finally, the Secure Networks business continued to add to its backlog with new wins, including a new contract to support The U.S. Air Force SIPRNet Enterprise Modernization.

Let me turn now to some comments on the industry landscape and a number of exact initiatives in Washington, D.C., that presents opportunities for Telos. There are indications that Congress plans to boost spending above the level called for by President Biden in his proposed FY 2023 budget.

The House and Senate versions of the Annual Defense Authorization Bill provides for increasing topline defense spending, respectively $37 billion to $45 billion above the level proposed by the President.

We still have to see how the appropriations process plays out this fall to know how much funding will actually be provided for our military customers, but signs are there that the FY 2023 defense budget will see a meaningful increase.

On the non-defense side, as with defense, we will have to wait for Congress to agree on appropriations legislation. But so far, the spending bills under consideration reflect a consensus that more funding is needed for cybersecurity throughout the various departments and agencies.

A great example of this is with CISA, The Department of Homeland Security’s cybersecurity agency. CISA works to detect and mitigate the effects of cyber attacks on federal, state and local governments, and the private sector, and then manage cyber risks to our critical infrastructure. We understand that recognizing the importance of this mission, the draft Senate Appropriations Bill for DHS seeks to provide CISA a 16% increase above last year’s funding.

Congress clearly recognizes that more resources are needed by federal departments and agencies to combat challenges they face in the cyberspace. A major factor in that thinking is the Ukraine situation, which has resulted in continued warnings of potential cyber attacks against U.S. interests, including against U.S. critical infrastructure.

So far, the United States has done an excellent job in preventing what had been expected to be widespread impacts from cyber attacks in retaliation for our support for Ukraine. The policy makers and companies like ours know that the public and private sectors can’t let up and they must continue to follow cybersecurity best practices, including deploying and updating effective cyber defenses.

I will now turn the call over to Mark who will discuss second quarter 2022 financial results and our guidance for the third quarter and full year 2022. Mark?

Mark Bendza

Thank you, John, and thank you everyone for joining us today. Let’s turn to slide five. As John mentioned, we delivered a strong second quarter, with results that exceeded our guidance on key financial metrics.

We reported revenue, gross margin and adjusted EBITDA above the high end of our guidance range. We also delivered $5.4 million of free cash flow, representing a nearly four-fold increase in free cash flow year-over-year.

Before I turn the year-over-year comparison, I just wanted to remind everyone again, as I did in our last earnings call, that we had a large delivery on a lower margin program in our Secure Networks business last year that’s pulled forward from the second quarter of 2021 to the first quarter of 2021 per the request of our customer.

The accelerated delivery caused with Secure Networks contribution to total revenue to shift from 60% in the first quarter of 2021 to 40% in the second quarter of 2021 and gross margin to shift from 25.9% in the first quarter of 2021 to 42% in the second quarter of 2021, thereby skewing some of the second quarter year-over-year comparisons this year.

So I will provide year-over-year comparisons for the second quarter as usual and also for the first half overall to normalize for the accelerated shipment from the second quarter to the first quarter of last year.

Okay, with that backdrop, I will go into details. For the second quarter, total sales were $55.8 million, up 11% sequentially and up 4% year-over-year. Performance about the high end of the guidance range of $50 million to $54 million was driven by favorable timing variances and pre-existing higher margin programs in Security Solutions and strong supply chain management in Secure Networks.

Security Solutions sales were $30.8 million, up 15% sequentially and down 4% year-over-year, due to lower revenues on a classified program and the completion of the U.S. Census program, partially offset by growth in other pre-existing programs.

Secure Network sales were $25 million, up 7% sequentially and up 17% year-over-year, due to continued strong supply chain management, higher revenues on major programs and favorable year-over-year comparison due to the previously mentioned large delivery that pulled forward from the second quarter of 2021 to the first quarter of 2021.

Turning to profitability and cash flow, second quarter gross margin was 37.5%, above our guidance range of 33% to 35%, primarily due to the margin outperformance in Security Solutions.

Gross margin contracted 449 basis points year-over-year and gross profit declined 7%. The gross margin contraction was driven by a less favorable sales mix between Security Solutions and Secure Networks compared to last year, as well as gross margin contraction within Secure Networks, both of which were the result of the previously mentioned early shipments in 2021.

Security Solutions revenues as a percentage of total company revenues declined from 60% in 2021 to 55% in 2022, as Secure Networks gross margin contracted nearly 700 basis points to 18%. Security Solution gross margins held constant at 53.3%.

Adjusted EBITDA declined by approximately $700,000 due to lower gross profit, partially offset by lower below the line expenses.

Free cash flow improved nearly four-fold to $5.4 million. The improvement in free cash flow continued the trends from the first quarter of more favorable working capital dynamics compared to last year and created an opportunity to begin returning capital to shareholders.

On May 24th, we announced that our Board of Directors authorized a share repurchase program for up to $50 million of the company’s stock. During the second quarter, we deployed $3 million to repurchase over 360,000 shares at a weighted average price of $8.33 and we have continued repurchasing stock daily during the third quarter. During the third quarter till last Friday, we deployed an additional $1.1 million to repurchase nearly 143,000 shares at a weighted average price of $7.86.

Now let’s recap on the first half overall to normalize for the accelerated shipments from the second quarter to the first quarter of 2021. First half revenues declined 3%. Secure Networks revenues declined 11%, as expected, due to the headwind associated with the ongoing wind down of two large programs in 2022. Security Solutions revenues grew 5%, primarily due to the ramp up of a confidential program.

First half gross margin expanded 374 basis points to 37.6% and gross profit increased 8%. The gross margin expansion was driven by a more favorable sales mix within Security Solutions and Secure Networks, as well as gross margin expansion within Security Solutions.

Security Solutions revenues as a percentage of total company revenues increased from 50% in 2021 to 54% in 2022 and Security Solutions, gross margin expanded 638 basis points to 54.5% due to the ramp of high margin progress. Secure Networks gross margin contracted 206 basis points to 17.2%.

Adjusted EBITDA declined $1.3 million due to higher SG&A, offsetting $2.8 million of higher gross profit.

Lastly, free cash flow was $10.3 million higher due to favorable working capital dynamics, driving significantly better cash flow from operations in the first and second quarters. Overall, our first half has performed ahead of forecast and guidance, primarily due to favorable timing differences -- variances between the second half and the first half in orders and deliveries on pre-existing programs and diligent supply chain management.

Now, let’s turn to slide six to discuss our outlook for the third quarter. For the third quarter, we forecast sales in a range of $58 million to $62 million, up 4% to 11% sequentially and down 10% to 15% year-over-year.

We forecast Security Solutions revenues to be down mid- to high-teens year-over-year, primarily due to the completion of the 2020 Census Program in 2021, lower orders expected on a single pre-existing program and lumpiness of perpetual licensing.

We continue to make good progress on the TSA PreCheck program, but revenues for this program in 3Q, if any, are expected to be de minimis. We expect Secure Networks revenues to be down mid-single digits to mid-teens year-over-year due to the ongoing wind down of two large programs coming to a successful completion.

We expect gross margins to be down approximately 350 basis points to 500 basis points year-over-year, primarily due to a slightly lower weighting of revenues to our high margin Security Solutions segment and revenue within both Security Solutions and Secure Networks mixing lower in the quarter.

Below the line expenses, excluding stock compensation expense, are expected to be approximately $1 million higher due to the ramp of R&D and G&A investments during 2021. Adjusted EBITDA is expected to be $3.5 million to $5 million, representing a 6% to 8% mark.

Now, let’s turn to slide seven to discuss our updated outlook for 2022. For the full year, we have narrowed our revenue range from our prior guidance of$226 million to $257 million to our updated range of $226 million to $242 million. There is no change to the low end of the revenue.

Reduction at the high end of the range reflects lower assumption on TSA PreCheck revenues and new business in the second half, partially offset by higher revenues on pre-existing programs within Security Solutions.

We have lowered and widely narrowed our adjusted EBITDA range from our prior guidance of $21 million to $28 million to our updated range of $18 million to $24 million. The reduction at the high end of the range reflects our -- reflects lower gross profit associated with the corresponding revenue reduction, partially offset by lower than previously forecasted below the line expenses. The reduction at the low end of the range primarily reflects the impact of lower than previously forecasted gross margins on Secure Networks in the second half, including on new business.

Overall, we have performed ahead of forecast in the first half, our core business is performing well and we expect that to continue, pre-existing programs are performing well, sequential sales growth is expected to continue into the third and fourth quarters as originally planned, and we are taking a slightly more cautious approach to new business in the second half in part as a result of the more complex macro environment, which could create some headwinds for our new business growth initiatives in the short-term.

With that, I will pass it back to John who will wrap up on slide eight.

John Wood

Thanks, Mark. In summary, we delivered a solid second quarter during which we formed a new strategic partnership with IBM and outpaced guidance on our key financial metrics. We also delivered gross margin expansion and strong free cash flow in the first half of the year and have begun to return free cash flow to shareholders through share repurchases.

Our core business and pre-existing programs are performing well and we expect that to continue for the balance of the year. We are taking a slightly more cautious approach to new business in the second half of the year and are managing our forecasting expenses accordingly.

With that, we are happy to take questions.

Question-and-Answer Session


Thank you. [Operator Instructions] Our first question comes from the line of that Zach Cummins with D.A. Davidson, oh, I am sorry with B. Riley. Your line is open. Please go ahead.

Zach Cummins

Yeah. Thanks. Good morning. Hi, John. Hi, Mark. Thanks for taking my questions. Mark, I -- my question is really geared towards the updated guidance for the year. I mean can you provide a little more granularity around the assumptions you are making for TSA PreCheck and maybe why you are taking a slightly more cautious approach to new business wins here in the second half of the year?

Mark Bendza

Yeah. Sure, Zach. Thanks for your question. So why don’t I dissect that a little bit for you? So, at the high end of the guidance range, we are taking sales down by $15 million, $11 million of the $15 million is PreCheck net revenue. So we previously assumed $12 million of net revenues for PreCheck at the high end of the guidance, now we are assuming $1 million.

The PreCheck process is progressing well. Obviously, we don’t have the ATO yet and so we felt it appropriate to take that guide down, but certainly, wanted to leave revenue in there as a recognition that we still expect the ATO this year.

The balance of the $4 million, the other $4 million, is really net reductions across the rest of the portfolio, primarily driven by lower assumptions on new business in the second half. The thought there is, even though we are not seeing impact from the more complicated macro environment right now in our core business, our core business is performing very well, it’s not being impacted by the macro environment and you are seeing that in the second quarter results.

But we wanted to acknowledge at least as we scrub the forecast for PreCheck, we wanted to take a broader look at some of the higher risk items in the forecast. For example, anywhere where we are selling new solutions for pre-existing solution to new customers in new end markets, we wanted to take a slightly more cautious approach there.

So that’s the $4 million of additional net reduction. To put that in perspective, at the midpoint of the range that would represent about 80 basis points of year-over-year growth, so a very modest reduction as a nod, in part to the macro environment, but very modest nonetheless.

On adjusted EBITDA at the high end of the guidance, we are taking by $4 million. That is the reduction in the gross profit corresponding to the revenue reduction, partially offset by reduction in below the line expense.

And then at the low end, no change to sales, but what you are seeing in the $3 million of lower adjusted EBITDA is lower gross margin on Secure Networks, primarily in new business in the second half.

Zach Cummins

Understood. That’s helpful. Much appreciated and best of luck in the coming quarter.

John Wood

Thanks a lot.

Mark Bendza

Thanks, Zach.


Our next question comes from the line of Rudy Kessinger with D.A. Davidson. Your line is open. Please go ahead.

Rudy Kessinger

Hey, guys. So just following up on that question there, I guess, the $4 million reduction at the top end, I am just more conservative ex-TSA and the rest of the portfolio. I guess, I would just ask, the channel and the direct sales reps, are they meeting your expectations from, say, the start this year on pipeline build in sales production as we get into the second half year? And then, secondly, on IBM, do you have anything incremental baked into the guide this year for IBM? And I guess just bigger picture, how big of a driver or growth -- how much can IBM be, say, in maybe 2023?

John Wood

Hey, Rudy. This is John. I will take first -- I will take the second question and I will ask Mark Griffin to answer the first one. As it relates to IBM, we have a couple of hundred thousand dollars in our model for purposes of this year.

As it relates to the -- how big it can be, we think it could be quite sizable and that’s not a good number -- that’s not me able to provide you -- I am not able to provide you a modeling perspective as yet. What I can say however is that the their pipeline is filling up quite rapidly with what I would consider to be Tier 1 names, large car manufacturers, large banks, large pharmaceutical companies, countries, et cetera, places that I think would be very difficult for us to get into on our own and really what’s happened is that they have embedded Xacta as their launch partner in their advanced -- its governance solutions. So I think it’s got a lot of potential in front of us.

As we put out our guide for 2023, I am sure we will provide you much greater detail. But I am quite happy with how the -- how that relationship is really coming out in a fully blossoming way, much like I had hoped it was going to be with the cloud service providers, but they have been quite, as you are well aware slower. So here IBM is completely embraced it. They are also looking at using it internally. So I think there is a great opportunity for us with IBM in over the next five years to 10 years. And Mark, if you have a mic, can you answer the first question on the sales force?

Mark Griffin

Sure. Hello, Rudy. Mark Griffin. Commercial adoption is happening, but obviously, we took a more cautious and slower approach than initially planned. We are ongoing and continuing to fine-tune the staff, not only in the sales area, but also increase the capture and business development areas to achieve operational efficiencies and maximize our potential.

So, yes, we are seeing progress. The pipeline is increasing. We are seeing some opportunities that will close in late Q3 and in Q4. But we continue to fine-tune that staff and look for additional opportunities and growth from additional -- look across operations in the sales and Capture BD areas.

John Wood

Go ahead.


Thank you. And our next question comes from the line of Alex Henderson with Needham & Company. Your line is open. Please go ahead.

Alex Henderson

Thanks. I am going to break little bit, just ask two questions, one just why you think there’s any improvement in TSA. The primary question is on the Xacta. It’s very difficult looking at the numbers to cut through the noise and understand exactly what’s going on with the product. Can you provide us some sense of what the growth rate, based on your current guidance for Xacta on a full year basis? Is it actually producing double-digit growth, is it flat, is it up 20%? What -- can you just provide us some parameters around what the true underlying growth rate is, because it’s kind of lost in the numbers?

Mark Bendza

Yeah. Hey, Alex. It’s Mark. So on our Information Assurance business for 2022, I mean, as you know, we don’t guide at that level. But I would say, we are probably going to end somewhere in the -- we are probably going to be somewhere in the, call it, low-to-mid single digits on the year, say, mid single-digit on year, higher at the high end of the range but, call it, midpoint -- kind of mid-single digits.

Alex Henderson

And the reason for the TSA optimism that it actually was going to close, I mean, you thought it was going to close in September, then you thought it was going to close at the end of the year, now we are still thinking it’s somehow going to close and that it’s improved. What makes you think that?

Mark Griffin

Sure, Alex. This is Mark Griffin. So ultimately we follow TSA guidelines and schedule for launch. We are engaged with them extensively on a daily basis going through their launch plan and their security approvals.

We are getting to the end of that schedule and we are in this process now deploying to our enrollment sites and gearing up training and operational enrollment capabilities for those site. So every indication is we are following TSA schedule. They are positive on our results at this point and we fully expect to launch this year.

Alex Henderson

So just so I understand, when you say gearing up training, they have been instructed you to train your employees and they are -- they understand that that’s an expense you are carrying and therefore they wouldn’t stretch that…

Mark Griffin


Alex Henderson

…ask you do that it would if it wasn’t imminent. Is that the right way we should be reading that?

John Wood

Yeah. Would you explain a little more about...

Mark Griffin


John Wood

…if you could.

Mark Griffin

Alex, yeah, the entire program is under guidance and policy and procedures from TSA. So every aspect of the program is reviewed and approved by TSA. And so everything we do from approval of sites, to training of personnel, to our soft launch, to our security process and procedures are all controlled by TSA.

So, yes, TSA reviews every document. There are contractual delivery -- deliverables that we have to adhere to on every aspect of this launch. So, yes, TSA is the ultimate approval of when we launch, but we are meeting their schedules and we are doing everything that they are asking in the time frame they are asking for a launch this year.

Alex Henderson

Great. Thanks.


Thank you. And our next question comes from the line of Nehal Chokski with Northland Capital Markets. Your line is open. Please go ahead.

Nehal Chokski

Yeah. Thank you and congrats on the solid results and commend you, Mark, on especially a clear guidance deck. Thank you very much for that. Where are you guys in terms of percent of software billing sold on a term basis versus perpetual basis now and relative to the one, two and four quarters ago?

John Wood

That’s a good question. I would say, the majority of what we are selling now, Nehal, is subscription or term versus perpetual and that’s true in our pipeline as well that the vast majority of our pipeline are subscription oriented.

There are a couple of exceptions. There are a couple of government examples that are exceptions, but the vast majority of the remaining pipeline, whether you are talking about ACA or Ghost or you are talking about Xacta, there are going to be subscription based or term based licenses versus perpetual.

Nehal Chokski

Okay. Great. And how much of an impact does that transition have on the projection of low-to-mid single-digit growth for Xacta?

John Wood

It has a -- it definitely has an impact. I don’t know the number off the top my head. But in the past, when we would do, say, we did $6.5 million in revenue. That was all perpetual. My guess right now is we are at about 60% or 50% perpetual currently and I think going forward it’s going to be -- the vast majority is going to be term or subscription.

Nehal Chokski

And then to be clear, what is -- for every dollar of perpetual that’s capitalized into term, what the...

John Wood

Basically -- what that basically means is, if I am delivering on a $6 million number for the year and it’s all term, I have got to deliver $12 million of orders by no later than June 30th.

Nehal Chokski

Got it. Great. Thank you. And then my last question is that, Mark you alluded to in terms of a more cautious outlook on the macro being part out for the $15 million take down on the high end of the guidance, but that you are not seeing any impact yet. Why do you think you are not seeing any impact yet?

Mark Bendza

Correct. So what I am distinguishing between there is our core business. Our core business has been very strong through the first half of the year and including in the second quarter as the macro became choppier. So we are not seeing any impact there. I think it’s really just the nature of our portfolio and the customers and markets that we serve.

And then, for the second half, again, slightly outside of our core business where we are selling either new solutions or pre-existing to new end markets and customers, we just wanted to take a finer point on that forecast. And again, the net effect is only 80 basis points of year-over-year growth.

Nehal Chokski

Thank you.


Thank you. And our next question comes from the line of Brad Clark with BMO. Your line is open. Please go ahead.

Brad Clark

Hi. Thanks for taking my question. I want to ask a question about the sort of new business slowdown and how it’s in the guide and so much more of a clarification. And what I am trying to understand is, the deals out there that are sort of being pushed back either by the customers or from Telos’ perspective given the sort of proposed margin profile and it’s more not so good business at this or is it, yeah, it basically trying to understand between those two, more from the customer side or from Telos’ side to sort of push back and delay the new business? That’s it from me. Thank you.

John Wood

So, it depends on the customer’s side, Brad. The government’s side is always -- it takes longer than people think and that we have mainly built into our guide. On the commercial side, I think, we are actually having success. But what’s happening is they are starting small and building out over time.

So we landed another commercial customer in this quarter. It started out being a six-digit, if you will starting place for it, but we expect it to be more like a seven-digit plus opportunity for us per year as they rollout Xacta throughout their offerings.

So I would say that on the commercial side, there is more of a try it and buy it, they are going to buy it small and then build out over time whereas in the markets that were more well known as in the Federal Government, there is some level of doing a pilot, but it’s a much more controlled pilot and it typically has a very, very specific beginning, middle and an end. And there the customers will go to an enterprise-wide license more quickly just based on the reputation that we have.


Thank you. And we do have a follow-up question from the line of Alex Henderson with Needham & Company. Your line is open. Please go ahead.

Alex Henderson

Great. Thank you very much. So I was hoping you could talk a little bit about what’s going on with the voice-over-AWS and is your big chunk of the story when you guys came out was that those guys were going to be reselling it starting kind of in the beginning of this year and that they thought it was a big driver of acceleration of the -- their services business, yet that doesn’t seem to be materializing. Can you talk about what the environment is there and why it’s taking so long or not metastasizing?

John Wood

Metastasizing. That’s a good word. Thank you, Alex. I think it is taking longer. It is frustrating. They continue to use it internally. There are pockets of the organization that still want to build their own capabilities and it is moving but slowly, whereas on the other hand IBM made that -- made the decision not to build, but buy using Xacta as a -- as their launch partner.

And so, there we have a situation that a service provider is using us in the way that I was hoping the cloud providers are going to use us. It doesn’t mean the cloud providers aren’t going to get there. It’s just that they have not got there yet. They do continue to use us. They continue to use us more and more.

On the exact -- one of the exact awards we had that we haven’t announced the name on, it started out in the intelligence community. They see the value of the intelligence community. Now they are bringing us into their Department of Defense side of the business. And then, ultimately, we want to be in a commercial world. So each of the cloud providers has looked at it and gone about it in a little bit of a different way.

In the case of Azure, there has been quite a bit of turnover on the security and compliance side of their house. So we have had to sort of start-over in the case of Azure. And so, each cloud provider has a little bit of a story associated with it, but it is frustrating.

Alex Henderson

Similarly, can you talk a bit about the Ghost product and the progress or what’s going on there in terms of commercializing it into a product that’s used outside of the government security infrastructure play?

John Wood

Sure. And actually you made a comment that I’d like to extend a little bit. One of the things that we have learned about our Xacta is that, it’s in the language of the government. And one of the things that we have to do is we have had to really change verbiage, how we describe things that we do inside of Xacta and I will provide you an example.

There is something called a poem in the government world. Poem doesn’t mean anything to the commercial guys. Remediation is the commercial equivalent of a poem. So we had to make changes in the product itself that more reflected what it is that the commercial world want it, which was also something that we had to build in.

As it relates to the -- as it relates to Ghost, we have continued to -- we have got continued progress with JCI offering Ghost as an embedded option with their cameras. Those cameras will -- if you will be hidden on the internet and their security product sales continue to be a very healthy growing business.

We expect a small level of sales out of that to happen late this year with this offering. And again, you have had some -- not turnover but promotions over there. So getting it off the ground has just taken longer than we would have liked.

Having said that, there are other organizations that are looking to do very similar things with JCI and we are in the midst of negotiating those -- with those other players and our hope is that we will be able to roll out some other announcements about how we are building that capability inside of these other players.

Now just to remind you, what we do with advanced cyber analytics is, all of that activity is hidden behind Ghost as well. So there is a -- there are opportunities for us with Ghost, both within our existing customer set, as well as selling through other players.

Alex Henderson

Since we are going around into the second round of questions, I am going to ask one more, if it’s okay. If not then just let me know. But I was hoping you could talk a little bit about the security networking business. It sounds like some projects were pulled forward in that business into the first half and just the favorable timing comment. Does that mean that you are expecting little less in the back half of the year from Security Networks?

Mark Griffin

So not in Secure Networks, the dynamic within Secure Networks, the team there is doing a really terrific job of managing their supply chain risk. And so when we set guidance we account for their supply chain risk in guidance and they have been outperforming that risk. So the program management teams there are doing a terrific job and outperforming guidance.

The pull-forward I think that you are referencing is more or less Security Solutions side. We did have some higher margin order on one program in particular within that business that came into the second quarter that we were otherwise expecting to more so come in the second half. So that’s the favorable…

Alex Henderson

Mark, but….

Mark Griffin


Alex Henderson

But if you pulled forward the availability of supply then you deployed products sooner than expected. Doesn’t that come out of your pipeline?

Mark Griffin

I am not sure I understand the question.

Alex Henderson

You have got an order from a government agency to deploy a, I don’t know, choose a location…

Mark Griffin

Oh! I think that lapped….

Alex Henderson

… and you told you can’t deploy because you don’t have the product…

Mark Griffin

Alex I think you are ….

Alex Henderson


Mark Griffin

…referring to last year. You are talking about the pull-forward last year in 2021, the pull-forward on the Secure...

Alex Henderson

No. I am not. Mark, I am talking about the current environment. You used that as an example because they could not know specifically which projects we are involved. But you have a pipeline of business that you need to deploy gear for in order to get the revenue. If you get the parts sooner than expected…

Mark Griffin


Alex Henderson

… then that do reduces your pipeline into the forward period, correct?

John Wood

That assumes that the pipeline is static, Alex. So the pipeline is not static.

Alex Henderson

Okay. So there’s no erosion in the outlook for the back half of the year within that because of the pull-forward of parts?

John Wood

Not the revenue line.

Mark Griffin


Alex Henderson

Thank you. That’s what I was looking for.

Mark Griffin

Yeah. Thanks, Alex.


Thank you. And I am showing no further questions and I would like to turn the conference back over to John Wood for any further remarks.

John Wood

Oh! Thank you very much, Operator. Well, first, I really well thank our shareholders for your ongoing support. And despite the current environment, I am pleased with our exact performance. And well, our year-to-date has progressed as we have expected. We are taking a balanced approach to the second half and we remain very focused on delivering for our customers and our shareholders. And again, I just want to say thank you to all of you for listening and to the analysts for asking questions and covering our stock. Thanks a lot everybody.


This concludes today’s conference call. Thank you for participating -- this concludes today’s conference call. Thank you for participating. You may now disconnect. Everyone, have a great day.

Tue, 09 Aug 2022 06:45:00 -0500 en text/html
Killexams : InterPro Solutions and Customer Southern Company Win Award

InterPro Solutions is lauded for their innovative Maximo mobile solution and technology leadership.

AUSTIN, Texas, Aug. 09, 2022 (GLOBE NEWSWIRE) -- From MaximoWorld 2022, InterPro Solutions, which offers the first and only suite of mobile solutions designed exclusively for IBM Maximo®, announced today that, along with its client Southern Company, it has been named winner of the MaximoWorld Award for Best Mobility.

InterPro offers a suite of Maximo mobile apps built exclusively for Maximo that O&M teams need to do their jobs efficiently and effectively without the cost, complexity, and service impacts of available alternatives. InterPro’s EZMaxMobile provides companies with a mobile app that extends and enhances the capabilities of Maximo to technicians in the field – both online and offline.

Southern Company’s Power Delivery division implemented EZMaxMobile in 2021 while simultaneously implementing Maximo. Southern Company’s Power Delivery group provides distribution and transmission of electricity to over nine million customers across multiple states. The Power Delivery team maintains assets such as poles, power lines, and transformers spread across a wide geographic area, which led to Southern Company’s need for a mobile work management tool to accompany Maximo. Southern Company chose EZMaxMobile as their mobile solution because it had the best fit with Maximo.

With EZMaxMobile, Southern Company’s technicians are able to easily perform inspections, oversee work management and make updates to work orders in the field. They are also able to view multi-layer maps that show the location of power lines, transformers, sub-stations, and poles to more easily complete their work. Productivity has increased substantially since they can complete all their work in the field – even in areas without data connectivity when using EZMaxMobile’s patented offline mode.

"This award highlights the Power Delivery group’s achievements with Maximo and mobility. We are very proud to be recognized as a leader in our industry with our success using EZMaxMobile to better perform inspections, maintenance, andrepairs across our wide geographic area,” said Matthew Dudley, Application Analyst at Southern Company.

“We are very honored to receive this MaximoWorld Award in conjunction with Southern Company,” said Bill Fahey, CEO, InterPro Solutions. “This award again recognizes InterPro Solutions as a leader in the Maximo mobile space and showcases our continued commitment to our clients’ success.”

In addition, InterPro’s Jeffrey Smith, Senior Director of Business Process Solutions (formerly Executive Director for Facilities Services at Harvard University), will speak at MaximoWorld 2022 on “Next Generation Mobility for Field Supervisors” discussing how field supervisors are being empowered with next generation work management tools that are able to leverage high value information and processes beyond the bounds of Maximo, on Aug 9 at 3:30pm CST and “New Generation Planning and Scheduling Tools,” exploring how American Electric Power (AEP) solved planning and scheduling challenges across Fleet Maintenance Operations, on Aug 10 at 3:30pm CST.

To learn more about EZMaxMobile and InterPro’s EZMax Suite of products, visit

About Southern Company
Southern Company (NYSE: SO) is a leading energy company serving 9 million customers through its subsidiaries. The company provides clean, safe, reliable and affordable energy through electric operating companies in three states, natural gas distribution companies in four states, a competitive generation company serving wholesale customers across America, a leading distributed energy infrastructure company, a fiber optics network and telecommunications services. Southern Company brands are known for excellent customer service, high reliability and affordable prices below the national average. For more than a century, we have been building the future of energy and developing the full portfolio of energy resources, including carbon-free nuclear, advanced carbon capture technologies, natural gas, renewables, energy efficiency and storage technology. Through an industry-leading commitment to innovation and a low-carbon future, Southern Company and its subsidiaries develop the customized energy solutions our customers and communities require to drive growth and prosperity. Our uncompromising values ensure we put the needs of those we serve at the center of everything we do and govern our business to the benefit of our world. Our corporate culture and hiring practices have been recognized nationally by the U.S. Department of Defense, G.I. Jobs magazine, DiversityInc, Black Enterprise, Forbes and the Women's Choice Award. To learn more, visit

About InterPro Solutions
InterPro Solutions, an IBM Business Partner, offers the first and only suite of mobile Enterprise Asset Management (EAM) solutions designed exclusively for IBM Maximo – using native Maximo rules, permissions and datastores – eliminating double updates, data lags, and synchronization failures. InterPro’s EZMax Suite expands upon native Maximo capabilities to mirror the way people actually work – with intuitive navigation, rapid app response, and rich functionality – allowing operations and maintenance professionals to effectively communicate with their community members and manage tasks, technicians, and vendors in a way that improves responsiveness to their organizations. To learn more, visit

Media contact:
Melissa Tyler
[email protected]

Primary Logo

[ Back To's Homepage ]

Tue, 09 Aug 2022 06:25:00 -0500 text/html
Killexams : Amentum Recognized as the Best Maximo Asset Data Governance Program at the 2022 MaximoWorld Conference

Amentum, a premier global government and private-sector partner supporting the most critical missions of government and commercial organizations worldwide, was awarded Best Maximo Asset Data Governance Program at the 2022 MaximoWorld Conference in Austin, TX, today for their work with Interloc Solutions, Inc. (Interloc) in optimizing data and decision making through a progressive and reliability centered data governance strategy. MaximoWorld, hosted by, for over 20 years, has been the largest cross-industry gathering for Maximo users, partners, and subject matter experts.

AUSTIN, Texas, Aug. 9, 2022 /PRNewswire-PRWeb/ -- Amentum, a premier global government and private-sector partner supporting the most critical missions of government and commercial organizations worldwide, was awarded Best Maximo Asset Data Governance Program at the 2022 MaximoWorld Conference in Austin, TX, today for their work with Interloc Solutions, Inc. (Interloc) in optimizing data and decision making through a progressive and reliability centered data governance strategy. MaximoWorld, hosted by, for over 20 years, has been the largest cross-industry gathering for Maximo users, partners, and subject matter experts.

Through an IBM Maximo Asset Management data governance strategy, elevated and enabled by Interloc's Mobility first philosophy to EAM, Amentum fosters excellent data quality, asset knowledge, and decision-making for its clients its clients around the world. By employing a program that enhances data quality through real-time visibility, and improved inspections and data readings, Amentum increases its asset knowledge, and predictive maintenance capabilities and analysis, giving its clients a proactive edge. Amentum's award-winning approach decreases mean time to repair (MTTR) and increases mean time between failures (MTBF) for its clients' key assets by taking advantage of the quality Maximo data gained via Mobile Informer and analyzing it through a robust data analytics platform.

Amentum's emphasis on mobility has also resulted in significant gains for its clients' sustainability initiatives. Mobile Informer's ability to eliminate reliance on paper-based procedures and immensely Strengthen data collection and quality led to one client in particular saving tens of thousands of dollars in annual paper, toner, and labor costs, as well as a near 20,000 lbs reduction in annual C02 emissions.

Data drives decision-making, and thanks to the powerful capabilities of and an innovative and progressive approach to Maximo, Amentum is making the best possible decisions based on the highest quality data for its clients around the world.

About Amentum

Headquartered in Germantown, Md., Amentum is a premier global services partner supporting critical programs of national significance across defense, security, intelligence, energy, commercial and environmental sectors. Amentum employs approximately 57,000 people on all seven continents and draws from a century-old heritage of operational excellence, mission focus, and successful execution. Amentum's reliability-centered and data-driven approach to asset management has proven successful at critical industrial and manufacturing facilities across various industries and facilities, such as pharmaceutical, life sciences, heavy industrial manufacturing, chemical refinement, aviation, automotive production, data centers, consumer production, industrial production, and more.

Learn more about Amentum at .

About Interloc Solutions

Since 2005, Interloc Solutions, an IBM Gold Business Partner and the largest independent IBM Maximo Enterprise Asset Management systems integrator in North America, has been helping clients and partners realize the greatest potential from their Maximo investment, providing application hosting, innovative consulting, and managed services. Interloc has enhanced the implementation and adoption of Maximo through its transformative Mobile Informer solution, which is currently in use across a wide range of disciplines and industries— including U.S. Federal Agencies, Utilities, Transportation, Airport Operations, Manufacturing, Healthcare, and the Oil and Gas.

As a consulting organization of highly qualified technology and maintenance professionals, experienced in all versions of Maximo, Interloc excels in delivering comprehensive, best-practice Maximo EAM consulting services and mobile solutions.

Learn more about Interloc's award-winning services and solutions at .

Media Contact

Scott Peluso, Interloc Solutions, 9168174590,

SOURCE Interloc Solutions

© 2022 Benzinga does not provide investment advice. All rights reserved.

Ad Disclosure: The rate information is obtained by Bankrate from the listed institutions. Bankrate cannot guaranty the accuracy or availability of any rates shown above. Institutions may have different rates on their own websites than those posted on The listings that appear on this page are from companies from which this website receives compensation, which may impact how, where, and in what order products appear. This table does not include all companies or all available products.

All rates are subject to change without notice and may vary depending on location. These quotes are from banks, thrifts, and credit unions, some of whom have paid for a link to their own Web site where you can find additional information. Those with a paid link are our Advertisers. Those without a paid link are listings we obtain to Strengthen the consumer shopping experience and are not Advertisers. To receive the rate from an Advertiser, please identify yourself as a Bankrate customer. Bank and thrift deposits are insured by the Federal Deposit Insurance Corp. Credit union deposits are insured by the National Credit Union Administration.

Consumer Satisfaction: Bankrate attempts to verify the accuracy and availability of its Advertisers' terms through its quality assurance process and requires Advertisers to agree to our Terms and Conditions and to adhere to our Quality Control Program. If you believe that you have received an inaccurate quote or are otherwise not satisfied with the services provided to you by the institution you choose, please click here.

Rate collection and criteria: Click here for more information on rate collection and criteria.

Tue, 09 Aug 2022 04:15:00 -0500 text/html
Killexams : IBM Acquires to Boost Data Observability Capabilities

IBM is acquiring, a leading provider of data observability software that helps organizations fix issues with their data, including errors, pipeline failures, and poor quality. The acquisition further strengthens IBM's software portfolio across data, AI, and automation to address the full spectrum of observability. is IBM's fifth acquisition in 2022 as the company continues to bolster its hybrid cloud and AI skills and capabilities.'s open and extendable approach allows data engineering teams to easily integrate and gain observability into their data infrastructure.

This acquisition will unlock more resources for to expand its observability capabilities for broader integrations across more of the open source and commercial solutions that power the modern data stack.

Enterprises will also have full flexibility in how to run, whether as-a-Service (SaaS) or a self-hosted software subscription.

The acquisition of builds on IBM's research and development investments as well as strategic acquisitions in AI and automation. By using with IBM Observability by Instana APM and IBM Watson Studio, IBM is well-positioned to address the full spectrum of observability across IT operations.

"Our clients are data-driven enterprises who rely on high-quality, trustworthy data to power their mission-critical processes. When they don't have access to the data they need in any given moment, their business can grind to a halt," said Daniel Hernandez, general manager for data and AI, IBM. "With the addition of, IBM offers the most comprehensive set of observability capabilities for IT across applications, data and machine learning, and is continuing to provide our clients and partners with the technology they need to deliver trustworthy data and AI at scale."

The acquisition of further extends IBM's existing data fabric solution by helping ensure that the most accurate and trustworthy data is being put into the right hands at the right time—no matter where it resides.

Headquartered in Tel Aviv, Israel, employees will join IBM Data and AI, further building on IBM's growing portfolio of Data and AI products, including its IBM Watson capabilities and IBM Cloud Pak for Data. Financial details of the deal were not disclosed. The acquisition closed on June 27, 2022.

For more information about this news, visit

Mon, 11 Jul 2022 01:00:00 -0500 en text/html
00M-242 exam dump and training guide direct download
Training Exams List