Free P8010-034 PDF and VCE at

Create sure that a person has IBM P8010-034 Exam dumps of actual questions for the particular Tealeaf Technical Mastery Test v1 practice questions before you choose to take the particular real test. All of us give the most up-to-date and valid P8010-034 questions answers that will contain P8010-034 real examination questions. We possess collected and produced a database associated with P8010-034 exam questions from actual examinations having a specific finish goal to provide you an opportunity to get ready plus pass P8010-034 examination upon the first try. Simply memorize our own P8010-034

Exam Code: P8010-034 Practice exam 2022 by team
Tealeaf Technical Mastery Test v1
IBM Technical learner
Killexams : IBM Technical learner - BingNews Search results Killexams : IBM Technical learner - BingNews Killexams : Data Science- the new kid on the block No result found, try new keyword!According to a report by the US Bureau of Labour Statistics, the rise of data science will create roughly 11.5 million job openings by 2026. Data science has also topped LinkedIn’s Emerging Jobs ... Sat, 16 Jul 2022 23:50:00 -0500 en-in text/html Killexams : IBM extends Power10 server lineup for enterprise use cases

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

IBM is looking to grow its enterprise server business with the expansion of its Power10 portfolio announced today.

IBM Power is a RISC (reduced instruction set computer) based chip architecture that is competitive with other chip architectures including x86 from Intel and AMD. IBM’s Power hardware has been used for decades for running IBM’s AIX Unix operating system, as well as the IBM i operating system that was once known as the AS/400. In more latest years, Power has increasingly been used for Linux and specifically in support of Red Hat and its OpenShift Kubernetes platform that enables organizations to run containers and microservices.

The IBM Power10 processor was announced in August 2020, with the first server platform, the E1080 server, coming a year later in September 2021. Now IBM is expanding its Power10 lineup with four new systems, including the Power S1014, S1024, S1022 and E1050, which are being positioned by IBM to help solve enterprise use cases, including the growing need for machine learning (ML) and artificial intelligence (AI).

What runs on IBM Power servers?

Usage of IBM’s Power servers could well be shifting into territory that Intel today still dominates.


Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

Steve Sibley, vp, IBM Power product management, told VentureBeat that approximately 60% of Power workloads are currently running AIX Unix. The IBM i operating system is on approximately 20% of workloads. Linux makes up the remaining 20% and is on a growth trajectory.

IBM owns Red Hat, which has its namesake Linux operating system supported on Power, alongside the OpenShift platform. Sibley noted that IBM has optimized its new Power10 system for Red Hat OpenShift.

“We’ve been able to demonstrate that you can deploy OpenShift on Power at less than half the cost of an Intel stack with OpenShift because of IBM’s container density and throughput that we have within the system,” Sibley said.

A look inside IBM’s four new Power servers

Across the new servers, the ability to access more memory at greater speed than previous generations of Power servers is a key feature. The improved memory is enabled by support of the Open Memory Interface (OMI) specification that IBM helped to develop, and is part of the OpenCAPI Consortium.

“We have Open Memory Interface technology that provides increased bandwidth but also reliability for memory,” Sibley said. “Memory is one of the common areas of failure in a system, particularly when you have lots of it.”

The new servers announced by IBM all use technology from the open-source OpenBMC project that IBM helps to lead. OpenBMC provides secure code for managing the baseboard of the server in an optimized approach for scalability and performance.


Among the new servers announced today by IBM is the E1050, which is a 4RU (4 rack unit) sized server, with 4 CPU sockets, that can scale up to 16TB of memory, helping to serve large data- and memory-intensive workloads.

S1014 and S1024

The S1014 and the S1024 are also both 4RU systems, with the S1014 providing a single CPU socket and the S1024 integrating a dual-socket design. The S1014 can scale up to 2TB of memory, while the S1024 supports up to 8TB.


Rounding out the new services is the S1022, which is a 1RU server that IBM is positioning as an ideal platform for OpenShift container-based workloads.

Bringing more Power to AI and ML

AI and ML workloads are a particularly good use case for all the Power10 systems, thanks to optimizations that IBM has built into the chip architecture.

Sibley explained that all Power10 chips benefit from IBM’s Matrix Match Acceleration (MMA) capability. The enterprise use cases that Power10-based servers can help to support include organizations that are looking to build out risk analytics, fraud detection and supply chain forecasting AI models, among others. 

IBM’s Power10 systems support and have been optimized for multiple popular open-source machine learning frameworks including PyTorch and TensorFlow.

“The way we see AI emerging is that a vast majority of AI in the future will be done on the CPU from an inference standpoint,” Sibley said.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Mon, 11 Jul 2022 23:01:00 -0500 Sean Michael Kerner en-US text/html
Killexams : Top 10 data lake solution vendors in 2022

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

As the world becomes increasingly data-driven, businesses must find suitable solutions to help them achieve their desired outcomes. Data lake storage has garnered the attention of many organizations that need to store large amounts of unstructured, raw information until it can be used in analytics applications.

The data lake solution market is expected to grow rapidly in the coming years and is driven by vendors that offer cost-effective, scalable solutions for their customers.

Learn more about data lake solutions, what key features they should have and some of the top vendors to consider this year. 

What is a data lake solution?

A data lake is defined as a single, centralized repository that can store massive amounts of unstructured and semi-structured information in its native, raw form. 


Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

It’s common for an organization to store unstructured data in a data lake if it hasn’t decided how that information will be used. Some examples of unstructured data include images, documents, videos and audio. These data types are useful in today’s advanced machine learning (ML) and advanced analytics applications.

Data lakes differ from data warehouses, which store structured, filtered information for specific purposes in files or folders. Data lakes were created in response to some of the limitations of data warehouses. For example, data warehouses are expensive and proprietary, cannot handle certain business use cases an organization must address, and may lead to unwanted information homogeneity.

On-premise data lake solutions were commonly used before the widespread adoption of the cloud. Now, it’s understood that some of the best hosts for data lakes are cloud-based platforms on the edge because of their inherent scalability and considerably modular services. 

A 2019 report from the Government Accountability Office (GAO) highlights several business benefits of using the cloud, including better customer service and the acquisition of cost-effective options for IT management services.

Cloud data lakes and on-premise data lakes have pros and cons. Businesses should consider cost, scale and available technical resources to decide which type is best.

Read more about data lakes: What is a data lake? Definition, benefits, architecture and best practices

5 must-have features of a data lake solution

It’s critical to understand what features a data lake offers. Most solutions come with the same core components, but each vendor may have specific offerings or unique selling points (USPs) that could influence a business’s decision.

Below are five key features every data lake should have:

1. Various interfaces, APIs and endpoints

Data lakes that offer diverse interfaces, APIs and endpoints can make it much easier to upload, access and move information. These capabilities are important for a data lake because it allows unstructured data for a wide range of use cases, depending on a business’s desired outcome.

2. Support for or connection to processing and analytics layers

ML engineers, data scientists, decision-makers and analysts benefit most from a centralized data lake solution that stores information for easy access and availability. This characteristic can help data professionals and IT managers work with data more seamlessly and efficiently, thus improving productivity and helping companies reach their goals.

3. Robust search and cataloging features

Imagine a data lake with large amounts of information but no sense of organization. A viable data lake solution must incorporate generic organizational methods and search capabilities, which provide the most value for its users. Other features might include key-value storage, tagging, metadata, or tools to classify and collect subsets of information.

4. Security and access control

Security and access control are two must-have features with any digital tool. The current cybersecurity landscape is expanding, making it easier for threat actors to exploit a company’s data and cause irreparable damage. Only certain users should have access to a data lake, and the solution must have strong security to protect sensitive information.

5. Flexibility and scalability

More organizations are growing larger and operating at a much faster rate. Data lake solutions must be flexible and scalable to meet the ever-changing needs of modern businesses working with information.

Also read: Unlocking analytics with data lake and graph analysis

Top 10 data lake solution vendors in 2022

Some data lake solutions are best suited for businesses in certain industries. In contrast, others may work well for a company of a particular size or with a specific number of employees or customers. This can make choosing a potential data lake solution vendor challenging. 

Companies considering investing in a data lake solution this year should check out some of the vendors below.

1. Amazon Web Services (AWS)

The AWS Cloud provides many essential tools and services that allow companies to build a data lake that meets their needs. The AWS data lake solution is widely used, cost-effective and user-friendly. It leverages the security, durability, flexibility and scalability that Amazon S3 object storage offers to its users. 

The data lake also features Amazon DynamoDB to handle and manage metadata. AWS data lake offers an intuitive, web-based console user interface (UI) to manage the data lake easily. It also forms data lake policies, removes or adds data packages, creates manifests of datasets for analytics purposes, and features search data packages.

2. Cloudera

Cloudera is another top data lake vendor that will create and maintain safe, secure storage for all data types. Some of Cloudera SDX’s Data Lake Service capabilities include:

  • Data schema/metadata information
  • Metadata management and governance
  • Compliance-ready access auditing
  • Data access authorization and authentication for improved security

Other benefits of Cloudera’s data lake include product support, downloads, community and documentation. GSK and Toyota leveraged Cloudera’s data lake to garner critical business intelligence (BI) insights and manage data analytics processes.

3. Databricks 

Databricks is another viable vendor, and it also offers a handful of data lake alternatives. The Databricks Lakehouse Platform combines the best elements of data lakes and warehouses to provide reliability, governance, security and performance.

Databricks’ platform helps break down silos that normally separate and complicate data, which frustrates data scientists, ML engineers and other IT professionals. Aside from the platform, Databricks also offers its Delta Lake solution, an open-format storage layer that can Excellerate data lake management processes. 

4. Domo

Domo is a cloud-based software company that can provide big data solutions to all companies. Users have the freedom to choose a cloud architecture that works for their business. Domo is an open platform that can augment existing data lakes, whether it’s in the cloud or on-premise. Users can use combined cloud options, including:

  • Choosing Domo’s cloud
  • Connecting to any cloud data
  • Selecting a cloud data platform

Domo offers advanced security features, such as BYOK (bring your own key) encryption, control data access and governance capabilities. Well-known corporations such as Nestle, DHL, Cisco and Comcast leverage the Domo Cloud to better manage their needs.

5. Google Cloud

Google is another big tech player offering customers data lake solutions. Companies can use Google Cloud’s data lake to analyze any data securely and cost-effectively. It can handle large volumes of information and IT professionals’ various processing tasks. Companies that don’t want to rebuild their on-premise data lakes in the cloud can easily lift and shift their information to Google Cloud. 

Some key features of Google’s data lakes include Apache Spark and Hadoop migration, which are fully managed services, integrated data science and analytics, and cost management tools. Major companies like Twitter, Vodafone, Pandora and Metro have benefited from Google Cloud’s data lakes.

6. HP Enterprise

Hewlett Packard Enterprise (HPE) is another data lake solution vendor that can help businesses harness the power of their big data. HPE’s solution is called GreenLake — it offers organizations a truly scalable, cloud-based solution that simplifies their Hadoop experiences. 

HPE GreenLake is an end-to-end solution that includes software, hardware and HPE Pointnext Services. These services can help businesses overcome IT challenges and spend more time on meaningful tasks. 

7. IBM

Business technology leader IBM also offers data lake solutions for companies. IBM is well-known for its cloud computing and data analytics solutions. It’s a great choice if an operation is looking for a suitable data lake solution. IBM’s cloud-based approach operates on three key principles: embedded governance, automated integration and virtualization.

These are some data lake solutions from IBM: 

  • IBM Db2
  • IBM Db2 BigSQL
  • IBM Netezza
  • IBM Watson Query
  • IBM Watson Knowledge Catalog
  • IBM Cloud Pak for Data

With so many data lakes available, there’s surely one to fit a company’s unique needs. Financial services, healthcare and communications businesses often use IBM data lakes for various purposes.

8. Microsoft Azure

Microsoft offers its Azure Data Lake solution, which features easy storage methods, processing, and analytics using various languages and platforms. Azure Data Lake also works with a company’s existing IT investments and infrastructure to make IT management seamless.

The Azure Data Lake solution is affordable, comprehensive, secure and supported by Microsoft. Companies benefit from 24/7 support and expertise to help them overcome any big data challenges they may face. Microsoft is a leader in business analytics and tech solutions, making it a popular choice for many organizations.

9. Oracle

Companies can use Oracle’s Big Data Service to build data lakes to manage the influx of information needed to power their business decisions. The Big Data Service is automated and will provide users with an affordable and comprehensive Hadoop data lake platform based on Cloudera Enterprise. 

This solution can be used as a data lake or an ML platform. Another important feature of Oracle is it is one of the best open-source data lakes available. It also comes with Oracle-based tools to add even more value. Oracle’s Big Data Service is scalable, flexible, secure and will meet data storage requirements at a low cost.

10. Snowflake

Snowflake’s data lake solution is secure, reliable and accessible and helps businesses break down silos to Excellerate their strategies. The top features of Snowflake’s data lake include a central platform for all information, fast querying and secure collaboration.

Siemens and Devon Energy are two companies that provide testimonials regarding Snowflake’s data lake solutions and offer positive feedback. Another benefit of Snowflake is its extensive partner ecosystem, including AWS, Microsoft Azure, Accenture, Deloitte and Google Cloud.

The importance of choosing the right data lake solution vendor 

Companies that spend extra time researching which vendors will offer the best enterprise data lake solutions for them can manage their information better. Rather than choose any vendor, it’s best to consider all options available and determine which solutions will meet the specific needs of an organization.

Every business uses information, some more than others. However, the world is becoming highly data-driven — therefore, leveraging the right data solutions will only grow more important in the coming years. This list will help companies decide which data lake solution vendor is right for their operations.

Read next: Get the most value from your data with data lakehouse architecture

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Fri, 15 Jul 2022 09:40:00 -0500 Shannon Flynn en-US text/html
Killexams : IBM Power Systems Academy Takes Charge at 300 Institutions

IBM's Power Systems Academic Initiative reached a milestone in February: More than 300 higher education institutions worldwide have signed on to the program.

The initiative delivers access to new technologies that help students dive deep into lessons on Big Data, cloud, mobile and social. Higher education coursework based on IBM's Power Systems gives students a wide range of experiences. IBM's Power Systems servers are used across several industries. The company's Watson computer is even based on these machines and is forging a new path for IBM in the realm of cognitive computing.

Students learning through Power Systems will be oriented to Linux-, IBM i- and AIX-based operating systems. Participating schools are offered coursework, software, access to technical libraries and experts, along with remote, virtual access to a Power Systems environment, according to an IBM news release.

“Linux and other open innovation platforms have become a primary source of development in today’s technology marketplace, and companies are looking to colleges and universities to produce a workforce equipped with the skills required to innovate in these environments,” Terri Virnig, vice president of Power Systems ecosystem and strategy at IBM, said in the news release.

The company's initiative has grown 152 percent since 2012 and is now being used in more than 300 schools around the world, including more than 150 in the U.S.

A graduate-level course in enterprise data management is taught at New York University's Polytechnic School of Engineering using IBM's approach. Raman Kannan, an adjunct professor of technology management at the school, said the academic initiative allows students the chance to "focus on the techniques and principles instead of infrastructure." Learn more about IBM’s Power Systems Academic Initiative on the official website.

Sun, 26 Jun 2022 12:00:00 -0500 D. Frank Smith en text/html
Killexams : Digital Realty: 4% Dividend With Huge Data Center Growth
Data Center Female e-Business Enrepreneur and Male IT Specialist talk, Use Laptop. Two Information Technology Engineers work on Bridge Big Cloud Computing Server Farm.


The pandemic has caused an acceleration in the adoption of Digital technology. It made digital services a "must have" and not just an optional extra. From Mobile Phones to Ecommerce, Social Media to Video Conferencing, this technology has become part of our everyday life. When we use digital technology, it just "works" but we don't think about the gigabytes of data flowing from Data Center to Data Center all over the world.

However, the innovation doesn't stop there from 5G to AI, Internet of Things (IoT) to the Cloud, all these services require even more infrastructure to function at scale. Picking the winning AI or 5G company can be a challenge, and thus I prefer to invest into the "backbone" digital infrastructure which supports these players. Think of this like a "Digital Toll Road" collecting rents and dishing you out dividends, while the tech companies battle it out for market share. "Digital Transformation" is also a key term which has become popular. Enterprises are realising their legacy IT infrastructure is costly, complicated and not flexible, thus many are moving to the "cloud" which is a fancy word for a Data Center.

"Looking toward 2023, most companies will need to build new digital businesses to stay economically viable." - McKinsey Global Survey, May 2021

The global data center market size was worth $216 billion in 2021 and is predicted to reach $288 billion by 2027, growing at a CAGR of 4.95%.

Thus, in this report, I'm going to dive into Digital Realty Trust (NYSE:DLR), a best-in-class owner of Data Centers with over 290 facilities globally. The company acts as the backbone infrastructure for many larger technology providers, from Facebook to Oracle. The stock price has plummeted 30% from the highs in December 2021 and is now trading at the March 2020 lows. From my valuation, the stock is undervalued relative to two large competitors in the industry. Let's dive into the Business Model, Financials and Valuation for the juicy details.

DLR stock Chart
Data by YCharts

Business Model

Digital Realty Trust serves a vast customer base of over 4,000 customers, which is diversified across multiple industries from Social Media to IT, Finance and Telecoms. Its largest customers include giants such as; Facebook, Oracle, Verizon, IBM, JPMorgan, LinkedIn, AT&T and many more. These established customers demand best-in-class speed, redundancy and security.

Digital Realty overview

Digital Realty (Investor Report June 2022)

The company's facilities are well diversified globally across 25 countries and 50 metro areas. With 58% of its facilities in North America, 27% in EMEA, 10% in APC and 5% in Latin America. This global and customer diversification should help to ensure stable cash flows, as when one country or industry is going through a recession, others will be thriving.

Global Diversification

Global Diversification (Investor presentation 2022)

Digital Realty also has a variety of equipment offerings which can appeal to various customers at different price points and stages of their digital transformation journey. From the Network Access Nodes to Aggregation and vast server farms for Hyper-scalers.

Digital Realty - connected campus

Technical details (Investor presentation 2022)

An example of a customer use case can be seen on the animation below. Here a "Self Service" option enables Coverage, Connectivity and Capacity to be dynamically controlled. While its "Any To Any" Interconnection enables low latency data transfer between multiple services.

Digital Realty - Platform Digital 2.0

Platform Digital (Investor Presentation 2022)

The Data Hub is another high growth area, as legacy companies historically have an issue with "Siloed Data". Companies are generating more "big data" than ever, but it's usually stuck in various departments and not utilised. However, by moving data services to the cloud, it can be aggregated and analyzed with various analytics and machine learning processes more easily. For instance, Hewlett Packard Enterprises (HPE) and its GreenLake technology have recently partnered with Digital Realty to help companies with digital transformation and data aggregation.

Digital Realty Data Hub

Data Hub (Investor presentation June 2022)

To expand its portfolio, Digital Realty is growing via acquisition and has made a vast number of global investments over the past few years. A few years back I wrote a post on the "consolidation" of the data center industry and the "land grab" tactics companies are using, and now it seems these are playing out.

Digital Realty Acquisitions

Digital Realty Acquisitions (Investor report 2022)

Growing Financials

Digital Realty has experienced strong growth in bookings over the past few years, with a major upwards trend seen since the pandemic, which acted as a catalyst for companies to "Digitally transform".

Digital Realty Bookings

Bookings (Investor presentation 2022)

Revenue saw a sharp uptick at the end of 2021 and for the first quarter of 2022, it popped to $1.19 billion, up 11% year over year.

Digital Realty revenue estimates
Data by YCharts

Funds from Operations per share (FFO) saw a slight down tick between the high of $1.78 in Q221 and $1.54 in Q321. However, since then it is starting to recover and was up 6.6% year over year to $1.60 for Q122. Management is expected a further uptick for the rest of 2022, due to record backlog of $391 million in the first quarter.

Digital Realty FFO per share
Data by YCharts

Digital Reality has had a strong retention rate of ~78% historically, which is fantastic as it means the tenants are finding immense value in the service. Transforming legacy IT infrastructure to the cloud is a technical, time-consuming and a costly process. Thus, I believe once installed and set up with a cloud provider, the likelihood of moving providers (even if another is slightly cheaper is rare). Thus, immense "stickiness" is seen in the industry, as you can see with the high retention rate. There has been a slight dip in the trailing 12-month rate, but it does look to be recovering.

Digital Realty Retention

Retention (Earnings Q122)

Digital Realty has staggered lease expiration dates, with 17.8% in 2022 and 18.3% in 2023, thus I expect some volatility in the next two years. However, the rest of the lease expiration dates are well diversified across many different years ahead, thus this should guarantee stable operations.

Digital Realty Lease Expirations

Lease Expirations (Investor presentation)

The Enterprise value of Digital Realty has increased by 47% since 2018, from $35.5 billion to $52.1 billion, which is a testament to the company's acquisition strategy. The top 20 tenant concentration has also decreased by 430 bps to 49%, which is a positive sign as it means more diversification.

Digital Realty financials

Digital Realty (Investor Presentation June 2022)

REITs are well-known for having a large amount of debt, and Digital Realty is no different, with $14.4 billion in long-term debt. The good news is the Net Debt/Adjusted EBITDA ratio has decreased to 5.9x from 6.2x in 2018, which is a positive sign.

The Dividend Yield (forward) is steady at 4%. Don't be panic by the very high 97% payout ratio, as by law all REITs must pay out at least 90% of earnings.

Digital Realty Dividend Yield

Dividend Yield (Investor presentation)


In order to value this REIT, I will compare the Price to Funds from operations across a few data center REITs in the industry. As you can see from the chart I created below, Digital Realty is the cheapest data center REIT of the three, with a P/FFO = 17.7. By comparison, Equinix (EQIX) has a P/FFO = 35 and CyrusOne (CONE) has a P/FFO = 21.8.

Data Center REITs valuation

Data Center REIT Valuation (created by author Ben at Motivation 2 Invest)

As a general comparison, Digital Realty trades at an average valuation relative to the entire real estate sector. However, I prefer to use the first valuation as the entire real estate sector includes Commercial office buildings, which are trading at cheap multiples.

Digital Realty Valuation

Digital Realty Valuation (Seeking Alpha)


Jim Chanos Short Sell Thesis

Infamous short seller Jim Chanos is raising capital to bet against data center REITs. In an interview with the Financial Times on June 29th, Chanos believes the value of the cloud is going to the hyperscalers (Amazon Web Services, Microsoft Azure, Google Cloud) and they will build their own. Thus, he believes the legacy data centers will not be utilized as much. This is an interesting point, but one in which Wells Fargo Analyst, Eric Luebchow, calls "misguided."

He makes the point that hyperscalers are struggling to build due to long lead times for equipment and power. He notes that the big cloud providers "outsource up to 60% of their capacity demands." Thus, the short selling thesis by Chanos is contrary to the facts on the ground. However, it is still a risk to be aware of.

Note: Jim Chanos Risk from my latest post on DigitalBridge.

Recession/IT spending cutbacks

Analysts are predicting a "shallow but long" recession, which is forecasted to start in the fourth quarter of 2022. Enterprises may decide to cut back or delay IT spending due to rising input costs and increasing uncertainty. This could mean some volatility is expected within the next year.

Final Thoughts

Digital Realty is a tremendous REIT which provides the backbone infrastructure for many large-scale cloud providers. The REIT's high Funds from Operations and industry diversified Data Centers offer both quality and stability. The stock is undervalued relative to two large Data Center REITs and thus looks to be a great investment for the long term.

Sun, 17 Jul 2022 00:15:00 -0500 en text/html
Killexams : The Evolving Role Of CISOs

Sriram Tarikere is a Cybersecurity Executive Leader with Alvarez & Marsal, New York, with more than a decade of experience in the field.

Traditionally, the role of a chief information security officer (CISO) has been to monitor information security both digitally and physically. Their technical skills were enough to help them grow and excel in their roles. But as the global landscape kept evolving with the innovation of new technologies and tools, the cyber world has become immensely complex.

The world we now live in is full of challenges and cybersecurity threats. Numerous high-profile data breaches and ransomware attacks have cost millions of dollars to American businesses. The onset of the Covid-19 pandemic and organizations' rapid shift to working from home and remote work models have only aggravated cybersecurity threats.

Cyber attackers are now trying to cash in on the opportunity, and the lapse in cyber hygiene is acting as a catalyst. IBM's Cost of a Data Breach Report 2021 revealed that the average cost of a data breach in the U.S. alone is $9.05 million, while the average total cost of a data breach worldwide is $4.24 million.

The Evolving Role Of CISOs

Cyber attacks have become more sophisticated today, and cybercrimes like SolarWinds and Microsoft supply chain attacks are a testament to it. Despite investing heavily in cybersecurity, companies such as CNA Insurance, Kronos Group and Kaseya, among many others, have suffered from major ransomware attacks in the latest past.

Such cybersecurity challenges have forced the expansion of the role of CISOs beyond their traditional responsibilities. Today, CISOs must be strategic thinkers, decision makers, influencers and much more. So, CISOs can no longer rely only on their technical knowledge alone to respond to cyberattacks of the magnitude you see today, nor is cybersecurity the concern of only the information technology teams anymore.

The level of ransomware attacks we have seen in the latest past has made it a company-wide concern. As a result, the role of a CISO has changed drastically from what it used to be.

The Modern Role Of A CISO

A CISO is now more involved in the overall cyber risk management of the company, mitigation of risks and the decision-making process. The CISO is now closely aligned with C-level executives and the board of directors to keep them informed about cybersecurity risks and initiatives to mitigate the threat. The board of directors has become increasingly cyber aware and expects the CISO to present the organization's cybersecurity posture to them more frequently than ever.

Today, having only the technical skills is not enough for a CISO. They are expected to have a strategic vision and a broader perspective of what's happening in the cybersecurity space. A CISO doesn't need to be a technology expert but should be aware of all the latest technologies and security areas that can impact the overall business.

Modern-Day CISO Expectations

Modern-day CISOs need a lot more apart from a solid technical foundation. Cyberdegrees mentions that a candidate should likely have a degree in computer science and experience in a management role.

Candidates with an MBA or business background also tend to have increased employment opportunities as their role involves management and advocating for the company leadership on cybersecurity. They need good communication skills because they have a critical role in communicating the risks and threats with other C-level executives and the board.

Regulatory Changes And The Role Of A CISO

As different regulatory changes like NYDFS, CCPA. GDPR and FedRAMP have come into existence, and the responsibilities of a CISO have grown as well. Some of them include:

Cyber Intelligence

A CISO should be aware of all the latest cybersecurity threats. They should evaluate and gain visibility based on the following parameters.

• Who are the cyber hackers who might be interested in your data? What is their history? What kind of attacks have they performed in the past?

• Why are these groups interested in stealing your data? Can they cause productivity loss, monetary loss, reputation loss or all of them?

• Classify the nature of the data. What are they trying to steal?

• How are they going to find you and access your data? What kind of tools or strategies have they used in the past that they can use against you?

Build Security Architecture

One of the main roles of a CISO is establishing a security framework. This also involves identifying the right security hardware and software, explaining its requirement to the board and other C-level executives and implementing it.

Adhering To Compliance

Organizations now adhere to different government and state regulations, and it is the responsibility of the CISO to ensure that all compliance requirements are met.

What is expected from next-gen CISOs?

The new normal of work from remote locations has profoundly impacted the role of a CISO. Since there has been this paradigm shift in the operations model, IT security has been one of the major concerns for the organization. CISOs now have to step up to ensure their organizations can smoothly transition while embracing digital transformation. Continuously learning, mastering new skills and acquiring deep domain knowledge will be the main factors shaping the future generation of CISOs.

CISOs have a key role to play in any organization. Once considered a technical role, CISOs today are influencing the other C-suite leaders in the organization, thus forging key leaders for the future of cybersecurity.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Tue, 05 Jul 2022 00:00:00 -0500 Sriram Tarikere en text/html
Killexams : FEOL Nanosheet Process Flow & Challenges Requiring Metrology Solutions (IBM Watson)

New technical paper titled “Review of nanosheet metrology opportunities for technology readiness,” from researchers at IBM Thomas J. Watson Research Ctr. (United States).

Abstract (partial):
“More than previous technologies, then, nanosheet technology may be when some offline techniques transition from the lab to the fab, as certain critical measurements need to be monitored in real time. Thanks to the computing revolution the semiconductor industry enabled, machine learning has begun to permeate in-line disposition, and hybrid metrology systems continue to advance. Of course, metrology solutions and methodologies developed for prior technologies will also still have a large role in the characterization of these structures, as effects such as line edge roughness, pitch walk, and defectivity continue to be managed. We review related prior studies and advocate for future metrology development that ensures nanosheet technology has the in-line data necessary for success.”

Find the open access technical paper here. Published April 2022.

Mary A. Breton, Daniel Schmidt, Andrew Greene, Julien Frougier, Nelson Felix, “Review of nanosheet metrology opportunities for technology readiness,” J. Micro/Nanopattern. Mats. Metro. 21(2) 021206 (18 April 2022)

Related Reading:
HBM, Nanosheet FETs Drive X-Ray Fab Use
X-ray tools monitor chip alignment in HBM stacks and Si/SiGe composition in nanosheet transistors.
Transistors Reach Tipping Point At 3nm
Nanosheets are likeliest option throughout this decade, with CFETs and other exotic structures possible after that.
Next-Gen Transistors
Why nanosheets and gate-all-around FETs are the next big shift in transistor structures.
Highly Selective Etch Rolls Out For Next-Gen Chips
Manufacturing 3D structures will require atomic-level control of what’s removed and what stays on a wafer.
Nanosheet FETs Knowledge Center

Fri, 08 Jul 2022 10:14:00 -0500 en-US text/html
Killexams : IBM acquires Israeli startup

While the headlines are filled with talk of market uncertainty and cuts, today we are sharing some other news. Yesterday (Wednesday) IBM announced the acquisition of the Israeli data startup, the developer of a Data Observability (real-time data monitoring) software that helps organizations fix problems with their data, including errors, data flow failures, and poor data quality.

Israeli monitoring tools join IBM

Upon completion of the acquisition, Databand employees will become employees of IBM's Cloud and AI Division and will work from its offices in Tel Aviv. Databand will become part of IBM's research and development division and according to the acquiring company, Databand will work alongside other tools that the company offers its customers in the areas of data monitoring.

The system developed by, which will now be part of the services offered by IBM, will alert data teams and engineers when the data they use to run applications is inaccurate, alongside other tools developed by IBM such as Instana. What’s more is that the Databand system will be able to develop significantly by gaining access to the many resources that a company like IBM has, including open-source products and commercial products currently used by data divisions in various organizations.

With the acquisition, IBM will offer its customers the option to choose how they want to use the system; they will decide whether they want to run the software in self-hosting or simply move everything through the cloud and get the SaaS product. was founded by data engineers who experienced firsthand the problems arising from faulty data processes, be it wasted time, malfunctions, or harm to customers. And since the data world has gained momentum in latest years and almost every company has data processes, Databand decided to develop a platform that uncovers and anticipates data problems– even before developers notice them– by collecting, monitoring, and learning from occurrences in the organization's information processes.

The platform they developed knows how to address problems as soon as they happen. By pointing out the source of the problem, Databand saves time but also prevents a technical problem from becoming a business, company-wide problem which normally bounces around from team to team and is so critical that it must be addressed at any hour of the night.

In a conversation with Geektime, one of the company's founders and its CPO, Victor Shafran, said that with the move to remote work and the digital transformation, more and more organizations need platforms like that of Databand’s more than ever before. was founded in 2018 by Shafran, Josh Benamram (CEO) and Evgeny Shulman (CTO), and its offices are located in Tel Aviv. The companies did not disclose the amount of the deal, but according to estimates, IBM paid between $100 and $ 50 million for the company, which had raised about $22 million until now. Among the investors are several well-known funds, including Accel, Blumberg Capital, Bessemer Venture Partners, Ubiquity Ventures, Differential Ventures, F2 Capital and the Lerer Hippeau fund.

Sun, 10 Jul 2022 18:00:00 -0500 en text/html
Killexams : Salesforce's AI Economist research wants to explore the equilibrium between equality and productivity
By monticello -- Shutterstock

2016 was a pivotal year for Salesforce. That was when the company acquired MetaMind, "an enterprise AI platform that worked in medical imaging and eCommerce images and NLP and a bunch of other things, a horizontal platform play as a machine learning tool for developers," as founder Richard Socher described it.

If that sounds interesting today, it was probably ahead of its time then. The acquisition propelled Socher to Chief Data Scientist at Salesforce, leading more than 100 researchers and many hundreds of engineers working on applications that were deployed at Salesforce scale and impact. AI became an integral part of Salesforce's efforts, mainly via Salesforce Einstein, a wide-ranging initiative to inject AI capabilities into Salesforce's platform.

Besides market-oriented efforts, Salesforce also sponsors "AI for good" initiatives. This includes what Salesforce frames as a moonshot: building an AI social planner that learns optimal economic policies for the real world. The project going under the name "AI Economist" has recently published some new results. Stephan Zheng, Salesforce Lead Research Scientist, Senior Manager, AI Economist Team, shared more on the project background, results and roadmap.

Reinforcement learning as a tool for economic policy

Zheng was working towards his PhD in physics around the time that deep learning exploded -- 2013. The motivation he cited for his work at Salesforce is twofold: "to push the boundaries of machine learning to discover the principles of general intelligence, but also to do social good".

Zheng believes that social-economic issues are among the most critical of our time. What attracted him to this particular line of research is the fact that economic inequality has been accelerating in latest decades, negatively impacting economic opportunity, health, and social welfare. 

Taxes are an important government tool to Excellerate equality, Zheng notes. However, he believes that it's challenging for governments to design tax structures that help create equality while also driving economic productivity. Part of the problem, he adds, has to do with economic modeling itself.

"In traditional economics, if people want to optimize their policy, they need to make a lot of assumptions. For instance, they might say that the world is more or less the same every year. Nothing really changes that much.

That's really constraining. It means that a lot of these methods don't really find the best policy if you consider the world in its full richness if you look at all the ways in which the world can change around you", Zheng said.

The Salesforce AI Economist team tries to tackle this by applying a particular type of machine learning called reinforcement learning (RL). RL has been used to build systems such as AlphaGo and is different from the supervised learning approach that is prevalent in machine learning.

"In supervised learning, somebody gives you a static data set, and then you try to learn patterns in the data. In reinforcement learning, instead, you have this simulation, this interactive environment, and the algorithm learns to look at the world and interact with the simulation. And then from that, it can actually play around with the environment, it can change the way the environment works", Zheng explained.

This flexibility was the main reason why RL was chosen for the AI Economist. As Zheng elaborated, there are three parts to this approach. There's the simulation itself, the optimization of the policy, and then there is data, too, because data can be used to inform how the simulation works. The AI Economist focused on modeling and simulating a simplified subset of the economy: income tax.

A two-dimensional world was created, modeling spatial and temporal relations. In this world, agents can work, mining resources, building houses, and making money that way. The income that the agents earn through building houses is then taxed by the government. The task of the AI Economist is to design a tax system that can optimize for equality (how similar people's incomes are) and productivity (sum of all incomes).

AI modeling vs. the real world

Salesforce's research shows that AI can Excellerate the trade-off between income equality and productivity when compared to three alternate scenarios: a prominent tax formula developed by Emmanuel Saez, progressive taxes resembling the US tax formula, and the free market (no taxes). As Zheng explained, those 3 alternatives were coded into the system, and their outcomes were measured against the ones derived from the AI via the RL simulation.

Although this sounds promising, we should also note the limitations of this research. First off, the research only addresses income tax in a vastly simplified economy: there is no such thing as assets, international trade and the like, and there is only one type of activity. In addition, the total number of agents in the system is a maximum of 10 at this point.

The AI Economist is an economic simulation in which AI agents collect and trade resources, build houses, earn income, and pay taxes to a government.


Zheng noted that the research considered many different spatial layouts and distributions of resources, as well as agents with different skill sets or skill levels. He also mentioned that the current work is a proof of concept, focusing on the AI part of the problem.

"The key conceptual issue that we're addressing is the government trying to optimize this policy, but we can also use AI to model how the economy is going to respond in turn. This is something we call a two-level RL problem.

From that point of view, having ten agents in the economy and the government is already quite challenging to solve. We really have to put a lot of work in to find the algorithm, to find the right mix of learning strategies to actually make the system find these really good tax policy solutions", Zheng said.

Looking at how people use RL to train systems to play some types of video games or chess, these are already really hard search and optimization problems, even though they utilize just two or ten agents, Zheng added. He claimed that the AI Economist is more efficient than those systems.

The AI Economist team are confident that now that they have a good grasp on the learning part, they are in a great position to think about the future and extend this work also along other dimensions, according to Zheng.

In an earlier version of the AI Economist, the team experimented with having human players participate in the simulation, too. This resulted in more noise, as people behaved in inconsistent ways; according to Zheng, however, the AI Economist still achieved higher quality and productivity levels.

Economics and economists

Some obvious questions as far as this research goes are what do economists think of it and whether their insights were modeled in the system as well. No member of the AI Economist team is actually an economist. However, some economists were consulted, according to Zheng.

"When we first started out, we didn't have an economist on board, so we partnered with David Parkes, who sits both in computer science and economics. Over the course of the work, we did talk to economists and got their opinions their feedback. We also had an exchange with [economist and best-selling author] Thomas Piketty. He's a very busy man, so I think he found the work interesting.

He also raised questions about, to some degree, how the policies could be implemented. And you can think of this from many dimensions, but overall he was interested in the work. I think that reflects the broader response from the economic community. There's both interest and questions on whether this is implementable. What do we need to do this? It's food for thought for the economics community", Zheng said.

As for the way forward, Zheng believes it's "to make this broadly useful and have some positive social impact". Zheng added that one of the directions the team is headed towards is how to get closer to the real world.

On the one hand, that means building bigger and better simulations, so they're more accurate and more realistic. Zheng believes that will be a key component of frameworks for economic modeling and policy design. A big part of that for AI researchers is to prove that you can trust these methods.

"You want to show things like robustness and explainability. We want to tell everyone here are the reasons why the AI recommended this or that policy. Also, I strongly believe in this as an interdisciplinary problem. I think really the opportunity here is for AI researchers to work together with economists, to work together with policy experts in understanding not just the technical dimensions of their problem, but also to understand how that technology can be useful for society", Zheng said.

Two aspects that Zheng emphasized about this research were goal-setting and transparency. Goal-setting, i.e. what outcomes to optimize for, is done externally. This means that whether the system should optimize for maximum equality, maximum productivity, their equilibrium, or potentially in the future, incorporate other parameters such as sustainability as well is a design choice up to the user.

Zheng described "full transparency" as the cornerstone of the project. If in the future iterations of these types of systems are going to be used for social good, then everyone should be able to inspect, question and critique them, according to Zheng. To serve this goal, the AI Economist team has open-sourced all the code and experimental data based on the research.

Another part of the way forward for the AI Economist team is more outreach to the economist community. "I think there's a fair bit of education here, where today economists are not trained as computer scientists. They typically are not taught programming in Python, for instance. And things like RL might also not be something that is part of their standard curriculum or their way of thinking. I think that there's a really big opportunity here for interdisciplinary research," Zheng said.

The AI Economist team is constantly conversing with economists and presenting this work to the scientific community. Zheng said the team is working on a number of projects, which they will be able to share more about in the near future. He concluded that a bit of education to make people familiar with this approach and more user-friendly UI/UX may go a long way.

Wed, 13 Jul 2022 03:28:00 -0500 en text/html
Killexams : Autonomous Mayflower backed by IBM completes recreation of original’s historic voyage

PLYMOTUH, Mass. – A crewless robotic boat retracing the 1620 sea voyage of the Mayflower landed near Plymouth Rock on June 30.

The sleek Mayflower Autonomous Ship met with an escort boat as it approached the Massachusetts shoreline Thursday, more than 400 years after its namesake’s historic journey from England.

It was towed into Plymouth Harbor — per U.S. Coast Guard rules for crewless vessels — and docked near a replica of the original Mayflower that brought the Pilgrims to America.

Piloted by artificial intelligence technology, the 50-foot (15-meter) trimaran didn’t have a captain, navigator or any humans on board.

The solar-powered ship’s first attempt to cross the Atlantic in 2021 was beset with technical problems, forcing it back to its home port of Plymouth, England — the same place the Pilgrim settlers sailed from in 1620.

A new Mayflower built in part by IBM sets sail for the US – but it’s autonomous

It set off from the southwest English coast again in April but mechanical difficulties diverted it to Portugal’s Azores islands and then to Canada.

“When you don’t have anybody onboard, you obviously can’t do the mechanical, physical fixes that are needed,” said Rob High, a software executive at IBM helping to work on the project. “That’s also part of the learning process.”

On Monday, it departed Halifax, Nova Scotia for a successful 4-day journey to Plymouth Harbor.

Nonprofit marine research organization ProMare worked with IBM to build the ship and has been using it to collect data about whales, microplastics pollution and for other scientific research. Small autonomous experimental vessels have crossed the Atlantic before but researchers describe it as the first ship of its size to do so.

The voyage’s completion “means we can start analyzing data from the ship’s journey” and dig into the AI system’s performance, High said. He said the prospect of such crewless vessels navigating the seas on a continuous basis will make it easier to collect “all the kinds of things that marine scientists care about.”

Wed, 29 Jun 2022 12:00:00 -0500 en-US text/html
P8010-034 exam dump and training guide direct download
Training Exams List