A consultancy business is only as good as the knowhow it can impart onto clients. So it speaks volumes that, when asked what sets IBM Consulting apart, Lee-Han Tjioe, General Manager for Hong Kong and Macau, points to its rich and varied expertise.
“We have both business consulting and technology consulting in our scope,” Lee-Han says. “We have business consultants in our team that help clients with their strategy, with their new propositions, with defining or optimising business processes. That's one part of our practice. The other part is where we also advise on specific technology topics. So we have consultants that are very specialised in key technologies like AI and Hybrid Cloud that can help clients to achieve technology enabled major operational improvements. We basically have both deep business and technology skill sets to deliver end-to-end solutions.”
IBM has been a trusted advisory and delivery partner with high market reputation for decades, and has further developed into an eco-system provider with latest major corporate acquisitions to expand its AI and Hybrid Cloud skill sets to support clients with implementing differentiating industry and technology solutions. Today, IBM works closely and collaboratively with companies and eco-system partners to achieve required business model changes enabled by modern digital solutions which, without reliable partners, would be hard to scale at fast pace.
In many cases, IBM’s clients are international and local conglomerate companies across multiple key industries. “We are co-creating with our clients and ecosystem partners to develop new propositions and experiences, and applying best practices in a fast fashion with fast go-to-market. One example is with an insurance company that we work with on IOT for “pay-how-you-drive” insurance. And so we helped the client to actually get the right technologies into the cars to track driving behaviours and attach that to very innovative insurance propositions.”
IBM’s partnership with AXA
Another example is IBM’s strong partnership with insurance company AXA, which has endured for many years. Initially, AXA had their applications managed by providers over the world but was looking to consolidate, recognising that it was very hard to achieve consistent levels of service as well as cost-effectiveness. AXA brought IBM on board to manage those applications but also to help them innovate.
“Our partnership with AXA means that we are delivering multi-year support for the business-critical applications that AXA has,” Lee-Han continues. “Those applications are supporting distribution, sales, and key internal operations. We have transferred knowledge of 60 applications within four months and now support about 100+ applications. This is the foundation for our partnership with AXA. We are now helping with further accelerated0 deployment of API-based services on AXA’s digital platform to meet the fast developing new market needs.”
IBM Consulting and Our Transformation
IBM Consulting’s business can be broken down into four pillars: 1. Strategy Consulting, where it works with clients to define vision and blueprint for its future; 2. Experience Consulting, where Garage and Design thinking approaches are applied to redefine new experiences for clients and their customers; 3. Operations Consulting, in which it examines how it can optimise current business activity with automation and new technologies such as AI and IoT; and lastly 4. Technology Consulting, where IBM helps clients to implement or manage enterprise solutions and leverage Cloud technologies for optimising application management.
It’s a diverse remit – but at the heart of everything the firm does is the Virtual Enterprise, IBM’s framework that helps clients in their pursuit of digital transformation. Transformation is not just about taking on technical hurdles, IBM’s depth of transformation experiences and understanding of key industry opportunities proofs a Virtual Enterprise approach can be achieved with an end-to-end vision for achieving business growth.
We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
IBM is looking to grow its enterprise server business with the expansion of its Power10 portfolio announced today.
IBM Power is a RISC (reduced instruction set computer) based chip architecture that is competitive with other chip architectures including x86 from Intel and AMD. IBM’s Power hardware has been used for decades for running IBM’s AIX Unix operating system, as well as the IBM i operating system that was once known as the AS/400. In more latest years, Power has increasingly been used for Linux and specifically in support of Red Hat and its OpenShift Kubernetes platform that enables organizations to run containers and microservices.
The IBM Power10 processor was announced in August 2020, with the first server platform, the E1080 server, coming a year later in September 2021. Now IBM is expanding its Power10 lineup with four new systems, including the Power S1014, S1024, S1022 and E1050, which are being positioned by IBM to help solve enterprise use cases, including the growing need for machine learning (ML) and artificial intelligence (AI).
Usage of IBM’s Power servers could well be shifting into territory that Intel today still dominates.
Transform 2022
Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.
Steve Sibley, vp, IBM Power product management, told VentureBeat that approximately 60% of Power workloads are currently running AIX Unix. The IBM i operating system is on approximately 20% of workloads. Linux makes up the remaining 20% and is on a growth trajectory.
IBM owns Red Hat, which has its namesake Linux operating system supported on Power, alongside the OpenShift platform. Sibley noted that IBM has optimized its new Power10 system for Red Hat OpenShift.
“We’ve been able to demonstrate that you can deploy OpenShift on Power at less than half the cost of an Intel stack with OpenShift because of IBM’s container density and throughput that we have within the system,” Sibley said.
Across the new servers, the ability to access more memory at greater speed than previous generations of Power servers is a key feature. The improved memory is enabled by support of the Open Memory Interface (OMI) specification that IBM helped to develop, and is part of the OpenCAPI Consortium.
“We have Open Memory Interface technology that provides increased bandwidth but also reliability for memory,” Sibley said. “Memory is one of the common areas of failure in a system, particularly when you have lots of it.”
The new servers announced by IBM all use technology from the open-source OpenBMC project that IBM helps to lead. OpenBMC provides secure code for managing the baseboard of the server in an optimized approach for scalability and performance.
Among the new servers announced today by IBM is the E1050, which is a 4RU (4 rack unit) sized server, with 4 CPU sockets, that can scale up to 16TB of memory, helping to serve large data- and memory-intensive workloads.
The S1014 and the S1024 are also both 4RU systems, with the S1014 providing a single CPU socket and the S1024 integrating a dual-socket design. The S1014 can scale up to 2TB of memory, while the S1024 supports up to 8TB.
Rounding out the new services is the S1022, which is a 1RU server that IBM is positioning as an ideal platform for OpenShift container-based workloads.
AI and ML workloads are a particularly good use case for all the Power10 systems, thanks to optimizations that IBM has built into the chip architecture.
Sibley explained that all Power10 chips benefit from IBM’s Matrix Match Acceleration (MMA) capability. The enterprise use cases that Power10-based servers can help to support include organizations that are looking to build out risk analytics, fraud detection and supply chain forecasting AI models, among others.
IBM’s Power10 systems support and have been optimized for multiple popular open-source machine learning frameworks including PyTorch and TensorFlow.
“The way we see AI emerging is that a vast majority of AI in the future will be done on the CPU from an inference standpoint,” Sibley said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
IBM Quantum System 2 Artist Concept
IBMIn my previous article on IBM Quantum computers, I wrote about IBM’s plans to Boost access to its quantum computers. This article describes the update to IBM’s Quantum computing roadmap as revealed by Darío Gil, Senior Vice President, Director of Research at IBM Think in June.
IBM is building accessible, scalable quantum computing by focusing on three pillars:
· Increasing qubit counts
· Developing advanced quantum software that can abstract away infrastructure complexity and orchestrate quantum programs
· Growing an ecosystem of quantum-ready enterprises, organizations, and communities
IBM originally announced its quantum development roadmap in 2020. To date, the company has hit its planned releases on the original timeline. In addition to new quantum systems, IBM has sped execution performance by 120x using Qiskit Runtime, IBM’s containerized quantum computing service and programming model, from previous experiments.
The next step in IBM’s goals to build a frictionless development experience will be the release of Qiskit Runtime in 2022, which will allow developers to build workflows in the cloud, offering greater flexibility. Bringing a serverless approach to quantum computing will also provide the flexibility to distribute workloads intelligently and efficiently across quantum and classical systems.
To help speed the work of developers, IBM launched Qiskit Runtime primitives earlier this year. The primitives implement common quantum hardware queries used by algorithms to simplify quantum programming. In 2023, IBM plans to expand these primitives, as well as the capability to run on the next generation of parallelized quantum processors.
Quantum Hardware Scaling
Later this year, IBM is scheduled to deliver the 433-qubit Osprey quantum computer and dynamic circuits. IBM used 3D packaging to place a complex tangle of microwave circuit components and wiring on multiple physical levels close to the quantum processors, enabling the faster execution of dynamic quantum circuits. IBM’s experience packaging qubits will then enable construction of the 1121-qubit Condor computer, with minimal impact to individual qubit performance, in 2023. IBM expects Condor to be the first quantum computer with more than 1,000 qubits. After Condor, IBM will use chip-to-chip couplers to build even larger quantum systems.
“Our new quantum roadmap shows how we intend to achieve the scale, quality, and speed of computing necessary to unlock the promise of quantum technology,” said Jay Gambetta, VP of Quantum Computing and IBM Fellow. “By combining modular quantum processors with classical infrastructure, orchestrated by Qiskit Runtime, we are building a platform that will let users easily build quantum calculations into their workflows and so tackle the essential challenges of our time."
To build this new quantum roadmap, IBM is targeting three scalability “regimes” or steps to scale its quantum processors.
The first step requires building capabilities to “classically” communicate and parallelize operations in a non-quantum way across multiple processors. This step opens the door to a broader set of techniques such as improved error mitigation techniques and intelligent workload orchestration, which combine classical compute capabilities with quantum processors.
The next step is building short-range, chip-level couplers between quantum chips. Using these couplers, multiple chips can be connected to effectively form a single larger processor. This multichip modularity is key to scaling.
Ultimately, the third step to reach larger scalability is developing quantum communication links between quantum processors. These quantum communication links connect clusters of quantum processors together into a larger quantum system.
IBM plans to be using all three of these scalability techniques by 2025 to build a 4,000+ qubit processor based on multiple clusters of modularly scaled processors.
IBM Quantum Development Roadmap 2022
IBMFuture quantum computing systems will be called IBM Quantum System Two. A central approach to building IBM Quantum System Two will be modularity, which will be necessary to increase the scale of IBM quantum chips in the future.
System Two introduces a new generation of scalable qubit control electronics together with higher-density cryogenic components and cabling. The platform brings the possibility of providing a larger shared cryogenic workspace, opening the door to potential linking of quantum processors through novel interconnects. System Two is a major step toward a true quantum data center. A prototype of this system is targeted to be up and running in 2023.
IBM Quantum System 2 Artist Concept
IBMWhile building systems with more qubits is important for extending the capabilities of quantum computing, the quality of these qubits is also essential to building practical quantum computers. Qubit quality refers to the amount of time that the qubits are entangled and the error rate of the results. IBM has a metric for qubits called Quantum Volume (QV). IBM says its quantum systems are moving from a QV of 256 last year, to a QV of 1024 this year. The Falcon r10 system has under a 1 in 1000 error rate today. IBM handles its error management in the Quiskit Runtime.
There is progress being made with error mitigation and suppression techniques to Boost the ability of quantum software to minimize the effect of noise on the users’ application. These are important steps on the path towards the error-corrected quantum systems of the future.
Multichip connectivity
The next step to scaling quantum computers will be to make quantum communications links between chips and between cryostats. First, IBM plans to connect three or more Heron 133 qubit chips using classical (non-quantum) logic connections in 2023. With classical interconnects, the quantum state must be resolved to a binary logical result. But with the Crossbill quantum computer in 2024, IBM plans to interconnect chips with quantum entangled connections, which communicate in a quantum state. The connection between three chips should deliver 408 qubits. IBM will offer both system scaling options for experiments.
IBM Quantum Communication Links
IBMIn addition to the potential for using quantum computing to solve complex problems, this technology can also be used to crack today’s data encryption. While some cryptographers are skeptical that quantum computing can reliably be used to break cryptography within the next decade, IBM is already planning to mitigate the issue by offering quantum-safe cryptography. For example, the recently announced Telum Z16 mainframe has quantum-safe encryption.
Summary
IBM continues to leverage its traditional computing, quantum expertise, packaging technology, extensive software resources, and new business models to expand the developer reach and market opportunities for quantum computers. IBM’s super-cold qubits are also fast – 1,000 times faster than Ion-trap quantum computers. The company has committed to scaling quantum computing and adding greater capabilities over a multi-year roadmap.
More information about IBM’s Quantum Research is found at:
Tirias Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud. Members of the Tirias Research team have consulted for IBM, Nvidia, Qualcomm, and other companies throughout the AI and Quantum ecosystems.
IBM expects that Databand.ai, along with IBM Observability by Instana APM and IBM Watson Studio, will help position IBM to address the full spectrum of IT data observability to find bad data issues and resolve them before they impact a customer’s operations.
IBM Wednesday unveiled the purchase of Databand.ai, an Israel-based developer of a proactive data observability platform, which claims to catch bad data before it impacts a customer’s business.
Financial details of the acquisition, which actually closed June 27, were not disclosed by IBM.
The term “data observability,” as defined by Databand.ai, is the blanket term for understanding the health and the state of data in a system that allows a business to identify, troubleshoot, and resolve data issues in near real-time.
[Related: OBSERVE EXITS STEALTH; TARGETS SPLUNK, DATADOG FOR OBSERVABILITY: CEO JEREMY BURTON]
Observability helps not only describe a problem for engineers, but also provides the context to resolve the problem and look at ways to prevent the error from happening again, according to Databand.ai.
“The way to achieve this is to pull best practices from DevOps and apply them to Data Operations. All of that to say, data observability is the natural evolution of the data quality movement, and it’s making DataOps as a practice possible,” the company said.
IBM expects Databand.ai to strengthen its data, AI, and automation software portfolio to ensure trustworthy data goes to the right user at the right time. The company also expects the acquisition to help Databand.ai take advantage of IBM’s own R&D investments and other IBM acquisitions.
IBM, citing Gartner, said that poor data quality costs organizations an average of $12.9 million every year while increasing the complexity of data ecosystems and leading to poor decision making.
That makes this an exciting acquisition, said Mike Gilfix, vice president of product management for data and AI at IBM.
“Bad data is expensive,” Gilfix told CRN. “We’re excited about the fast-growing data observability market. We know when data stops, companies lose business. If you depend on data to run your company, and that data is corrupt or has other issues, we want Databand.ai to help find the issues and resolve them faster.”
Databand.ai is part of a three-legged way to bring observability to businesses, Gilfix said.
Databand.ai is focused on data observability, and is important to ensuring that data pipelines work as promised, he said.
The second leg is IBM Observability by Instana APM, which IBM acquired in late 2020. Instana brings observability specifically to applications by observing the makeup of the application and the performance of the app itself, he said.
The third is IBM Watson Studio, which brings observability to AI models, he said.
For IBM, Databand.ai is also an important component in a data fabric, which Gilfix defined as an architectural approach that enables consumers of data including engineers to access data, discover it, catch it, build data pipelines, and protect data across multiple data silos, Gilfix said.
“Many companies struggle with data silos,” he said. “A data fabric is a good way to connect those silos together.”
IBM’s channel partners are an important part of the company’s observability business, and the way they sell Databand.ai will be no different once it is integrated, Gilfix said.
“The channel is a big part of our business,” he said. “We believe a rich ecosystem is critical. Partners add expertise, make sure customers are successful, and bring in their own value adds to be even more successful.”
Object Management Group (OMG) announced Responsible Computing (RC), a new consortium comprised of technology innovators working together to address sustainable development goals.
Responsible computing is a systemic approach aimed at addressing current and future challenges in computing, including sustainability, ethics, and professionalism, stemming from the belief that is needed to think about technology in terms of its impact on people and the planet.
The new consortium’s manifesto defines RC values to restore trust in IT by responsibly applying technology and by sharing experiences with other organizations. These values include sustainability, inclusiveness, circularity, openness, authenticity, and accountability.
The consortium’s RC framework focuses on six domains of responsible computing, including:
Through interviews with over 100 CTOs concerns were raised around developing practical actions to progress Environmental, Social, and Governance (ESG) programs.
They wanted to contribute to becoming more sustainable businesses and demonstrate progress through consistent metrics. In November 2020, IBM's Academy of Technology (AoT)'s responded to these challenges and created the Responsible Computing Council, an international team of technology and computing leaders who collaborate in validation and the implementation of the RC framework and lead by example in becoming a responsible computing provider.
Object Management Group (OMG) was an early member of the council, and shortly after that, the OMG board approved the formation of the RC consortium.
“Now is the time for companies to adopt a holistic approach that places sustainability strategy at the center of their business,” said Sheri Hinish, global lead, IBM Consulting Sustainability Services. “Under IBM Impact, our new ESG Strategy, we are working to make a lasting, positive Impact in the communities in which we work and live. IBM is proud to be a founding member of the RC consortium and through this collaboration we hope to help companies establish new and innovative ways to transform their business operations through ethical, impactful ways that can help contribute to a more sustainable future. "
An organization can become more operationally efficient and demonstrate a return on investment (ROI) when meeting sustainability goals. The ROI can potentially include:?
RC spans a broad cross-section of industries, including consumer, financial services, travel and transport, insurance, government, energy, environment and utilities, telco and media, and industrial verticals.
For more information about this news, visit https://www.ibm.com.
IBM Wednesday unveiled the purchase of Databand.ai, an Israel-based developer of a proactive data observability platform, which claims to catch bad data before it impacts a customer’s business.
Financial details of the acquisition, which actually closed 27 June, were not disclosed by IBM.
The term “data observability,” as defined by Databand.ai, is the blanket term for understanding the health and the state of data in a system that allows a business to identify, troubleshoot, and resolve data issues in near real-time.
Observability helps not only describe a problem for engineers, but also provides the context to resolve the problem and look at ways to prevent the error from happening again, according to Databand.ai.
Join Australia’s most influential channel partners at CRN Pipeline 2022 to reconnect after two extraordinary years of change!
“The way to achieve this is to pull best practices from DevOps and apply them to Data Operations. All of that to say, data observability is the natural evolution of the data quality movement, and it’s making DataOps as a practice possible,” the company said.
IBM expects Databand.ai to strengthen its data, AI, and automation software portfolio to ensure trustworthy data goes to the right user at the right time. The company also expects the acquisition to help Databand.ai take advantage of IBM’s own R&D investments and other IBM acquisitions.
IBM, citing Gartner, said that poor data quality costs organizations an average of $12.9 million every year while increasing the complexity of data ecosystems and leading to poor decision making.
That makes this an exciting acquisition, said Mike Gilfix, vice president of product management for data and AI at IBM.
“Bad data is expensive,” Gilfix told CRN US. “We’re excited about the fast-growing data observability market. We know when data stops, companies lose business. If you depend on data to run your company, and that data is corrupt or has other issues, we want Databand.ai to help find the issues and resolve them faster.”
Databand.ai is part of a three-legged way to bring observability to businesses, Gilfix said.
Databand.ai is focused on data observability, and is important to ensuring that data pipelines work as promised, he said.
The second leg is IBM Observability by Instana APM, which IBM acquired in late 2020. Instana brings observability specifically to applications by observing the makeup of the application and the performance of the app itself, he said.
The third is IBM Watson Studio, which brings observability to AI models, he said.
For IBM, Databand.ai is also an important component in a data fabric, which Gilfix defined as an architectural approach that enables consumers of data including engineers to access data, discover it, catch it, build data pipelines, and protect data across multiple data silos, Gilfix said.
“Many companies struggle with data silos,” he said. “A data fabric is a good way to connect those silos together.”
IBM’s indirect channel partners are an important part of the company’s observability business, and Databand.ai will be no different once it is integrated, Gilfix said.
“The channel is a big part of our business,” he said. “We believe a rich ecosystem is critical. Partners add expertise, make sure customers are successful, and bring in their own value adds to be even more successful.”
After five years of tireless and dedicated service, Buena Vista Town Administrator Phillip Puckett is stepping down from that administrative position to focus on family and reconnect with his roots.
Puckett, who moved to Buena Vista (BV) from Texas in 2010, has dedicated the last decade to serving BV and transforming it into a modicum of small-town dynamics and policies.
Puckett, now 42 years old, will be transitioning into the role of Town Treasurer, which he says will allow him to spend more time with his family while simultaneously utilizing his prowess for accounting and bookkeeping.
It’s a win-win for the town of BV, despite the very large shoes he leaves to fill.
Originally from Dallas, Texas, Puckett began visiting BV in the late 90s. He says he immediately fell in love with the dynamic mixture of nature and culture the Arkansas River community had to offer.
“I fell in love with the river area and small town feel. It’s the mixture of community with natural resources and natural beauty in the area,” said Puckett with a cheerful reminiscence.
Puckett saw the potential BV had to drastically transform his life and those of many others, but first he had to get there.
In 2010 Puckett made the move to BV and immediately started looking for ways to get involved in the community. He says he had little luck until one fateful day he attended a BV Board of Trustees meeting.
Puckett was immediately struck by the stark demographic contrast of the members of the board. He noticed there was little to no representation of the young adult community that was quickly beginning to thrive in the town. He realized that this was an opportunity for him to not only get to know the town better but to contribute to its future in a positive way.
In 2012, Puckett decided to run for an open trustee position on the board and won handily. Despite his enthusiasm for the role, Puckett discovered very quickly that being a trustee in a small town is no easy task.
“It was a huge learning curve. There’s so much to local government”, explained Puckett.
With the help of mentors Joel Benson, Duff Lacy, and Keith Baker, Puckett was learning the ropes and quickly becoming immersed in the small town he so much wanted to be involved in.
He says it took him a solid two to three years to learn the subtle, and not so subtle, nuances of being an active and effective member of the board. He admits that learning the ins and out of small town politics was no easy task.
Yet Puckett knew he had a chance to represent the younger demographics in BV who did not know how to make their voices heard. In fact, in some ways, the residents of BV seemed to resist the ever-rising inflow of younger and younger people into town. Puckett saw this as an opportunity to improve.
“There’s no gate to close to stop new people and families from coming here”, said Puckett matter-of-factly.
Puckett immediately brought a fresh perspective to the board and in 2014, he started pushing more policy-based governance to handle the town’s needs. Up to this point, the board members were almost entirely volunteers who could only do so much on their own.
He says he knew a big change was needed to elevate the board to its utmost potential. So he suggested the board focus its efforts on policy, direction, and town support while simultaneously hiring a staff that would direct and guide in transforming BV into the town he always knew it could be.
“We set a new expectation for future boards”, explained Puckett.
After four years of dedicated service as a trustee, Puckett ran again in 2016 and won his re-election easily. His hard work and dedication were paying off and the town was taking notice.
Then in the spring of 2017, the position of Town Administrator became available.
After working 18 years as a product and project manager at IBM, and remotely in BV since 2010, Puckett was faced with a harrowing decision when IBM wanted to move him and his family to New York. He knew his place was in BV and promptly resigned from IBM and pursued the position of Town Administrator.
He knew it was a gamble, but in June of 2017, it paid off, and he was notified that he had been chosen as the new Town Administrator of BV.
“It was extremely exciting!”, Puckett exclaimed passionately.
However, Puckett had his work cut out for him. Staff turmoil and turnover says Puckett had left him with a board and town that not only needed a lot of work, but a lot of love and organization. Puckett’s main goal was to build a culture that would not only keep good people in the right positions to best serve the town, but would also build up its staff and fill vital roles. However, mending fences isn’t always easy in small-town politics.
“It was a huge challenge and I was very excited”, recalled Puckett.
But staffing alone would not restore balance to the town’s administration, Puckett knew he had to rebuild trust between the residents of BV and its leaders. He says he went to work transforming the role of Town Administrator into a medium between the wills and wishes of the people of BV, the Town’s staff, and the Board of Trustees.
He began explaining the why and the context of the needs of the residents in BV to the board and suggesting the policies needed to enact those changes. Puckett became the mouthpiece of the people of BV and there appears to be nobody better suited for the job.
Over the course of five years, Puckett has worked tirelessly day and night to create transparency and trust in a way no town Administrator had ever done before. His boots-on-the-ground approach and accessibility made it easy for the people of BV to speak up about the ideas they were passionate about and confident they were being taken seriously.
In a way, you could say Puckett was more of a “Town Mediator and Counsel” than simply the Administrator.
However, being a dedicated servant of the town comes with its costs. Puckett, who has three kids aging from eight years old to sixteen years old, was finding it harder and harder to carve time out for his family in between his endless list of responsibilities and commitments to the Town. When celebrated Town Treasurer Michelle Stoke announced she was stepping down from the role, the timing couldn’t have been more perfect. Puckett threw his hat in the ring and was immediately considered and chosen for the position.
“It’s the right time for me. I’ve put a lot of myself into this role [Town Administrator]”, Puckett said with what appears to be a sense of relief.
Fortunately, Stoke herself has been able to train Puckett in the role and facilitate his transition with great ease. He says her dedication and hard work as Town Treasurer has laid a solid foundation for Puckett to build on and move forward with.
“I am very excited to be starting into something in such good order and continue what Michelle has done real well,” explained Puckett.
In addition to building on what Stoke has worked so hard to create, Puckett says he’s bringing his own goals and ambitions to the position.
“I look forward to finding creative ways to finance town projects and utilizing our local tax dollars well,” said Puckett.
Puckett also plans to continue the legacy of accuracy, well-run budgets, big projects, and funding capital improvements that Stoke has done so well facilitating.
Above all else, Puckett is excited and relieved to have the chance to reconnect with the community of BV in ways he’s been unable to for the last five years.
“As I leave the Town Administrator role, I’m excited to reconnect with the BV community as Phillip Puckett again,”, he said, with a touch of excitement.
Puckett will provide an update on the ongoing hiring process of his replacement as Town Administrator at the next Board of Trustees meeting on Tuesday, July 12. Until then, the town of BV and its residents continue to show their gratitude for his dedication to making BV all it can and should be.
IBM plans to acquire Randori, a leading attack surface management (ASM) and offensive cybersecurity provider, further advancing IBM's Hybrid Cloud strategy and strengthening its portfolio of AI-powered cybersecurity products and services.
Randori helps clients continuously identify external facing assets, both on-premise or in the cloud, that are visible to attackers—and prioritize exposures which pose the greatest risk.
"Our clients today are faced with managing a complex technology landscape of accelerating cyberattacks targeted at applications running across a variety of hybrid cloud environments—from public clouds, private clouds and on-premises," said Mary O'Brien, general manager, IBM Security. "In this environment, it is essential for organizations to arm themselves with attacker's perspective in order to help find their most critical blind spots and focus their efforts on areas that will minimize business disruption and damages to revenue and reputation."
Randori is IBM's fourth acquisition in 2022 as the company continues to bolster its hybrid cloud and AI skills and capabilities, including in cybersecurity. IBM has acquired more than 20 companies since Arvind Krishna became CEO in April 2020.
Randori is a hacker led company, with software to help security teams discover gaps, assess risks, and Boost their security posture over time by delivering an authentic attack experience at scale.
Designed to help security teams zero in on previously unknown exposure points, Randori's unique attack surface management solution takes into account the logic of an adversary based on real-world attacks—and is the only one to prioritize based on level of risk as well as the attractiveness of an asset to potential attackers using their proprietary scoring system.
Their unique approach has led to the development of a cloud native solution that provides better prioritization of vulnerabilities and reduces noise by focusing on customers' unique attack surface.
Upon close of the acquisition, IBM plans to integrate Randori's attack surface management software with the extended detection and response (XDR) capabilities of IBM Security QRadar.
By feeding insights from Randori into QRadar XDR, security teams will be able to leverage real-time attack surface visibility for intelligent alert triage, threat hunting, and incident response. This can help eliminate the need for customers to manually monitor new critical applications and respond quickly when new issues or emerging threats arise on their perimeter.
Randori also provides businesses with a solution that uniquely combines attack surface management with continuous automated red teaming (CART) to stress test defenses and incident response teams. Upon close, IBM will leverage Randori to compliment X-Force Red's elite hacker lead offensive security services while further enriching QRadar XDR detection and response capabilities.
This will allow more global customers to benefit from a top-tier attack experience that helps uncover where organizations are most vulnerable. Randori insights will also be leveraged by IBM's Managed Security Services to help Boost threat detection for thousands of clients.
For more information about this news, visit www.ibm.com.
Press release content from Business Wire. The AP news staff was not involved in its creation.
DUBLIN--(BUSINESS WIRE)--Jul 12, 2022--
The “Global Smart Building Management Systems Market (2022-2027) by Components, Building Type, Geography, Competitive Analysis and the Impact of Covid-19 with Ansoff Analysis” report has been added to ResearchAndMarkets.com’s offering.
The Global Smart Building Management Systems Market is estimated to be USD 2.38 Bn in 2022 and is projected to reach USD 5.89 Bn by 2027, growing at a CAGR of 19.88%.
Market dynamics are forces that impact the prices and behaviors of the Global Smart Building Management Systems Market stakeholders. These forces create pricing signals which result from the changes in the supply and demand curves for a given product or service. Forces of Market Dynamics may be related to macro-economic and micro-economic factors. There are dynamic market forces other than price, demand, and supply. Human emotions can also drive decisions, influence the market, and create price signals.
As the market dynamics impact the supply and demand curves, decision-makers aim to determine the best way to use various financial tools to stem various strategies for speeding the growth and reducing the risks.
Competitive Quadrant
The report includes Competitive Quadrant, a proprietary tool to analyze and evaluate the position of companies based on their Industry Position score and Market Performance score. The tool uses various factors for categorizing the players into four categories. Some of these factors considered for analysis are financial performance over the last 3 years, growth strategies, innovation score, new product launches, investments, growth in market share, etc.
Ansoff Analysis
The report presents a detailed Ansoff matrix analysis for the Global Smart Building Management Systems Market. Ansoff Matrix, also known as Product/Market Expansion Grid, is a strategic tool used to design strategies for the growth of the company. The matrix can be used to evaluate approaches in four strategies viz. Market Development, Market Penetration, Product Development and Diversification. The matrix is also used for risk analysis to understand the risk involved with each approach.
The analyst analyses the Global Smart Building Management Systems Market using the Ansoff Matrix to provide the best approaches a company can take to Boost its market position.
Based on the SWOT analysis conducted on the industry and industry players, The analyst has devised suitable strategies for market growth.
Why buy this report?
Report Highlights:
Market Dynamics
Drivers
Restraints
Opportunities
Challenges
Market Segmentations
The Global Smart Building Management Systems Market is segmented based on Components, Building Type, and Geography.
Companies Mentioned
For more information about this report visit https://www.researchandmarkets.com/r/43t9
View source version on businesswire.com:https://www.businesswire.com/news/home/20220712006012/en/
CONTACT: ResearchAndMarkets.com
Laura Wood, Senior Press Manager
press@researchandmarkets.com
For E.S.T Office Hours Call 1-917-300-0470
For U.S./CAN Toll Free Call 1-800-526-8630
For GMT Office Hours Call +353-1-416-8900
KEYWORD:
INDUSTRY KEYWORD: TECHNOLOGY OTHER CONSTRUCTION & PROPERTY CONSTRUCTION & PROPERTY OTHER TECHNOLOGY
SOURCE: Research and Markets
Copyright Business Wire 2022.
PUB: 07/12/2022 12:35 PM/DISC: 07/12/2022 12:36 PM
http://www.businesswire.com/news/home/20220712006012/en