Read and Memorize these HP0-S35 Test Prep and forget worries

Accessibility of genuine, legitimate, refreshed, and most recent HP0-S35 examcollection is a large issue on the web. We have conquered the circumstance by gathering HP0-S35 cheat sheet and Test Prep and making a data set for our contender to download from and remember. These HP0-S35 bootcamp questions and answers are adequate to finish the test at the first endeavor.

Exam Code: HP0-S35 Practice exam 2022 by team
Implementing HP BladeSystem Solutions
HP Implementing test prep
Killexams : HP Implementing test prep - BingNews Search results Killexams : HP Implementing test prep - BingNews Killexams : USING E-BOOKS IN SCHOOL:

21st Century Classroom: Transforming the Textbook

In 21st century classrooms, blackboard chalk is on the endangered list, the pop quiz has been replaced with clicker questions, and bowling alley technology (overhead projector transparencies) has disappeared, thanks to digital projectors and document cameras.

But if you’re going to point to any aspect of the classroom that still hasn’t covered much ground on its trip into the 21st century, it has to be the textbook. This ubiquitous accessory has been beset by editorial controversy as we have seen recently in Texas; has seen consistently high price increases of an average of six percent per year; and still inspires parental derision for the outdated information often portrayed.

And then there’s the matter of weight. The heft of textbooks was the subject of a 21-page report written in 2004 in California for the state’s board of education. According to researchers, the combined weight of textbooks in the four “core” subjects (social studies, math, reading/ language arts, and science) ran, on average, from eight pounds at the first grade level to 20 pounds at the 11th grade level. Legislation to mandate weight limitations quickly followed in that state.

As this comparison of two school districts on opposite sides of the country and economic spectrum illustrates, in a world rich with alternative methods of delivery of content exemplified by digitized conversation, Google books, the Kindle and iPad, the textbook is the next classroom object worthy of transformation.

Realigning the Budget with Netbooks

“Everyone has a different 1:1 approach,” says Gary Brantley, chief information systems officer for the Lorain City School District. “Ours was to eliminate the books.”

Lorain City Schools is located in a city 35 miles from Cleveland. The district has 18 schools and 8,400 students. By moving to digital delivery of textbooks Superintendent Cheryl Atkinson saw an opportunity to address several larger district challenges than simply replacing outdated texts. A majority of families are low-income; its schools were struggling to meet yearly academic progress measures; and the district had just come out from under a state-mandated “fiscal watch.”

And, recalls Brantley, Atkinson was sincerely concerned about the weight of the textbooks being hauled around by the kids in her schools.

That was the atmosphere under which initial discussions began, he says. The district quickly realized that adopting a 1:1 program with digital textooks at the heart of the initiative could reduce textbook expenses and help bring students into the 21st century. “We’re an inner city school district,” says Brantley. “We saw this as a way to level the playing field for our kids and give them equal access and opportunities with technology.”

After a pilot program in 2007 and 2008, the district went after a federal grant to partially fund a full rollout to 9th and 10th graders for the following year. In January 2009, the district used federal Title 1 and Ohio state educational technology grant funds to lease Dell Inspiron 910 netbooks. The following year that program was expanded to 6th, 7th, 8th, and 11th grades, and the district switched to Acer Aspire One AOD150-1577 netbooks. This fall the district hopes to add 12th graders to the program.

The publishers the district is working with on the program are the traditional ones: Pearson Prentice Hall; Holt McDougal; and McGraw-Hill/Glencoe. They have provided versions of the texts, Brantley says, that go beyond simply being a PDF of the book. “It’s interactive. For example, if you have someone like Martin Luther King or John F. Kennedy in a history book, you can click on a picture, and it will tell you information about [that person] or [you can] do a search from the book to get more information about that particular person.”

Brantley is quick with numbers. He says that for 2,600 math books—the number of texts needed for grades nine through 12—the cost was going to be about $182,000. That’s $70 per book. The e-book edition for that same math book was about $15,000. The savings on that one text alone covered a large part of the expense of that first rollout of digital textbooks. The savings don’t stop there. An English textbook was priced at $163,673.05 for 2,475 books—about $66 per book. The digital version of the same volume was a fourth of the cost—$36,554.45.

Explains Brantley, Superintendent Atkinson “was very persistent” that the district find a content provider for the program, even if it wasn’t one of the three or four big textbook publishers. The publishers were willing to try the program in pilot mode. “A lot of trust was built on both sides to make this happen,” he says.

Now, says Brantley, students don’t have to travel to labs to gain access to computers. “Basically, there’s a lab in every classroom. Every kid is using that netbook as a textbook and as a computer.”

Brantley knows the technology is making an impact. “I think it’s pushed us a long way. It’s allowing the students to become a lot more creative in what they do and how they do it. It’s also leveled the playing field. A lot of these kids don’t have computers or internet access at home. Because the books are loaded on the hard drive, [Superintendent Atkinson] has given kids the ability to work on things they’d only have access to in a limited time within the classroom or in the lab.”

Although Brantley says student testing scores have gone up, he can’t confidently point to quantifiable results tied directly to the digital textbooks. “We brought different pieces of technology into the district in the same period, so we have to let the program run for a little while,” he explains.

“But Why Do We Care?”

The Campbell Union High School District, next door to San Jose in California’s Silicon Valley consists of six sites, five of which have been designated by the state as excellent. During the 2009-2010 school year, they performed a pilot program to experiment with the replacement of textbooks with e-readers. Director of Technology Charles Kanavel and his IT team of five distributed 270 Sony Reader Touch model PRS-600s into English classes across the district’s sites.

“These kids get technology. They go home and look at YouTube all day. An e-reader isn’t that hard for them,” Kanavel explains. The goal of the pilot was to get a “true sense of what’s it like for the everyday student to use one of these things in terms of wear and tear and what they wanted to see on the device.”

The effort was spurred by the Williams Settlement, Kanavel says. That California statute calls for California schools to have sufficient educational materials and conditions to meet curriculum standards. In order to meet standards of currency, textbooks need to be replaced every seven years—an expensive proposition in a district with 8,000 students. “It’s $180 for a biology textbook. That’s just one. With e-readers and how ubiquitous they’ve become,” Kanavel recalls asking, “Why do they need to carry 80 pounds worth of books around, when we have the technology to do this differently?”

But that initial test might never have come about if Kanavel hadn’t persisted in trying to woo Sony to participate in the proof of concept, a process that took seven months. The Campbell director focused on Sony because of its durability, price, and open platform. “Kindle, if you drop it, it’s game over,” he says. “With the Nook you have to buy everything from Barnes & Noble. The [Apple] iPad with 32 or 64 Gb, that’s $600 to $800. With one iPad, I can get four e-readers from Sony at around $200 each.”

But persuading the manufacturer to pay attention to education’s needs wasn’t an easy sell. Kanavel, who has a background in investment banking, studied the company’s financial reports and figured out how many e-readers had probably been sold through its nearby Silicon Valley area store, the largest Sony store in the United States.

When he approached the company about doing a test, it replied, “Yeah, yeah, yeah, interesting. But why do we care?” In response, he used this argument: “You sold 14,000 at the Valley Fair store in a three month period. Those are respectable numbers. But realistically, our district is 8,000 kids. You’d sell me 8,000 units. Then I’d have to buy a quarter of that every year forever. Once I start on it, I can’t get off.” He also pointed out that Campbell was only a medium-sized district. “Take San Jose Unified —55,000 students right next door. That would make your store numbers look like nothing. And there are 32 districts in Santa Clara County alone. Think of the entire country. Then they started caring.”

Once Sony was on board, the next hurdle was the textbook publishers trying to safeguard the pricing model, according to Kanavel. He estimates that a single school might have 300 copies of a particular book. On average the textbook will cost $120 on the low side and $180 on the high side. That’s a total outlay of $36,000 to $54,000 for a single textbook in a single school in the Campbell district.

For English classes, however, many of the books contained classic works of literature that are now in the public domain and available on various digital book websites. “Shakespeare is Shakespeare. The guy’s not writing a new version,” Kanavel says. He has been able to make a deal with Houghton Mifflin Harcourt for some digital textbooks in PDF format; but others—particularly novels —came from the Sony Reader Store; on Project Gutenberg (a good source for Shakespeare, he says); and via the OverDrive School get Library.

The challenge faced by textbook publishers, he points out, is that they have to change their business model. Kanavel wants to set up a site license with the publishers, but so far those negotiations are still on-going, and, besides, many still have to convert their textbooks into the epub format.

But the financials, as this former numbers guy points out, still work out nicely for the district. “For example, historically we have paid $9 a book for paperback copies of Macbeth and 70 to 80 percent of them come back unusable at the end of the year. Now with the e-reader, that replacement cost goes to zero.”

On average 15 out of every 100 books in the district need to be replaced because they’re damaged, lost, or stolen. Often, the same student loses multiple books when he or she loses a backpack. “If you’re a parent, you have to pay to replace all of those books. If your student loses a history book, biology book, math book, and English book, that’s about $600,” Kanavel says. “If they lose an e-reader or it breaks, you pay for the replacement cost of the e-reader —$200 -- then we just get the content.” This, he adds, “has long-term implications for budgeting and funding.”

So far, Kanavel says, the pilot has been successful with students. “They’ve taken good care of them. I’ve only had three break out of 270, which is pretty good.” He plans to add an additional 200 e-readers to the district for the next school year. “One thing I’ve been very focused on with this pilot is offsetting the cost of textbook replacement with this device and making it easier on the kids.” He believes the district is on the right track.

Teachers and students are discovering other advantages. The e-readers have built-in dictionaries. If a reader has a visual impairment, text can be upsized quickly. Users can annotate, draw, and take notes—something that’s forbidden with traditional textbooks. When the year is over, the kids will return the devices, and that added material can be wiped from the hard disk.

But e-readers still aren’t perfect, he adds. First, not every book is available in a digital format. He cites a high school classic, Chinua Achebe’s Things Fall Apart, as an example. Many textbooks have already been put on CD, but those are designed to be used in a PC. Publishers haven’t made huge inroads into converting their materials into the standard epub format that works with the major e-readers. But Kanaval is hopeful those gaps will diminish with time.

With the expected expansion of the pilot, negotiations with Sony continue. “We’ve proven that the kids can take care of them. The technology does work,” Kanavel says. “The next thing is to get Sony to build something bigger—an eight and a half by 11 inch format. And there are a lot of features that we don’t use. We’ve given them feedback on those things. There may be ways to cut cost by eliminating feature sets that can help them balance the cost of manufacturing.”

Textbook Smackdown

So given the experiences of these two districts—and others—how does a standard textbook stack up against an e-book? If a publisher needs to repair the mistakes introduced in the text, as happened with math books issued in Sacramento County in spring 2010, it won’t have to arrange to destroy the outdated books and incur shipping costs for the new ones; it can correct the errors and electronically distribute new versions of the content. In the face of a quickly evolving business model, publishers will be forced to adjust their pricing schemes—no doubt, to the advantage of the districts. In the matter of weight— well, the Acer netbook comes in under three pounds, and the Sony device is a little over 10 ounces. Those are metrics anyone can use no matter how much digital content sits on the devices.

Building the E-Book Structure

Although every e-book initiative shares common aspects—hardware, bandwidth, content, and professional development—how the program unfolds in your district will be unique. For example, should you connect e-readers to the internet?

In order to have a successful 1:1 implementation, you need hardware, bandwidth, content, and teacher professional development and buy in. But each district will be unique in its approach to implementing each aspect and the entire program. The question of when in implementation a district allows connection to the internet is a case in point. Campbell Union High School District in Silicon Valley wants students to stay on task as it implements e-books. Therefore, the Sony Reader Touch devices being used there don’t include web access. Although Sony does make a model of its e-reader that includes WiFi, according to Director of Technology Charles Kanavel, the decision to leave that feature out helps simplify the transition teachers have to make in integrating the device in the classroom.

“If I’m a teacher and I have these new devices in class, it affects my lesson planning,” he explains. “Without administrative control of access to the internet, some smart kid will make the thing text another e-reader. Then once that kid knows, all the kids will know. In class, instead of reading, they’re texting each other, surfing MySpace, and doing everything else. Have I just disrupted an entire class with this device? So let’s get the adoption in first. Let’s get the hurdles out of the way surrounding usage of content, usage of technology, and how it integrates into your standards in the classroom. Once that’s outlined, then we’ll figure out how to do WiFi.”

That absence of web access has also streamlined professional development. The district had 270 devices, which it handed out in English classes spread fairly evenly across its six sites. To ensure that the pilot wouldn’t get put on the back-burner by teachers uninterested in using the ereader, Kanavel had the principals at those sites nominate teachers to participate who were a “little bit tech savvy.”

From there, his IT team called teachers in for a demonstration of the Sony product they’d be using with their students. “That was it,” he says. “Maybe 30 minutes of Q&A with teachers, and off we went. The devices aren’t that complicated. You turn it on, pick your book, turn to the page, and that’s it.”

To make sure the program is on track, Kanavel has been doing evaluation of it in “real time.” “It’s not something we threw out there and said we’ll come back to you in six months. Every couple of weeks I’m pinging these teachers. They have direct lines back to me. As they’ve noticed things, they’ve emailed me.” Along with that, device maker Sony has put out surveys for the users too.

It’s Complicated

What complicates implementation of digital content in a 1:1 program is when the device being deployed is used for other purposes too. That’s the case at Lorain City School District in Ohio, which has distributed Acer netbooks to 9th, 10th, and 11th grade students. The goal there is to give its students access to technology and the wider world it can deliver. Many don’t have computers or an internet connection at home. Therefore, Chief Information Systems Officer Gary Brantley has chosen to implement WiFi on the devices.

The devices, which cost about $300 with software and maintenance, are loaded with a gigabyte of RAM, a 150 Gb or 160 Gb hard drive, an Intel Atom processor, a webcam, Windows XP Professional, Microsoft Office, a couple of calculators, 802.11 b/g WiFi, and, of course, digital textbooks.

Teachers have an interest in educating students about social networking, so, although access to the internet is filtered, the devices do allow access to sites such as Twitter, and Facebook. But that, says Brantley, “is being carefully monitored.”

Also, connectivity is necessary for implementation of CompuTrace, a program from Absolute Software that provides a service for tracking down lost, stolen, or missing devices. “We were finding that we were spending a lot of money replacing textbooks,” Brantley explains. “Now, we actually are spending less. If CompuTrace doesn’t find the netbook within 60 or 90 days, they pay for it. I can tell you they have found every single one.”

To simplify operations, the district uses only two images for the netbooks. Every middle school book in use is on every middle school netbook; and the same with all high school books. That approach, says Brantley, makes IT’s work easier since they don’t have to worry about granular inventory or “fool around” with what books any given student should be able to access.

The district has tackled the challenge of teacher acceptance from multiple sides. First, there was a teachers’ union aspect. Would it promote the change in teaching approaches necessary for success? To gain support, Brantley took the head of the union to a 1:1 conference to show her what could be done. After that, he says, “She came on board for the professional development piece.”

The next aspect was putting together programs and teams for professional development. Since the district has an “early release” day once a week, “that’s the block of time that increasingly is being dedicated to helping teachers learn how to integrate the technology into their classes. Gaining traction in that area is a longer haul,” Brantley admits. “It takes a while to get teachers on board with this.”

Next up for the Lorain district: implementation of a teacher recognition program and some type of graduate credit to motivate the teachers to try out new methods of instruction.

An area where Brantley has seen success is having the kids teaching the teachers. “That’s one thing that we’ve been trying to push,” he says. “Don’t be afraid to let the kids show you something as well. It becomes a collaborative effort.”

Challenges have surfaced in two IT areas. First, the sheer number of new devices has put a strain on Brantley’s department, which has 10 employees. “We’ve doubled the number of computers in the district but didn’t add one staff member,” he says. Second, IT has to be able to supply technical support to students in a timely manner. “Turnaround can’t be longer than a day. Even though we have spares, we still have to turn around these machines really quickly, so kids aren’t left without their books.”

But these burdens aren’t slowing down the district’s dreams. Brantley says eventually the netbook and digital textbook program could be expanded to every student in the district, from the fourth grade up.

Sat, 09 Jul 2022 04:48:00 -0500 en text/html
Killexams : One year after Literacy Act implementation test scores show improvement, but almost 12,000 students still falling behind

HUNTSVILLE, Ala. (WAFF) - If an Alabama law were fully implemented, 12,000 students across the state wouldn’t be moving on to the next grade. They’d be held back.

Last year, a portion of The Alabama Literacy Act went into effect it was created to help Improve studying in Alabama public schools. It was also created to ensure students are studying on grade level by the end of the 3rd grade.

After one year, there has been only a small improvement in test scores. There is still a long way to go.

“Without that skill at the end of the third grade, they are four times more likely not to complete high school,” said Senior Research Associate for Public Affairs Research Council of Alabama Thomas Spencer.

Spencer says the 2022 Alabama Comprehensive Assessment Program test scores show that 22 percent of third graders are not studying at a proficient level.

During the 2021 school year, Alabama implemented the Literacy Act curriculum to sharpen the focus on early grades reading.

“Particularly, students with learning disabilities and also students from economically disadvantaged backgrounds end to not come into school with quite the level of preparation and exposure to literature and studying that other kids get,” Spencer said.

According to the test scores, Wilcox County had the lowest test scores, with 58% of third graders falling behind, and the highest test scores were from Mountain Brook City, with just three percent.

“Parents, teachers, and communities need to work together and identify those students who are struggling in studying and wrap the services around them as early as kindergarten,” Spencer said.

Originally, part of the act was to hold back any 3rd-grade student, not at a proficient studying level, but that portion of the act has been delayed until the 2023-24 school year.

You can find a link to the full study here.

Copyright 2022 WAFF. All rights reserved.

Mon, 25 Jul 2022 19:12:00 -0500 en text/html
Killexams : How DevOps works in the enterprise How DevOps works in the enterprise — it's all about rapidity of release, but without sacrificing and compromising on quality in the digital world How DevOps works in the enterprise image

DevOps is an enabler of digital transformation.

How DevOps works in the enterprise is one of key questions business leaders have been asking.

This relatively new discipline, which Atlassian describes as agile applied beyond the software team, is helping businesses release products fast, but without cutting corners — which is “the name of the game at the moment in the digital world”, according to Gordon Cullum, speaking as CTO at Mastek — now technology director at Axiologik.

Increasingly, DevOps is the style in which businesses want to interact with each other in the digital age; it’s about rapidity of release without sacrificing and compromising on quality.

Patrick Callaghan, vice-president, partner CTO at DataStax, goes one step further.

He suggests that businesses “can’t truly function as an enterprise without applying DevOps software development principles…. DevOps in practice is ideal for organisations looking to streamline production, automate processes and build a culture of collaboration within their software teams. DevOps innovators are confident in their code because they both test it and make it fail in order to produce reliable apps.”

DiversityHow important is diversity in implementing a successful DevOps and IT strategy?

The importance of new ideas and embracing new ways of thinking can’t be underestimated when thinking about DevOps and IT. Read here

What is DevOps?

How DevOps works? Before getting into this, it’s important to understand what is DevOps.

Quoting AWS, ‘DevOps is the combination of cultural philosophies, practices, and tools that increases an organisation’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organisations using traditional software development and infrastructure management processes. This speed enables organisations to better serve their customers and compete more effectively in the market.’

This is a very practical explanation, but there are multiple definitions of the term.

It’s often described as a set of evolutionary practices inherited from the ways of agile working, which are more tuned to bringing the delivery and operational support communities closer together. This surrounds using processes and tooling that has been developed over the years for things like test automation, continuous integration, continuous deployment, to enable the faster flow of code. These new releases of code could be new functionality, architectural change or bug fixes.

“It’s a combination of keeping the lights on and changing delivery,” says Cullum.

DevOps resources

DevOps or disappear: 5 reasons business leaders need to embrace development and operational IT integration

What is the right storage software needed for DevOps to be a success?

3 DevOps pitfalls and how to avoid them

DevOps and CloudOps: The connection behind digital transformation acceleration

Why DevOps must become BizDevOps for business and IT collaboration

Best DevOps practices for 2019

The future of DevOps

Reinvigorating an old way of working

Bringing delivery and support together is a throwback to the 1980s and 1990s, “where IT just did IT and you didn’t care whether you asked them to fix a bug or deliver functionality,” continues Cullum.

This ethos is being reinvigorated in DevOps. But the reason it works and is more powerful today is because of the emergence of enabling technologies and new ways of working.

“While, 20 to 30 years ago we may have had JFDI approaches for getting stuff into live environments, what we now have are very controlled, measured processes, brought around by tools such as Puppet and Jenkins — these all create the robust, quality, managed pipeline that allows fast delivery,” explains Cullum.

Culturally, the discipline brings lots of old and new ideas together

Why DevOps now?

The reason DevOps has emerged now is because companies are involved in a highly competitive arms race.

Everything is accelerating so fast from a delivery point of view; if businesses can’t release code quickly, then they are probably already being disrupted. This brings challenges, but also provides advantages if you are already on that curve. Agile work patterns, for example, only really work if the organisation already has a relatively modern architecture.

The other area in the acceleration of DevOps is the emergence of cloud services. Over the last five to 10 years, the cloud has enabled very quick, easy and at times cost effective processes and techniques. These can be spun out in environments, infrastructures, platforms or whole services, and can be wired together very easily.

What this means is that architects are more able to build componentised architectures that are independently able to be released, modified and scaled from each other.

“So modern techniques, such as microservices and even serverless architectures, really accelerate the uptake of DevOps capabilities from a delivery and support point of view within an organisation,” says Cullum.

Bringing all these things together; the rise of cloud, the need to get things out faster but at a high quality, the rise of all the tooling that enables fast pipeline deliveries, changing culture and IT, what you’ve got is DevOps.

According to Statista, 21 per cent of DevOps engineers have added source code management to their DevOps practices, in the aim to accelerate the release of code.

DevOps vs Agile: pulling in the same direction in the enterprise

DevOps vs Agile. How do these two disciplines work in the enterprise, and why are they crucial in moving forward in a collaborative, customer-focused way? Read here

How DevOps works in the enterprise

What is the best approach organisations can take to DevOps? “It’s horses for courses-type conversation,” answers Cullum. By this, he means there are a lot of “complications under the hood”.

The first thing for organisations would be to identify why they want to adopt DevOps, so “they can keep their eyes on the prize”.

“It’s not about a marketing term, it’s not about somebody at c-level saying we want to implement DevOps, go away and do it,” suggests Cullum. “You have to know why you’re trying to do it. What is it you want? Do you want repeatable quality? Do you want cheaper or faster deliveries? Do you recognise a need to modify the architecture,” he asks?

Gordon Cullum looks after Mastek's technology strategy.

Gordon Cullum oversaw digital transformation company Mastek’s technology strategy as its CTO.

The leaders at legacy organisations, such as an older bank with monolithic environments, can’t just send their IT department on a DevOps training programme and expect them to be able to change the way they release software on mainframes. “It isn’t going to work like that,” suggests Cullum. In this scenario, there needs to be an architecture enablement programme that takes place, “which is how these legacy organisations can make sure that the services they deliver through the IT estate can be componentised in a way that delivery teams can run at their own pace.”

So, how DevOps works depends on the journey. There is no simple answer. But, the key takeaways for business leaders would be; don’t underestimate the cultural change required (people have to buy into the idea, similar to digital transformation), don’t rely too much on heavy documentation (you’re not going to know everything up front) and approach risk proactively (don’t be afraid of change).

If business then decide to implement DevOps within teams, from a process and method point of view, then these questions must be addressed; is your architecture able to support it? Is a leadership roadmap in place that creates the environment necessary to start delivering fast, high quality, automated deliveries?

“It’s a good question and requires a very consultative answer,” says Cullum.

Addressing these six steps in the DevOps cycle will lead to organisation success in this discipline. Image source: 6 C’s of DevOps Life Cycle

Addressing these six steps in the DevOps cycle will lead to organisation success in this discipline. Image source: 6 C’s of DevOps Life Cycle.

The DevOps workforce

As with any new disciple, even traditional ones in technology, the skills gap proves irksome. So, when implementing DevOps, should organisations retrain or bring in new talent?

It’s probably a bit of both, but the biggest thing people need is the right attitude. Mastek soon found this, according to Cullum. The programmers, designers and product managers who have been in the industry for 15 to 20 years are sometimes resistant to the change DevOps brings. They need to embrace a rapid change mindset, and accept that delivery and operations need to get closer together.

Generally, however, if “you aren’t already stuck in the mud at a senior level”, individuals in the industry are already well versed in the pace of change and in learning new techniques — they have to be “cross-skilled,” as Cullum describes.

Top DevOps interview Dumps revealed

Five experts provide Information Age with their top DevOps interview questions and answers, while revealing the skills and attitudes that impress them the most. Read here

Justifying this, he explains that what Mastek is finding is that it’s easier to train trainee engineers in new techniques, because they haven’t yet been conditioned to think in the older, waterfall-style ways of thinking.

“It’s harder to change attitude than it is to change a technology skill set,” he says. “So, we are cross-training and it’s working quite successfully, but we are seeing an accelerating effect by focusing on DevOps and agile techniques for our trainees.”

To satisfy this, there are seven key skills for businesses to consider:

1. Flexibility
2. Security skills
3. Collaboration
4. Scripting skills
5. Decision-making
6. Infrastructure knowledge
7. Soft skills

DevOps: an essential part of digital transformation?

Digital transformation is a wholesale reinvention of business — embracing digital, culturally and technologically.

“If you’re not reinventing your business processes, then you are not doing a transformation,” points out Cullum.

But, if businesses are reinventing business processes, then by definition they’re probably going to be overhauling large chunks of their IT estate, including the aforementioned legacy.

Why do we need DevOps? For the business and consumer

Businesses — especially large enterprises — must embrace DevOps to challenge the competition and meet their consumers’ digital experience demands. Read here

By embarking on this journey, sooner or later, these transformative businesses will be moving into a modern-style architecture with different components and different paces of different deliveries.

“In our case, we often talk about pace-layered deliveries,” says Cullum. “You’re going to put a lot more focus in your systems of differentiation and innovation, and they have to have rapid relatively robust change going in,” he says.

DevOps is the enabler of that.

If businesses aren’t doing DevOps — they might call it something else — or repeatable, automated deployment testing processes then they are not embracing change and able to make releases at the speed of change.

Why DevOps is important

DevOps, like digital, is an assumed norm now. It’s probably a little late to start thinking about it.

“If you aren’t already thinking about it or aren’t already doing it, you’re probably way behind the curve,” warns Cullum.

In digitally-resistant organisations it is likely that there are “guerrilla factions” that are trying DevOps. “In this case, you should probably go and look at what’s going on there and work out how you can industrialise that and scale it out,” he advises. “If you aren’t doing any of that, then you’re probably holding yourself back as a business.”

Some argue, however, it’s never too late to join the DevOps integration race.

The DevOps challenge: outdated IT estate architectures

The biggest DevOps challenge is that not all IT estate architectures are suitable for a DevOps approach… they are not modern. Read here

Business case study

Callaghan suggests that Netflix is a great example of making DevOps work for the business.

He says: “Netflix has used Apache Cassandra™ for its high availability, and to test for this they wrote a series of testing libraries called “Chaos Monkey.” For example, both “Chaos Kong” and “Chaos Gorilla” tests are used to decimate Netflix infrastructure to evaluate the impact on availability and function. As a result of the practice, Netflix is confident in their system and its reliability. DevOps software development practice enables Netflix to effectively speed up development and produce an always-on experience for their users.”

The DevOps engineer: fulfilling the software development life cycle

The DevOps engineer is becoming a more common presence in the enterprise. But, what exactly does the role entail and how can you become one? Read here


How to drive impact and change via DevOps — Stephen Magennis, managing director for Expleo Technology (UK technology), discusses how impact and change can be driven via DevOps.

How intelligent software delivery can accelerate digital experience success — Greg Adams, regional vice-president UK&I at Dynatrace, discusses how intelligent software delivery can accelerate digital experience success.

Mon, 01 Aug 2022 12:00:00 -0500 Nick Ismail en text/html
Killexams : Himachal Pradesh Approves Drone Policy To Create Holistic Ecosystem, Boost Employment

The Himachal Pradesh Drone policy will help create a holistic drone ecosystem, which will be built upon the foundation of Governance and Reforms Using Drones (GARUD)

Students will reap the benefits of this drone policy by getting access to job opportunities in the drone sector

The policy will also propagate the use of drones and drone-enabled technology to students as well as the general public

The Himachal Pradesh Cabinet, led by Chief Minister Jai Ram Thakur passed the Himachal Pradesh Drone Policy recently. 

The policy will help create a holistic drone ecosystem, which will be built upon the foundation of Governance and Reforms Using Drones (GARUD), according to a press statement. 

It aims to use digital sky opportunities by connecting with institutions including National Education Policy 2020, the HP Industrial Investment Policy, the HP Startup/Innovation Scheme and the National Skill Qualification Framework.

Besides, students will reap the benefits of this drone policy by getting access to job opportunities in the drone sector. The policy will also propagate the use of drones and drone-enabled technology to students as well as the general public.  

The latest development has come at a time when the Centre is making efforts to accelerate the growth of India’s drone industry.

In the past, the Centre amended the Drone Rules 2021 thereby, liberalising compliance norms, reducing restrictions on R&D activities and creating a framework for drone deliveries for drone manufacturers and startups. 

The Centre additionally introduced the PLI scheme with a corpus of INR 120 Cr for drone and drone components. Under this scheme, the Ministry of Civil Aviation shortlisted five drone manufacturers and nine drone component manufacturers. 

In May, the Ministry of Civil Aviation also notified that Digital Sky, an online platform for drone management, will be fully functional by October this year. The platform will offer various services to drone manufacturers and startups to help them apply for remote pilot certificates, generating type and pilot certificates, and submitting flight plans, among others.

Apart from the government, private players are also eyeing the burgeoning drone sector of India. Recently, Indian conglomerate Adani Enterprises’ Adani Defence Systems and Technologies acquired a 50% stake in drone startup General Aeronautics.

In April, foodtech startup Swiggy partnered with four drone startups including Garuda Aerospace, Skyeair Mobility, ANRA+TechEagle Consortia, and Marut Dronetech to pilot its grocery delivery project in Bengaluru and Delhi-NCR. 

Earlier this year, Zypp Electric also announced commencing drone deliveries in various cities of India including Bengaluru, Hyderabad, Mumbai, Pune, and Delhi-NCR. 

Tue, 07 Jun 2022 01:47:00 -0500 Jaspreet Kaur en text/html
Killexams : Understanding NIST’s post-quantum encryption standardization and next steps for CISOs

By Duncan Jones, Head of Cybersecurity at Quantinuum

In a accurate National Security Memo (NSM-10), the White House acknowledged the need for immediacy in addressing the threat of quantum computers to our current cryptographic systems and mandated agencies to comply with its initial plans to prepare. It’s the first directive that mandates specific actions for agencies as they begin a very long and complex migration to quantum-resistant cryptography. Many of the actions required of agencies depend on new cryptographic algorithms that have just been chosen by the National Institute of Standards and Technology, although final standardization will take 18 to 24 months.

What should CISOs be doing to prepare for the risks of quantum computers and to comply with NSM-10 requirements? They should start by gaining an understanding of the new algorithm standards, and from there, focus on inventorying the agency’s most important information and assets. 

NIST to the rescue

In as little as a decade, quantum computers will break many of the encryption schemes in use today, such as the popular RSA algorithm that we use for encrypting internet data and for digitally signing transactions. An attacker with a powerful quantum computer will be able to read data encrypted by an RSA public key or forge transactions signed by an RSA private key. Worse, a category of attack known as “hack now, decrypt later” may already be under way. Attackers who record data using quantum-vulnerable algorithms now can retrospectively decrypt it in the future using quantum computers. For any agency or contractor that shares data with a long sensitivity lifespan, this is a real concern.

Fortunately, the academic world has not been sitting idle. Since 2016, NIST has been working with the cryptographic community to identify and standardize new quantum-proof encryption algorithms. The NIST process will help ensure that these algorithms become standardized in Federal Information Processing Standards publications and are ready for consumption by federal authorities. As such, it’s important for CISOs to familiarize themselves with the new algorithms and their properties.

Each post-quantum algorithm has three different security levels defined—SL1, SL3 and SL5. These levels are very similar to key sizes in today’s algorithms. Much like 4096-bit RSA keys are stronger than 1024-bit RSA keys, SL5 is stronger than SL3 and SL1. However, that increased security comes at a cost. SL5 keys are typically larger to store and result in slower computations. It’s also notable that post-quantum algorithms cannot be used for both encryption and data signing. Instead, they are used for only one task or the other. This means we will be replacing a single algorithm, such as RSA, with two separate algorithms.

The table below shows some of the characteristics of the selected algorithms.

Algorithm Type Family Public Key Size Ciphertext/Signature Size
CRYSTALS-KYBER Key Establishment Lattice-based 1.6KB - 3.1KB 0.8KB - 1.5 KB
CRYSTALS-Dilithium Signature Lattice-based 2.5KB - 4.8KB 2.4KB - 4.6KB
Falcon Signature Lattice-based 1.2KB - 2.3KB 0.7KB - 1.3KB
SPHINCS+ Signature Hash-based 0.03KB-0.06KB 7.7KB - 49KB

For immediate action

According to NIST’s chief of the Computer Security Division, Matt Scholl, “…don't wait for the standard to be done. Start inventorying your most important information. Ask yourself what is that data that an adversary is going to want to break into first.”

According to NSM-10, leaders from the Office of Management and Budget, the Cybersecurity and Infrastructure Security Agency, NIST and the National Security Agency will be establishing requirements for inventorying all currently deployed cryptographic systems within six months of the May 4 memo. Within a year—and on an annual basis—“…heads of all federal civilian executive branch agencies shall deliver to the director of CISA and the national cyber director an inventory of their IT systems that remain vulnerable to CRQCs.”

Agency inventory requirements will include: 

  • A list of key information technology assets to prioritize
  • Interim benchmarks
  • A common—and preferably automated—assessment process for evaluating progress on quantum-resistant cryptographic migration in IT systems

Migrating an agency or department to a fully post-quantum position is a complex process that will take many years. Although these post-quantum algorithms will not be ready for widespread production use until the standardization process finishes in 2024, considerable work—now mandated under NSM-10 directive—must be done to prepare for these changes, starting with the inventorying process. 

Next steps for federal CISOs

Identify data assets and use of cryptography. Before you can prioritize migration, you need to understand exactly what data you have, and how vulnerable it is to attack. Data that is particularly sensitive and vulnerable to the “hack-now, decrypt-later” attacks should be prioritized above less sensitive data that isn’t transmitted freely. CISOs should start cataloging where quantum-vulnerable algorithms are currently being used. For a variety of reasons, not all systems will be affected equally. CISOs need a very clear picture of the vulnerabilities present in each of their systems.

Speak with vendors. Now is the perfect time to be asking your vendors about their plans for adopting post-quantum algorithms. A good vendor should have a clear roadmap already in place and be testing the candidate algorithms in preparation for 2024.

Test algorithms for home-grown software. Post-quantum algorithms have different properties than the algorithms we use today. The only way to know how they will affect your systems is to implement them and experiment. To assist with potential compatibility issues, NSM-10 encourages agency heads to begin conducting “…tests of commercial solutions that have implemented pre-standardized quantum-resistant cryptographic algorithms.” 

A good place to start is with the Open Quantum Safe project, which provides many different implementations of post-quantum algorithms designed for experimentation. 

Quantum is not all bad news. It is worth remembering that quantum computing also offers new techniques for strengthening existing systems. Quantum computers are already being used today to generate stronger cryptographic keys. In the future, once this migration to post-quantum algorithms is behind us, we’ll view quantum as a gift to cybersecurity, not a threat.

 Duncan Jones is the head of cybersecurity at Quantinuum.

Tue, 26 Jul 2022 06:00:00 -0500 en text/html
Killexams : Surprise Senate vote would overturn Biden environmental rule

WASHINGTON (AP) — In a surprise victory for Republicans, the Senate on Thursday voted to overturn a Biden administration rule requiring rigorous environmental review of major infrastructure projects such as highways, pipelines and oil wells — an outcome aided by Democratic Sen. Joe Manchin of West Virginia.

Manchin, a key player on energy and climate issues and a swing vote in the closely divided Senate, joined Republicans to support the measure, which was approved 50-47. The vote comes as Manchin has proposed a separate list of legislative measures to speed up federal permitting for major projects in return for his support of a Democratic bill to address climate change.

Republicans voted unanimously to overturn the Biden permitting rule, while Manchin was the only Democrat to do so. Three senators were absent: Republican John Cornyn of Texas and Democrats Patrick Leahy of Vermont and Jeff Merkley of Oregon. The vote sends the measure to the Democratic-controlled House, where it is unlikely to move forward.

Still, the vote signaled strong Senate support for action to reform the often onerous federal permitting process, which can take up to eight to 10 years for highways and other major projects. Streamlining federal review is a top Manchin and GOP priority that is not shared by most Democrats.

Sen. Dan Sullivan, an Alaska Republican, sponsored the measure to overturn the Biden rule, saying new regulations under the National Environmental Policy Act, or NEPA, will further bog down the permitting process and delay critical infrastructure projects the country needs.

The Biden rule — which overturns an action by the Trump administration loosening environmental reviews — requires regulators to consider the likely impacts on climate change and nearby communities before approving major projects. The new requirement “is going to add to the red tape" that prevents major infrastructure projects from being approved in a timely manner, Sullivan said.

While President Joe Biden has called infrastructure a priority — and pushed for a $1 trillion bipartisan infrastructure law passed last year — the new NEPA rule actually “makes it harder to build infrastructure projects” in the United States, Sullivan said.

“The only people, in my view, who really like this new system are radical far-left environmental groups that don’t want to build anything ... and probably the Chinese Communist Party,'' he said on the Senate floor. China and other competitors likely “love the fact that it takes 9 to 10 years to permit a bridge in the U.S.A.,'' Sullivan said.

The White House threatened a veto if the measure reaches the president's desk.

“This action would slow the construction of American infrastructure, lead to the waste of taxpayer resources on poorly designed projects and result in unnecessary and costly litigation and conflict that will delay permitting,'' the White House said in a statement Thursday.

Manchin countered that, "for years I’ve worked to fix our broken permitting system, and I know the (Biden) administration’s approach to permitting is dead wrong.''

Manchin called Thursday's vote “a step in the right direction" but said the measure likely "is dead on arrival in the House. That’s why I fought so hard to secure a commitment (from Democratic leaders) on bipartisan permitting reform, which is the only way we’re going to actually fix this problem.''

The new rule, finalized this spring, restores key provisions of NEPA, a bedrock environmental law that is designed to ensure community safeguards during reviews for a wide range of federal projects, including roads, bridges and energy development such as pipelines and oil wells. The longstanding reviews were scaled back under former President Donald Trump in a bid to fast-track projects and create jobs.

The White House Council on Environmental Quality said in implementing the new rule that it should restore public confidence during environmental reviews. The change could speed development by helping to "ensure that projects get built right the first time,” said CEQ Chair Brenda Mallory.

Projects approved by the Trump administration were frequently delayed or defeated by lengthy court battles from groups challenging environmental reviews as inadequate.

Manchin, who brokered a surprise deal last week on climate legislation with Senate Majority Leader Chuck Schumer, said he's won promises from Biden and Democratic leaders in Congress to pursue permitting reforms in the Senate to speed approval of projects in his energy-producing state and across the country. Manchin's wish list includes swift approval of the controversial Mountain Valley natural gas pipeline in his home state and Virginia. The pipeline is nearly complete but has been delayed for years by court battles and other issues.

Manchin’s list includes a number of proposals supported by Republicans, including a two-year deadline on environmental reviews; changes to the Clean Water Act; limitations on judicial review; and prompt action on projects determined by the Energy secretary to be in the national interest.

Environmental groups have decried Manchin's proposals as counter-productive to the climate legislation and a threat to the environment and communities where projects would be built.

Madeleine Foote, deputy legislative director of the League of Conservation Voters, dismissed the Senate vote Thursday as "nothing more than a Republican-led stunt to appease their fossil fuel-industry allies.''

Foote and other environmentalists said strong NEPA review is needed to ensure that those most affected by an energy project have a say in the projects built in their communities.

“Thorough, community-based environmental reviews are critical to helping eliminate environmental racism and making sure low-income communities and communities of color are protected from polluters who want to build dirty, toxic projects in their backyards,'' Foote said.

She called on Congress to approve the Manchin-Schumer climate bill as soon as possible. Schumer said votes on the bill are likely this weekend.

Kabir Green, director of federal affairs at the Natural Resources Defense Council, another environmental group, said Americans are “seeing the effects of climate change in catastrophic detail, from the heat waves in Texas to wildfires in New Mexico to the devastating flooding in Kentucky. But the Senate is voting to prevent the federal government from considering climate change when making decisions. This makes no sense.''

Thu, 04 Aug 2022 09:28:00 -0500 en-US text/html
Killexams : HP Announces Commencement Of Exchange Offer And Consent Solicitation For Plantronics Notes

(MENAFN- GlobeNewsWire - Nasdaq)

PALO ALTO, Calif., June 27, 2022 (GLOBE NEWSWIRE) -- HP Inc. (NYSE: HPQ) (“HP” or the“Company”) announced today that it commenced a private exchange offer to certain eligible holders (the“Exchange Offer”) for any and all outstanding notes (the“Poly Notes”) issued by Plantronics, Inc. (NYSE: POLY) (“Poly”) for up to $500,000,000 aggregate principal amount of new notes to be issued by the Company (the“HP Notes”) and cash.

As previously announced, on March 25, 2022, the Company entered into a definitive agreement (“Merger Agreement”) to acquire Poly in an all-cash transaction for $40 per share, implying a total enterprise value of $3.3 billion, inclusive of Poly's net debt (the“Acquisition”). Pursuant to the Merger Agreement, a subsidiary of HP will merge with and into Poly, with Poly surviving the Acquisition as a wholly owned subsidiary of HP. The Exchange Offer and Consent Solicitation (as defined herein) are being conducted in connection with, and are conditioned upon, the completion of the Acquisition.

In conjunction with the Exchange Offer, HP is concurrently soliciting consents (the“Consent Solicitation” and, together with the Exchange Offer, the“Exchange Offer and Consent Solicitation”) to adopt certain proposed amendments to the indenture governing the Poly Notes (the“Poly Indenture”) to, among other things, eliminate from the Poly Indenture (i) substantially all of the restrictive covenants, (ii) certain of the events which may lead to an“Event of Default”, (iii) the restrictions on Poly consolidating with or merging into another person or conveying, transferring or leasing all or any of its properties and assets to any person, (iv) the reporting covenant and (v) the obligation to offer to purchase the Poly Notes upon certain change of control transactions (including the Acquisition) (collectively, the“Proposed Amendments”). The Proposed Amendments require the consent of the holders of not less than a majority in principal amount of the Poly Notes outstanding (the“Requisite Consent”). If the Requisite Consent is obtained, any remaining Poly Notes not tendered and exchanged for HP Notes will be governed by the amended indenture. The Exchange Offer and the Consent Solicitation are subject to the same conditions, and any waiver of a condition by HP with respect to the Exchange Offer will automatically waive such condition with respect to the Consent Solicitation, as applicable.

Upon consummation, the Acquisition will constitute a change of control under the Poly Indenture. Accordingly, pursuant to the existing terms of the Poly Indenture, HP would be obligated to make an offer to purchase the Poly Notes then outstanding at a purchase price equal to 101% of the principal amount of the Poly Notes thereof, plus accrued and unpaid interest, if any, to (but excluding) the date of repurchase, in connection with the consummation of the Acquisition (the“Poly Post-Acquisition Change of Control Offer”). However, if the Proposed Amendments are adopted, HP will no longer be obligated to make a Poly Post-Acquisition Change of Control Offer. The terms of the HP Notes will require HP to make an offer to purchase the HP Notes at a purchase price equal to 101% of the principal amount thereof, plus accrued and unpaid interest to (but excluding) the date of repurchase in connection with the consummation of the Acquisition.

The following table sets forth the Consent Payment (as defined herein), Exchange Consideration (as defined herein), Early Participation Premium (as defined herein) and Total Consideration (as defined herein) for the Poly Notes:

Title of
Poly Notes
Maturity Date Principal
Amount Outstanding
Consent Payment (1) Exchange
Consideration (4)
4.750% Senior Notes due 2029 727493AC2 (144A) / U7260PAB7 (Reg S); US727493AC24 (144A) / USU7260PAB77 (Reg S) 03/01/2029 $500,000,000 $2.50 in cash $970 principal amount of HP 4.750% Senior Notes due 2029* $30 principal amount of HP 4.750% Senior Notes due 2029* $1,000 principal amount of HP 4.750% Senior Notes due 2029* and $2.50 in cash


(1) For each $1,000 principal amount of Poly Notes accepted for exchange. On the Settlement Date (as defined herein), the Consent Payment will be paid to each eligible holder that validly tendered and did not validly withdraw Poly Notes at or prior to the Early Participation Date (as defined herein), even if such person is no longer the beneficial owner of such Poly Notes on the Expiration Date (as defined herein).
(2) For each $1,000 principal amount of Poly Notes accepted for exchange.
(*) The HP Notes will include a put right at 101% triggered upon consummation of the Acquisition.
(3) For each $1,000 principal amount of Poly Notes validly tendered and not validly withdrawn at or prior to the Early Participation Date. On the Settlement Date, the Early Participation Premium will be paid to each eligible holder who is a beneficial owner of such Poly Notes at the Expiration Date, and who validly tendered such Poly Notes at or prior to the Early Participation Date and did not validly withdraw such Poly Notes at or prior to the Expiration Date.
(4) For each $1,000 principal amount of Poly Notes validly tendered and not validly withdrawn at or prior to the Early Participation Date. Includes the Consent Payment, $970 of Exchange Consideration and the Early Participation Premium. For the avoidance of doubt, unless the Exchange Offer is amended, in no event will any holder of Poly Notes receive more than $1,000 aggregate principal amount of HP Notes for each $1,000 aggregate principal amount of Poly Notes accepted for exchange.

The Exchange Offer and Consent Solicitation is being made pursuant to the terms and subject to the conditions set forth in the confidential exchange memorandum and consent solicitation statement dated June 27, 2022 (the“Offering Memorandum and Consent Solicitation Statement”), and is conditioned upon, among other things, the closing of the Acquisition. The Exchange Offer will expire at 11:59 p.m., New York City time, on July 25, 2022, unless extended or terminated by HP (such date and time, as may be extended, the“Expiration Date”). Eligible holders of Poly Notes who validly tender and not have validly withdrawn their Poly Notes at or prior to 5:00 p.m., New York time, on July 11, 2022, unless extended or terminated (such date and time, as the same may be extended, the“Early Participation Date”), will be eligible to receive the Early Participation Premium (as defined herein). A consent may not be revoked after the earlier of (i) 5:00 p.m., New York City time, on July 11, 2022, unless extended or terminated, and (ii) the date the supplemental indenture to the Poly Indenture implementing the Proposed Amendments is executed (the earlier of (i) and (ii), the “Consent Revocation Deadline”). The Consent Solicitation will expire at the Early Participation Date. The settlement date (the“Settlement Date”) for the Exchange Offer will be promptly after the Expiration Date and is expected to occur no earlier than the closing of the Acquisition, which is expected to be completed by the end of the calendar year 2022, subject to customary closing conditions, including regulatory approvals.

For each $1,000 principal amount of Poly Notes validly tendered and not validly withdrawn at or prior to the Early Participation Date, eligible holders of Poly Notes will be eligible to receive the total consideration set out in the table above (the“Total Consideration”), which includes a consent payment of $2.50 in cash (the“Consent Payment”) and an early participation premium, payable in principal amount of HP Notes, of $30 (the“Early Participation Premium”). To be eligible to receive the Total Consideration, eligible holders must have validly tendered and not withdrawn their Poly Notes at or prior to the Early Participation Date and beneficially own such Poly Notes at the Expiration Date. For the avoidance of doubt, unless the Exchange Offer is amended, in no event will any holder of Poly Notes receive more than $1,000 aggregate principal amount of HP Notes for each $1,000 aggregate principal amount of Poly Notes accepted for exchange.

For each $1,000 principal amount of Poly Notes validly tendered and not validly withdrawn after the Early Participation Date and prior to the Expiration Date, eligible holders of Poly Notes will be eligible to receive $970 principal amount of HP Notes (the“Exchange Consideration”). To be eligible to receive the Exchange Consideration, eligible holders must validly tender (and not validly withdraw) their Poly Notes at or prior to the Expiration Date. If an eligible holder validly tenders and has not withdrawn their Poly Notes at or prior to the Early Participation Date and beneficially owns such Poly Notes at the Expiration Date, the eligible holder will instead receive the Total Consideration. An eligible holder that validly tenders Poly Notes and delivers (and does not validly revoke) a consent prior to the Early Participation Date, but withdraws such Poly Notes after the Early Participation Date but prior to the Expiration Date, will receive the Consent Payment, even if such eligible holder is no longer the beneficial owner of such Poly Notes on the Expiration Date.

No accrued and unpaid interest is payable upon acceptance of any Poly Notes in the Exchange Offer and Consent Solicitation. The interest rate, interest payment dates, maturity and redemption terms of the HP Notes to be issued by HP in the Exchange Offer will be the same as those of the Poly Notes to be exchanged. The first interest payment on the HP Notes will include the accrued and unpaid interest from the date of the last interest payment made under the Poly Indenture on the Poly Notes tendered in exchange therefor so that a tendering eligible holder will receive the same interest payment it would have received had its Poly Notes not been tendered in the Exchange Offer and Consent Solicitation; provided that the amount of accrued and unpaid interest shall only be equal to the accrued and unpaid interest on the principal amount of Poly Notes equal to the aggregate principal amount of HP Notes an eligible holder receives, which may be less than the principal amount of corresponding Poly Notes tendered for exchange if such holder tenders (and does not subsequently withdraw) its Poly Notes after the Early Participation Date. For the avoidance of doubt, to the extent the interest payment date for the Poly Notes occurs prior to the Settlement Date, holders who validly tendered and did not validly withdraw Poly Notes in the Exchange Offer and Consent Solicitation will receive accrued and unpaid interest on such interest payment date as required by the terms of the Poly Indenture.

Documents relating to the Exchange Offer and Consent Solicitation will only be distributed to eligible holders of Poly Notes who complete and return an eligibility certificate confirming that they are either a“qualified institutional buyer” under Rule 144A or not a“U.S. person” and outside the United States under Regulation S for purposes of applicable securities laws, and a non U.S. qualified offeree (as defined in the Offering Memorandum and Consent Solicitation Statement). The complete terms and conditions of the Exchange Offer and Consent Solicitation are described in the Offering Memorandum and Consent Solicitation Statement, copies of which may be obtained by contacting D.F. King & Co., Inc., the exchange agent and information agent in connection with the Exchange Offer and Consent Solicitation, at (888) 605-1956 (toll-free) or (212) 269-5550 (banks and brokers), or by email at . The eligibility certificate is available electronically at: and is also available by contacting D.F. King & Co., Inc.

This press release does not constitute an offer to sell or purchase, or a solicitation of an offer to sell or purchase, or the solicitation of tenders or consents with respect to, any security. No offer, solicitation, purchase or sale will be made in any jurisdiction in which such an offer, solicitation or sale would be unlawful. The Exchange Offer and Consent Solicitation is being made solely pursuant to the Offering Memorandum and Consent Solicitation Statement and only to such persons and in such jurisdictions as are permitted under applicable law.

The HP Notes offered in the Exchange Offer have not been registered under the Securities Act of 1933, as amended, or any state securities laws. Therefore, the HP Notes may not be offered or sold in the United States absent registration or an applicable exemption from the registration requirements of the Securities Act of 1933, as amended, and any applicable state securities laws.

About HP Inc.

HP Inc. (NYSE: HPQ) is a technology company that believes one thoughtful idea has the power to change the world. Its product and service portfolio of personal systems, printers, and 3D printing solutions helps bring these ideas to life. Visit

Forward-looking statements

This document contains forward-looking statements based on current expectations and assumptions that involve risks and uncertainties. If the risks or uncertainties ever materialize or the assumptions prove incorrect, the results of HP and its consolidated subsidiaries may differ materially from those expressed or implied by such forward-looking statements and assumptions.

All statements other than statements of historical fact are statements that could be deemed forward-looking statements, including, but not limited to, any statements regarding the consummation of the Acquisition; the potential impact of the COVID-19 pandemic and the actions by governments, businesses and individuals in response to the situation; margins, expenses, effective tax rates, net earnings, cash flows, benefit plan funding, deferred taxes, share repurchases, foreign currency exchange rates or other financial items; any projections of the amount, timing or impact of cost savings or restructuring and other charges, planned structural cost reductions and productivity initiatives; any statements of the plans, strategies and objectives of management for future operations, including, but not limited to, our business model and transformation, our sustainability goals, our go-to-market strategy, the execution of restructuring plans and any resulting cost savings, net revenue or profitability improvements or other financial impacts; any statements concerning the expected development, demand, performance, market share or competitive performance relating to products or services; any statements concerning potential supply constraints, component shortages, manufacturing disruptions or logistics challenges; any statements regarding current or future macroeconomic trends or events and the impact of those trends and events on HP and its financial performance; any statements regarding pending investigations, claims, disputes or other litigation matters; any statements of expectation or belief, including with respect to the timing and expected benefits of acquisitions and other business combination and investment transactions; and any statements of assumptions underlying any of the foregoing. Forward-looking statements can also generally be identified by words such as“future,”“anticipates,”“believes,”“estimates,”“expects,”“intends,”“plans,”“predicts,”“projects,”“will,”“would,”“could,”“can,”“may,” and similar terms.

Risks, uncertainties and assumptions include factors relating to the consummation of the Acquisition and HP's ability to meet expectations regarding the accounting and tax treatments of the Acquisition; the effects of the COVID-19 pandemic and the actions by governments, businesses and individuals in response to the situation, the effects of which may give rise to or amplify the risks associated with many of these factors listed here; the need to manage (and reliance on) third-party suppliers, including with respect to component shortages, and the need to manage HP's global, multi-tier distribution network, limit potential misuse of pricing programs by HP's channel partners, adapt to new or changing marketplaces and effectively deliver HP's services; HP's ability to execute on its strategic plan, including the previously announced initiatives, business model changes and transformation; execution of planned structural cost reductions and productivity initiatives; HP's ability to complete any contemplated share repurchases, other capital return programs or other strategic transactions; the competitive pressures faced by HP's businesses; risks associated with executing HP's strategy and business model changes and transformation; successfully innovating, developing and executing HP's go-to-market strategy, including online, omnichannel and contractual sales, in an evolving distribution, reseller and customer landscape; the development and transition of new products and services and the enhancement of existing products and services to meet evolving customer needs and respond to emerging technological trends; successfully competing and maintaining the value proposition of HP's products, including supplies; challenges to HP's ability to accurately forecast inventories, demand and pricing, which may be due to HP's multi-tiered channel, sales of HP's products to unauthorized resellers or unauthorized resale of HP's products or our uneven sales cycle; integration and other risks associated with business combination and investment transactions; the results of the restructuring plans, including estimates and assumptions related to the cost (including any possible disruption of HP's business) and the anticipated benefits of the restructuring plans; the protection of HP's intellectual property assets, including intellectual property licensed from third parties; the hiring and retention of key employees; the impact of macroeconomic and geopolitical trends, changes and events, including the Russian invasion of Ukraine and its regional and global ramifications and the effects of inflation; risks associated with HP's international operations; the execution and performance of contracts by HP and its suppliers, customers, clients and partners, including logistical challenges with respect to such execution and performance; changes in estimates and assumptions HP makes in connection with the preparation of its financial statements; disruptions in operations from system security risks, data protection breaches, cyberattacks, extreme weather conditions or other effects of climate change, medical epidemics or pandemics such as the COVID-19 pandemic, and other natural or manmade disasters or catastrophic events; the impact of changes to federal, state, local and foreign laws and regulations, including environmental regulations and tax laws; potential impacts, liabilities and costs from pending or potential investigations, claims and disputes; and other risks that are described (i) in“Risk Factors” in the Offering Memorandum and Consent Solicitation Statement and (ii) in our filings with the SEC, including but not limited to the risks described under the caption“Risk Factors” contained in Item 1A of Part I of our Annual Report on Form 10-K for the fiscal year ended October 31, 2021, as well as in Item 1A of Part II of our Quarterly Reports on Form 10-Q for the fiscal quarter ended January 31, 2022 and the fiscal quarter ended April 30, 2022. HP does not assume any obligation or intend to update these forward-looking statements.

Media Contacts

HP Media Relations

HP Inc. Investor Relations



Mon, 27 Jun 2022 11:42:00 -0500 Date text/html
Killexams : States Can Reduce Medicaid’s Administrative Burdens to Advance Health and Racial Equity No result found, try new keyword!Administrative burden exacerbates inequity,” the Office of Management and Budget recently said of economic and health assistance programs. That is because burdens “do not fall equally on all entities ... Tue, 19 Jul 2022 03:34:00 -0500 en-US text/html Killexams : IEEE Top 2022 Medal Of Honor Awardee, Dr. Madni Shares Deep Lessons

The IEEE Medal of Honor, established in 1917, is the highest IEEE award. Dr. Asad M. Madni is the 2022 recipient for Dr. Madni’s decades of global outstanding contributions and innovations. Dr. Madni’s remarkably compelling career, globally useful deep insights and lessons; captivating pioneering stories of innovation, invention, leadership; future predictions, and top recommendations; are explored in this extensive interview which is unscripted and provided in full below.

The IEEE, Institute of Electrical and Electronic Engineers, its roots dating back to 1884, and with more than 420,000 members in 160-plus countries, is the world’s largest technical professional organization dedicated to advancing technology for the benefit of humanity. Dr. Madni embodies all of the excellence in this iconic organization.

This article is based upon insights from my daily pro bono work, across more than 100 global projects and communities, with more than 400,000 CEOs, investors, scientists, and notable experts.

Dr. Asad M. Madni’s Brief Profile

Dr. Madni’s profile fills volumes due to hundreds of extensive contributions. A summary can be found with the IEEE TEMS (see interview series - Stephen Ibaraki - “Transformational Leadership and Innovation...”). This direct link to the interview page contains the summary profile and video interview.

As just one example of Asad’s lasting AND continuing global impact and influence, can be found in my Forbes article on biomedical innovation which spotlights the work of the Terasaki Institute for Biomedical Innovation (TIBI) ranked amongst the tops in the field. Dr. Asad M. Madni is a founding member of TIBI’s Leadership Board.

Here’s a very short abstract from the profile summary.

A Chat with Dr. Asad M. Madni, 2022 IEEE Medal of Honor Recipient (IEEE's highest award):

-Pioneering Inventor and Globally Transformational Innovator, Entrepreneur;

-Chairman / CEO / President / COO / CTO;

-Top Distinguished Scientist;

-Worldwide Contributions recognized with 84 major honors, 6 honorary doctorates and 6 professorships, 69 issued/pending patents, 200+ refereed publications;

-Top philanthropist with endowed scholarships, educational programs, and initiatives for empowerment involving financial inclusion and underrepresented minorities.

Interview with Dr. Asad M. Madni

AI is employed to generate the transcript which is then edited for brevity, clarity while staying with the cadence of the chat. AI has an approximate 80% accuracy so going to the full video interview is recommended for full precision. Time stamps are provided however with the caveat that they are approximate.

The interview is recommended for all audiences from students to global leaders in government, industry, NGOs, United Nations, scientific and technical organizations, academia, education, media, translational research and development, interdisciplinary and multidisciplinary work and much more.

Stephen Ibaraki 00:00

Hey Assad, thank you for coming in today, and congratulations on achieving the highest award for IEEE, the Medal of Honor, through this considerable history of success. You're a serial entrepreneur, you've had exits, you've invented and created many, many different kinds of technologies, which you instantiated into research, but also translated it into companies. And then you are advising companies and investments and so on. It's just really remarkable to have one person sit across all of these different domains. So again, thank you for coming in (and first of all, providing inflection points in your life that led to your remarkable history of success).

Asad Madni 00:36

Stephen, thank you. Actually, I should be the one saying thank you; looking at the outstanding and exemplary interviews that you have conducted with so many people. And just the unique manner of the depth and breadth with which you conduct these interviews has been very inspiring. So let me tell you that I am a fan of yours. And so with that, perhaps I should say in response to your question...

The digital revolution of late 1960s started defining architectures of next generation electronic systems and instrumentation. The advent of semiconductor memories, high speed and resolution analog-to-digital converters (ADCs), and microprocessors offered capabilities that were not possible to realize with analog systems alone. This coupled with electronic miniaturization and digital signal processing (DSP) techniques allowed systems to be smaller, smarter, cheaper and more reliable. I had graduated from UCLA with my undergraduate and graduate degrees in electrical sciences and engineering, with specialization in electronic systems, and was enthusiastically seeking creative challenges and employment where I could utilize my education in advancing my field in a significant way.

The opportunity arrived in 1975 when Systron Donner Corporation's Microwave Division (SDMD) offered me a position as a Project Engineer to develop the company's first spectrum analyzer with digital storage display. SD was one of 3 leading companies specializing in Radio Frequency (RF) and Microwave components and instrumentation (the other 2 being HP and Tektronix). The company's line of spectrum analyzers utilized the bulky, expensive, analog, variable persistence storage tubes which had severe limitations including display flicker, poor reliability and the inability to view multiple waveforms simultaneously. I was promised a technician, a junior engineer and an assembler who would be devoted solely to my project. Gullible as I was in those days, I believed in receiving this support only to realize shortly thereafter that due to "emergencies on other projects", I would be the lone warrior championing this project. In hindsight, I believe that this was the greatest learning experience of my life. Not only did I end up designing and developing the world’s first digital storage spectrum analyzer but I learnt the value of proper soldering, circuit layout, interface between analog and digital circuits, noise reduction techniques, and above all designing for cost. This system replaced the analog storage tube with the, recently introduced, semiconductor Random Access Memory based digital display. This led to a revolution of features in spectrum analyzer capabilities including, simultaneous viewing of multiple images, adaptive sweep, digital baseline clipper, electronic cursor, data normalization, automatic bandwidth adjustment, network analysis capabilities, voice interaction, etc. It transformed the landscape of not only spectrum analysis but spawned a whole new era of low cost, highly powerful digital based instrumentation including, network analyzers, sweep generators, frequency synthesizers, etc. The innovations and resulting patents, with me as the sole-inventor, were truly seminal, they established a multi-million-dollar test and measurement market and the basis for powerful test and measurement capabilities that we enjoy today.

With this success, I was now on my way to bigger challenges. I will focus on 3 truly revolutionary technologies that I spearheaded and that were inflection points in my career.

Stephen Ibaraki 05:39

That's pretty remarkable the work that you did and you're coming in and creating and inventing new methods and utilizing the latest chip technologies. Let's continue the story of all of this innovation.

Asad Madni 06:03


The analysis of communications systems comprised of transmission lines and antennas was considered complex and time consuming by the microwave industry for years. When multiple faults (impedance mismatches) existed on a line it was impossible to accurately measure their severities (due to line attenuation losses and power reflections from previous faults) and locations using historical measurement techniques such as conventional reflectometers (CR), time domain reflectometers (TDR), and swept frequency techniques (SF). Additionally, each of these techniques were: a) paralyzed in the presence of in-band, external, interfering signals b) unable to separate harmonics from reflections due to equally spaced faults (in general caused by connections between equal length waveguides or cables), and c) could not provide usable location accuracy when the transmission line was connected to an extremely narrow band antenna.

In the mid 1970's the Naval Surface Weapons Center (NSWC), Dahlgren Virginia established the Combat Readiness Electromagnetic Analysis and Measurement (CREAM) Program. As part of this program, NSWC issued a RFP soliciting innovative solutions to provide an accurate and fast analysis of RF and microwave communication systems that included waveguides, coaxial cables, antennas, and in certain cases directional couplers. It was further mandated that the system operation and interpretation should be simplified to a point where an E-3 technician would be capable of performing the entire test, reliably determine system faults, and be able to fix them.

I introduced the concept of using DSP techniques in conjunction with Frequency Domain Reflectometry (FDR) to develop a stand-alone system, which would be capable of identifying the true severities and locations of multiple faults along a coaxial or waveguide transmission line/antenna system (within inches and within minutes). It is worthwhile to note here that besides my small team which included a full-time technician, a junior engineer; and occasional part-time software programmer; no one in upper management had sufficient confidence or understanding to believe that my concepts would work and result in a usable system. Needless to say, my project was classified as extremely "high-risk" and I was funded in a most frugal manner. I was, however, extremely fortunate that the late Dr. Robert J. Haislmaier, Office of Chief of Naval Operations and the CREAM project manager, Robert B. Windle believed in me and assured me that if I could develop a prototype system that could perform the tasks in the time that I claimed, I would be funded sufficiently well to take this system into flown blown production and that it would become a standard equipment for testing the communication system on every US naval vessel. The resulting system that I subsequently invented, was referred to as the AN/PSM-40 Antenna Test Set and its commercial version as the Transline Analyzer®. This was a landmark contribution in the area of RF and microwave instrumentation and system design. It also served as the Topic of my doctoral dissertation.

The patented correlation and interpolation techniques of microwave signals was a major breakthrough in overcoming the limitations identified above and this stand-alone system replaced 9 instruments that took weeks to perform the measurements by highly trained personnel with much lower accuracy. Key feature of the new technique included determination of the severity and location of multiple mismatches in a single pass. NSWC had me making presentations to various departments of the Navy in order to receive production funding. My big moment came when Dr. Haislmaier arranged for me to make a presentation to a select group of Admirals at the Pentagon followed by a shipboard test on the USS John F. Kennedy (CV-67) at Norfolk, Virginia. My presentation was extremely well received by the Admirals but the shipboard test posed some unique challenges. The system showed that there was a discontinuity at a particular location in the waveguide/antenna communication system. The engineers swore that the system made an erroneous measurement, since at the specified location there was only a curved piece of waveguide (with the curve pointing downwards) and there were no connectors remotely close to it. I requested them to carefully disconnect that entire section of the waveguide without re-orienting it. When I received the waveguide, I turned it vertical only to show that a large amount of water had collected at the curve. We replaced the waveguide after removing the water and re-ran the test which indicated that the particular location in question was now fine. This convinced every skeptic in the area and I received my first million-dollar contract for SDMD.

The system performance was tested extensively under laboratory conditions, field conditions, and shipboard performance and it eventually went into full blown production and continues to be in the US Navy inventory. This system has unlocked revolutionary innovation value from the US Navy’s $160B annual budget, has long become standard test equipment for the US Navy while exponentially enhancing its combat readiness as well as those of our allies that adopted it.

When the year-end time came for bonuses my technician and I received several handshakes and numerous words of praise while the marketing and engineering upper management shared the cash bonus for winning this major award. I realized then that when I reached a position of authority, I would rectify these types of practices and I indeed did so.

Stephen Ibaraki 15:40

Again, remarkable innovation, but also sitting at the boundary of what you know, instead of the near impossible, making it possible. And resiliency, perseverance, and commitment; this concept of GRIT, from Angela Duckworth, sort of a variation of GRIT. But combined with perseverance and optimism, and pushing through against incredible challenges, and yet succeeding, and then learning the lessons of reward and recognition and providing another way of balancing that. So let's continue this journey.

Asad Madni 16:20


Since the acquisition of the major assets of Systron Donner by BEI Technologies, Inc., in 1990 I was involved in the development of several advanced systems, each of which posed their own unique challenges. In particular I would like to highlight the development of an extremely slow-motion, dual-axis servo control system for Hubble Space Telescope's (HST) star selector that provided the HST with unprecedented pointing accuracy and stability, resulting in truly remarkable images that have enhanced our understanding of the universe. This system allows a fine lock to the guidance system, thereby, providing a highly stable reference required for pointing the HST and maintaining a pointing accuracy equivalent to pointing at the face of a US quarter dollar as seen from 200 miles away and the pointing stability less than the width of the quarter dollar over a 24-hour period. This required development of optical encoding technology with accuracies previously unachieved (up to 23 bits of resolution) together with advanced actuation and signal processing techniques that would allow HST to scan a portion of the sky while orbiting earth at approximately 18000 mph. The system is still in use, 30 years since it was launched in 1990, with its pointing accuracy/stability resulting in over one million truly remarkable images such as the discovery of Pluto’s moons and the formation of galaxies thousands of light-years away.

Stephen Ibaraki 19:43

You have this revolutionary MEMS gyro chip technology. It's quite remarkable and something that consumers and business people can relate to. But, it's beyond the domain of the military. And chip guidance and things like that, and, systems and testing systems on chips. Also, it's used in the consumer marketplace with stability control. Can you talk a little bit more about that?

Asad Madni 20:18


(This material is adapted from previous published articles by Madni, et al.)

In the early 1990s, after BEI had acquired the major assets of Systron Donner Corp., the SD Inertial Division (SDID), celebrated 40 years of excellence in satisfying the inertial needs of Aerospace & Defense (A & D) markets, primarily with a product line of high precision accelerometers for space, missile and aircraft applications. The company acquired a new MEMS rate gyroscope technology concept based on a Coriolis force tuning fork, but the technology had yet to be commercialized for high volume production. The end of the Cold War forced a significant reduction in SDID’s overall business as older product demand declined while the new technology had not yet penetrated significant markets.

The Quartz Rate Sensor (QRS) exhibited promise for manufacturing with high-volume methods, however, the low production volume demand before 1995 could not justify the capital expense to automate the low-volume, labor-intensive manufacturing methods. We clearly needed a growth strategy to take advantage of the promise of the QRS.

After performing an extensive market survey, we identified a significant growth opportunity for an extremely low-cost solid-state rate gyroscope for automotive stability control brake systems. Since gyroscopes had never been engineered and adapted to automotive service, the application represented a new challenging and emerging market. “Stability Control” (SC) systems measure the vehicle yawing (turning) rate and a brake computer compares it to the desired yaw rate from the driver steering wheel command. A skid condition is detected by an out-of-tolerance comparison in a software algorithm. This detection causes a momentary automatic application of either left or right brake(s) to correct or “stabilize” the vehicle. SC systems enhance the safety of traditional Antilock Brake Systems (ABS) for a relatively small increase in cost. The automotive application required a gyroscope with extreme reliability, very low cost, built-in-test capability and high-volume manufacturability. The MEMS QRS conceptually met all of these requirements. A strategic decision was made by myself in my capacity as President, COO and CTO of BEI together with Charles Crocker, BEI Chairman to target the automotive requirement and to initiate conversion from an exclusively A&D business. Among the most formidable challenges that the company faced were the massive cultural and infrastructure changes which had to be made over the next five years to accommodate this new business mentality, while not abandoning the A&D business. Several areas were impacted including: the quality system, Enterprise Resource Planning (ERP) computer system, Electronic Data Interchange (EDI) customer ordering, statistical process controls, factory automation, technology road-mapping techniques for continuous cost reduction, engineering design and validation techniques for lowest unit cost and development of a global provider and customer base.

Under my leadership, QRS manufacturing processes and techniques were re-designed for mass-production primarily in fork fabrication, fork balancing and hermetic packaging, and final assembly, calibration and test. All labor-intensive processes were replaced by automation and proofing against human error. Continuous cost reductions were planned with five-year Technology Roadmaps. Products achieved the primary customer needs of performance specifications and continuous fault detection capabilities for safety-critical applications. A common quartz fabrication facility served both A&D and automotive product lines. Other low cost, high volume automotive components were fed into selected A&D products for extremely favorable cost benefits. These major changes (based on leveraging the A&D technology into the automotive and transportation markets, and reverse leveraging high volume, low-cost automotive components back into A&D markets) allowed SDID to dramatically ramp up production, shipping millions of units while it continued serving the A&D market.

The QRS technology, eventually called the GyroChip®, met key performance characteristics that were orders of magnitude better than the automotive application in the A&D gyroscopes which were focused on micro-miniature Inertial Measurement Units (IMU) and other high-performance solid-state rate gyro applications. The automotive challenge hinged on reducing unit cost of the GyroChip® to “double-digit” dollar levels from the three and four-digit levels common to A&D products. The cost reduction occurred primarily through selective investments in automation, significant advances in design techniques, and mass production techniques.

The classic semiconductor industry technique of more chips per silicon wafer was embraced by progressively moving from one tuning fork per wafer to 2, 4, 8, 16, and 56. All these designs utilized the same size wafer. This batch manufacturing, together with laser trimming and electronically programming calibration techniques, radically reduced the cost per tuning fork. Performance degradation caused by fork size reduction in such mass-based sensors was not only significantly mitigated but the revolutionary tuning fork designs actually improved the performance.

Another major hurdle was to implement Self-Monitoring for Safety Critical Systems. Rate gyroscope applications frequently occur in systems that create a dangerous situation if the gyro fails without the host system detecting that the rate-sensing device is providing faulty information. Automotive stability control brake systems generate direction-changing brake commands independent of the driver. For this reason, this “safety system” may become an “unsafe system” if it erroneously activates the brakes. Unlike other limited self-tests, the BEI GyroChip® was embedded with a patented technique called Continuous Built-in-Test (CBIT), which can monitor end-to-end sensor and electronics health continuously during operation. CBIT is a major contributor to stability control brake system safety.

The first high-volume production of the GyroChip® Yaw Rate Sensor commenced in June 1996 for application in the Cadillac StabiliTrak™ brake system. An unexpected event in the fall of 1997 determined that the market would mushroom at rates that were multiple of 100 percent per year. This event was the loss of control and rollover of a new European vehicle by a Swedish automobile magazine editor while executing a maneuver to simulate avoidance of an animal, such as an elk, crossing the road. This created a firestorm of adverse publicity and in reaction to the “Elk Maneuver” episode, the manufacturer committed to fit all future vehicles with stability control. European manufacturers seized on a marketing opportunity and decided to significantly ramp up their offerings of stability control. GyroChip® demand skyrocketed by a factor of 10 to over 400,000 in the first full year of production for Europe. By 2002, SDID had shipped well in excess of 5,000,000 GyroChips®.

We had to learn how to walk and run at the same time. With a focused leadership and pioneering technical innovations, the Gyrochip® evolved from a single axis Yaw Rate Sensor to a fully packaged Automotive Inertial Measurement Unit.

Using the success and knowledge gained through this low-cost manufacturing experience, SDID leveraged this engine of low cost, high-volume products back into the A&D market and made major inroads in the commercial aviation market.

The GyroChip® technology revolutionized navigation and stability in aerospace and automotive systems. Its applications range from satellites to airliners and cars. It is in use worldwide in over 80 models of passenger cars for automotive stability and rollover protection systems; in over 90 types of aircraft including the ACE Pitch Stability Control of Boeing 777/stretch 777; as Yaw Damper for over 3000 Boeing 737s; in most business jets for attitude heading and reference; and for guidance, navigation & control in major missile, UAV, helicopter & space programs. It is also the main technology for Rockwell Collins’ Pro-Line 21 & GHC-3000 avionics suites; NASA’s "Sojourner", EVA robotic camera-AerCAM Sprint, numerous satellites; and ARCHER airborne Hyperspectral Imaging System used by Civil Air Patrol in Search/Rescue and Disaster Management missions.

The re-engineering of BEI that I led resulted in one of the most successful “Defense Conversion” stories. The GyroChip® cost was reduced by orders of magnitude and met with unprecedented acceptance by the automotive industry. Electronic stability and roll over prevention are of paramount importance to human safety which, thanks to our efforts, we all enjoy when driving a car or flying on an airplane.

By 2005, over 55 million Gyrochips® were produced; their use for stability augmentation in passenger cars has saved millions of lives around the globe. This technology together with associated sensing technologies that my team and I developed, generated over $2 billion in revenues for BEI, while enabling hundreds of billions of dollars in revenue for its customers across the globe.

Simply put, the GyroChip is to autonomous vehicles, what the atom is to matter; it is a fundamental building block, without which, today’s progress in reaching a place of autonomous vehicle ubiquity would not be possible. We paved the way for a new industry that will span most of the 21st century.

Under my leadership, BEI became the world's largest independent provider of MEMS YAW and Roll/Stability Control (RSC) sensors for cars. In 2005, together with Charles Crocker, I spearheaded the sale of BEI Technologies to Schneider Electric for approximately $ 600 Million. In addition to serving as President, COO and CTO of BEI Technologies, I was appointed Chief Technology Officer for Schneider Electric’s Custom Sensors & Technologies (CST) Group (2005-2006) which included all the BEI companies, Kavlico and Crouzet Automation (revenues ≈ One-Billion Dollars, and ≈3000+ employees). I declined the CEO position for Schneider's CST in favor of planned retirement from BEI in 2006 in order to pursue activities with the US National Academy of Engineering and as a Distinguished Adjunct Professor and Distinguished Scientist at UCLA. CST's commercial portion of the business, which consisting of products and technologies that my team and I invented or led the research and development of, was sold to Sensata in 2015 for One-Billion Dollars.

Stephen Ibaraki 40:33

So let's take a minute here. I'm just going to summarize a little bit for the audience because you've covered so much. Initially, you describe your tenure, as an engineer with Systron Donner Corporation, where you were there for 18 years. You worked, developed, and pioneered; had tremendous challenges; seminal and pioneering developments in RF and microwave systems and instrumentation, which is proliferated in combat readiness with the US Navy and the Department of Defense, and to simulate more threat representative ECM environments, current and future advanced warfare training, and so on; just just a remarkable history. You eventually became chairman, president and CEO. Because it's just repeated contribution after contribution, after contribution. You then went into a company called BEI Technologies Inc., headquartered in California. You were responsible for the development and commercialization of intelligent micro sensors and systems for aerospace and military, transportation, commercial, including, and let's separate this now, extremely slow motion servo control system for the Hubble Space Telescope’s Star Selector System, which is unprecedented in its accuracy and stability. And you gave some really interesting measures. This idea, at a long distance, seeing a quarter dollar, and really resulting in all the remarkable images that we see. Separately from that, you led this journey of the revolutionary MEMS GyroChip® technology, which today is used worldwide in commercial innovation, Stability Control and Rollover Protection in passenger vehicles, future autonomous vehicles; saving millions of lives; a very, very successful innovation. You became president and chief operating officer, CTO of that organization. There was a $600 million acquisition by Schneider Electric, but then within Schneider, you in turn, had an executive position and continue this innovation journey of remarkable contribution of which those portions of Schneider which were acquired later for over a billion. Just literally a continual series of translational leadership, taking research into commercialization, and, taking on that cost factor which is something that a lot of researchers don't understand. As you indicated, if you have an unlimited budget, you can develop to any kind of performance standard, but when you have to do it with a cost constraint or to keep it affordable, so that it could be scalable globally, that introduces this whole number of other challenges which you have repeatedly surmounted. I now want to go into the more engineering contribution side, and for example, UCLA and so on. Did I capture that summary, correctly?

Asad Madni 43:45

I think you captured it extremely well. The only thing I would say is that BEI Technologies was formed and listed on NASDAQ. When the parent company...of Systron Donner Corporation, decided to divest all that US aerospace and defense companies, and BEI Electronics, the company which was in motion control systems and pointing systems, and sensors like pressure sensors and position sensors, acquired the major assets of Systron Donner Corperation, which included the inertial division, included other sensing divisions, but not the microwave division where I was, but I was at that point, leading the sale of Systron Donner major assets to BEI Technologies in 1990. So the combined company was known as BEI Technologies. Charles Crocker stayed as the Chairman (and CEO); I as the President, Chief Operating Officer, CTO just as a clarification.

Stephen Ibaraki 44:44

I can see this journey; that's a narrative of continuing development but also with colleagues that you trust and form relationships with and a similar mindset, as you continue, from Systron Donner to BEI Technologies, and then to excel. Okay, now let's get into your current areas of interest and your view of trends. We only have about 15 minutes left.

Asad Madni 45:08

Okay, we've got very little time left. So I will now do what you'd asked me, I'll tell you a few things that I'm working on right now. And then I'll give you a little bit of my vision for the future.


It is a different technical approach based on the Photonic Time-stretch Technology that was developed in Prof. Bahram Jalali’s lab at UCLA. I provided the innovations in time and frequency domain signal processing techniques.

In our digital world we convert real time analog signals by digitizing them with analog-to-digital converters (ADC) and then process them. This, however, places a limit on the bandwidth of the signal that we wish to digitize based on the conversion speed of the ADC. As a result, extremely high frequency signals or one-time rare event occurrences cannot be digitized, placing a limit on the system performance. As opposed to traditional approaches, time-stretch transformation of wideband waveforms boosts the performance of ADCs and digital signal processors by slowing down analog electrical signals before digitization and enabling ultra-high throughput and precision capture of wide-band analog signals so they can be digitized in real-time with a slower, higher-resolution, more energy efficient ADC.

The photonic time-stretch front-end consists of a femto-second mode locked laser (MLL), creating a linearly chirped optical signal, which is modulated by the incoming electrical data. As a second step, the data signal is stretched in time by the propagation through a dispersive fiber which reduces its analog bandwidth to fit within that of the digitizer.

Photonic time-stretch has enabled the development of various high-throughput, real-time instruments for science, medicine, and engineering applications. The technology has been employed for the discovery of “rare events” such as Optical Rogue Waves (Mariners have known for centuries that freak, giant waves can appear out of the blue in the ocean) and soliton explosions (striking nonlinear dynamics in dissipative systems. In this state, a dissipative soliton collapses but returns back to its original state afterwards). Time-stretch was used to directly observe the relativistic electron bunch microstructures with sub-picosecond resolution in a storage ring accelerator. It has enabled the record throughput of instruments such as serial time-encoded amplified microscopy (STEAM) and high-speed quantitative phase imaging for label-free detection of cancer cells in blood with a sensitivity of one cell in a million. A time-stretch accelerated processor (TiSAP) was used to perform real-time, in-service signal integrity analysis of 10 Gigabit/s streaming video packets for the first time on a commercial optical networking platform and for ultra-wideband single-shot instantaneous frequency measurements. Various research groups across the world have adopted time-stretch as a technique for the characterization of ultrafast phenomena and for increasing the resolution limits of high-speed ADCs.


Utilizing AI and ML techniques to create cost-effective, high-performance sensors that can be commercialized for various applications.

Wearable sensors utilizing wireless sensing technologies for non-intrusive monitoring of biological data.


Today we have unbelievable technology at our disposal. Three important examples are a) low-cost miniaturized sensors utilizing MEMS and nanotechnology which makes them ubiquitous in everyday applications b) miniaturization and increased density of memory chips together with cloud computing for data storage, computation, manipulation of data and signal processing and c) AI and ML to provide intelligence that can handle large amounts of data and perform previously unimagined tasks.

Heterogeneous Integration and Performance Scaling: Interpret and implement Moore’s Law to include all aspects of heterogeneous systems and develop architectures, methodologies, designs, components, materials and manufacturable integration schemes, that will shrink system footprint and Improve power and performance.

An area of great importance where advances are being made, as engineering strives to better human lives, is human-centered technologies—enabled by converging engineering advances in sensing, computing, machine learning, and data communication—which will draw on machine intelligence (MI) to help understand, support, and enhance the human experience. The challenge is to create technologies that work for everyone while enabling tools that can illuminate the source of variability or difference of interest.


I believe that AI and ML will play extremely important roles in taking us to the next level in several areas. AI relates to a form of execution demonstrated by machines, that traditionally has been associated with humans or animals. From simple robots (“Talos” in ancient Greece 2000 years ago) to the self-driving cars of today that seek to replace a human driver. These examples, both ancient and modern, fall under the realm of “weak AI” that is pre-programmed to address tasks that would have been given to a human.

The question that arises is—-where will the field go next? Professor Achuta Kadambi (of UCLA) and I wrote an essay for 50th anniversary commemorative edition of NAE’s Journal Bridge on this topic.

We emphasized that the untapped future of AI, where revolutionary progress awaits, lies in “strong AI”, where machines act as a teacher to humans. When humans learn from such machines it is possible to receive unexpected insights that yield a change in practice.

One future of strong AI lies in scientific discovery, a disruptive tool to unblock stagnated fields of science. Where a human can only apply the same known techniques in their arsenal, the unexpected insights from an AI might be the wiggle that is needed to get the wagon wheel out of the rut.

Consider the field of physics. The last 30 years have seen little progress on fundamental questions like explaining the wave collapse (do a search for these key Topic areas and/or use noted by Asad, in quantum mechanics, wave function collapse occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an "observation"). Part of the challenge is that physical observations have become much more expensive to collect (the so-called “Big Science”) and also difficult to interpret by humans. From Newton to Einstein, we have seen a remarkable jump in the complexity of the observations required to validate a theory.

Today, however, we have something that neither Einstein nor Newton had: ever-increasing computational power. This motivates a new paradigm for physics, which we refer to as “artificial physics”. The artificial physicist could operate in a way that is almost contradictory to a human. Where a human can test a small set of curated theories on a sparse set of data, a machine can test a huge number of combinatorial possibilities on massive datasets. It is certainly a radical change in approach, but hopefully one that can yield a radical change in results. For example, consider a computer program that can re-discover Einstein’s famous equations. We have not yet observed a technology that can automatically intuit these equations - one of the challenges is that Einstein’s equations are a human-interpretable construct - but a solution might build upon work in symbolic equation generation.

However, the road ahead to scientific discovery is not easy. For the moment, human engineers and computer scientists will have to create the “artificial physicist”. We will struggle with questions of interpretability. If the artificial physicist were to be based on a deep neural network, how does one enforce human interpretability. In other words, how does the output of the artificial physicist guarantee an output equation that meaningfully maps to what humans can interpret?

The future of AI lies in grappling with these nuanced challenges. There are multiple frontiers that could be explored. A first frontier lies in interpretability. If a machine is to teach humans new insights, both partners must speak the same language. Imagine if a hybrid team could be formed where two physicists work together: one is an artificial intelligence, the other real. A second frontier relates to novel algorithms and architectures to implement AI. Today, neural networks (“deep learning”) is the dominant approach to implementing weak AI. However, such methods are pre-programmed rather than self-thinking. Yet a third frontier of AI lies in unblocking traditional fields, not just physics, but chemistry, medicine, and engineering. The word choice of “unblocking” is deliberate. It is one thing to use AI as a tool to augment human performance in a field - much as computers augment the author searching for a word definition. It is entirely different to have the AI drive the research field in unexpected and meaningful directions. An example of unblocking in action can be found in the optical sciences. Progress in optical design long-held that Fourier coded apertures were optimal. With the advent of AI, optical scientists have been successfully using AI algorithms to create unexpected aperture masks that depart from - yet also outperform - Fourier masks. ... For thousands of years humans have been teaching AI to do our chores. It might be time that we let AI teach us how to innovate in new and unexpected ways.

Stephen Ibaraki 54:58

I'm going to inject some ideas because we're getting into time constraints now. I think the work of Judea Pearl at UCLA is really interesting, and sort of causal models that he's created. I think that it's interesting that LaMDA from Google, GPT-x (beyond the current widely used GPT-3) from OpenAI, Gato from Deep Mind, and added hybrid models, multimodal models that are coming out, in terms of AGI, more of a generalization combined with all of the aspects you talked about, and exascale supercomputing in the intersection of that with quantum computing computing (see my Forbes article with Jack Dongara, pioneer in high performance computing) and perhaps even analog computing, even the work of Pattie Maes at Fluid Interfaces, MIT Media Lab (see my Forbes article with Pattie); I just want to inject that into the audience of areas of confluence of many interesting ideas that you just suggested now. Let's now go to your final recommendations to the audience.

Asad Madni 55:58


1. Believe in yourself

2. You will encounter Ultracrepidarians—Ignore them

(Ultracrepidarian is a person who expresses opinions on matters outside the scope of their knowledge or expertise)

3. Don’t operate out of fear of failure

4. Do not become a victim of your successes (Learning Individual and Learning Organization)

“I was educated once—it took me years to get over it” ——- Mark Twain

(Deeper truth than one initially realizes!)

5. Knowledge is no longer power. What you do with it is.

6. Be creative and imaginative

“Imagination is more important than knowledge” ——- Albert Einstein

Stephen Ibaraki 58:12

Very profound and really great recommendations for the audience. Everybody to follow in terms of their life, their passions, their career, and so on. Thank you for coming in today, and sharing your deep insights with our audience; just a remarkable history of success! And again, congratulations on achieving the IEEE Medal of Honor for 2022, over transformational inventions and pioneering work. And you continue to do so as well. So thank you again for coming in.

Asad Madni 58:44

Stephen, my thanks to you, for this interview, for all that you do to keep our community updated on the latest cutting edge research with the giants of our field and for the exquisite way, the exemplary manner in which you conduct your interviews. And I'm honored to have been interviewed by. Thank you. Take care.

Sun, 10 Jul 2022 12:00:00 -0500 Stephen Ibaraki en text/html
Killexams : Literacy Act implementation test scores show improvement; nearly 12,000 students still falling behind

HUNTSVILLE, Ala. (WAFF) - If an Alabama law were fully implemented, 12,000 students across the state wouldn’t be moving on to the next grade. They’d be held back.

Last year, a portion of The Alabama Literacy Act went into effect it was created to help Improve studying in Alabama public schools. It was also created to ensure students are studying on grade level by the end of the 3rd grade.

After one year, there has been only a small improvement in test scores. There is still a long way to go.

“Without that skill at the end of the third grade, they are four times more likely not to complete high school,” said Senior Research Associate for Public Affairs Research Council of Alabama Thomas Spencer.

Spencer says the 2022 Alabama Comprehensive Assessment Program test scores show that 22 percent of third graders are not studying at a proficient level.

During the 2021 school year, Alabama implemented the Literacy Act curriculum to sharpen the focus on early grades reading.

“Particularly, students with learning disabilities and also students from economically disadvantaged backgrounds end to not come into school with quite the level of preparation and exposure to literature and studying that other kids get,” Spencer said.

According to the test scores, Wilcox County had the lowest test scores, with 58% of third graders falling behind, and the highest test scores were from Mountain Brook City, with just three percent.

“Parents, teachers, and communities need to work together and identify those students who are struggling in studying and wrap the services around them as early as kindergarten,” Spencer said.

Originally, part of the act was to hold back any 3rd-grade student, not at a proficient studying level, but that portion of the act has been delayed until the 2023-24 school year.

You can find a link to the full study here.

Copyright 2022 WAFF. All rights reserved.

Mon, 25 Jul 2022 20:12:00 -0500 en text/html
HP0-S35 exam dump and training guide direct download
Training Exams List