0day updated free DP-203 Exam Questions with 100% pass guarantee

killexams.com gives you in order to download a 100% totally free DP-203 Exam Questions sample and evaluate the quality associated with the content. Our DP-203 research guide questions consist of a complete exam questions collection. All of us offer 3 a few months free updates associated with Data Engineering on Microsoft Azure test prep questions. Our group is constantly offered at the rear end who else updates the DP-203 mock exam because and when needed.

Exam Code: DP-203 Practice exam 2022 by Killexams.com team
Data Engineering on Microsoft Azure
Microsoft Engineering Topics
Killexams : Microsoft Engineering Topics - BingNews https://killexams.com/pass4sure/exam-detail/DP-203 Search results Killexams : Microsoft Engineering Topics - BingNews https://killexams.com/pass4sure/exam-detail/DP-203 https://killexams.com/exam_list/Microsoft Killexams : Microsoft Cloud and AI chief Scott Guthrie on what's new and next for Microsoft Cloud customers
Credit: Microsoft

Microsoft Executive Vice President Scott Guthrie runs some of the biggest businesses at the company. He's in charge of Microsoft's Cloud + AI Group, and oversees everything from Azure and Microsoft's Data Platform, to Dynamics 365, the Power Platform and GitHub. I had a chance to talk to him on Teams at the end of Day 2 of Microsoft's Ignite 2022 conference, which is focused on the IT pro and dev audiences.

Here's a transcript of our conversation, edited slightly for clarity.  

MJF: For anyone who's interested in the Microsoft Cloud, what you consider to be the top three or four "hidden gems" at Ignite. I'm thinking Topics which may not have gotten enough attention or things that people might be interested in if they knew more about them.

Guthrie: I think I'd say one macro trend that we've talked about before this Ignite obviously is the integration of the subcomponents of the Microsoft Cloud, and the fact that you can take advantage of these pieces that are not required but are preintegrated to get faster value.

I think a hidden gem would be a lot of the continued enhancements we're doing with Power Platform, Power Apps, Power Automate, and in particular some of the AI capabilities we're adding into it, including some of the GPT3 work around auto-completing Copilot functions for Power Apps. Because it connects with Teams, it connects with Azure, it connects with Office 365 and Dynamics, that would be a hidden gem from my perspective.

I think the PostgreSQL support that's coming to Cosmos DB is another one that developers are going to be really excited about, and really lets you leverage the global scale and extreme capabilities of Cosmos DB from a performance and scale perspective, with one of the most popular relational database APIs, which is PostgreSQL, on top.

I think the work we're doing with Viva Sales, which again takes advantage of that integration of the Microsoft Cloud and enables you to dramatically Improve your sales experience in the context of Teams, and integrates with both Dynamics 365 as the CRM, but also Salesforce as a CRM. And that ability to dramatically Improve the productivity of your sales organization, regardless of what the existing system of record you have, I think is a hidden gem for a lot of customers. It basically means every customer can benefit from it.

MJF: I wanted to ask you more about Dynamics and Azure together, because as you mentioned, there's a lot of synergy there with the Industry Cloud work that you're doing now. But I'm curious from an engineering standpoint what is going on with Dynamics and Azure, because I feel like the organizations are getting increasingly close. How do the AI capabilities in Dynamics get there? Do they come from Azure or do AI capabilities developed in the Dynamics org come over to Azure?

Guthrie: Both teams work for me, and I think to your question, the fact that they seem increasingly integrated and leveraged across each other is by design. Dynamics 365 is an Azure-native app. It really takes full advantage of Azure up and down the stack, including the AI capabilities. It uses our Azure Communication Service to integrate in Teams and to do video and telephony. It uses our Cosmos DB and SQL DB data services as a PaaS (platform as a service) service. It integrates with Synapse. It uses a common data model. And it integrates with Teams and the Office 365 offerings, obviously in a very deep way as well. It's I think one of the best examples of kind of deep integration up and down the stack that we've done.

And there's times when they are leveraging the core AI capabilities of Azure directly and building on top, and there are definitely times where Dynamics 365 has used the core Azure platform to build something that we then moved down into Azure, because we think it is now broadly applicable to even more customers and more use cases.

And so, I think you're going to continue to see us drive that integration and that synergy up and down the stack, and I think that's part of what makes the Microsoft Cloud special.

MJF: What's an example of something Dynamics developed that came back to Azure?

Guthrie: I think one of the areas that has really helped Improve Azure has been the way Dynamics 365 does database management. It's all built on top of SQL DB and it's per tenant, so each customer has its own SQL DB database. And that has been instrumental to building out a more general-purpose SaaS (software as a service) platform inside the Azure SQL database.

Things like elastic pool support, things like bring your own key, things like just the ability to manage millions of databases as a SaaS provider, a lot of those features were built originally in Dynamics, and we've now moved down into the core underlying Azure SQL database platform that now allows any SaaS company out there to be able to leverage that same capability.

MJF: I have an all-up AI question. I'm wondering if Microsoft's strategy is to create more commercial Azure AI services, or is it to integrate AI capabilities into existing Microsoft products like Power Automate and Microsoft Designer, or is it both?

Guthrie: Both. I think there's going to be places where we will deliver applications and workloads in use cases that leverage AI to solve very specific problems. GitHub Copilot would be an example, or the new Digital Contact Center work that we're doing would be another example. Nuance with the health care solutions that they're delivering would be a third.

But we are also very much a platform company, and we recognize that one application does not solve every possible use case in the world. And so, to the extent that we can also offer those services as lower-level APIs that either a business can build on directly, or other software vendors can build on and sell to businesses, we think that's the best opportunity for everyone.

For each of those use cases I mentioned, GitHub Copilot, Digital Contact Center, as well as Nuance DAX, each of those, also the underlying APIs and AI platform capabilities are available to be used directly as well.

MJF: I noticed (CEO) Satya Nadella talked at length about the partnership with Microsoft and OpenAI during the opening Ignite keynote. Microsoft also has the Azure Service also called OpenAI, which Microsoft announced this week is getting DALL-E 2 as an additional capability. What's next for Azure OpenAI and how much of a role does OpenAI, the company, figure into what's going on with Azure OpenAI?

Guthrie: Well, we partner very closely. And so, the underlying models in Azure OpenAI are built together with OpenAI.

The thing that we add with Azure OpenAI on top of the raw models is even more enterprise controls, the ability of compliance and security and additional enterprise capabilities, like locking down virtual networks or bringing your own keys. Things that a lot of enterprises need in order to productize these models, we provide prebuilt enterprise integration as part of the Azure OpenAI service. But the underlying models, AI models are the same. And we deeply collaborate together in terms of building those models.

And all those models from OpenAI are trained on top of Azure. And so, not only is Azure OpenAI building on top of OpenAI, but then OpenAI is building on top of Azure. That partnership, as Satya and (OpenAI CEO) Sam Altman talked about in the keynote, is very deeply symbiotic and is a fantastic collaboration across the companies.

MJF: You talked about Cosmos DB a little bit, but I want to ask you about another database, SQL Server. I know a big release of SQL Server is coming soon, very soon. Why are people going to be interested in SQL Server 2022? What's the elevator pitch for this release?

Guthrie: I think part of what we're continuing to do is continue to enhance SQL in lots of different ways, both in terms of improved capabilities, whether it's analytics, whether it's operational, whether it's around more advanced security. We're hearing more and more in terms of confidential compute needs. And so, there's a bunch of announcements we made at Ignite around that.

And then also in terms of scale and hybrid capabilities and the ability to be able to leverage SQL not just in our cloud, but on any cloud, including edge, and really to think and manage that data service holistically across your environment is top of mind as well.

MJF: Another virtual Ignite session I watched this week was (Azure CTO and Technical Fellow) Mark Russinovich's what's next for Azure. And it made me wonder if there are any things changing in Azure at the infrastructure level that you think will bubble up to customers. I know you do a lot of work in your own data centers to optimize for power and performance and all, but will any of those features and enhancements bubble up into services for customers?

Guthrie: I think certainly how we provide the lowest cost infrastructures is obviously top of mind for customers, particularly right now. But then also, I think other things that we're innovating on that I think customers really care about will be around sustainability. I think increasingly organizations want to understand across their value chain and their supplier chain, what is your carbon footprint, how are you helping me lower my overall net carbon footprint.

And so, a lot of the innovations we're doing that Russinovich talked about in terms of leveraging green energy, water recycling, how do we think about kind of our net zero carbon targets and goals, not just beyond carbon, but really around water sustainability and the entire global supply chain are things where I think we're adding a lot of enhancements that we increasingly see customers asking about because they want to understand how it's helping them on their overall sustainability journey.

MJF: It seems to me like more and more Microsoft developer tools are being sold as services. So I'm wondering, is Microsoft planning to go all-in on subscriptions on the dev side, the same way it's done, like with Office and with gaming.

Guthrie: We do offer our Visual Studio subscriptions, our Visual Studio offering as a subscription service, and have actually now for several years. And so, we do have the subscription certainly.

I would differentiate maybe the commercial model, which is subscription. Does that mean everything will run in the cloud as a service? I think from a technology perspective, part of what we're trying to do is how do we leverage the cloud with things like Copilot, obviously things like GitHub, testing, build support.

I think we've done some amazing work inside our DevDiv team in terms of really delivering cloud-based services. And at the same time, how do we leverage the local machine that the developer is using? So whether it's with Visual Studio or whether it's with Visual Studio Code, we obviously continue to support local development, and I think where we see the real value and power is when you can leverage both, and they integrate together in a very seamless way.

MJF: Anything else you'd like to highlight from Ignite this week?

Guthrie: Just stepping back for a moment, there's no doubt we're living through a period of sort of historic economic, societal, geopolitical, and technological change. And whether it's inflation that's at a 40-year high, whether it's supply chains that are stretched, whether it's the energy crisis in Europe, the war in Ukraine, there's lots of challenges. We've just gone through a period of massive change with the COVID pandemic, and obviously that's still ongoing.

But I think we've seen, and no company is immune to these challenges, including us. I think we've said before, both Satya, me and Amy (Hood) from our CFO perspective, no company is immune to these global challenges. But I think technology is a unique accelerator that can help ensure that customers can do more with less.

A key part of what we talked about at Ignite this week was really around how do we help our customers maximize the value of their cloud investments, how do we enable them to really experience the full value of the Microsoft Cloud, and we're trying to make sure that we're laser-focused on helping our customers use our platforms and tools so that they can do more with less.

And as we kind of continue to manage through this period, we're going to continue to invest in future growth; at the same time, maintaining intense focus on operational excellence and execution discipline. And I think that's what all of our customers are looking for as well from us, and we're here to help them and support them.

Fri, 14 Oct 2022 11:58:00 -0500 en text/html https://www.zdnet.com/article/microsoft-cloud-and-ai-chief-scott-guthrie-on-whats-new-and-next-for-microsoft-cloud-customers/
Killexams : Microsoft Introduces Azure Cosmos DB for PostgreSQL

During the latest Ignite conference, Microsoft announced Azure Cosmos DB for PostgreSQL, a new generally available (GA) service to build cloud-native relational applications. It is a distributed relational database offering with the scale, flexibility, and performance of Azure Cosmos DB.

Azure Cosmos DB is a fully managed NoSQL database with various APIs targeted at NoSQL workloads, including native NoSQL and compatible APIs. With the support of PostgreSQL, the service now supports relational and NoSQL workloads. Moreover, the company states that Azure is the first cloud platform to support both relational and NoSQL options on the same service.

The PostgreSQL support includes all the native capabilities that come with PostgreSQL, including rich JSON support, powerful indexing, extensive datatypes, and full-text search. Next to being built on open-source Postgres, Microsoft enabled distributed query execution using the Citus open-source extension. Furthermore, the company also stated in a developer blog post that as PostgreSQL releases new versions, it will make those versions available to its users within two weeks. 


Source: https://devblogs.microsoft.com/cosmosdb/distributed-postgresql-comes-to-azure-cosmos-db/

Developers can start building apps on a single node cluster the same way they would with PostgreSQL. Moreover, when the app's scalability and performance requirements grow, it can seamlessly scale to multiple nodes by transparently distributing their tables. A difference compared to Azure Database for PostgreSQL, which Jay Gordon, a Microsoft Azure Cosmos DB senior program manager, explains in a tweet:

#AzureCosmosDB for #PostgreSQL is a distributed scale-out cluster architecture that enables customers to scale a @PostgreSQL workload to run across multiple machines. Azure Database for PostgreSQL is a single-node architecture.

In addition, the product team behind Cosmos DB tweeted:

We are offering multiple relational DB options for our users across a number of Database services. Our Azure Cosmos DB offering gives you PostgreSQL extensions and support for code you may already be using with PostgreSQL.

And lastly, Charles Feddersen, a principal group program manager of Azure Cosmos DB at Microsoft, said in a Microsoft Mechanics video:

By introducing distributed Postgres in Cosmos DB, we’re now making it easier for you to build highly scalable, cloud-native apps using NoSQL and relational capabilities within a single managed service.

More service details are available through the documentation landing page, and guidance is provided in a series of YouTube videos. Furthermore, the pricing details of Azure Cosmos DB are available on the pricing page.

Sat, 15 Oct 2022 22:39:00 -0500 en text/html https://www.infoq.com/news/2022/10/azure-cosmosdb-postgresql-ga/?topicPageSponsorship=683c35f6-7de9-430e-90d8-6ed1298f5a00
Killexams : Microsoft Previews Azure Deployment Environments

During the latest Ignite Conference, Microsoft announced the public preview of Azure Deployment Environments. This managed service enables dev teams to quickly spin up app infrastructure with project-based templates to establish consistency and best practices while maximizing security, compliance, and cost-efficiency.

Setting up environments for applications that require multiple services and subscriptions in Azure can be challenging due to compliance, security, and possible long lead time to have it ready. Yet, with Azure Deployment Environments, organizations can eliminate the complexities of setting up and deploying environments, according to the company, which released the service in a private preview earlier this year.

Organizations can preconfigure a set of Infrastructure as Code templates with policies and subscriptions. These templates are built as ARM (and eventually Terraform and Bicep) files and kept in source control repositories with versioning, access control, and pull request processes. Furthermore, through Azure RBAC and Azure AD security authentication, organizations can establish comprehensive access controls for environments by the project- and user types. And finally, resource mappings ensure that environments are deployed to the correct Azure subscription, allowing for accurate cost tracking across the organization.

In a Tech Community blog post, Sagar Chandra Reddy Lankala, a senior product manager at Microsoft, explains:

By defining environment types for different stages of development, organizations make it easy for developers to deploy environments not only with the right services and resources, but also with the right security and governance policies already applied to the environment, making it easier for developers to focus on their code instead of their infrastructure.

Environments can be deployed manually through a custom developer portal, CLI, or pipelines.


Source: https://techcommunity.microsoft.com/t5/apps-on-azure-blog/announcing-azure-deployment-environments-preview/ba-p/3650223

Azure Deployment Environments are an addition to the existing services, such as CodeSpaces and Microsoft Dev Box the company made available earlier to enhance developer productivity and coding environments. CodeSpaces allows developers to get a VM with VSCode quickly, and similarly, with Microsoft Dev Box, they can get an entire preconfigured developer workstation in the cloud. 

Amanda Silver, CVP of the Product Developer Division at Microsoft, tweeted:

Minimize environment setup time and maximize environment security and compliance with Azure Deployment Environments. Now in public preview! Game changer for platform engineering teams.

More details of Azure Deployment Environments are available on the documentation landing page. Pricing-wise, the service is free during the preview period, and customers will only be charged for other Azure resources like compute storage and networking created in environments.

Fri, 14 Oct 2022 21:34:00 -0500 en text/html https://www.infoq.com/news/2022/10/azure-deployment-environments/?topicPageSponsorship=7014c5ba-cbd3-434a-a53e-1a344908657e
Killexams : Can you solve it? Nick Berry, data dude

Some sad news to report. Nick Berry, the British data scientist who wrote DataGenetics – one of the best and longest-running maths popularisation blogs – has died aged 55 after a long battle with cancer.

Nick was a Yorkshireman who studied Aeronautical Engineering at Southampton university. He later moved to Seattle, where he worked as a data scientist for firms including Microsoft and Facebook. He started DataGenetics in 2009 and it soon gathered a huge following for its accessible posts about interesting Topics in maths, physics and computer science.

Nick had a great eye for good subject matter, a gift for effortless explication and a continual joy in the subject. He also liked a good puzzle. Today’s three challenges are taken from his blog, with permission.

1. No-zero heroes

Write 1,000,000 as the product of two numbers; neither of which contains any zeroes.

(You may be interested to know that 10 x 10 x 10 x 10 x 10 x 10 = 1,000,000)

2. Lucy’s secret number

You are at a party and overhear a conversation between Lucy and her friend. In the conversation, Lucy mentions she has a secret number that is less than 100.

She also confesses the following information: “The number is uniquely describable by the answers to the following four questions:”

Q1) Is the number divisible by two?
Q2) Is the number divisible by three?
Q3) Is the number divisible by five?
Q4) Is the number divisible by seven?

She then proceeds to whisper the answers to these questions to her friend. Unfortunately, because of the ambient noise at the party, you only hear the answer to one of the questions. Knowing this one answer allows you to determine the secret number. The answer you hear is ‘“yes.” What is Lucy’s secret number?

3. Naughty maths elves

I write the whole numbers from 1-9999 (inclusive) on a huge chalkboard. Each number is written once.

During the night the board is visited by a series of naughty maths elves. Each elf approaches the board, selects two numbers at random, erases them, and replaces them with a new number that is the absolute difference of the two numbers erased.

This vandalism continues all night until there is just one number remaining.

I return to the board the next morning and find the single number of the board. Is this remaining number odd or even?

I’ll be back with the answers and solutions at 5pm UK

PLEASE NO SPOILERS

(And if you do check out the Datagenetics blog, which I recommend, don’t look straight away at the answers to these puzzles!)

I set a puzzle here every two weeks on a Monday. I’m always on the look-out for great puzzles. If you would like to suggest one, email me.

I deliver school talks about maths and puzzles (online and in person). If your school is interested please get in touch.

Sun, 16 Oct 2022 17:20:00 -0500 Alex Bellos en text/html https://www.theguardian.com/science/2022/oct/17/can-you-solve-it-nick-berry-data-dude
Killexams : Microsoft

Ishaq Khalid

BBC News, Abuja

Copyright: Getty Images

Image caption: Young people are expected to be the main beneficiaries of the training programme

Nigeria's government has signed an agreement with US technology giant Microsoft to train five million people in digital technology.

Minister of Communication and Digital Economy Isa Ali Pantami said the agreement would boost "job creation and economic development".

Prof Pantami described the initiative as "amazing and also very huge".

He told the BBC it would start immediately, and run for five years.

The deal was signed on the sidelines of an IT exhibition in Dubai on Wednesday, with a Microsoft representative saying the company was willing to work with the Nigerian government to provide "economic opportunities" for young people.

Microsoft and the government have been in talks for more than a year about the project.

Nigeria is Africa’s most populous country and biggest economy.

It has a largely young population of more than 200 million. But unemployment is high and the education sector is struggling. The government says it is trying to make the economy digitally oriented.

Prof Pantami also said that Nigeria would continue to provide an "enabling environment" for Microsoft and other companies to operate in the country by ensuring that regulatory measures were "developmental and flexible".

Sat, 15 Oct 2022 12:25:00 -0500 en-GB text/html https://www.bbc.com/news/topics/cvenzmgyg0lt/microsoft
Killexams : Science & Engineering

What is Artemis? Everything you need to know about NASA's new moon mission

NASA is embarking on a yearslong, multistage, groundbreaking mission to the moon. Here's why NASA is returning to the moon, who's going, what technology is enabling the mission, and more.

Tue, 11 Oct 2022 11:37:00 -0500 en text/html https://www.zdnet.com/topic/science-engineering/
Killexams : Top AI investors reveal State of AI in 2022

This article is part of a VB special issue. Read the full series here: How Data Privacy Is Transforming Marketing.

If you think artificial intelligence (AI) is moving at a breakneck speed and it’s almost impossible to keep up, you’re not alone. Even if being on top of all things AI is part of your job, it’s getting increasingly hard to do that. Nathan Benaich and Ian Hogarth know this all too well, yet somehow they manage.

Benaich and Hogarth have solid backgrounds in AI as well as tons of experience and involvement in research, community- and market-driven initiatives. AI is both their job and their passion and being on top of all things AI comes with the territory.

Benaich is the general partner of Air Street Capital, a venture capital firm investing in Al-first technology and life science companies. Hogarth is a cofounder at Plural, an investment platform for experienced founders to help the most ambitious European startups.

Since 2018, Benaich and Hogarth have been publishing their yearly State of AI report, aiming to summarize and share their knowledge with the world. This ever-growing and evolving work covers all the latest and greatest across industry, research and politics. Over time, new sections have been added, with this year featuring AI safety for the first time.

Event

Low-Code/No-Code Summit

Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.

Register Here

Traditionally, Benaich and Hogarth have also been venturing on predictions, with remarkable success. Equally traditionally, we have been connecting with them to discuss their findings every year upon release of the report. This year was no exception, so buckle up and let the ride begin.

State of AI: Evolution

AI research is moving so fast, it seems like almost every week there are new breakthroughs, with commercial applications quickly following suit. Case in point: AI coding assistants have been deployed, with early signs of developer productivity gains and satisfaction.

Coding assistants

OpenAI’s Codex, which drives GitHub Copilot, has impressed the computer science community with its ability to complete code on multiple lines or directly from natural language instructions. This success spurred more research in this space, including from Salesforce, Google and DeepMind.

Codex quickly evolved from research (July 2021) to open commercialization (June 2022) with (Microsoft’s) GitHub Copilot now publicly available for $10/month or $100/year. Amazon followed suit by announcing CodeWhisperer in preview in June 2022.

Google revealed that it was using an internal machine learning (ML)-powered code completion tool, which Benaich and Hogarth note in the State of AI report could soon lead toward a browser-based AI-powered IDE (integrated development environment). Meanwhile, Tabnine has more than 1 million users, raised $15M and promises accurate multiline code completions.

Text-to-art

And if you think that is old news, or not massive enough, then how about some diffusion-powered AI art? In 2021, diffusion AI models were overtaking GANs, the previously dominant AI models for image generation, on a few benchmarks. Today, diffusion AI models are used to power the likes of DALL-E 2, Imagen, Midjourney and Stable Diffusion, spreading to text-to-video, text generation, audio, molecular design and more.

This meteoric rise has given birth to both opportunities and open questions. It seems anyone can now create stunning imagery (and more) with the click of a button. Does that mean everyone is an artist now? Does that mean graphic designers will be out of work soon? And who owns the AI-generated art? These are just some of the questions that pop up, the answers to which seem to be “no,” “no,” and “we don’t know,” respectively.

Benaich pointed out the obvious: this is an evolving Topic and it will take a while for people to figure it out. There will be goofs and controversies, like people winning art contests with AI-generated art, while others are forced to take down AI-generated images. Some art communities are banning AI art altogether, while some artists are not at all happy that their art is included in datasets used to train those AI models.

Benaich thinks we’ll see more formal partnerships between AI companies and generators of these models and corpus owners, especially large corpus owners. Ultimately, it’s a question about the incremental value of additional data points in this broad dataset:

“It’s not clear that an individual contributor to a broad dataset really moves the needle on model performance and to what degree can an individual really influence this debate,” said Benaich. “Or would there have to be an en masse demand to not have work be present in a training dataset to influence this question of ownership?” Long term, he said, “If these systems are trained on everyone’s data, they shouldn’t necessarily be owned by a single party.”

Hogarth, for his part, noted that the economic model around monetization and ownership is currently massively in flux, as a result of alternatives that are popping up fast. “If you were planning to have an API that monetizes a generative image model and you have that behind a paywall, and then suddenly there’s an open-source project that offers the same quality experience in a self-service, non-commercial way, you’re going to see a real tension,” he noted.

Similar questions have also been raised for the case of AI coding assistants. This points toward a so-called “distributed” modus operandi for AI. What remains beyond question, as Benaich and Hogarth’s work in the State of AI report reveals, is the dominant player in the hardware used to generate AI models for all types of applications: Nvidia.

State of AI: Distribution

Compute infrastructure, the substrate that’s enabling all the progress in this field, as Benaich put it, is also seeing lots of innovation. However, he went on to add, despite the fact that there has been a lot of investment and willingness in the community to try and dislodge Nvidia as the giant in this space that powers everybody, that has not really happened.

Nvidia is still the AI hardware leader

It has always been hard to put numbers on that feeling, but this is precisely what Benaich and Hogarth tried to do this year. They scanned academic and open-source AI literature for papers that mentioned the use of a specific hardware platform to train the models that they reported in the results. They enumerated those papers and the results were both expectable and impressive.

What Benaich and Hogarth’s work showed was that the chasm between the sum of papers that mentioned using any form of Nvidia hardware and papers using TPUs or other hardware created by the top five semiconductor companies is sometimes 100 times or 150 times in favor of Nvidia. This, Benaich noted, hasn’t really changed that much in the last few years.

Nvidia has had a head start compared to the competition and they certainly made the best of it. Nvidia has created a massive ecosystem and partnerships around its hardware, investing heavily in its software stack as well. Nvidia also shows a startup-like attitude despite being the incumbent, as Benaich noted. They keep improving their hardware, using techniques such as ML to design new architectures, and their latest H100 generation of GPUs is said to bring a 10 times improvement in performance over their previous A100.

The tipping point for other challengers

On the other hand, GPUs come with certain baggage, as they were never really designed to accommodate AI workloads. It’s easier to innovate if you build something from the ground up and you don’t have backward-compatibility to worry about. That may well mean that eventually there will be a tipping point at which the challengers will start seeing substantial adoption. 

Benaich was adamant:  “I would like to see that happen. I think the industry would like that, too. But the data doesn’t suggest that. I think the question is always — how much better can a new design be and how much better does it have to be if its software is less well understood and the learning curve is higher? And you’re also fighting with a massive install-base that many companies already have.”

Benaich added that it’s “a really hard uphill battle” that has been fought for at least five years. “We would have thought that the chasm would be narrower than where it is if the future would actually look more distributed than pure Nvidia,” he said. “Despite there being a busting up of centralized ownership of models at the software layer, that hasn’t really happened at the hardware layer.”

This “busting up of centralized ownership” that Benaich spoke of at the software layer is another takeaway from the State of AI report. As Benaich noted, the last couple of years the central dogma in ML has been that of centralization. The hypothesis was that the entities that will profit and advance the most are those that can acquire the most resources, whether that’s money, talent, compute or data.

While that still rings true in many ways, it’s also being challenged. For example, Meta concluded that “while the centralized nature of the [AI] organization gave us leverage in some areas, it also made it a challenge to integrate as deeply as we would hope.”

Distributed research collectives

What we have been seeing in the last 12 months or so, Benaich added, is that there is an emergence of what he called “distributed research collectives” such as Adept, Anthropic, Inflection, Luther and Cohere. Benaich referred to these as being “broadly defined as either not even companies, or Discord servers that emerge, or nonprofit institutions or startups that are fundamentally open source.”

Benaich and Hogarth see those as another pole to do AI research, specifically work that focuses on diffusing and distributing inventions in centralized labs to the masses incredibly quickly. The report includes various examples of open-source alternatives for models — including text-to-image, language and biology models — being released faster than anyone expected.

Benaich believes that this will become the norm: first, closed-source models will appear and then within a matter of a year we’ll start seeing the first open-source models. He thinks this grants access to a broader community of people that otherwise wouldn’t participate [in AI]:

“That is because to get jobs in some of these big tech companies, you need to have a Ph.D., you need to be extremely technically literate and check certain boxes. These open-source collectives care a lot less about that. They care more about the value of each person’s contribution and the contributions can be different,” he said. At the same time, which ones of those initiatives can be viable and how, exactly, is an open question.

There are lots of dynamics at play there. Top-tier talent from the Googles and DeepMinds of the world is breaking loose and becoming entrepreneurial. At the same time, investment in startups using AI has slowed down in 2022 compared to 2021, along with the broader market, but is still higher than 2020. Investment in the USA accounts for more than half of the worldwide venture capital and unicorns, while private valuations are on the rise.

State of AI: Safety

Last but not least, the somewhat forward-looking introduction of AI safety. The report’s section on AI safety starts by quoting AI pioneers like Alan Turing and Marvin Minsky, who warned about the dangers of machine intelligence surpassing human capabilities as early as the 1950s.

AI safety is currently used as an umbrella term that captures the general goal of making powerful AI systems aligned with human preferences and values, as Hogarth noted. Some of the challenges are nearer term, such as taking a computer vision system used by law enforcement and trying to understand where it exhibits bias. Benaich and Hogarth included work on related Topics in previous years.

Preventing AI armageddon with alignment

What’s new in 2022, and what made Benaich and Hogarth dedicate an entire section to AI safety, is the other end of AI safety. This is what Hogarth referred to as AI alignment: ensuring that an extremely powerful and superintelligent AI system doesn’t ever go rogue and start treating humanity badly in aggregate. The 2022 State of AI report is very much biased toward that end of safety because, according to Hogarth, the Topic is not receiving enough attention.

“We’re seeing exponential gain in capabilities, exponential use of compute, exponential data being fed into these [AI] models,” Hogarth said. “And yet we have no idea how to solve the alignment problem yet.” It’s still an unsolved technical problem where there are no clear solutions, he added: “That’s what alarms me — and I think that the thing that is probably the most alarming about all of it is that the feedback loops now are so violent. You have huge wealth creation happening in AI. So there’s more and more money flowing into making these models more powerful.”

There’s more geopolitical awareness of the significance of this and there’s competitive dynamics between countries accelerating, he went on, as well as more social prestige. “You get kudos for working at DeepMind or OpenAI,” he said, “so there’s a lot of these powerful feedback loops that are kicking in and making the systems [have] more increasing capabilities at a greater rate and we don’t have the same feedback loop starting to kick in on safety.”

It’s a forward-looking concern, but the thinking in the report seems to be “better safe than sorry.” Many AI researchers share Hogarth’s concern as well. The report quotes a latest survey of the ML community, which found that 69% believe AI safety should be prioritized more than it currently is.

A separate survey of the NLP community was also quoted, which found that a majority believe AGI (artificial general intelligence) is an important concern we are making progress toward. Over 70% believed AI will lead to social change at the level of the Industrial Revolution this century, and nearly 40% believed AI could cause a catastrophe as bad as nuclear war during that time.

The report’s key findings are that AI safety is attracting more talent and funding, but remains relatively neglected and underfunded. Some initial progress had been made toward alignment, via approaches such as learning from human feedback, red-teaming, reverse-engineering neural networks and measuring moral behavior in artificial agents.

Focusing on the AGI angle

The State of AI report also highlights Conjecture, which it notes is the first well-funded startup purely focusing on AGI alignment. Conjecture is a London based startup, led by Connor Leahy, who previously cofounded Eleuther – the organization that kicked off decentralized development of large AI models.

Conjecture operates under the assumption that AGI will be developed in the next five years, and on the current trajectory will be misaligned with human values and consequently catastrophic for our species. It has raised millions from investors, including the founders of GitHub, Stripe and FTX.

Of course, all of that only makes sense if you believe that we are moving toward AGI. Not everyone shares this belief. Different schools of thought are aptly exemplified by Meta’s Yann LeCun, and prolific AI scholar, author and entrepreneur, Gary Marcus. Besides the debate on how AI could move forward, Hogarth finds that Marcus’s criticism of the capabilities of AI models is “extremely unhelpful … kind of the opposite of ringing the fire alarm.”

Hogarth believes that in order to carve a safe path toward the advancement of AI, there should be more funding of AI safety, as well as some regulatory oversight. He mentioned gain-of-function research as an example of research that is only allowed under certain conditions and AI should be modeled after. For Hogarth, the distinction of what should be regulated cannot be done on the basis of applications, as all applications have potential for misuse, but rather on the basis of capabilities: anything above a certain threshold should be subject to scrutiny.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Fri, 14 Oct 2022 12:01:00 -0500 George Anadiotis en-US text/html https://venturebeat.com/ai/top-ai-investors-reveal-state-of-ai-in-2022/
Killexams : Engineering News | Topics | Sponsored Content - Solutions
Thu, 13 Oct 2022 11:59:00 -0500 en text/html https://www.engineeringnews.co.za/topic/solutions
Killexams : Enhanced Microsoft app allows visually impaired to hear Haleon health products information

Picture1

To mark World Sight Day on 13th October, Haleon and Microsoft are launching a joint project to make health products more accessible for blind and visually impaired consumers, using Artificial Intelligence (AI) technology that narrates the labels of products.

New enhancements in the free Microsoft Seeing AI app will help advance inclusivity and Improve accessibility, they say.

Across the UK and USA, consumers will be able hear important label information for over 1500 everyday consumer health products such as Sensodyne, Centrum, Aquafresh, ChapStick and Emergen-C. Haleon, a global leader in consumer health, will be participating in a Brand Challenge at the upcoming AIPIA World Congress on 14/15 November, in Amsterdam.

This new collaboration will help people who are blind, have low-vision or have difficulty with practicing the labels of products due to low literacy. Expanding functionality in the Microsoft Seeing AI app will provide more detailed labelling information for this community of consumers.

With the launch of Haleon’s ‘Always Read the Label,’ campaign for World Sight Day, these consumers are able to read labels through Seeing AI by scanning the barcode on Haleon products.

They will be able to hear important information, such as name, ingredients, and use instructions. Through the enhanced functionality that Seeing AI offers, Haleon will help empower people to care for their own health independently.

According to the UK National Health Service (NHS) more than 2 million people are living with sight loss while the National Literary Trust reports 8.5 million people in UK have very poor literacy skills.

In an independent study of visually impaired people commissioned by Haleon, 93% of respondents said they don’t feel health products are accessible enough and almost 1 in 5 have taken the wrong dosage as they couldn’t read the packaging effectively.

The Seeing AI App was developed by a team of Microsoft engineers headed by project lead and engineering manager Saqib Shaikh, who lost his sight at the age of seven, and was driven to develop the app by his passion for using technology to Improve people’s lives.

He commented: “I’m really excited to see the launch of this enhanced product recognition functionality, developed in collaboration with Haleon. Seeing AI’s intelligent barcode scanner plays audio cues to help users find the barcode, and now the information displayed is coming straight from the manufacturer, providing richer information. This can be invaluable for someone who cannot read the label, leading to greater independence.”

Tamara Rogers, chief marketing officer at Haleon added: “Helping people access vital information on our products is one of our first initiatives to make everyday consumer health more inclusive. We hope the capability to narrate labels across Haleon’s products brings greater independence to our consumers.

We have set ourselves the goal of helping 50 million people to be more included in opportunities for better everyday health by 2025. We are doing this by tackling three big barriers that we know put everyday health out of reach for too many of the world’s citizens: Health Literacy, Healthcare Accessibility and Bias & Prejudice.”

The Microsoft Seeing AI App is free to download from the Apple App Store, and will be available on Android in the future. Users simply hold their phone camera over the existing barcode on the packaging.

The app will read out the product name and all text on the packaging. The user can skip ahead or move back to the relevant section. There are plans to expand globally and add additional languages in the future.

Want to put yourself at the forefront of smart packaging innovation? Click here to get your tickets for the AIPIA World Congress, which takes place in Amsterdam on the 14th and 15th of November 2022.

This article was created in collaboration with AIPIA (the Active and Intelligent Packaging Industry Association). Packaging Europe and AIPIA are joining forces to bring news and commentary about the active and intelligent packaging landscape to a larger audience. To learn more about this partnership, click here.

Wed, 12 Oct 2022 23:33:00 -0500 en text/html https://packagingeurope.com/news/enhanced-microsoft-app-allows-visually-impaired-to-hear-haleon-health-products-information/8897.article
Killexams : The Program

The following are the learning outcomes of the program:

  1. Students will have a deep appreciation for the ways their actions impact human society and the environment at large.

  2. Students will be able to apply a breadth and depth of appropriate engineering sciences knowledge, skills, and techniques from different fields to solve complex engineering problems.

  3. Students will apply their skills and knowledge through creative problem solving.

  4. Students will have an appreciation for engineering as an inherently human endeavor, and will take a human-centered approach as engineers.

  5. Students will be lifelong learners who have the ability and confidence to acquire new skills and knowledge, using technologies yet to be developed, in order to address problems yet to be identified.

  6. Students will work fluently as active contributors in multidisciplinary teams to identify and implement engineering solutions nested within complex problems.

  7. Students will be able to effectively translate their ideas through written, oral, visual, and other forms of communication.

  8. Students will be entrepreneurially-minded and will have the confidence and competence to realize ideas.

Fri, 07 Oct 2022 09:03:00 -0500 en text/html https://www.bc.edu/bc-web/schools/mcas/departments/engineering/academics/the-program.html
DP-203 exam dump and training guide direct download
Training Exams List