The Google Ads help guide has a new AI assistant.
Currently in open beta, the new tool has been designed to help users find answers and solve account issues relating to Google Ads.
Select users are currently being invited to test the AI and provide their feedback.
Here’s a screenshot of the new tool in action, as shared by PPC expert Julie F Bacchini:
Interestingly, the new AI assistant notably comes with a disclaimer, warning:
Why we care. Marketers have been widely encouraged to test new AI tools and integrate them into their daily work strategies. However, this tool comes with a clear warning that isn’t yet a finished product and that it may provide incorrect answers, which would suggest it’s unreliable and could waste time.
What has Google said? Google sent emails to select users, inviting them to trial the new tool and provide their feedback:
Get the daily newsletter search marketers rely on.
Deep dive. Read Google Ads' Best Practice guide for more information on how to get the most out of Google Ads.
New on Search Engine Land
Google Ads is testing a new AI-generative solution for help guides and answers. Google is inviting some advertisers "to try AI-generated answers in help guide," it emailed some advertisers. This is in addition to the other AI features in Google Ads being tested and what is in PMax.
Julie Bacchini received one invite and posted some examples on Twitter, she said, "Google Ads has an AI assistant in beta in help section. Comes wwith disclaimer "While I know a lot, I'm still learning and may make some mistakes. If I do, please leave feedback so I can learn."
Here is a screenshot she shared of the email:
Help guide BETA
You're invited to try Al-generated answers in help guide
If you choose to try this new feature, you'll see the option to provide feedback on any Al-generated content.
Your feedback helps us Strengthen the quality and accuracy of Al answers. Keep in mind that this is an early technology preview, so it may display inaccurate or inappropriate content. If you don't want to try this feature, you can still get many answers from help guide without it.
Here is a demo of how it works:
Help guide BETA
Hi! I'm Help guide, a digital support assistant. I can answer a variety of questions and solve problems relating to Google Ads.
While I know a lot, I'm still learning and may make some mistakes. If I do, please leave feedback so I can learn.
Here are some options. If you don't see what you need, type your question in the text box below.
Help me optimize
Campaign setup help
Conversion tracking issues
Set up conversion tracking
Add details or ask a question
Here are some more screen shots:
Forum discussion at Twitter.
Google Ads API version 14.1 is now available – two months after version 14 was released.
The minor update includes additional search term data, new recommendation types and account management assistance.
Why we care. The updated version of Google Ads offers a range of new tools and features to help you better monitor the performance of campaigns, so that you can make data led decision to Strengthen optimization efficiency.
Upgrade needed. In order to use some of the new Google Ads API version 14.1 features, you will need to upgrade your client libraries and client code. The updated client libraries and code examples are set to be published by Google next week.
Get the daily newsletter search marketers rely on.
New features. Although there are no breaking changes, there are several new features available through the updated Google Ads system:
Deep dive. Read the Google Ads API version 14.1 announcement in full for more information.
New on Search Engine Land
Google announced today it’s launching a new pilot program to provide enhanced customer service for a select group of small Google Ads customers.
According to a company statement, the goal of the paid pilot program is to:
“… provide agencies and advertisers with specialized one-on-one support tailored to specific customer needs.”
This marks a shift for Google, which has historically reserved this high-touch level of support for its largest advertising clients.
The change comes after increased complaints from small businesses who feel left behind by Google’s automated self-service options.
Google states in an email to Search Engine Journal:
“A common complaint from customers is that they want more specialized advice from Google experts and ideas for how they can Strengthen their ads campaigns and optimize their budgets. The paid pilot for our smallest customers gives them a level and quality of support that has historically only been accessible to our largest customers.”
Google’s hands-off approach for smaller clients may be more cost-effective but fails to provide the expertise and guidance smaller businesses need to compete with more prominent brands.
While limited to a small number of participants initially, the pilot represents a long-term effort by Google to rethink and Strengthen the customer experience for its advertising platform, which makes up the bulk of the company’s revenues.
Google’s statement continues:
“These changes are part of a long-term strategy that we’ll be building on overtime, testing, and learning as we go.”
Google plans to solicit in-depth feedback from pilot participants and adjust the program based on what works best for customers.
The goal is to eventually expand the level of support to more small and medium-sized advertisers over time.
Google has significantly upgraded the Google Ads Help Center to help customers resolve more issues on their own more easily.
The upgrades include multimedia additions, such as videos and GIFs, across articles related to editing campaign settings, editing bids, and resolving data inconsistencies in Google Ads accounts.
Agencies working with pilot participants can also access their clients’ specialized paid support consultations.
Google said it will monitor results closely as it continues to explore options to enhance support for ad customers of all sizes.
Featured Image: Primakov/Shutterstock
Google AdsLiaison provided a detailed rundown of Performance Max features to help advertisers align ads with brand safety requirements.
Google's Ad Liaison, Ginny Marvin, posted a X/Twitter thread clarifying some of the confusion about brand safety controls & reporting supported in PMax, Performance Max campaigns.
Ginny Marvin wrote, "There’s been some confusion about brand safety controls & reporting supported in PMax. So I want to share a rundown of important levers available to help you control the types of content your PMax ads can appear next to in Search, Shopping, Display & Video inventory."
(1) Search & Shopping suitability controls:
- New PMax campaign-level brand exclusions prevent your ads from serving for specific brand queries in Search & Shopping
- Account-level negative keywords prevent your ads from showing for those queries in Search & Shopping
(2) Display & Video suitability controls:
PMax supports all of your account-level content suitability settings – available from the Tools icon in Google Ads
(3) Content labels allow you to narrow the maturity level of YouTube & GDN content your ads can show on or next to. This is where you'll find the one-click "content suitable for families" option, for example.
(4) Inventory type, expanded, standard, limited, allows you to quickly choose the type of content best suited to your brand on YouTube & GDN.
(5) Content type exclusions prevent ads from showing on certain areas of video content such as live streaming or embedded YouTube video
(6) Sensitive content categories allow you to exclude certain types of GDN content such as tragedy and conflict.
(7) Exclude up to 1000 content keywords to prevent ads from showing on YouTube & GDN content related to those exact words.
(8) Placement exclusions prevent your ads from showing on specific YouTube and GDN content. PMax respects account- and MCC-level placement exclusions. For more on content suitability controls see here.
(9) PMax placement reports show the sites & apps where your ads appeared and are built expressly for GDN brand safety tools. They can be found in the Reports editor in Google Ads.
(10) The new Search Terms Insights updates include more categories, API integration, custom date ranges & downloading. More on Search Term Insights for PMax (and more).
Here is the start of that thread:
Forum discussion at Twitter.
After a research report last week found that YouTube’s advertising practices had the potential to undercut the privacy of children watching children’s videos, the company said it limited the collection of viewer data and did not serve targeted ads on such videos.
These types of personalized ads, which use data to tailor marketing to users’ online activities and interests, can be effective for finding the right consumers. Under a federal privacy law, however, children’s online services must obtain parental consent before collecting personal information from users younger than 13 to target them with ads – a commitment YouTube extended to anyone watching a children’s video.
Now Fairplay, a prominent children’s group, is challenging the company’s privacy statements. The group said it had used advertising placement tools from YouTube’s parent company, Google, to run a $10 ad campaign this month targeted at different groups of adults, exclusively on children’s video channels.
The ads were shown to users in consumer segments selected by the children’s group – including motorcycle enthusiasts, high-end computer aficionados and avid investors – on popular channels including “Cocomelon Nursery Rhymes,” “Talking Tom” and “Like Nastya,” according to a placement report Fairplay received from Google. In total, the group’s ads were placed 1,446 times on YouTube children’s video channels.
Adalytics, the company that published the research first reported on by The New York Times last week, said it had analyzed similar ad campaigns on children’s channels from several other media buyers.
On Wednesday morning, Fairplay, the Center for Digital Democracy and two other nonprofit groups lodged a complaint with the Federal Trade Commission, asking the agency to investigate Google and YouTube’s data and advertising practices on videos made for children.
In a letter to Lina Khan, the FTC chair, the groups said the new research “raises serious questions” about whether Google had violated federal children’s privacy rules.
Michael Aciman, a Google spokesperson, said: “The conclusions in this report point to a fundamental misunderstanding of how advertising works on made-for-kids content. We do not allow ads personalization on made-for-kids content, and we do not allow advertisers to target children with ads across any of our products.”
Google said it continued to abide by child privacy commitments it made to the FTC. It added that some YouTube channels feature a mix of videos for children and adults and that, as a result, it was possible that Fairplay had received audience segment reports for ads appearing on videos that were not made for children.
This is not the first time that Fairplay and the Center for Digital Democracy have pressed the FTC to investigate Google and YouTube over children’s privacy. In a complaint to the agency in 2018, the two organizations, along with 21 other groups, accused the company of improperly collecting data from children who watched children’s videos.
In 2019, the FTC and the state of New York found that the company had illegally collected personal information from children watching children’s channels. Regulators said the company had profited from using children’s data to target them with ads.
Google and YouTube agreed to pay a record $170 million to settle regulators’ accusations.
“There are very few legal protections for children online,” said Josh Golin, the executive director of Fairplay. “One of the few obligations that platforms like YouTube have is to not use children’s personal information to track them or serve personalized ads.”
This article originally appeared in The New York Times.
ITHACA, N.Y. – A Cornell University-led research team has discovered that the algorithm behind Google Ads charged significantly more to deliver online ads to Spanish-speaking people in California about the benefits of SNAP, formerly known as food stamps.
“SNAP is a really important resource to get right,” said Allison Koenecke, lead author of the study and assistant professor of information science. “When faced with an algorithm that has disparate impact, our research asks, how do you pick a strategy to interact with the algorithm to equitably recruit SNAP applicants?”
Californians can apply for SNAP benefits using a website called GetCalFresh, which is developed and managed by Code for America, a civic tech nonprofit that builds digital tools and services for community leaders and governments. Code for America primarily recruits GetCalFresh applicants through Google Ads – for example, spending roughly $400 daily to reach anyone from San Diego County who punches key words and phrases like “how to apply for food stamps” into Google.
However, despite GetCalFresh being offered in multiple languages, Spanish-speakers were filling out proportionally fewer applications than English-speakers. In San Diego County, 23% of families living below the poverty line speak Spanish as their primary language, and yet just 7% had applied for SNAP via GetCalFresh, researchers said.
Koenecke and her collaborators discovered one possible reason: the default, dollar-stretching algorithm behind Google Ads was working too efficiently and disregarding Spanish-speaking people in the process.
When Google Ads is configured to garner the most SNAP enrollments per dollar, it ends up delivering fewer ads to prospective Spanish-speaking applicants because such ads cost more than those for English speakers, the team found. At the time, for every $1 spent on Google Ads to “convert” an English-speaking applicant into a SNAP benefits holder, it cost $3.80 to convert a Spanish-speaking person – nearly four times more. Another bidding option on the Google Ads platform cost 1.4 times more to reach Spanish-speakers versus English-speakers.
Koenecke and her collaborators can’t definitively explain the difference, since Google Ads is a black box – a proprietary machine-learning tool outside of public review. It could be attributed to any number of factors, like supply and demand or a bug in the system, she said.
For GetCalFresh, the research findings pose an important ethical question regarding how to spend its limited online advertising budget: Should they reach as many Californians as cheaply as possible, even if that means fewer Spanish-speaking applicants, or advertise more to Spanish-speakers, even if that yields fewer total applicants?
Trade-offs such as these are at the heart of Koenecke’s research into fairness and algorithmic systems, which are increasingly being used to help with decision-making in areas with real consequences, like health care, banking and child services. But without additional scrutiny, algorithms – including a seemingly harmless one behind an advertising platform – can exacerbate inequality or produce results that run counter to what people actually want or need, she said.
As a result of the team’s findings, Code for America adjusted its online advertising strategy to directly target more Spanish-speaking prospective applicants.
“It’s important for the field and the public to have productive dialogues about the kinds of metrics we should be using in these algorithmic systems,” she said. “The communities most impacted by the algorithms should be given more power in the decision-making process.”
This research was partly funded by the National Science Foundation and Stanford University.
For additional information, see this Cornell Chronicle story.
Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.