Here is Pass4sure Google-AMA Practice test updated today

We have legitimate and state-of-the-art Google-AMA pdf download and braindumps. killexams.com gives the specific and latest Google-AMA questions and answers with braindumps which essentially contain all data that you really want to finish the Google-AMA test. With the aid of our Google-AMA test dumps, you Do not need to chance your chance on perusing reference books yet basically need to consume 10-20 hours to retain our Google-AMA questions and answers and replies.

Exam Code: Google-AMA Practice exam 2022 by Killexams.com team
Google AdWords Mobile Advertising
Google Advertising information source
Killexams : Google Advertising information source - BingNews https://killexams.com/pass4sure/exam-detail/Google-AMA Search results Killexams : Google Advertising information source - BingNews https://killexams.com/pass4sure/exam-detail/Google-AMA https://killexams.com/exam_list/Google Killexams : Google Algorithms & Updates Focused On User Experience: A Timeline

As the role of search evolves to touch multiple marketing and consumer touchpoints, optimizing for the user has never been so important.

This is reflected in Google’s continual focus on the searcher experience. Whether in its core algorithmic updates, new features, products, or SERP format changes.

While some of these Google changes have involved updates targeting low-quality content, links, and spam, other updates aim to understand consumer behavior and intent.

For example, most exact updates have focused on page speed, Core Web Vitals, and product reviews.

Considering the massive competition for SERP real estate from brands, even slight drops in position will critically impact traffic, revenue, and conversions.

In this article, I examine a combination of some (not all) Google updates and technological advancements that significantly reflect the search engine’s focus on the human user and their experiences online – from Panda in 2011 through to Page and Product Experience in 2021 and 2022.

Google Panda (2011)

First launched in February 2011, subsequent updates were continuous and added to Google’s core algorithm.

Panda was announced to target sites with low-quality content; this was one of the first signals that Google focused on content for the user experience.

The focus: producing and optimizing unique and compelling content.

  • Avoid thin content and focus on producing high-quality information.
  • Measure quality over quantity.
  • Content length is not a significant factor but needs to contain information that answers the user’s needs.
  • Avoid duplicate content – initially a big concern for ecommerce sites. Most recently, Google’s John Mueller explained that duplicate content is not a negative ranking factor.

Google Hummingbird (2013)

Following the introduction of the Knowledge Graph came Hummingbird with a focus on semantic search.

Hummingbird was designed to help Google better understand the intent and context behind searches.

As users looked to enter queries more conversationally, it became essential to optimize for user experience by focusing on content beyond the keyword with a renewed focus on the long tail.

This was the first indication of Google using natural language processing (NLP) to identify black hat techniques and create personalized SERP results.

The focus: creating and optimizing content that audiences want and find helpful.

  • Long-tail keywords and intent model strategies became crucial.
  • Content creation is needed to address what users are interested in and would like to learn.
  • Expand keyword research to include conceptual and contextual factors.
  • Avoid keyword-stuffing and producing low-quality content to personalize experiences.

E-A-T (2014)

Although it gained attention in 2018, the Google E-A-T concept first appeared in 2014 in Google’s Quality Guidelines.

Now, it is part of Google’s guidelines on focusing on YMYL – your money or your life.

Marketers were advised to focus on content that could impact their readers’ future happiness, health, financial stability, or safety.

Google established E-A-T guidelines to help marketers tailor on and off-page SEO and content strategies to provide users with an experience containing the most relevant content from sources they could trust.

In other words: Expertise, Authority, and Trust.

The focus: ensuring websites offer expert and authoritative content that users can trust.

  • Create content that shows expertise and knowledge of the subject matter.
  • Focus on the credibility and authority of websites publishing content.
  • Improve the overall quality of websites – structure and security.
  • Earn off-page press coverage on reputable sites, reviews, testimonials, and expert authors.

Mobile Update (2015)

This was the first time Google gave marketers a heads-up (or a warning, for many) that an update was coming.

Focusing on the user’s experience on mobile was a significant signal reflecting the growing use of mobile as part of the customer search journey.

Google clearly communicated that this update would prioritize mobile-friendly websites on mobile SERPs. Many more mobile updates followed.

The focus: mobile content and users’ mobile site experience.

  • Focus on design factors such as responsive design and mobile page structures.
  • Enhance site navigation, so mobile users can quickly find what they need.
  • Avoid format issues on mobile that were different from the desktop experience.
  • Confirm that websites are mobile-optimized.

Just after the mobile update went live, Google quietly issued a Quality update.

Websites that focused on the user experience by focusing on quality content and avoiding too much irrelevant user-generated content and too many ads did well. This was another sign that Google was putting the user experience first.

RankBrain (2015)

Like the Hummingbird principles and NLP mentioned earlier, Google RankBrain was more of a change to the algorithm.

It gave us an indication of how vital machine learning was in all marketing and technology forms.

Utilizing this to learn and predict user behavior, RankBrain powered search results based on an even better understanding of users’ intent.

The focus: ensuring that content reflects user intent and optimizing for conversational search.

  • Place greater focus and emphasis on creating content that matches the user’s intent.
  • Ensure that all aspects of technical SEO are updated (such as schema markup, for example).
  • Google signified that RankBrain was the third-most important ranking signal.

Google Mobile-First Indexing (2018)

The Mobile-First Indexing Update meant that Google would use the mobile version of a webpage for indexation and ranking.

Once again, this was aimed to help enhance the user experience and help users find what they are looking for.

Producing content for mobile and focusing on speed and performance became paramount to success.

The focus: re-affirming the importance of mobile optimization, content, speed, and mobile site performance.

  • Improve AMP and mobile page speed and performance.
  • Ensure that URL structures for mobile and desktop sites meet Google requirements.
  • Add structured data for both desktop and mobile versions.
  • Make sure the mobile site contains the same content as the desktop site.

Google has said that March 2021 is the rollout date for its mobile-first index.

Shortly afterward, Google made mobile page speed a ranking factor so website owners would focus on load times and page speed to enhance the user experience.

Broad Core Algorithm Updates (2018)

2018 was a year in which Google released lots of core algorithm updates covering areas such as social signals and the so-called medic update.

After the August update, in particular, Google’s John Mueller suggested making content more relevant.

While there was some confusion on ranking factors and fixing specific issues, it did bring the concept of E-A-T and content for the user top of mind for many SEO professionals and content marketers.

On the subject of rater guidelines being key to the broad update, Google’s Danny Sullivan suggested:

“Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That’s like almost 200 pages of things to consider.”

BERT (2019)

Following RankBrain, this neural network-based method for natural language processing allowed Google to understand conversational queries better.

BERT allows users to find valuable and accurate information more easily.

According to Google, this represented the most significant leap forward in the past five years and one of the greatest in search history.

The focus: improving the understanding of consumer intent through conversational type search themes.

  • Increase the depth and specifics of the content.
  • Work more with long-tail queries and phrases using more than three words.
  • Ensure that content addresses the users’ questions or queries and is optimized correctly.
  • Focus on writing for humans clearly and concisely so that it is easy to understand.

Read more on BERT and SMITH here.

COVID-19 Pandemic (March 2020)

The global pandemic meant that consumer behavior and search patterns changed forever as Google continued to focus on E-A-T signals.

Google began to emphasize YMYL signals as the internet struggled to cope with misinformation and SEO pros struggled to keep up with the rapid shifts and dips in consumer behavior.

From setting up 24-hour incident response teams with the World Health Organization and policing content to helping people find helpful information and avoiding misinformation, the user’s needs never became so important.

The demand for SEO rose to an all-time high, and Google released a COVID-19 playbook.

Google Page Experience Update And Core Web Vitals Announced (May 2020)

Focusing on a site’s technical health and metrics to measure the user experience of a page metrics include looking at how quickly page content loads, how quickly a browser loading a webpage can respond to a user’s input, and how unstable the content is as it loads in the browser.

The focus: integrating new Core Web Vitals metrics to measure and Strengthen on-page experiences.

  • Mobile-friendliness, safe browsing, HTTPS, and intrusive interstitials – The Google Page Experience Signal.
  • LCP (Largest Contentful Paint): Strengthen page load times for large images and video backgrounds.
  • FID (First Input Delay): Ensure your browser responds quickly to a user’s first interaction with a page.
  • CLS (Cumulative Layout Shift): Include the size attributes on your images and video elements or reserve the space with CSS aspect ratio boxes and ensure content is never inserted above existing content, except in response to user interaction.

Broad Core Algorithm Updates (2020)

The third Google core algorithm update of the year rolled out in December 2020. This came in the form of slight changes that affect the order and weight of certain (not always disclosed) ranking signals.

According to SEJ Contributor Ryan Jones:

“Google aims to serve content that provides the best and most complete answers to searchers’ queries. Relevance is the one ranking factor that will always win out over all others.”

Read more on December’s Core Update here.

Passage Ranking (February 2021)

Google officially rolled out its passage-based indexing, designed to help users find answers to specific questions.

You’ve probably seen this in the wild, but essentially this allows Google to highlight pertinent elements of a passage within a piece of content that fits the question.

This means long-form content that may not be skimmable but provides valuable answers could be surfaced as a result.

Ultimately, this makes it easier for Google to connect users to content without making them hunt for the specific answer to their questions when they click on a page.

The key to success with passage ranking goes back to focusing on creating great content for the user.

Read more on the 16 Key Points You Should Know here.

Product Reviews Update (April 2021)

This new product review update was designed to Strengthen a user’s experience when searching for product reviews.

Marketers were advised to focus on avoiding creating thin content as this update will reward content that users find most helpful.

The focus: rewarding creators who provide users with authentic and detailed review content

Google shared nine helpful questions to consider when creating and publishing product reviews.

  • Show expert knowledge about products.
  • Differentiate your product compared to competitors.
  • Highlight benefits and any drawbacks clearly and concisely.
  • Show how the product has evolved to fit the needs of the user.

Read more here.

MUM (May 2021)

Following RankBrain and BERT, MUM (Multitask Unified Model) technology utilizes AI and NLP to Strengthen information retrieval.

For the end user, this technological advancement helps provide better information and results as it processes multiple media formats such as video, images, and audio.

Pandu Nayak, Google fellow and vice president of Search, said:

“But with a new technology called Multitask Unified Model, or MUM, we’re getting closer to helping you with these types of complex needs. So in the future, you’ll need fewer searches to get things done.”

Read more here.

Page Experience Update And Core Web Vitals (CWV) Rollout (June 2021)

The much-anticipated Page Experience Update, including Core Web Vitals, rolled out, with further updates to desktop following in March 2022.

Nine months after the rollout of Google’s Core Web Vitals and over a year since BrightEdge launched pre-roll predictive research, new research showed how many industries are adapting and improving their Core Web Vitals.

The focus: improving Pages Experiences for users with speed and precision.

  • Retail giants have made significant strides in improving experiences.
  • In cases like Retail, CWV metrics like input delay have been cut in half.
  • Although Finance was the best prepared last year, it made the least performance gains in the categories ​evaluated.

Spam Update (June 2021) And Link Spam Algorithm Update (July 2021)

Ensuring users get the right results based on their searches is foundational to a good experience.

In addition, updates and algorithm changes help protect users’ privacy to keep searches safe and secure.

The focus: keeping user experiences safe.

Learn more in this video from Google here.

Local Search Update (November 2021))

Google has always provided local search updates for local search users and fine-tuned its algorithm for better user results.

Local search is a huge channel, not to be underestimated, but a whole other post.

This also includes guidance on how businesses can Strengthen their local ranking for improved customer experiences.

Read more here.

Product Algorithm Update (March 2022)

On March 23, 2022, Google provided an instruction update based on how product reviews are performing in one year.

This also informed the community of improved rollout updates that will help users surface accurate and relevant information to help with purchasing decisions.

The focus: user experience and surfacing results that help users make purchasing easier.

  • As always, showcase your expertise and ensure the content is authentic.
  • Share why you recommend products with evidence to support it.

Read more advice here and here.

Conclusion

A successful user experience requires a combination of content and technical expertise. Updates and guidance help marketers create content for the user.

In addition, algorithms and technological advancements help Google surface better results and showcase accurate, relevant, and trustworthy content.

Google will continue to focus on improving experiences for its user.

As a marketer who wants to optimize for both, ensuring your website (from navigation, speed, and reliability) and focusing on content is vital.

Many of Google’s updates signal that technical SEO, data science, and content marketing excellence are coming together.

Stay up to date and read through all of Google’s Updates here on SEJ.

More Resources:


Featured Image: Gorodenkoff/Shutterstock

Wed, 03 Aug 2022 04:20:00 -0500 en text/html https://www.searchenginejournal.com/ux-google-algorithm-updates/459267/
Killexams : Marketing Briefing: ‘A very nervous bunch of CMOs’: How the ripple effects of the current uncertainty is affecting the ad industry

Ad agency execs and pitch consultants say that overall marketers seem to be cautious, pulling back on budgets and focusing on project work rather than agency-of-record pitches. Agencies, meanwhile, have been tossing their hat in the ring for smaller pieces of business than usual and project work they wouldn’t normally go after during sunnier times, according to agency execs and pitch consultants. Some shops have also tapped their more seasoned or well-known creatives to sit in or pitch on pitches where they’d normally have more junior creatives or execs take the reins. 

“Pitches are down 39% this year compared to last year,” said Greg Paull, co-founder and principal of search consultancy R3. “You’ve got a very nervous bunch of CMOs who are probably pushing things back to the second half of the year to make shifts. There’s also been more project-based pitches over large global pitches.” 

Lisa Colantuono, president of search consultancy AAR Partners echoed that sentiment adding that she’s seen “cautious continuation” from clients who are not only dealing with the potential recession but also the on-going supply chain and talent issues has had some clients pull back on budgets. 

“Clients are watching their budgets just as we are watching our own,” said Celeste Bell, evp and director of human resources at Deutsch NY, when asked about the current economic climate’s impact on the agency’s clients. Bell noted that Deutsch is still hiring for positions and hasn’t done layoffs as clients are still moving forward with planned work. Even so, the agency is “trying to know what clients know when they know it,” noted Bell. 

The belt tightening at agencies has some job seekers worried. “People are getting nervous,” said ad recruiter Christie Cordes. “Ad agency CFOs are allowing hiring, but acting bearish on capping salaries – in other words, there is very little tolerance right now, to approve more in budgets to ‘secure a candidate hire.’ To me, that looks like bear-ish behavior. The candidate wants $15K over our budget – cut them loose, rescind the offer. It’s not a good sign, especially considering how tight the talent market still is.” 

That’s not to say all agencies will feel the pinch the same way. Some say that shops with specialties – whether that’s in Web3 or influencer marketing or even sustainability – may weather the storm easier than traditional shops as clients increasingly seek out efficiencies associated with working with agencies with a specific niche.

3 Questions with Megan Jones, vp of marketing at January Digital marketing agency and consultancy

As Q4 approaches, what’s top of mind for January Digital?

The focus for us right now continues to be growth. We’re growing very quickly. Since the beginning of the pandemic, we’ve more than doubled in size. For us, it’s maintaining a really solid product while continuing to grow, and also keep true to who we want to be and what culture we want to build. That’s something cool about the pandemic in a way. Alongside everyone else, you’re learning as you go and then something changes.

What does company culture look like as your team is in growth mode?

One of the things we’ve done that’s worked really well is — and we’ve brought in our first real class — we’ve started an intern program. If you intern with us, you can join in as a senior coordinator–an entry level position. But also, leaning into our values in everything we do. A great example of that is, especially if you’re remote, you have to have a centralized place for your learning and development. It has to be easily accessible, searchable. Then, having really good soft skills training. It’s not just about learning those taxable things. If you can teach someone soft skills, then you get them involved in the culture from very early on.

In that respect, what have been some of the biggest challenges or learnings? 

The biggest challenge has been the fast growth combined with this new environment. Even though that’s been a challenge, we were very focused on keeping that growth very diverse and inclusive, and bringing in different perspectives, backgrounds. Communication is the hardest thing to get right in a remote environment and a changing environment. What we’ve also learned is that during change, people just create the minutia. They want so much information. Even if you feel like you’re being transparent, sometimes we would get the feedback that we weren’t being transparent enough. What we learned is we have got to spell it out for people. — Kimeko McCoy

By the Numbers

TikTok isn’t the only thing that’s keeping shoppers glued to their mobile screens. According to new research from software service company Newstore, at least one in three U.S. consumers prefer mobile shopping apps over all other channels. Find a breakdown of the study below:

  • 71% of consumers are ‘very interested’ or interested’ in mini apps (i.e., apps that don’t require downloading), according to the study
  • 60% of consumers who responded to the survey reportedly prefer mobile shopping apps over mobile websites due to improved user experience
  • 45% of consumers would not download a brand’s app if they have concerns about security or privacy, per the survey — Kimeko McCoy

Quote of the Week

“Advertisers are concerned about the ROI they are currently experiencing from walled garden platforms (Snapchat, Twitter, Facebook, Google, etc). Clients are reassessing their spends and searching for more cost efficient digital advertising sources that are more transparent and open in their measurement in delivering an efficient ROI.” 

— said Mark Walker, CEO of ad tech group Direct Digital Holdings, on clients’ reassessment of ad spending and efficiencies amid the economic downturn.

What We’ve Covered

https://digiday.com/?p=460157

Tue, 09 Aug 2022 07:22:00 -0500 en-US text/html https://digiday.com/marketing/marketing-briefing-a-very-nervous-bunch-of-cmos-how-the-ripple-effects-of-the-current-uncertainty-is-affecting-the-ad-industry/
Killexams : Reinventing Google: The underdog search engine with a daring new vision for the web No result found, try new keyword!The company draws from the value pot with both hands, making billions from the ads in its search listings (which are populated with material scraped for free from the web) and earning a commission on ... Sat, 06 Aug 2022 04:03:13 -0500 en-us text/html https://www.msn.com/en-us/news/technology/reinventing-google-the-underdog-search-engine-with-a-daring-new-vision-for-the-web/ar-AA10nz3q Killexams : Negative SEO: How black hat marketers abuse Google’s rules vs toxic backlinks

MANILA, Philippines – When online, you’ve probably encountered text or images on web pages which allow you to navigate to other web pages. They’re called hyperlinks. 

Hyperlinks tell the user, or other web applications, that more information or data about the subject matter can be found on the other page or online address. The more useful information a page has about a particular subject matter, the more chances it gets to be read and receive more links.

Among digital marketing professionals who practice search engine optimization (SEO), these are called backlinks. And they are worth their weight in gold. 

Search engine giants like Google have been known to use these backlinks as one of the signals to gauge the importance of a page in relation to a subject matter. Websites and pages that get a lot of backlinks are placed higher in search engine results pages. 

News websites, particularly those that regularly produce updated unique, credible and informative content, rank well in search results because they get a lot of backlinks. 

Gaming the system via artificial link building

In contrast, other commercial or marketing websites which do not regularly produce original content on a daily basis have a harder time getting these backlinks. 

So how do marketers get around this? 

They reach out to high-authority sites in hopes of getting them to link back. Rappler, for example, because it ranks well in results pages, has been receiving many of these requests for years.

In time, wily SEO practitioners have taken to gaming the search algorithms to make websites they are promoting more visible in search results pages. A usual practice was to set up numerous sites to artificially build these backlinks. 

This isn’t hard to do.

The World Wide Web abounds with services promising to automate this process of building links to your site. A quick search on Google will lead you to services that even automate the process of building websites. These websites can also be easily populated by tools that take content from other websites and “spin” them to make them seem different from the source site. 

The industry now refers to these unethical, manipulative techniques as black hat SEO. And Google, whose avowed mission is “to organize the world’s information and make it universally accessible and useful,” has been at war with these black hat operators for years

For these black hat SEO practitioners, it does not matter if the websites built have poor quality to very little content. What matters are the backlinks. 

The search optimization industry refers to backlinks built through these link building schemes as toxic backlinks. Due to the prevalence of these toxic websites on the web, Google released a series of algorithm updates beginning 2012 that aimed to discourage or minimize the practice in their search results pages. 

Attack tool 

Beyond using it to promote sites and pages, this technique has also been used as an attack tool against competition.

Stacy (not her real name), a digital marketing practitioner who used to work for an Australian SEO company, spoke to Rappler about one case in late 2021 where this technique has been used to target competition. 

The marketer recalls finding a sudden dip in search results traffic to the website of the product of their client, a local retailer. To identify the cause of the drop, Stacy said they inspected various indicators. For instance, did the articles with backlinks still exist? Were the stories that had these links removed or were the hyperlinks broken? 

This process led them to one indicator: a huge jump in backlinks from toxic domains. 

“We completely believed it was their competitor paying their online marketers to Strengthen their own ranking,” Stacy said. “It was a competitive industry and they were too targeted to be random.” 

At first, Google said it only devalues the low quality links accumulated by websites through these black hat link building schemes. That should have been the end of it, except that black hat operators found another way to still make themselves relevant: by using the same techniques to sabotage traffic going into the websites of the competitors of their clients. 

Industry certified refer to this as “negative SEO.” For years, Google has been denying that such techniques work. As late as March 2021, John Mueller, Web Trends Analyst at Google, argued that “negative SEO” is nothing but a meme.  

Then in October 2021, following a new update to the Google search algorithm, Mueller admitted that in some cases, where there is a clear pattern of spammy and manipulative links by the site, their algorithm may decide to simply distrust the whole site.

Mueller was responding to a question about how “toxic backlinks” affect the visibility of a website in search results. This was his response to the question: “For the most part, when we can recognize that something is problematic or is a spammy link, we will try to ignore it. If our systems recognize that they cannot ignore these links to the website, if they see a very strong pattern there, it can happen that our algorithms say well, we really have lost trust in this website.”  

Mueller conceded that Google tends to be conservative in its approach to this problem. “The Web is very messy and Google ignores the links out there.” He said that this drop usually happens “when there is a clear pattern.” 

Fighting toxic links, going after black hat operators

What do website owners need to do when they are targeted by toxic linkbacks?

One way is to disavow these bad links, according to Google and SEO practitioners. 

DISAVOW. Screenshot of Google’s disavow tool, which allows webmasters to request for Google to ignore toxic links to a domain.

Unfortunately, not every website owner will have the staffing or the tools necessary to detect, much less fight spammers on a regular basis. 

The hard part here is sifting through the mess of backlinks and identifying which links are desirable and which ones are not – which can be a tedious process. It’s a tricky business and Google itself advises website managers to use the disavow tool with caution. 

Beyond identifying toxic links, holding those responsible for the sabotage can be even harder. As in the rest of digital space, bad actors can lurk behind anonymous accounts and proxies. 

“We didn’t find out if the competitor actively requested this attack,” Stacy said.

It is also likely, she added, that the competitor did not actively request for the attack. “It could have just been ‘part of the service’ to Strengthen SEO for the competitor, without explaining the black hat tactics to the client.” 

Buyer and reader, beware. – Rappler.com 

Fri, 05 Aug 2022 14:45:00 -0500 en-US text/html https://www.rappler.com/technology/negative-seo-how-black-hat-marketers-abuse-googles-rules-vs-toxic-backlinks/
Killexams : Sick of Google’s tracking? DuckDuckGo just added all these privacy features No result found, try new keyword!DuckDuckGo is known for private searching but had a deal that let Microsoft track users. These new features should quell your fears. Sat, 06 Aug 2022 19:00:00 -0500 en-us text/html https://www.msn.com/en-us/news/technology/sick-of-google-e2-80-99s-tracking-duckduckgo-just-added-all-these-privacy-features/ar-AA10oCri Killexams : Why Google Is Delaying Plans To Block Third-Party Cookies In Chrome
Google announced its Privacy Sandbox initiative back in 2019 with the stated goal of changing the online advertising business that supports much of the web to use more privacy-preserving technologies. At present, online advertising relies heavily on third-party cookies that track users’ behavior across different websites, often without their knowledge or consent. The prevalence and number of third-party tracking cookies has grown over time, making third-party cookies an increasingly large threat for those concerned about privacy and data sovereignty. A exact study found that most government websites serve tracking cookies, which is a particularly concerning finding when it is often necessary to visit government websites to fulfill certain legal requirements or find important information.

Google intends for Privacy Sandbox to provide a number of different APIs that will enable online advertising to shift away from third-party cookies, ultimately culminating in the phasing out of third-party cookies entirely. However, Google has encountered some speed bumps while developing these APIs. The company originally planned to phase out cookies and switch over to its new APIs in 2022, but later pushed the phase out back to the second half of 2023. However, Google met strong resistance to its Federated Learning of Cohorts (FLoC) API, which was supposed to inform advertisers about users’ browsing habits. 

The Electronic Frontier Foundation acknowledged that FLoC would do away with the privacy risks associated with third-party trackers, but contended that it would introduce new privacy risks, calling FLoC a “terrible idea.” Aside from privacy concerns, FLoC met resistance from some of the biggest players on the web, including Amazon, which decided to block FLoC across all of its websites. FLoC also drew attention from antitrust bodies who saw it as a means to potentially consolidate online advertising technology under Google’s control.

Google eventually announced that it was abandoning FLoC in favor of a new API known as Topics. This switch to a new API still in development likely delayed Google’s plans for fully implementing its Privacy Sandbox APIs and phasing out of third-party cookies. Anthony Chavez, the vice president of Privacy Sandbox, announced in a blog post today that Google is pushing back the full release of Privacy Sandbox in Chrome and the depreciation of third-party cookies in order to provide the company more time to test the APIs.

These APIs are already available for testing by developers, but Google plans to expand user trials starting in early August by enrolling millions of Chrome users in these trials. The number of participants will continue to expand until Q3 2023, when Google hopes to fully launch the Privacy Sandbox APIs for general use. The company will then let some time pass, presumably giving advertisers time to switch away from third-party cookies to these new APIs, before disabling third-party cookies. The third-party cookie phase out is now scheduled for the second half of 2024. We’ll have to see whether Google manages to hit these new time projections or not.

Thu, 28 Jul 2022 09:07:00 -0500 en-us text/html https://hothardware.com/news/why-google-delaying-plans-block-third-party-cookies-chrome
Killexams : ‘I highly doubt this is the last deadline’: Why Google’s daunting balancing act leaves the cookie’s fate open-ended

Google has confirmed what many have speculated: it will further delay phasing out third-party cookies in its Chrome web browser until the second half of 2024.

The economy being rocky could be making people redeploy resources that have been working on Privacy Sandbox

Paul Bannister, CSO, Cafe Media

It’s the second such postponement in a little more than 12 months. In January 2020, when Google initially said it would phase out third-party cookies, it said it expected to make a move in 2022. Last year, that deadline was moved to late 2023.

Although, for some, the fact that news of the delay was leaked (Insider first reported it on Wednesday) is indicative of just how unwieldy a task Google has ahead of it.

True, Google Chrome is an outlier when compared to the likes of Apple’s Safari, which began rolling back support for third-party cookies years ago. However, the scale of its footprint and Google’s dominance of the media market paints a target on its chest, a reality that requires deft political maneuvering from the online colossus.

In a blog post, Anthony Chavez, vp Privacy Sandbox, Google, noted how feedback from the media industry in forums such as W3C, not to mention oversight from antitrust bodies such as The U.K.’s Competition Markets Authority, prompted the delay.

“This feedback aligns with our commitment to the CMA to ensure that the Privacy Sandbox provides effective, privacy-preserving technologies and the industry has sufficient time to adopt these new solutions,” he wrote. “For these reasons, we are expanding the testing windows for the Privacy Sandbox APIs before we disable third-party cookies in Chrome.”

High stakes game

Wayne Blodwell, CEO of consultancy firm TPA Digital, said the latest delay in progressing The Privacy Sandbox experiments is not a surprise. This is because Google effectively has to play this “high stakes game” with one arm tied behind its back given the scrutiny it faces from regulators — let’s not forget it has pledged to roll out any agreed measures with the CMA global.

“I also highly doubt that this is the last deadline pushback we see,” he claimed in a written statement, adding that while “many may sigh” at the latest announcement, much progress has been made since the prospect of sunsetting cookies was first mooted in 2019.

Blodwell added, “It’s still fascinating to me that Apple doesn’t appear to be under any scrutiny, yet Google are having to go through major hoops with regulators before they do anything, I wish someone would really take a look at that.”

Several sources told Digiday that many independent ad tech companies — those that stand to be impacted most by the retirement of traditional identifiers — have made significant progress with potential substitute ad targeting tools.

Although “scalable and democratized technical solutions” that can help monetize long-tail web properties are currently lacking, Blodwell added, that he was confident that ideal solutions will present themselves, eventually.

Why the delay?

Sources involved with Prebid.org, a body geared toward developing industry standards for sell-side players, also noted how staffers at Chrome (a team distinct from their peers at Google Ads) have been interacting with third parties such as ad tech players and publishers.

Paul Bannister, chief strategy officer at Cafe Media, said, “I was somewhat optimistic [of real progress] earlier this year when they announced the Topics API, a new Origins trial and all of the new information came out about FLEDGE because other ad tech companies were testing things… it felt like we had real momentum.”

Bannister, whose team has been in direct discussions with the Google Chrome team, speculated that the ongoing economic uncertainty may have played a role in causing the latest delay as multiple companies in the space (not just Google) have been forced to prioritize generating revenues in the near term.

He added, “I think the economy being rocky could be making people redeploy resources that might have been working on Privacy Sandbox features, which would have been for next year or beyond, to focus on short-term initiatives, I think that may have contributed to a slowing of things.”

Incentivized delay?

However, some expressed a degree of cynicism, indicating a belief that the successive delays are just red meat that Google is throwing to keep regulators at bay, now the potential divestiture of (at least some) of its ad tech offerings is in prospect.

“Isn’t this the headline they [Google] needed to show they were moving in the right direction but they’re just not prepared for it and they’re just delaying it over and over again when they really just don’t want it to happen,” said one source, who was not authorized to speak to press.

A separate source, who similarly declined to be named given their employer’s PR policy, characterized the continued delay as “a distraction tax” which ultimately meant the continuation of the status quo, and Google’s unrivaled hegemony of the online advertising sector.

“Google is very happy to force its rivals to pay [to use its ad tech stack] as long as it possibly can,” added the source. “Even if we could all agree [on a suitable set of ad targeting tools to substitute cookies] tomorrow, Google would benefit from the uncertainty… there’s no incentive for Google to reach a conclusion, I think they’ll delay, yet again.”

https://digiday.com/?p=458207

Wed, 27 Jul 2022 16:01:00 -0500 en-US text/html https://digiday.com/marketing/i-highly-doubt-this-is-the-last-deadline-why-googles-daunting-balancing-act-leaves-the-cookies-fate-open-ended/
Killexams : Google’s new Play Store rules target annoying ads and copycat crypto apps No result found, try new keyword!Google is trying to cut down on annoying, unskippable ads in Android apps and overall bad behavior in the Play Store ( via TechCrunch ). The company announced wide-ranging policy changes on Wednesday ... Thu, 28 Jul 2022 07:27:23 -0500 en-us text/html https://www.msn.com/en-us/money/other/google-e2-80-99s-new-play-store-rules-target-annoying-ads-and-copycat-crypto-apps/ar-AA104ppm Killexams : Google Ads Revenue Up 13.5% & Microsoft Bing Ads Revenues Up 15% But Slowed Growth

Both Google (Alphabet/GOOG) and Microsoft Bing (MSFT) announced earnings last night and while both missed expectations, both earnings were not that bad and their respective stock prices are up after hours. If you zoom into Google Ads revenue, it was up about 13.5% and Bing Ads were up about 15%.

  • Google Search ad revenue went from $35,845 to $40,689 (in millions) an increase of 13.5 percent
  • Microsoft Bing search ad revenue increased 15%

The previous quarter, Google and Microsoft saw their search ad revenue up about 22% each, and the quarter before that, they were both up about 32% each.

So while Google keeps making insane money, there may be a slow down.

Google reported search ad Q2 revenue up nearly 14% YoY to $40.69B, vs. $40.15B est., suggesting it may withstand a global recession better than smaller rivals. Alphabet's $69.69B revenue marks its slowest sales growth since Q2 2020, missing estimates by almost $190M, as macroeconomic pressures mount on the ad market. Microsoft misses Q4 estimates with $51.87B in revenue, vs. $52.44B estimated, citing changing exchange rates plus challenges in the advertising and PC markets.

Nicole at Search Engine Land dug more into the details after listening to the earnings call.

Forum discussion at WebmasterWorld.

Tue, 26 Jul 2022 23:21:00 -0500 Barry Schwartz text/html https://www.seroundtable.com/google-microsoft-earnings-33821.html
Killexams : Google Ads, AdSense, Analytics, and other platforms suffering reporting outages, again

Google’s various ad platforms are suffering yet another reporting outage with serious delays to the Google Ads console, Google AdSense reporting, Google Analytics dashboards, and other platforms are impacted as well.

Confirmed. Google has confirmed the issues this morning, which we believe started at around 3 am ET on Tuesday, July 19th, based on the complaints we have seen from the search marketing industry. Google wrote “We’re investigating reports of an issue with Google Ads. We will provide more information shortly. The affected users are able to access Google Ads, but may not have access to the most exact data.”

Ginny Marvin, the Google Ads Liaison, also posted about this on Twitter:

What is impacted? It seems that many Google Ads platforms are impacted, including:

  • Google Ads
  • Google AdSense
  • Google Analytics
  • Search Ads 360
  • Display and Video 360
  • Campaign Manager 360
  • Google Ad Manager

Here is the current incident report at the time of reporting this:

Again. Yes, again. We reported about similar issues this past Friday. Google resolved those issues several hours later but new or existing issues have come up again this morning.

Why we care. As Nicole wrote on Friday, incorrect reporting gives way to incorrect adjustments, optimization, and changes. Check your ads dashboard. If your numbers look incorrect, you may want to hold off on making any big changes until the issue has been resolved.

Resolved. The issues listed above are now all resolved as of the following morning. Here is an update from Google:

New on Search Engine Land

About The Author

Barry Schwartz a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry can be followed on Twitter here.

Tue, 19 Jul 2022 20:00:00 -0500 Barry Schwartz en text/html https://searchengineland.com/google-ads-adsense-analytics-and-other-platforms-suffering-reporting-outages-again-386583
Google-AMA exam dump and training guide direct download
Training Exams List