Clemenger Media Sales, CMS, is Australia’s home of niche media.

Australia has hundreds of smaller niche publications. These consumer and business / trade publications play the most vital role within each of their specific industries.

IF you are a business and want to connect with your target market these media, publications, news portals are the key to managing stakeholders, customers, prospects.

Content marketing is vital, advertising is vital, working with the media to sell you as a market leader is key to ongoing profitability / communications management / demand management.

Be the message about a new product, a pending sale, a merger, a rebranding; what is the message – if you have a message you need the media.

Clemenger / CMS is your key to advertising, marketing, media, sales.

We give you the news to help you live and learn; BEST Tony Clemenger.

 

Investigating YouTube’s algorithmic black box

By Rande Price, Research VP – DCN @Randeloo

 

We’ve all taken a trip down the YouTube rabbit hole. It starts off innocently enough as we view a video recommended by a friend. Then about 30 to 60 minutes later, after countless video views, your brain neurons set off an alarm. This alert registers and you realize you are watching a video that clearly violates YouTube’s standards and practices. But wait just a minute: this video was recommended by YouTube itself.

That’s right, the platform recommends videos that disregard their own standards. It is difficult to explain how this happens because YouTube provides little transparency into their recommendation algorithms (well, any of their algos, really). The rabbit hole turns out to be an algorithmic black box.

While there are penalties for video violations, the determination process for YouTube’s actions for prohibited content are often unknown. They range from demonetization to the removal of individual videos and suspension of an entire account — to nothing at all. Not much is known about the end results. Meanwhile, there is little the user can do on the platform to prevent the targeting and amplification of regrettable (if not deplorable) content.

RegretsReporter

Consequently, Mozilla, a non-profit creator of browsers, apps, and tools, stepped in to shed light on YouTube’s black box of algorithms. They created the RegretsReporter browser extension. It’s a crowdsourcing tool to help users identify their path to a “regrettable” YouTube video. The browser extension captures the user’s YouTube browsing behavior. The tracking will only occur up to 5 hours prior to initiating the report. Further, the data is shared with Mozilla only if the user actively agrees.

Mozilla’s newly released report, YouTube Regrets, is an analysis of shared browser data from over 37,000 YouTube users in 91 countries. In all, the pathways and content of 3,362 regrettable videos are explored in this study. It’s the largest-ever crowdsourced investigation into YouTube’s algorithms.

Regrettable videos, often recommended

This analysis finds that one in 10 of the reported videos (12.2%) “should not be on YouTube” or “should not be proactively recommended,” based on YouTube’s Community Guidelines. The most frequently reported videos include misinformation and violent or graphic content. Covid-19 misinformation is categorized separately from “general” misinformation because of the volume of videos. It comprises a third of all categorized video regrets.

Unfortunately, YouTube recommendations represent 71% of the regretted videos reported to Mozilla. And at least two-fifths (43%) of all regrettable videos are recommended by YouTube and are completely unrelated to videos the user is watching.

Mozilla and other researchers cannot confirm YouTube’s claims at progress in correcting algorithms. YouTube provides no insight into the design and operational practices of their recommendation systems. Therefore, Mozilla sees these recommendations as necessary next steps in YouTube’s accountability:

  1. Allow independent audits of recommendation systems.
  2. Provide information about how recommendation systems work.
  3. Give users more control over which of their data is used to generate recommendations.
  4. Implement risk assessment programs to identify and evaluate the possibility and magnitude of harm caused by the recommendation system.
  5. Provide users with an option to opt-out of personalized recommendations in favor of receiving chronological, contextual, or search term-based recommendations.

Mozilla and other researchers believe there are significant consequences connected to YouTube’s algorithms. Further, many believe the algorithms are optimized in favor of YouTube’s business model – increasing users’ time spent to serve more advertising. The amplification of regrettable video content including pseudo-science, 9/11 conspiracy theories, mistreated animals or encouraging white supremacy cannot be a byproduct of YouTube’s business model. Clearly, transparency is a needed to identify and resolve the problems embedded in their recommendation ecosystem.

 

Clemenger Media Sales, CMS, Clemenger Consulting, Clemenger is advertising, marketing, media buying, media buying agency: if you want to buy media, advertising, if you want to buy radio advertising, buy magazine advertising, do you want to buy digital media, are you buying TV advertising, or want content marketing, media sponsorship – Clemenger can help you. We give you the above / this news to help you learn / live / continually improve (call or email if you need help with your media / advertising / marketing investment (we are Australia’s home to niche media :

 

Investigating YouTube’s algorithmic black box