Tag: Insights

September 30, 2020

Pinterest Adds New Ad Slots, More Insights for Marketers, Ahead of the Holiday Shopping Push


With the holiday shopping season about to hit, all the major social platforms will be rolling out new tools and features to help businesses maximize their seasonal promotions, and ideally, re-coup what they can after a massively interrupted year.

This week, Pinterest has outlined its latest tools to help businesses tap into the rising number of online shoppers, many of whom are now using Pinterest as a kind of virtual shopping mall for unique, artisan products.

First off, Pinterest is adding more ad slots to help businesses reach people when they’re searching for products to buy.

Pinterest ad slots

As explained by Pinterest: 

As more people use Pinterest to shop and look for ideas and products from brands and retailers, we’re integrating ads into more shopping experiences across Pinterest to deliver relevant content where it’s welcomed by shoppers.”

Now, advertisers will be able to place ads in Pinterest Lens matches (as shown above), the ‘Shop’ tab within Pinterest search, and even shopping matches on Pins.

The new placement options are being rolled out over time, and will be first made available to businesses in the US and UK.

In addition to this, Pinterest is also looking to give Pin marketers more data on the path to conversion from each Pin, with new, in-depth insights on specific Pin performance.

Pinterest conversion insights

“With new conversion insights, Pinterest Verified Merchants and Shopify retailers can easily see the impact of both their paid and organic Pinterest content on their site visits and checkouts, making their shopping efforts not just impactful, but also measurable.”

As you can see above, the new listings provide a more specific split between organic and promoted Pin metrics, with individual Pin listings that highlight the top performers. The format is similar to Facebook’s Page Insights listings, which should make it easier to understand for those already managing a Facebook Page.

Pinterest is also expanding the availability of its personalized shopping reccomendations, which it first launched in the US last year. 

Pinterest style ideas

The option will provide more ways for UK Pinners to find related products, and for brands to gain more exposure through related matches.

At this stage, the new features are only being launched in the US and UK, which Pinterest notes are two of its top markets.

For businesses in these nations, the new options will provide new ways to maximize exposure for their products on the platform – and with Pinterest usage increasing more than any other social network during the pandemic (outside of TikTok), it’s worth taking a look, and considering why more people are turning to the platform for online shopping, and whether it might be a good fit for your brand.

You can read more about Pinterest’s latest updates here.

Free Speech Social Media Platform


September 27, 2020

Is Facebook Bad for Society? New Insights on the Company’s Approach Raise Important Questions


Is Facebook a positive or negative influence on society – and does the company, or indeed anybody in a position to enact any type of change, actually care either way?

This question has been posed by many research reports and academic analyses over the years, but seemingly, the broader populous, at least in Western nations, really hadn’t give it a lot of consideration until the 2016 US Presidential Election, when it was revealed, in the aftermath, that foreign operatives, political activists and other groups had been using Facebook ads, posts and groups to influence US voter activity. 

Suddenly, many realized that they may well have been manipulated, and while The Social Network has now implemented many more safeguards and detection measures to combat ‘coordinated inauthentic behavior’ by such groups, the concern is that this may not be enough, and it could be too late to stop the dangerous impact that Facebook has had, and is having, on society overall.

Facebook’s original motto of ‘move fast and break things‘ could, indeed, break society as we know it. That may seem alarmist, but the evidence is becoming increasingly clear.

Moving Fast

The launch of Facebook’s News Feed in September 2006 was a landmark moment for social media, providing a new way for people to engage with social platforms, and eventually, for the platforms themselves to better facilitate user engagement by highlighting the posts of most interest to users.

At that time, Facebook was only just starting to gain momentum, with 12 million total users, though that was already more than double its total audience count from the previous year. Facebook was also slowly consuming the audience of previous social media leader MySpace, and by 2007, with 20 million users, Facebook was already working on the next stage, and how it could keep people more engaged and glued to its app.

It introduced the Like button in 2007, which gave users a more implicit means to indicate their interest in a post or Page, and then in 2009, it rolled out the News Feed algorithm, which took into account various user behaviors and used them to define the order in which posts would appear in each individual’s feed – which, again, focused on making the platform more addictive, and more compelling.

And it worked – Facebook usage continued to rise, and on-platform engagement skyrocketed, and by the end of 2009, Facebook had more than 350 million total users. It almost doubled that again by the end of 2010, while it hit a billion total users in 2012. Clearly, the algorithm approach was working as intended – but again, in reference to Facebook’s creed at the time, while it was moving fast, it was almost certainly already breaking things in the process. 

Though what, exactly, was being broken was not clear at that stage.

This week, in a statement to a House Commerce subcommittee hearing on how social media platforms contribute to the mainstreaming of extremist and radicalizing content, former Facebook director of monetization Tim Kendall has criticized the tactics that the company used within its growth process, and continues to employ today, which essentially put massive emphasis on maximizing user engagement, and largely ignore the potential consequences of that approach. 

As per Kendall (via Ars Technica):

“The social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding – at worst, I fear we are pushing ourselves to the brink of a civil war.”

Which seems alarmist, right? How could a couple of Likes on Facebook lead us to the brink of civil war? 

But that reality could actually be closer than many expect – for example, this week, US President Donald Trump has once again reiterated that he cannot guarantee a peaceful transfer of power in the event of him losing the November election. Trump says that because the polling process is flawed, he can’t say that he’ll respect the final decision – though various investigations have debunked Trump’s claims that mail-in ballots are riddled with fraud and will be used by his opponents to rig the final result.

Trump’s stance, in itself, is not overly surprising, but the concern now is that he could use his massive social media presence to mobilize his passionate supporter base in order to fight back against this perceived fraud if he disagrees with the result.

Indeed, this week, Trump’s son Don Jnr has been calling on Trump supporters to mobilize an ‘army’ to protect the election, which many see as a call to arms, and potential violence, designed to intimidate voters.

Trump army post

Note where this has been posted – while President Trump has a massive social media following across all the major platforms, Facebook is where he has seen the most success in connecting with his supporters and igniting their passions, by focusing on key pain points and divisive topics in order to reinforce support for the Republican agenda among voter groups.

Why does that seemingly resonate more on Facebook than other platforms? 

Because Facebook prioritizes engagement over all else, and posts that generate a lot of comments and discussion get more traction, and thereby get more distribution via Facebook’s algorithm. Facebook also offers complex ad targeting options which have enabled the Trump campaign to hone in on specific pain points and concerns for each group.

Facebook Trump ad

By using custom audiences, the Trump campaign is able to press on the key issues of concern to each specific audience subset, more effectively than it can on other platforms, which then exacerbates specific fears and prompts support for the Trump agenda.

How you view that approach comes down to your individual perspective, but the net result is that Facebook essentially facilitates more division and angst by amplifying and reinforcing such through its News Feed distribution. Because it focuses on engagement, and keeping users on Facebook – and the way to do that, evidently, is by highlighting debate and sparking discussion, no matter how healthy or not the subsequent interaction may be.

It’s clearly proven to be an effective approach for Facebook over time, and now also the Trump campaign. But it could also, as noted by Kendall, lead to something far worse as a result.

Civil Unrest

But it’s not just in the US that this has happened, and the Trump campaign is not the first to utilize Facebook’s systems in this way. 

For example, in Myanmar back in 2014, a post circulated on Facebook which falsely accused a Mandalay business owner of raping a female employee. That post lead to the gathering of a mob, which eventually lead to civil unrest. The original accusation in this instance was incorrect, but Facebook’s vast distribution in the region enabled it to grow quickly, beyond the control of authorities.

In regions like Myanmar, which are still catching up with the western world in technical capacity, Facebook has become a key connector, an essential tool for distributing information and keeping people up to date. But the capability for anyone to have their say, on anything, can lead to negative impacts – with news and information coming from unofficial, unverified sources, messages can be misconstrued, misunderstood, and untrue claims are able to gain massive traction, without proper checks and balances in place.   

We’ve seen similar in the growth of pseudoscience and conspiracy theories in western regions – the growth of the anti-vax movement, for example, is largely attributed to Facebook distribution.

Anti vax searches chart

As you can see in this chart, using Google Trends data, searches for ‘anti-vax’ have gained significant momentum over the last decade, and while some of that is attributable to the terms used (people may not have always referred to ‘anti-vax’), it is clear that this counter-science movement has gained significant traction in line with the rise of The Social Network.

Is that coincidence, or could it be that by allowing everyone to have such huge potential reach with their comments, Facebook has effectively amplified the anti-vax movement, and others, because the debate around such sparks conversation and prompts debate?

That, essentially, is what’s Facebook’s News Feed is built upon, maximizing the distribution of on-platform discussions that trigger engagement.

As further explained by Kendall:  

“We initially used engagement as sort of a proxy for user benefit, but we also started to realize that engagement could also mean [users] were sufficiently sucked in that they couldn’t work in their own best long-term interest to get off the platform. We started to see real-life consequences, but they weren’t given much weight. Engagement always won, it always trumped.”

Again, Facebook’s race to maximize engagement may indeed have lead to things being broken, but various reports from insiders suggest Facebook didn’t consider those expanded consequences. 

And why would it? Facebook was succeeding, making money, building a massive empire. And it also, seemingly, gives people what they want. Some would argue that this is the right approach – adults should be free to decide what they read, what they engage with, and if that happens to be news and information that runs counter to the ‘official narrative’, then so be it.

Which is fine, so long as there are no major consequences. Like, say, the need to be vaccinated to stop the spread of a global pandemic.

Real World Consequence

This is where things get even more complex, and Facebook’s influence requires further scrutiny.

As per The New York Times:

“A poll in May by The Associated Press-NORC Center for Public Affairs Research found that only about half of Americans said they would be willing to get a coronavirus vaccine. One in five said they would refuse and 31 percent were uncertain.”

US medical leader Dr Anthony Fauci has also highlighted the same concern, noting that “general anti-science, anti-authority, anti-vaccine feeling” is likely to thwart vaccination efforts in the nation.

Of course, the anti-vax movement can’t purely be linked back to Facebook, but again, the evidence suggests that the platform has played a key role in amplifying such in favor of engagement. That could see some regions take far longer than necessary to recover from the COVID-19 pandemic, so while the debate itself may seem relatively limited – and Facebook had allowed anti-vax content on its platform till last year, when it took steps to remove it – the actual consequences can be significant. And this is just one example.

The QAnon conspiracy theory had also been allowed to gain traction on The Social Network, before Facebook took steps to remove such last month, the violent ‘boogaloo’ movement saw mass engagement on the platform till Facebook announced new rules against such back in June, while climate change debates have been allowed to continue on the platform under the guise of opinion. In each case, Facebook had been warned for years of the potential for harm, but the company failed to act until there was significant pressure from outside groups, which forced its response.

Is that because Facebook didn’t consider these as significant threats, or because it prioritized engagement? It’s impossible to say, but clearly, by allowing such to continue, Facebook benefits from the related discussion and interaction on its platform.

The history shows that Facebook is far too reactive in these cases, responding after the damage is done with apologies and pledges to improve.

Again, as noted by Kendall:

“There’s no incentive to stop [toxic content] and there’s incredible incentive to keep going and get better. I just don’t believe that’s going to change unless there are financial, civil, or criminal penalties associated with the harm that they create. Without enforcement, they’re just going to continue to be embarrassed by the mistakes, and they’ll talk about empty platitudes… but I don’t believe anything systemic will change… the incentives to keep the status quo are just too lucrative at the moment.”     

This is where the true conflict of open distribution platforms arises. Yes, it can be beneficial to give everyone a chance to have their say, to share their voice with the world. But where do you draw the line on such?

Facebook CEO Mark Zuckerberg could clearly prefer for Facebook not to intervene:

“People having the power to express themselves at scale is a new kind of force in the world — a Fifth Estate alongside the other power structures of society. People no longer have to rely on traditional gatekeepers in politics or media to make their voices heard, and that has important consequences. I understand the concerns about how tech platforms have centralized power, but I actually believe the much bigger story is how much these platforms have decentralized power by putting it directly into people’s hands. It’s part of this amazing expansion of voice through law, culture and technology.”

And that may reveal the biggest true flaw in Facebook’s approach. The company leans too far towards optimism, so much so that it seemingly ignores the potential damage that such can also cause. Zuckerberg would prefer to believe that people are fundamentally good, and we, as a society, can come together, through combined voice, to talk it out and come to the best conclusion.

The available evidence suggests that’s not what happens. The loudest voices win, the most divisive get the most attention. And Facebook benefits by amplifying argument and disagreement.  

This is a key concern of the modern age, and while many still dismiss the suggestion that a simple social media app, where people Like each others’ holiday snaps and keep tabs on their ex-classmates, can have serious impacts on the future of society, the case, when laid out, is fairly plain to see.

Investigations into such are now taking on a more serious tone, and the 2020 Election will be a key inflection point. After that, we may well see a new shift in Facebook’s approach – but the question is, will that, once again, prove too late?

Free Speech Social Media Platform


September 24, 2020

Twitter Shares Insights into the Effectiveness of its New Prompts to Get Users to Read Content Before Retweeting


Back in June, Twitter added a new pop-up alert on articles that users attempt to retweet without actually opening the article link and reading the post.

Twitter read prompt

After a full three months of implementation, today, Twitter has shared some new insight into the effectiveness of the prompt, and how it’s changed user behavior when they’re shown the alert.

According to Twitter:

  • People open articles 40% more often after seeing the prompt
  • People opening articles before retweeting increased by 33%
  • Some people didn’t end up retweeting after opening the article – “which is fine – some Tweets are best left in drafts”

Those are some pretty impressive numbers, underlining the value of simple prompts like this in getting users to think twice about what it is they’re distributing through their social media activity.

Adding any level of share friction seems to have some effect. Back in 2016, Facebook added similar pop-ups on posts which had been disputed by third-party fact checkers, prompting users to re-think their intention before they hit ‘Share’.

Facebook fact-check prompt

Analysis conducted by MIT found that these labels reduce people’s propensity to share misinformation by around 13%, while Facebook has since also added new prompts when users attempt to share a link that’s more than 90 days old, reducing the spread of outdated content.

It seems that simple pushes like this can actually have a big impact. And while free speech advocates have criticized such labels as being overly intrusive, if the net effect is less blind sharing, and more reading and research into topics, then that’s surely a good thing that can only benefit online discourse.

Given the success of the new prompts, Twitter’s now working to bring them to all users globally (currently only available on Android), while it’s also looking to make the alerts smaller after their initial display to each user.

And clearly, the impacts could be significant. While the above figures may not hold in a broader launch of the option, the numbers do show that the prompts are at last somewhat effective, and can help in reducing ill-informed sharing, and the distribution of misinformation.

Free Speech Social Media Platform


September 23, 2020

Facebook Shares New Insights into Gen Z, and How COVID-19 has Changed Their Outlook [Infographic]


While the COVID-19 pandemic has had wide-reaching impacts, with older people statistically the most susceptible to the virus itself, younger people, are also seeing significant shifts in their educational and career trajectory, which could have long-lasting effects.

As noted by Facebook:

“For Gen Zers, the COVID-19 pandemic has struck at a particularly formative time, disrupting educational journeys, career opportunities and more. Yet many Gen Zers seem to be weathering it in a way that leaves them transformed, stronger – energized around who they are, what they stand for and what drives them.”

Indeed, the societal changes happening as a result of COVID-19 seems to have reinforced many traits already evident in younger demographic cohorts, including increased awareness of causes, and the impacts on various aspects of society. And that could alter the way marketers look to connect with these audiences, in order to maximize brand messaging and better link into significant shifts.

To provide more insight on this, Facebook recently analyzed survey responses from over 15,000 people to get a better understanding of how Gen Z is responding to the pandemic, and what that means for advertising and outreach. You can read Facebook’s full “Meet the Future: Gen Z’s Regeneration” report here, but the key data points are summarized in the graphic below.

Some valuable insight for your planning.  

Facebook Gen Z research

Free Speech Social Media Platform