Elon Musk Provides Update On Who Will Keep Blue Tick—Ahead Of Twitter’s Looming Verification Rollback
[unable to retrieve full-text content]
Twitter proprietor Elon Musk on Friday gave additional particulars on which customers would preserve their blue checkmarks as soon as the platform begins revoking legacy verified badges in April, a much-maligned transfer purportedly designed to extend belief within the platform.
There is no silver bullet in social media marketing. But there are some patterns in your audience’s behavior that could help you get higher engagement on Facebook.
Let’s take a look at how to find your best times to post on Facebook to make sure you’re posting when they’re most active on this platform, ready to engage.
Find out YOUR best times to post on Facebook
Get in-depth social media analytics for your own profiles & track your competitors, too! Find out what content performs best and when. Try NapoleonCat for free:
Try NapoleonCat free for 14 days. No credit card required.
Is there a best time to post on Facebook?
It’s the same question as with any best time to post on social media: there’s always a best time and worst time, but it might not be the same for every brand out there, and it’s probably going to change over time.
And that’s because audience habits change, platform algorithms change, and even your brand and your content change. (And let’s not forget, the timing is only a part of the puzzle, with the type of content you post, the topics, or the format and relevance to your audience interests playing a huge role, too.)
And it’s all going to affect your engagement rates at some point. And the engagement you get will of course then affect the organic reach of your content.
So, to answer this question for Facebook (or any other social media platform), you’ll need to dig into your analytics and see at what times your audience responds to the different types of content you put out there on your Facebook Page.
But this doesn’t mean there are no benchmarks and universal best times to post on Facebook that seem to work for most brands in one way or another. Actually, there seems to be an abundance of research studies as far as Facebook post timings go. So let’s take a look at those first.
Overall best time to post on Facebook
There is some research that will help you at least establish some benchmarks that you can later start testing yourself. According to this article on the Influencer Marketing Hub, these are the best times to post on Facebook:
Monday: 9:00 AM, 11:00 AM, 3:00 PM
Tuesday: 9:00 AM, 11:00 AM , 3:00 PM
Wednesday: 9:00 AM, 10:00 AM , 1:00 PM, 4:00 PM
Thursday: 9:00 AM, 11:00 AM , 4:00 PM
Friday: 11:00 AM, 4:00 PM, 6:00 PM
But it really depends on multiple factors, like:
audience location and time zones
sector (B2B vs. B2C)
For example, a study by CoSchedule, cited in the above-mentioned article, shows the following differences in the best times to post on Facebook across industries:
Their research found that:
The best times to post on Facebook are 7:00 PM, 7:00 AM, and 3:15 PM.
The best days to post on Facebook are Friday, Wednesday, and Monday.
The worst days to post on Facebook are Saturday, Sunday, and Thursday.
They also identified the worst times to post on Facebook, which you can find in the image below, courtesy of CoSchedule.
There has been more research into this over the years. But I guess the main answer to the “When is the best time to post on Facebook” question stays roughly the same. And it’s:
So, shall we look at how to answer it more specifically for your Facebook Pages?
How to find the best time to post on Facebook
The best (and really only) way to figure out the best time to post on Facebook for your brand and your audience is to look into your social media analytics tool. This could be the native Meta Business Suite, or it could be a tool like NapoleonCat, where you can track the statistics for your Facebook Pages along with other social media platforms like Twitter, Instagram, and LinkedIn.
So, let’s take a look at how to do that.
When you go into the analytics section in NapoleonCat, go to the Content tab, and look for these two graphs:
Post engagement by days
Post engagement by hours
Based on your posts’ average engagement rates, these will show you the days of the week and the hours of the day when your Facebook content performs the best.
Make sure you check those periodically – as they’ll probably change in time. It’s also a good idea to set up periodic reports (for example, weekly or monthly) to keep up to date with these and other statistics on your Facebook Pages.
You can set up a report like this in NapoleonCat, too – so you don’t have to keep all of the numbers in a shared spreadsheet somewhere in the cloud.
How to find the best times to post on Facebook based on your competition
Besides looking at the numbers for your own Facebook profiles, you could also follow your closest competitors and see what their results are in terms of the best day and time to post.
This is also something you can do with NapoleonCat’s analytics – follow any Facebook Page you like via the platform, and you can see all the publicly available statistics just like for your own Facebook Pages. Along with their post engagement by day of the week and hour of the day.
This could give you a much more complete picture of how and when people respond to posts or, indeed, what time of day is best to post on Facebook. After all, your competitors’ audience (or at least some of it) is also your target audience. So you’re not too far off analyzing your competitors and what they’re doing on Facebook.
Analyze your competitors on Facebook
Measure your social media results. Track competitors and improve performance using one simple tool. Try NapoleonCat for free:
Try NapoleonCat free for 14 days. No credit card required.
So, you know the best time to post on Facebook. Now what?
Testing is key. So play around with the posting times and see if your overall engagement changes as you change your posting schedule.
If you’re using NapoleonCat for your analytics, you can also use it as your social media publishing tool. This way, you don’t have to relog between the different platforms every time you schedule a post on Facebook and then want to see how it’s doing in terms of performance. It’s all there inside the same social media management platform, which makes things so much easier.
Schedule all your Facebook posts and Reels alongside content for your other social media posts for channels like Instagram, Twitter, and LinkedIn. (And even your Google My Business account.)
Add videos and up to 30 images in one post.
Target your Facebook posts by age, country, region, or city.
All in one visual calendar, where it’s super easy to see what posts are scheduled for different days and different social media profiles.
Plus, you get a whole bunch of team-oriented features such as:
Assigning tasks for the different posts to your team members
Exchanging feedback under each post or draft
Creating content approval workflows
Working with external creators and contributors without giving them direct access to your Facebook Pages.
You can also really easily reschedule the different content pieces inside the calendar when you want to try out new posting times.
Social media management for teams
Moderation, automation, analytics, reporting, scheduling, and more. Try NapoleonCat and save tons of your time spent on marketing tasks. Designed specifically for teams. Try it for free:
Try NapoleonCat free for 14 days. No credit card required.
Now, your turn to find out your best times to post on Facebook
If you don’t know where to start, you can try the generally popular times mentioned in the research and then drill down into your audience’s reaction and tweak your schedule based on your own analytics. Using social media scheduling tools like NapoleonCat will make it much easier to play around with your content schedule, with easy access to analytics within the same platform, letting you save tons of time and clicks.
Get ready for the next phase of Twitter 2.0’s subscription revenue push, with the platform announcing today that ‘legacy’ blue checkmarks will begin being revoked as of next week.
On April 1st, we will begin winding down our legacy verified program and removing legacy verified checkmarks. To keep your blue checkmark on Twitter, individuals can sign up for Twitter Blue here: https://t.co/gzpCcwOpLp
As per the above tweet, Twitter’s hoping to boost Twitter Blue and Verification for Business subscribers by prompting them to start paying for their blue tick instead.
Twitter’s also alerting blue tick account holders with this in-stream notification.
That could see some legacy verified accounts paying up, bringing in a few more Twitter Blue subscribers – though the amount that are going to revert to Verification for Business, which costs $1,000 per month, will be far less.
In order to generate 50% of Twitter’s total income, Twitter needs around 24 million users to sign up to the program. So while Twitter Blue is set to bring in more money for Elon and Co. (around $11 million per quarter to be exact), it’s nowhere close to being half of the platform’s intake, which, based on its last revenue report, would be around $590 million every three months.
While it also dilutes the value of the thing that it’s aiming to sell. The problem with selling blue checkmarks, both on Twitter and Facebook, is that you’re charging users for the exclusivity, and the perceived reputational value of having a blue tick, but as soon as anyone can buy it, it’s no longer valuable in this respect.
And as more people sign up, it becomes even less valuable over time, and once Twitter removes the legacy blue ticks, that will mean that the only checkmarks left are those that are attached to accounts that are paying for it, which will make it completely worthless in this respect. At that stage, the blue check is only going to show others that you have enough money to afford it, and that you want to support Elon Musk’s mission to change how Twitter works.
Maybe that has some value in itself, and there are some aspects of Twitter Blue that some users will pay for. Though even then, Twitter’s experimenting with a new option that would enable subscribers to not show their blue tick, if they choose – because even Twitter is moving to acknowledge that it’s not the indicator of reputational or exclusivity that it once was.
And it’ll become less so from next week – while it’s also worth noting that even if every legacy checkmark holder were to sign on to pay $8 per month, and keep their blue tick, that would still only be another420k extra subscribers, max.
And I suspect many won’t. I suspect, too, that removing the legacy checkmarks will have a negative impact, in that it will see some of those users tweet even less, because they won’t feel as aligned to the platform that has taken away that marker from their account.
This is why selling verification ticks is a flawed strategy, because its growth and expansion dilutes its own value, and undermines the concept of what it is. Sure, Meta’s trying the same thing, but even Meta staff raised this same concern (as did Twitter staff), and Meta at least offers a truly valuable aspect, in providing additional, in-person support for paying subscribers.
But even then, Meta’s approach is also flawed, because you can’t sell reputation, you can’t charge for authority or recognition.
Some will think that’s what they’re getting, but eventually, when they’re the only ones left, I think you’ll find that it’ll be much easier to dismiss blue checkmark accounts in-stream.
It’s a confused approach, which won’t become a significant revenue driver – at least not without some significant additions that are worth paying for. But Twitter’s pushing ahead either way.
Prepare to pay up, or lose your blue tick, from next week.
Twitter’s Blue Checkmarks To Stop Being Useful ID Verification Tool In April
Starting April 1, Twitter will remove any blue checkmarks from accounts that were “verified” before Elon Musk’s takeover of the company, according to an announcement on Thursday. Anyone who currently has the blue checkmark and wants to keep it will need to sign up for Twitter Blue, a service that doesn’t actually do much beyond giving you a checkmark.
“On April 1st, we will begin winding down our legacy verified program and removing legacy verified checkmarks. To keep your blue checkmark on Twitter, individuals can sign up for Twitter Blue here,” Twitter said in a tweet.
The social media company directed any organizations that want to buy a checkmark to yet a different page—which speaks to just how confusing Twitter’s “verification” program has become under Musk. Many different types of checkmarks come in many colors. There’s a gold badge for some businesses, while governments get a gray badge. It’s all a confusing mess, to say the least.
In 2009, Tony La Russa, a baseball legend filed a suit against Twitter. La Russa had concerns about the impersonator accounts and verified meant that any account marked with a blue checkmark was confirmed to be owned by this notable person.
Some right-wing media personalities began to view the blue checkmark in a status symbol. This led to calls for the ability to purchase a checkmark after Musk purchased the platform in October 2022. Musk’s rollout of the new Twitter Blue, which charges $8 per month or $11 per month on Apple devices, was a disaster, with several impersonator accounts pretending to be Apple, Tesla and Nintendo while posting highly offensive material.
One person even pretended to be drugmaker Eli Lily and announced insulin was going to be free, which sent the company’s stock plummeting. It stopped offering users the ability to modify their names. That created its own problems for many reasons. Twitter doesn’t actually verify that anyone paying for Twitter Blue is who they say they are.
The checkmark’s actual purpose—verifying an account’s identity—will be completely dead in just a couple of weeks. Every time you see the blue checkmark, you can be sure of one thing. That individual, whatever they are, at least has $8.
Despite concerns over its radically high pricing, Twitter is pushing ahead with the rollout of its ‘Verification for Organizations’ offering, which enables brands to purchase a gold checkmark for their main account, and verify their employee profiles as affiliates.
Twitter first put out the call for selected businesses to sign up to the program back in January, as part of its broader revamp of verification, which aims to both democratize access to checkmarks in the app, while also establishing a new revenue stream for the business.
If they’re willing to pay up. Twitter’s currently looking to charge businesses $1,000 per month for the option, which seems like a high price to pay for a different colored tick – and really, not much else.
An affiliate badge, a smaller version of your brand profile image that’s added to approved accounts in the app
Affiliates display on the main brand page, which shows all the accounts linked to the main brand profile
Twitter Blue access for all brand and affiliated accounts
So you do get access to all the Twitter Blue features, for your main account and any profiles that you approve as affiliates. But you do also have to pay for each affiliate you register – if you want to approve your staff, and get them both an affiliate marker and a blue tick, you’ll have to pay $50, per month, for each profile you add in.
That seems like a lot – especially considering you can just pay $8 per month to sign your brand profile up to Twitter Blue and get a regular blue checkmark in the app. Maybe Twitter will eventually look to cut off Twitter Blue access for brand entities, but right now, you’re really paying an extra $992 per month for a different colored tick.
Is that worth it?
I guess, Twitter’s hoping that it can reach a critical mass of brands that sign up for a gold checkmark, which will then make it the new gold standard in brand recognition, and in turn, raise questions about the legitimacy of other brand accounts that don’t have that gold tick endorsement. That could force more brands to sign-up to the program, in order to ensure that they’re seen as the official brand entity in the app.
I’m not sure that’s going to work, but that seems to be the principle that Twitter’s going with, effectively using the value of exclusivity that was once afforded to the regular blue checkmark to make the new gold tick more desirable, thus boosting interest.
But it’s a lot. $1000 a month is likely beyond the reach of most SMBs, and it’ll be hard for any brand to justify the expanse, for so little in return.
Some reports have also suggested that Twitter’s giving away the gold checkmark to approved ad partners, as another means to make it a bigger thing, and that could be another effort to further incentivize take-up, by using competitive sensibilities to prompt other brands to want one as well.
Again, I don’t know that it’s the right approach, but Twitter’s, at the least, going to kick the tires on the option, at its current price point.
And it’s coming to more regions – Verification for Organizations is now available in the US, Canada, Australia, New Zealand, Japan, the UK, Saudi Arabia, France, Germany, Italy, Portugal, Spain, India, Indonesia, and Brazil.
With a heap of advertisers still not coming back to Twitter, Elon and Co. definitely need the extra money – but do you need the ‘benefits’ that this program provides?
Everyone is talking and tweeting about AI. You’ve probably heard it all at this point: AI can help you write better. AI can help you tweet faster. AI is coming for your job (let’s just put that theory to rest — AI is a tool, not a replacement).
But can AI help your social captions perform better?
As a busy social media marketer, anything that promises to make the content creation process easier and faster is appealing. But we wanted to find out if using AI actually impacts engagement and reach, specifically on Twitter.
Do human-written captions have better engagement and reach than AI-written ones? Does the Twitter algorithm penalize tweets written by AI? I ran an experiment to find out.
Coming soon 👀
Big news! An AI caption writer for Instagram, Facebook, Twitter, and LinkedIn powered by Chat GPT is coming to Hootsuite. Join our mailing list and be one of the first to know more.
Hypothesis: Human-written Twitter captions will get more engagement and reach than AI-written captions
Our educated guess is that human-written Twitter captions will get more engagement and reach than ones written by AI.
Twitter is a conversational platform. People love tweets with a clear voice, and Twitter users love engaging with other humans. We suspect people won’t be drawn to a tweet that doesn’t sound like it came from a human.
But would people really know if a tweet was written by AI or not? And if so, how would that impact reach and engagement?
Let’s find out.
To test whether human-written Twitter captions generate more engagement and reach than AI-written ones, I shared three sets of tweets on my personal Twitter account. Each set of tweets used the same format so they could be fairly compared.
For this experiment, I posted two tweets that were text only, two tweets that included a link, and two tweets with a visual.
To compose the AI-generated tweets, I used Lately, a tool that repurposes long-form content into bite-sized social captions.
(Bonus: You can connect the Lately app with Hootsuite to finalize, schedule, and auto-publish your posts right inside your Hootsuite dashboard)
It’ll then analyze the content and generate captions for you to review and edit. After you edit, you can send them directly to Hootsuite to be scheduled.
Some things to remember about Lately and any other AI-powered content creation tools: First, you have to train it. For AI tools to generate content that sounds the most like you or your brand, they have to understand your voice, content, and audience. This is why Lately asks you to submit a long-form piece before it can generate captions.
Another thing to note about AI-generated content is that it still needs to be edited by a human.
After Lately comes up with tweets based on the content you submitted, you can edit them for voice, clarity, and context. Lately then uses these insights to create better content the next time. The more you use it, the more it’ll start to sound like something that your brand would actually post.
Once I had my human-written captions created, I used Hootsuite’s Lately integration to generate similar tweets using the same long-form pieces of content. I then drafted them in Hootsuite and used the recommended posting time feature to get them scheduled.
I checked the performance of each tweet about 24 hours after I posted (the Twitter lifecycle moves fast). Here are the results:
Human-written caption engagement rate
AI-written caption engagement rate
Tweet with link
Tweet with visual link
I compiled the engagement rate for each tweet as an overview of how they performed. I’ll dig into specific metrics for each one below.
Tweet #1: Text-only tweet
Since I’m using Lately’s AI tool for this experiment, I had to focus on paraphrasing long-form pieces of content for these captions.
Here is the first tweet I wrote for this experiment. For this tweet, I paraphrased a quote I contributed to an article about freelance work routines.
And here are the results of this tweet, according to Hootsuite Analytics. It generated an engagement rate of 14.14%, 28 total engagements, and 198 impressions.
As far as the style goes, this is a normal tweet for me. I like to add voice through asterisks, and I’m a sucker for using parentheses in my writing. Parenthetical statements (statements that qualify or clarify something and usually have a set of parentheses around them) are a great way to communicate extra information and provide more context (see what I did there?). Plus, sentences with parentheses just add that extra human element if you ask me.
All that to say, when I had the AI generate a similar tweet from the same article, I didn’t edit it to add my usual human touches because I wanted to see how it’d perform on its own.
Here are the results of the AI-generated caption. It generated an engagement rate of 4.06%, 5 total engagements, and 123 impressions.
For this first tweet comparison, the human-written caption performed better. Is that because of the lack of parentheses? Maybe, but probably not.
There are a few factors that could’ve influenced this tweet’s performance, so let’s try again with another type of tweet.
Tweet #2: Tweet with link
For my next comparison, I wanted to see how a tweet with a link would perform. I paraphrased an article I wrote about types of career paths for people who work in content marketing (specifically freelance vs. full-time roles).
Here is the tweet I wrote:
With all the job uncertainty many are facing lately, this article I wrote for @superpathco feels like a timely reminder.
Freelancing doesn’t have to be an all-or-nothing move! Many folks I spoke to for this piece do it on the side or in between jobs. https://t.co/EI8rePLBuS
And here are the results of this tweet. It generated an engagement rate of 8.47%, 32 total engagements, and 378 impressions.
This tweet included a link to the article and a mention of the brand I wrote it for. This tweet had better reach than my last, and I suspect this is because the brand I mentioned quote retweeted it. I checked Twitter’s native analytics for additional metrics, and this tweet got 21 detail expands and five link clicks. Interesting!
For the AI-written caption, I had Lately paraphrase the same article and generate a tweet for it. It also included the link and a mention (which I had to add), but I didn’t change anything about the text that it generated.
“It’s nice to know freelancing is there in case my career ever takes me down that path again!” It’s not uncommon for FT content marketers to take on freelance work on the side or transition between FT and freelance throughout their careers. @superpathco
Here are the final results from that tweet. It generated a 7.5% engagement rate, 6 total engagements, and 80 impressions. I also checked out Twitter’s analytics for this tweet and it received one detail expand and two link clicks.
My human-written tweet outperformed the AI-generated one for this round. While the engagement rate was about the same, the tweet I wrote received over four times as many impressions, likely because it got retweeted.
Would the AI-generated caption have performed better if it had also been retweeted? Maybe. Would it have received more link clicks if it had more reach? Perhaps. But you could also argue that it wasn’t retweeted or clicked on as much because the caption itself wasn’t as engaging.
Let’s do one more test.
Tweet #3: Tweet with visual link
For my final tweet comparison, I wanted to include something visual to see how that made an impact on engagement and reach. I used two articles I’ve written for Hootsuite for comparison. (Again, since I’m using Lately for this experiment, the tweets had to be based on a long-form piece of content).
Each of them has a similar thumbnail image and would appear similarly in the feed. Plus, both articles cover a similar topic — social accessibility — so the content could be fairly compared.
Here’s the first tweet I shared with a caption that I wrote.
If you’ve worked in social media in the last few years, you know that digital accessibility and inclusivity are top priorities.
This tweet generated a 5.71% engagement rate, 8 total engagements, and 140 impressions. It got a couple of likes and one comment. According to Twitter’s native analytics, this tweet also got one link click and five detail expands.
I had Lately generate a tweet for another article I wrote on a similar topic about TikTok auto captions. I should note that the AI paraphrased the article for the first part of the tweet, but I had to add the second line for some context and to include the mention as my first tweet did.
Want more people to consume your TikTok videos? Make the viewing experience as accessible and enjoyable as possible by adding auto captions.
Overall, this tweet didn’t perform great. It hardly had any reach or engagements — no likes, comments, retweets, or link clicks. But it did get a couple of detail expands.
Once again, thehuman-written tweet outperformed the AI-generated text.
What do the results mean?
Ultimately, all of my human-written tweets performed better than the AI-written captions. I admittedly don’t have a large following on my personal Twitter account, so none of these metrics are too exciting. But I do think they demonstrate (on a very small scale) that more human-sounding tweets resonate better on the platform.
Here are some of my takeaways from this experiment:
AI-written captions can be a time saver
I’ll be honest: I may be a writer, but it takes me an embarrassing amount of time to write a tweet — especially one that paraphrases a long article. I was amazed at how quickly the AI tool was able to come up with captions, even if they weren’t publish-ready.
Whether writing comes naturally to you or not, coming up with creative captions every week takes time and energy away from your other tasks, like strategizing or engaging with followers. So when it’s your job to write dozens of social captions on any given day or week, using an AI tool can save you a lot of time.
A tool like Lately is especially helpful if you use Twitter to share long-form content like blog posts. This tool can create dozens of captions from one piece, which means you can have a month’s work of Twitter captions created in a matter of minutes.
AI-written captions still need a human touch
Since Lately generated dozens of captions for each article I gave it, there were a ton of captions that didn’t make the cut. Some of them were decent, but others were just plain confusing and would’ve needed a lot of editing.
If you use a tool to generate captions, you’ll still need to review and edit them, at least in the beginning — especially to add voice.
AI tools usually don’t understand the full context of a tweet and don’t know your brand voice right off the bat. Most AI tools need to be trained to better understand what you want them to do and to get your brand’s tone right. It can take time for these tools to catch on, but the more you use them and tweak what they write, the more they will adapt.
As far as whether or not AI-written captions directly impact reach? The jury’s still out. In my experiment, all of my human-written captions had better reach than the AI-written ones, but that could’ve been because the content itself resonated more. It’s hard to say if the platform knew the AI-written tweets were, in fact, written by AI. But if I had to guess, I’d say that Twitter doesn’t penalize you for using AI to write your captions.
Ultimately, if you’re going to use AI to write Twitter captions, think of them more as a starting point. You’ll have to use your social expertise to make them engaging and publish-ready, but you can knock them out much faster than if you were starting from scratch.
Save time by using Hootsuite to manage your Twitter presence alongside your other social channels. You can run contests, share videos, schedule posts, and monitor your efforts — all from one convenient dashboard! Try it for free today.