Blacklisted swearwords in your article will block you from being indexed in Google. It will also prevent you from reaching users on Linkedin.
I wrote an article where I wanted to show the impact of blacklisted terms on Google and Social Media.
On an article on SEO jokes, I created an initial version of the article that contained a lot of blacklisted words to prove that Google wouldn’t index it. After 16 hours of the profanity article being live and posted on Linkedin and Twitter, Google still did not index the article.
On top of that, the article did horrible on LinkedIn.
I removed the blacklisted terms and reposted a new identical post on LinkedIn.
48 minutes later, the article was indexed on Google and it reached a lot more users than its bad-mouth counterpart, proving that this both LinkedIn and Google assess the quality of your content before showing it to its users.
This is why you don’t block Twitter and LinkedIn in robots.txt.
Anytime you post a link on these platform, their bot send a request to view your page.
It is mentioned on their documentation that this is meant for them to fetch the metadata, including the thumbnail from your post. However, this post proves that they also use it to assess the quality of your content.
After 23 Hours, this Twitter post had 277 Views.
I just deleted and reposted that one right now at 3:57PM on March 26. Let’s see.
After 20 minutes, this post already had more views than previous one.
Twitter also seem to check the content before pushing it.
SEO Strategist at Tripadvisor, ex- Seek (Melbourne, Australia). Specialized in technical SEO. Writer in Python, Information Retrieval, SEO and machine learning. Guest author at SearchEngineJournal, SearchEngineLand and OnCrawl.