If your brand, or even you, have been lucky enough to be eligible for a Wikipedia page, good for ...
Case Study: Reputation Management for a Manufacturing Firm
Category: Case Study
Vertical: Packaged Goods
Est. Read Time: 7 min
How we did it.
How Reputation X improved the online brand of a global manufacturing firm
A multi-billion dollar company. Competitors masquerading as activists. Articles generated to negatively manipulate online sentiment. It was a perfect storm of Google manipulation we helped turn around.
Overview of the issue
Our client is a global brand. Well-funded international detractors had effectively "taken over" our client's Wikipedia page, caused (and possibly paid for) negative articles to be written and promoted, and did everything possible to turn public opinion against them.
- Base measurement (beginning): 60% negative, 20% positive, 20% neutral. Negatives were above the fold.
- Post-campaign measurement: 10% negative, 60% positive, 30% neutral. Remaining negative in position ten or off of page one.
- Duration: 12 months
- Improvement began after 30 days of strategy and content approval
Our objective was four-fold:
- Improve online sentiment for branded searches
- Balance the client's Wikipedia page
- Increase the amount of control our client had over branded search results
- Push down negative content hurting our client's brand
To solve the problem, we used our client's own resources far more efficiently, encouraged like-minded entities to support them, and executed a plan to clean up and defend their branded search results.
Details of the reputation campaign
We worked with the following parties:
- Director of Marketing (Client)
- Public Relations Firm (Client's)
- SEO Team (Client's internal team)
- Wikipedia editors
- Our own team
Before beginning our client's online reputation management campaign, we executed a research program to understand the sentiment landscape. This involved measuring sentiment, measuring client control of branded search results, and much more. The steps involved are listed below:
Measure baseline sentiment and controlReputation X measured the sentiment (positive, neutral, negative) of the following types of search results:
- Google searches in four countries
- Primary and secondary branded search phrases
- Desktop and mobile
Research similar companies to help us set objectivesWe queried the client about similar companies, executives with similar titles, existing relationships with journalists, newsworthy items that may arise in the future, differences and similarities between our client and similar companies, and charitable work the client has been involved in. We also asked them to identify known detractors and to educate us on their possible motivations.
Perform a gap analysis to know what to developWe compared the search and social results of our client to similar companies as seen from various locations around the world. Here are some of the results of our findings:
- Our client had one site. Some competitors divided content among many sites to control more search results.
- Our client's website was inefficient. A few tweaks would improve search visibility.
- A list of publications upon which competitors were published, but our client was not, was generated for future outreach.
- Competitors had almost certainly paid Wikipedia editors to manage their Wikipedia pages.
- Databases used by search engines were revealed, enabling us to seed them with our client's information as well - improving search results.
- Similar companies used the power of not only their main website, but those of seemingly unrelated subsidiaries, to enhance the online reputation of the parent entity.
Following digital clues to identify motivations of negative content authors
When a blogger, journalist, or researcher writes negative content, they usually have an honest reason to do so. Other times they are paid or have been provided other incentives to create negative content.
For example, negative content may be created for:
- Political purposes as was clearly seen during the 2016 US elections
- To drive share price down in order to benefit from short sale of stock
- To embarrass a company or its executives
For this client, we needed to discover the roots of authors' motivations to create negative content, and to identify authors who might be working at the behest of others.
Looking for patterns in digital footprintsWe first examined the editing history of Wikipedia authors to spot patterns. We looked at contributors to websites to see if they were the real thing, or "sock puppets" (shills). This involved looking at their editorial histories, social media, and profiles. We also performed searches of screen names to find references to hidden content they may have authored. Finally, we looked at the backlink profiles of negative content (and the authors of that content) to find date patterns, similar IP addresses, mentions of third-parties, and more.
This type of reputation research has unearthed dark PR firms, private blog networks designed to hurt clients, troll farms, paid Wikipedia editors (and their real names), and much more. In this case, we discovered a network of activists had been unknowingly recruited by a competing corporate entity.
Find knowledgable Wikipedia editors to come to our client's aidReputation X does not employ even one Wikipedia editor. Instead, we often research the history of third-party editors working in the same space as our client. In this case, we identified problems with our client's Wikipedia page that included what might be called "alternative facts", outright lies, references to false planted information, and more.
We then reach out to knowledgable editors asking if they believe they can help to correct the Wikipedia page. In many cases, Wikipedia editors agreed the page was in fact being vandalized and worked independently to correct and defend the page - all without pay. We believe that sometimes all that needs to be done is to highlight a problem, and the community will work to correct it.
Design a more efficient authority flow from client-controlled sitesOur client controlled dozens of websites for itself and its subsidiaries. While the sites had been in existence for years, they were often based on out-of-date technologies, badly optimized, and far less supportive of the parent brand than they could have been.
We performed SEO analysis on each site, provided a checklist to our client's technical team, and set them to the task of improving each site. This included suggesting new articles on each site, adding new link references, and a social media campaign to support each improvement. We used the "five types of content" promotion plan.
After 90 days, Google began ranking many of them on the first page of search results - above negative content. This worked because while the authority of each of the client-controlled websites was good, it was not optimized for the parent brand, nor for helping other third-party content to gain the relevance needed to perform well in search.
- Tools used:
- Proprietary tools
- Screaming Frog
- Google Analytics
- Human brains (our best tool)
Design a "new mythology" for our clientThe client had been an established entity in existence for many years. In the distant past, they'd done things that may not be acceptable today. While Google does its best to provide the best possible search results, those results are often out of date and appear in search results because of popularity rather than because they are true. To add to the problem, many search results rank highly because of negativity bias or confirmation bias - in other words, people tend to click on sensational stories. Google gets signals that the popularity of stories may mean others are interested, therefore the stories become more visible in search results. We needed to break the cycle of negativity.
We started with a reputation management process we developed called the Reverse Wikipedia Strategy. The Reverse Wikipedia Strategy is a framework to improve a brand's online content by imagining what an ideal Wikipedia page may look like in a year or more. It exists within a family of strategies used by Reputation X as an exercise for strategists and content creators to think about the best possible branded content. In it you imagine the sections a fictional Wikipedia page may have, what's in those sections, and most importantly, the references (existing or not) that would support the Wikipedia page if it were real. Google results are in many ways like the sections of a Wikipedia page in that they rely on notable content.
In short, we imagined the future search results of our client, developed the references and search results that were contained in those Google results, and then executed forward from the start date. In time, the positive content began to rise in SERPs as well as the newly optimized subsidiary sites, third-party charity sites, and eventually Wikipedia (remember the Wikipedia editors we notified earlier?).
Optimizing charity sites to better support our client
Our client was generous to charities, but they weren't getting search and social visibility commensurate with their giving. We often advise clients to follow a set of best practices outlined in the Reputation X Playbook. It directs internal PR teams to ask for certain small things from a charity when providing donations.These requests involved the development and publication of press releases by the charity that were structured in a certain way, multiple timed social media posts by the charity (rather than just one), specific additional content being added to the charity's website, and more.
The result of these relatively simple actions was that the charity pages often rose in search results and benefitted our client by making it clear to all stakeholders that our client was - in many ways - good.
Outcome of the reputation management campaign
- Our client began to see improvement in their branded search results within 90 days of initial strategy and content approval.
- Their Wikipedia page began to improve, then lost ground, and gained again after one month. Edits continue to happen to this day.
- Sentiment scores which began as primarily negative improved to only 10% negative after six months of execution (after the research and strategy phases).
- Base measurement (beginning): 60% negative, 20% positive, 20% neutral.
- Post-campaign measurement: 10% negative, 60% positive, 30% neutral.
- Our client's legal team used background research we conducted on paid negative authors to begin legal proceedings.
Get Your Free Analysis
Find a path to success. Estimate duration. Understand costs.