A/B Testing (also known as Split Testing) is the practice of creating two or more documents or websites that are almost identical, except for one characteristic. Then tests A against B, and chooses the best. Then the website owner watches to see which layout is more profitable.
The only part of the document or site that is different is the element being tested. The A/B Test provides insight into website traffic and conversion rate. Whichever feature being tested shows better results is the feature that should be used permanently, or at least until a better performing feature comes along. This test is often used in PPC marketing, affiliate marketing, and it’s occasionally used in organic SEO. The goal is to always have the highest performing elements on your website so you are earning as much profit as possible. Although A/B Testing can be very successful, it’s not as popular as web analytics, SEO, and usability because many people don’t understand what it is or how it’s done, and they don’t know what the benefits are.
Above the Fold refers to the section of a webpage or email that is visible without scrolling.
The term originated as a reference to the top half of a newspaper. Since a newspaper is always folded in half so that only the top portion is visible, and it is this area that encourages people to buy the newspaper, it represents the place where the most important information of the day is placed. The information and images displayed in this space is often hand-picked by the editor, business owner, or other person with a seniority position. The term Above the Fold is also used in graphic design and refers to a visually appealing image and/or important news story that has been placed on the top half of the front page of a journal or magazine. Other terms sometimes used for this prime space include the hot zone, above the crease, and above the scroll.
Above-the-Line Advertising is an effort to promote a brand to a mass audience.
Above-the-Line Advertising is usually carried out on television, radio, newspapers, magazines, and billboards. The Internet has also grown to above-the-line advertising levels with banners ads on popular online portals like MSN and Yahoo. The cost of this type of advertising is usually higher than below-the-line advertising, which involves more direct forms of marketing such as direct mail and telemarketing.
Above-the-Line Advertising is effective for branding and public relations, but not for direct response campaigns.
Accelerated Test Marketing is a real-time method of market testing that takes place in a controlled environment.
Real-time testing of products, services, brands, or ideas is subject to numerous factors that can drive focus away from the test subject. With Accelerated Test Marketing, testing takes place in a controlled environment where evaluators are in full control of influential factors.
In the case of retail consumer goods, Accelerated Test Marketing can take place in a simulated shop or store; these spaces are also called laboratory test markets or purchase labs. Certain aspects of online reputation management can be tested in a closed-off local area network (LAN) that simulates the Internet, complete with a substitute search engine results page (SERP) and a browser.
Online Ad Space refers to any place on a Web document where an advertising client can pay to place a promotional item.
Available Ad Spaces on Web pages are classified according to their size and position. Certain concepts from the print industry have carried over to online advertising with regard to ad space; for example, banner, skyscraper or rectangle ads are similar to those available on the pages of daily newspapers. Online Ad Space expands along with the Internet, but premium advertising space is getting more expensive.
Ad Tracking is a methodology used to monitor the performance of a brand or the effectiveness of a specific advertising.
Online Ad Tracking features metrics, measurements and analytics obtained with Internet applications. Some of the factors evaluated are recognition, awareness, perception of price, lifestyle compatibility, and how the brand fits to the content of the Web page.
Most online advertisements these days have built-in Ad Tracking capabilities; in fact, some forms of Internet Marketing such as banner ads, pay-per-click (PPC) and Google AdWords essentially track themselves. Online Ad Tracking is mostly used to gauge return on investment (ROI).
Affiliates are often referred to as publishers since they own or manage online properties such as blogs or forums where they can attract prospects for merchants. Affiliates are individuals who have a special marketing relationship with an online enterprise in which visitors, prospects or subscribers are captured in exchange for a reward or commission.
Affiliates are not as significant in terms of online marketing as paid search results or e-mail marketing campaigns, but they are very effective for memberships. Certain products offered by the financial services industry are also known to use Affiliates for their marketing efforts. Affiliate networks serve as middlemen or matchmakers between publishers and merchants.
AIDA is a marketing acronym for Attention, Interest, Decision and Action; the typical order of events intended to take place when advertisements and marketing efforts are pitched to consumers.
Content is first crafted in a way that gets the attention of Internet searchers and website visitors. The interest is created by highlighting positive qualities of an individual or a brand, the desire to learn more is satisfied by pointing to related content in other websites, and an example of action would be to urge visitors to sign up for email newsletters.
AIO is a marketing acronym that stands for Attitude, Interests and Opinions. AIO Statements are obtained by performing psychometric research surveys in certain market segments.
AIO Statements are extracted from surveys that generally consist of presenting concepts or situations to consumers and obtaining their level of agreement. These statements are then used for further segmentation of a demographic group.
Agencies study AIO Statements to create and target content that will help boost the positive image of an individual, a business entity, an organization, or a brand. Since AIO Statements reflect the unique needs and wants of a market subset, it is important to understand how this information affects perception of the online reputation of a company or a business professional.
Alexa Rank is the measurement of the global popularity of a website as calculated by Alexa Internet, a Web traffic and reporting firm owned by Amazon.
The data used to calculate the Alexa Rank is collected by the Alexa toolbar, which is distributed worldwide among users of the Microsoft Internet Explorer and Mozilla Firefox Web browsers. For the Google Chrome browser, Alexa makes a browser extension. In this fashion, Alexa measures the popularity of Web sites around the world and ranks them on list that is continuously updated.
According to Alexa, the two most popular sites in 2012 were Google and Facebook. This means that the right amount of SEO and ad placement on those two sites could greatly work to improve the visibility and positive image of a brand.
An Algorithm is a collection of logical rules that provide a solution to problem.
Computer Algorithms are programs that depend on input before a logical sequence can be enacted to solve the problem. A search engine Algorithm solves a query by following a number of unambiguous steps to parse through the index and use dynamic scripting to create a search engine results page (SERP).
The Google search engine algorithm is believed to contain more than 200 factors to rank sites. Although the Google search Algorithm is a closely-guarded corporate secret, the company issues guidance used by search engine optimization (SEO) professionals with regard to increasing the SERP rank of websites.
AltaVista (AVT) was a leading search engine and Internet portal acquired by Yahoo in 2003.
In the years leading up to the collapse of the dot-com bubble on Wall Street, AltaVista Technology Inc. was a promising company that provided online search and indexing services. The success of the AVT search engine prompted Overture Services to purchase the company, which would later be transformed into a Web portal. The AVT search engine began returning Yahoo results and was permanently redirected in 2012.
Search engine optimization (SEO) and online reputation management companies no longer optimize content for AVT. The website redirect to Yahoo search means that AVT now returns Bing results.
AllTheWeb (ATW) was a search engine owned by Yahoo from 2004 until its closure in 2011. AllTheWeb originated from a doctoral thesis on a data search method based on the File Transfer Protocol (FTP), which preceded the Hypertext Transfer Protocol (HTTP) of the World Wide Web. The advanced technology used by ATW once made it a strong competitor to Google in the late 20th century. By 2003, ATW was acquired by Overture and later by Yahoo.
After Yahoo's takeover, ATW was diminished in scope and lost popularity. ATW currently redirects to Yahoo, which now features Bing results.
Alternate ALT Text is an aspect of accessibility used in the programming, coding and design of Web pages.
ALT Text uses electronic character encoding systems like ASCII to specify content on a Web page that can be interpreted by screen readers and other technologies that enable accessibility for visually impaired Internet users and other visitors
who live with disabilities.
ALT Text can be embedded in a Web page in place of certain content such as images, applets and data input forms.
The Amazon Effect refers to the positive outcome a Web document obtains once it becomes associated with a trusted domain like Amazon.com.
The Amazon Effect was initially observed by Search Engine Optimization (SEO) professionals in the defunct AltaVista search engine. The AltaVista algorithm assigned significant trust and authority to results from online retailer Amazon in the late 20th century; as a result, several listings from on its search engine results page (SERP) were Amazon product listings. This is still the case for many manufacturers or distributors who get their products listed on Amazon and begin to see positive SEO results.
Ambush Marketing consists of taking advantage of, or capitalizing upon, a sponsorship, endorsement or brand association instead of paying for it.
Marketers look for different methods to boost the visibility and image of their clients; this often requires sponsorship or other types of association with other companies, organizations, events, products, or situations. The frequently "banned" TV commercials from the People for the Ethical Treatment of Animals (PETA) and the Ashley Madison dating website are examples of ambush marketing during the Super Bowl. Their commercials may not air, but they get considerable online attention due to this rejection.
Anchor Text refers to a string of characters on a Web page that points to an online resource. It can consist of related words, phrases, keywords, or sentences.
For search engine optimization (SEO) purposes, Anchor Text can become just as important as the inbound links that point to a certain Web page. Content that reflects a positive image of an individual, an organization or a brand will appear more relevant to search engine algorithms if several inbound links with the right Anchor Text are placed across the Internet.
Ah, to be perfect without any online reputation issues haunting you, your business, or your clients. Even as companies and individuals try to be exemplary citizens of the internet ecosystem, having an online presence protected by a glowing halo of ‘goodness’ is quickly becoming a thing of the past. A growing legion of faceless evil-doers is busy inventing malicious digital crimes that affect everyone – from major corporate targets to unknowing private citizens. Actively maintaining a halo of positive online behavior, mutually beneficial relationships, and inciting positive online feedback is still vital to any online reputation management plan, but by itself rarely is enough to protect an entity for too long.
AOL stands for America Online. It is a digital media company and Web portal that frequently ranks in the top 100 most popular sites on the Internet.
AOL was once known as America Online, a company that became synonymous with the burgeoning concept of cyberspace. It was a precursor to the Internet, online communities and social media. AOL was a paid service until the early 21st century, when it switched from a membership-only online service to a mostly free Web portal.
AOL today lacks the clout, influence and massive membership base it had in the 1990s. The company's search engine is powered by Google, but a lot of original content and engaging is still showcased by AOL. For these reasons, AOL is still on the radar of many online reputation management companies.
The Apache Web Server is the most popular Web server software application in the world.
Apache started as an open-source development project in 1995, just when the World Wide Web was new. The Apache HTTP Server is currently present in servers hosting more than half of all Web pages in the world. Apache runs on a variety of operating systems, including Linux, Apple OS X, Unix, and Microsoft Windows.
Some of the advantages of Apache Web Server include its low cost and support of several Web technologies. Secure socket and transport layer security protocols are easy to implement on Apache Servers, and the vibrant community of software developers dedicated to the improvement of Apache makes it a great choice for a Web server.
An Application Programming Interface (API) is a tool used by programmers to develop apps that communicate with databases, networks, operating systems, websites, and other programs.
APIs essentially provide an interface for third-party services and applications to enhance the user experience. The growth of some of the most popular online social networks these days owe a lot to the Web API strategies they follow. A notable example is the micro blogging and short messaging network Twitter. Thanks to its API strategy, outside developers have created several apps that enhance the functionality of Twitter. Some third-party Twitter apps use the network to allow users to send money, find jobs, track packages, share photos, and more.
An Applet is a program developed with the intent of being embedded in a Web page, loaded in a remote computer, and executed within a browser.
Java Applets are small programs that can be developed to provide specific functions to augment the visitor experience. Java Applets bring Web pages to life with a variety of truly interactive functions and content that can be generated or modified based on input from visitors.
Typical uses for Java Applets include games, news tickers, maps, virtual keyboards, and more. Java applets can be used on web pages to encourage interaction and foster positive image through engagement.
Application Server refers to a computer or a software environment that enables Web programs to run. Applications Servers can run anything from operating systems to enterprise software suites.
Application Servers have become more significant in the current cloud computing paradigm due to their ability to function as platforms for remote clients to access and operate software applications without downloading or installing them.
Instead of using the old client-server model of network computing, users of Application Servers can connect to them from just about any Internet-connected device equipped with a modern browser. Application Servers are typically three-tiered systems whereby a Local Area Network (LAN) serves as a middleman between the client and the back-end component.
An Article Directory is a website that collects, stores, categorizes, and serves articles from a variety of authors on different topics.
Articles can be acquired by purchasing them from authors or through voluntary submissions. The search relevance of these websites increases as they continue to amass diverse content, eventually becoming high-authority. The importance of article directories has diminished significantly with the advent of Google's Panda and Penguin updates.
ASP.Net is a technology that uses active scripting to dynamically generate Web pages that reside on a server before they are presented to the visitor.
Active Server Page technology was created by Microsoft in the mid-1990s. ASP.Net is an advanced framework that developers use to build content that is rich and interactive. This allows for a dynamic and customized user experience that goes beyond serving static pages.
Online reputation management firms sometimes make use of ASP.Net technology whenever their projects call for information that needs to be constantly updated and customized for different visitors. Reputation X relies primarily on PHP, rather than ASP.
Astroturfing is in the same vein as whitewashing, whereby organizations cover up wrongdoing or positions of advantage with rhetoric and good deeds. It takes advantage of social media and other online platforms to mask identities or to create avatars for purported grassroots groups. Some astroturfing campaigns may also involve concerted defamation attacks against real grassroots organizations whose members truly oppose a practice, political group or business concern.
Online reputation management companies are sometimes called upon to remedy the damaging effects of astroturfing gone wrong.
Authenticity is a highly-valued quality on the Internet. Online communities are constantly on the lookout for potential forgery, fakery, swindles, and assorted acts of flimflam. For this reason, personal branding and reputation management efforts must be conducted with a high degree of genuineness in order to build authenticity.
An example of fostering authenticity in reputation management strategies is to apply an asset-based marketing approach to content creation, underscoring positive aspects of a product, service, brand, organization, or a person. These positive aspects must be easily verifiable and the content that communicates them must not blatantly come across as sales copy.
Avatars are Internet representations of members of online communities.
Early avatars were simply nicknames that contained limited biographical information. Icons and photos were later added, and by the time online social networking grew in popularity avatars had also grown in complexity.
The origin of the word avatar can be found in the ancient religious teachings of Hinduism; it essentially describes the personification of a deity. In the Information Age, the term avatar is used to describe an online persona created and managed by a human, although artificial intelligence programs can also control avatars.
Backlinks are hyperlinks in HTML documents that point to a webmaster's own site. They are placed on external pages and are important factors in terms of search engine optimization (SEO).
Before Google rolled out its Panda search algorithm update in 2011, backlinks were crucial to SEO professionals. Part of Google's algorithm before Panda consisted of assigning rank to pages that enjoyed a high number of backlinks from across the Internet.
The backlink strategy to boost the rank of a website on the Google search engine results page (SERP) led to some questionable SEO tactics that ended up creating lots of low-quality online properties. The Panda update changed the SEO landscape with regard to quality versus quantity, but backlinks are still significant to SEO and reputation management professionals.
Bad neighborhoods are websites labeled by Google and other search engines as purveyors of spam, malware, spyware, or malicious scripts.
Bad neighborhoods are either banned from Google or banished to the lowest rungs of the search engine results page (SERP), meaning that they are unlikely to be found by casual searchers.
Black-hat search engine optimization (SEO) tactics like cloaking, link farming and spamming in general can cause a site to be labeled as a bad neighborhood. Simply linking to a bad neighborhood could result in a heavy SERP rank penalty; for this reason, SEO and online reputation management firms are always on the lookout for bad neighborhoods.
Behavioral Tracking is an online technique of collecting and interpreting market research data from browsing sessions with the intent of customizing Internet advertising to specific visitors.
Marketing and advertising agencies use behavioral tracking to follow the activities of users as they surf the World Wide Web. This is accomplished through the use of online technologies such as cookies, local shared objects and user data persistence modules that can track users across multiple websites. Most of this tracking is surreptitiously carried out without the knowledge of users.
The use of behavioral tracking through advanced technologies has become controversial, particularly with the growth of online social networking and the vast amounts of personal data shared therein. The data collected by behavioral tracking firms is often analyzed by online reputation management firms to learn more about browsing patterns and behaviors.
There are essentially three types of search engine optimization (SEO) practioners: White Hat, Gray Hat, and Black Hat. White Hat SEO's follow Google and Bing Webamaster Guidelines to the letter. Gray Hat SEO's (most SEO's) follow the spirit of the guidelines, Black Hat SEO's toss the rulebook out the window.
The bad old days of black hat SEO are behind us. There were specific ways black hats operated, search engines have seen through most of their tactics at this point. Today black hat SEO is almost a science.
Well executed black hat tactics are like high-risk stocks. They can produce massive rewards quickly, or bankruptcy. Even when black hat tactics work, one never knows for just how long though. As search engines get smarter black hat experts must continually adapt.
A botnet, also known as a zombie army, is a network of Internet-connected computers that have been positioned, unbeknownst to the owners, to forward malware and spam to other computers.
'Bots' on the internet are software programs that emulate human behavior in an automated and often repetitive way. Often called 'web robots' Internet bots are used to copy web pages, commit crimes, take part in online discussions, trade stocks, search for bargains on Ebay and other sites, post 'spam' (spambots), buy concert tickets, control computers ('zombify') and more. Web bots are faster than humans at performing repetitive tasks, or simply tasks that need instant response like buying the best concert tickets the second they go on sale.
Whether used for beneficial or nefarious purposes, internet bots are here to stay.
High bounce rates often indicate that the website isn't actually doing a good job of attracting the continued interest of visitors, but not always.
Bounce rates could be used to determine the effectiveness or performance of a given page at creating and maintaining the interest of website visitors. For example, an entry page with a low bounce rate means that the page effectively causes visitors to view more pages and proceed deeper into the website.
Interpretation of the bounce rate measure should be appropriate to a website's business and conversion goals, as having a high bounce rate is not always a sign of poor performance. On sites where a target can be met without looking at additional pages, for example on websites sharing specific knowledge on some subject, the bounce rate would not be as meaningful. In comparison, the bounce rate of an e-commerce site could be interpreted in connection with the purchase conversion rate and the buyers journey through the site, possibly proving that the bounces reveal visits where no purchase was made.
A Broken Link is a hyperlink on a Web page that returns an error or an incorrect page instead of the intended content.
Broken links also known as dead links, and they are the cause of much consternation to website visitors. They are also the bane of webmasters and search engine optimization (SEO) professionals since they lower the ranking of a site on the search engine results page (SERP). Most broken links return the HTTP 404 – Page Not Found error.
Neglecting websites will eventually cause link rot, an unfortunate condition that causes a Web page to lose its SERP rank. SEO and reputation management professionals use different tools to monitor the health of links and the targets they point to.
A Business Model is a functional concept used by self-employed professionals and companies to define their operations and general goals.
Business models describe the relationships between intentions, objectives, strategies, and tactics used by companies and individuals when they deal with their clients. At the heart of every business model there must be a strategy for creating value.
Click-Through Rate CTR is a metric used in online marketing to measure the performance of a Web page element.
The CTR average is calculated by counting the number of clicks an element gets and dividing the amount by the number of page impressions times 100. For example, a banner ad located at the top of a Web page that is visited 1, 000 times in a day is clicked 11 times. The CTR of that banner is just 1.1 percent, a rather poor performance for a premium position. It is important for web masters to understand that CTR only reflects the percentage of visitors who clicked on a specific advertisement, but not the number of people who may have viewed the ad and later took action by directly visiting a website.
In Pay Per Click advertising, Click-Through Rate can reveal the effectiveness of a Pay Per Click (PPC) ad. An example of a PPC advertising is Bing Ads or Google AdWords. It measures the number of clicks advertisers get compared to the number of impressions the ad has received. Click-Through Rate affects the quality score of an ad, so its an important metric in online advertising. A high CTR isn't always a good thing though. Since advertisers normally pay for every click, a click that doesn't end in a sale or other preferred outcome may be money wasted.
When someone performs a search in Bing (or I suppose Google) they are presented with a Search Engine Results Page (SERP). The person sees a selection of links with Descriptions below the links. Based on the placement of the link on the SERP (top, middle, bottom) and the Description, the person clicks on a search result. The number of times searchers click on a given search result compared with the number of times that website was returned for that particular search, is the website Click-Through Rate. So if a search result is returned in search results 100 times, and that result is clicked on ten times per 100 times it was displayed, the CTR would be 10%.
Click-Through Rate is important for web reputation management. We try to get a high Click-Through Rate and low bounce rate for the web properties and content we create or manage for customers. The combination of a high CTR and a low bounce rate helps search engines confirm the quality of the content we have created, which leads to higher search engine results.
Clicks are basic units of measurement in the fields of online advertising and marketing.
In the past, clicks mostly referred to as the process of placing a pointer or cursor on top of Web page element and pressing the left mouse button. With the advent of touchscreens and operating systems like Android, iOS and Windows Phone, swipes and taps are replacing mouse clicks, but they are still counted as processes that activate elements on a page.
Not all clicks are designed to take a visitor to a different Web page. Social networking buttons that invite visitors to share, like and vote on content perform complex actions that do not require the visitor to leave the page they are browsing.
Cloaking refers to a deceitful search engine optimization (SEO) practice that involves hiding certain elements from visitors that only the search bots or crawlers get to see and index.
Among the many mischievous techniques used by black hat SEO practitioners, cloaking stands out as one of the most egregious. When cloaking is applied to a Web page or site, visitors are treated to content that may be completely unrelated to the description they saw on the search engine results page (SERP).
Cloaking is a dishonest method to boost the SERP rank of a specific Web page. Search engine giant Google has a strict policy against websites that feature cloaking, which includes labeling those sites as bad neighborhoods and banning them altogether from their index.
Comment Tags are part of the HTML source code that allow the input of text remarks and commentary on Web documents without displayed on a Web browser.
To the average person, HTML source code may look incomprehensible except for the text that is contained within comment tags. These HTML tags are supported by all major Web browsers, but they are not normally scanned by search engine crawlers, at least not by Google.
Comment tags start with <!-- and end with -->. In the previous example, the phrase "and end with" would not be displayed on a Web browser, but it would be visible to visitors who activate the source code viewer component of their browsers or who open the page with a program that allows HTML editing.
Conversion ratio CR is a measurement of the effectiveness of an online advertising or marketing campaign.
The most basic conversion rate calculation involves setting a goal for visitors, like buying something or entering information on a form. The number of people who actually complete the action out of every 100 visitors is the conversion ratio. Affiliate networks, e-commerce websites and search engine optimization (SEO) professionals consider conversion ratios to be a true measurement of their marketing prowess.
In the case of a webmaster capturing leads for a client, if 10 out of 200 visitors enter their contact information on a form, the conversion ratio is five percent. Conversion Ratios can also be used as part of return on investment (ROI) calculations.
Of all payment models used in online advertising, CPM is the most traditional. Cost-Per-Mille CPM is a traditional payment model in the field of advertising that measures how much a client will pay for a single advertisement to reach one thousand magazine readers, television viewers, radio listeners, website visitors, email readers, and other prospects.
CPM is calculated by dividing the cost of placing an advertisement by one thousand impressions. An impression is equivalent to a single page view. Online reputation management specialists often evaluate CPM quotes from different publishers in case they need to place advertisements for the content they create for their clients.
The Communications Decency Act of 1996 (CDA) is an act of the United States Congress that aims to regulate Internet content.
Legal scholars consider the CDA, which was signed by former President Bill Clinton, to be the first major legislative action to attempt placing limits on content publishing on the Internet. The CDA has been controversial since its enactment, and has been subject to numerous legal challenges at the Supreme Court level.
One of the most disputed sections of the CDA deals with the issue of defamation and the level of responsibility that webmasters and owners of online properties have over the content generated by third-parties. Section 230 of the CDA often leaves victims of defamation without recourse, which is one reason for the growth of online reputation management firms.
ConsumerReports.org is the flagship website for the Consumers Union, the group responsible for all Consumer Reports Media, which includes various print publications and websites.
Since 1936, Consumer Reports has been publishing analysis and commentary on products and services conducted by an independent and mostly anonymous editorial group. The opinions presented by the Consumer Reports media group are generally very well received and highly respected.
For manufacturers and providers of goods and services, getting a good review or positive mention by Consumer Reports is extremely desirable. Negative mentions, on the other hand, can be particularly damaging. To this end, Consumer Reports has been the target of defamation lawsuits by known brands such as Suzuki and Sharper Image.
Content is information that can be produced, published and marketed to convey a message and provide value to audiences. Online content can be presented to website visitors in a variety of media formats.
Content creation is at the heart of all search engine optimization (SEO) and reputation management strategies. The phrase "Content is King" is often used these days in the context of producing quality websites that visitors can use and enjoy repeatedly.
In 1996, Microsoft co-founder and former CEO Bill Gates wrote about the importance of content to the future of the Internet. He mentioned that content, the underlying information that can be presented in a variety of digital media formats, presented the greatest opportunity for branding and making profits. Back then he was referring to the Microsoft and NBC news venture that became MSNBC.
A Contested Name Space is a situation whereby the shared name of an individual or business entity is targeted by two or more entities.
When reputation management companies are retained to protect the online image of a person or a business entity, one of the most important issues they look at is the name space, which is defined by the number of listings that appear on the search engine results page (SERP) when a name query is submitted. Shared name spaces mean that common names like "ABC company" compete against each other to gain rank on the SERP.
Contested name spaces present a problem for reputation management and search engine optimization (SEO) professionals.
Third-party tracking cookies are controversial due to their online behavioral tracking functions. Add-on modules and components can be installed in Web browsers to block some of these tracking cookies. Privacy advocates argue that cookies are highly intrusive since they are stored in the hard drives of visitors' computers, often without their knowledge.Browser cookies retrieve data that informs the website about the previous activity conducted on the site by particular visitors. Web browsers that support cookies have been developed since the mid-1990s. Cookies serve a variety of purposes, from remembering visitor preferences when they browse a site to keeping a secure session secure and encrypted.
Cost-Per-Action (CPA) is a methodology for setting online advertising rates that is specifically based on measurable actions such as registrations or sales.
CPA models usually put the burden of obtaining a sale or a lead on the publisher. For the client, this means that the publisher will be motivated to produce earnings from CPA and thus may put greater effort on getting visitors to click on ads and either make a purchase, register for service or to receive information. This payment model is typically seen in affiliate marketing arrangements.
Online advertising professionals use different payment models to compensate marketers and publishers. Other models like cost-per-click and cost-per-mille do not ask for so much from the marketers or publishers.
Cost-Per-Lead (CPL) is a payment model for online advertising campaign that sets the amount of revenue that a publisher or marketer will receive from a client expecting to generate prospects and sales contacts.
CPL campaigns guarantee that the advertiser or client will get a certain number of leads, but the quality of the leads is often widely open to interpretation and negotiation.
The expected Return on Investment (ROI) of CPL campaigns is usually higher than Cost-Per-Click and Cost-Per-Mille strategies, but if prospects do not see too much value on the proposal they encounter, they may enter unreliable or even outright false information instead of solid leads.
A Web Counter is a code module that collects general information and statistics about website usage.
Web counters measure data related to the number of pages visited by each visitor, the number of unique visitors who arrive in a specific site in a day, and other metrics. Counters can be further developed into more sophisticated tracking systems like Google Webmaster Tools and Analytics.
Data from counters and analytics software is constantly reviewed and analyzed by search engine optimization (SEO) professionals and reputation management firms. Sophisticated counters can collect important information about visitors, the type of device they are using to browse sites, the time of day they prefer for browsing a site, etc. This information is also very useful to reputation management clients.
A Crawler is an automated browsing application that methodically scans Web documents across the Internet. Crawlers are typically used by major search engine providers, and they are also known as bots or spiders.
Major search engines like Google and Bing send out their crawlers across the Web to scan pages and their content. These crawlers are specifically programmed to look for certain characteristics of the sites that will be later evaluated by indexing programs.
Modern crawlers make copies of the Web sites and pages they visit; by doing so, search engines can show cached copies of pages they store on their servers. Crawlers can also be programmed to harvest data, such as email addresses, for unsolicited marketing purposes. Reputation management firms are well-aware of crawler mechanisms and behaviors; this allows them to better serve their clients in terms of search engine optimization (SEO).
Crawling The Web is the activity that search engine bots, crawlers and spiders perform as they encounter Web pages and copy their content and information.
Humans browse the Web; the bots sent out by major search engines like Bing and Google do not browse so much as crawl or spider the Internet as they catalog the information they find. The crawling behavior and functions of crawlers are closely-guarded secrets that start with the bot architecture. A standard crawler is essentially a downloading program that analyzes URLs as well as text and other data it finds on the Web pages it encounters.
Webmasters can program sites and pages to prevent crawling of their websites; doing so will result in Web pages and content not being indexed as part of the search engine results.
Cross-Linking is a search engine optimization (SEO) practice that involves placing hyperlinks on a Web page that point to another site owned by the same webmaster.
Cross-Linking can be used as a linking scheme to increase the ranking of websites on the search engine results page (SERP). Search engine algorithms take into account WHOIS and ownership when evaluating cross-linking practices, and sites can be penalized with lower SERP rank as a result.
The most intrinsic value of the Internet is the ability that webmasters and users have to share links and promote the process of information discovery. Based on this premise, the process of interlinking sites and domains together should be applauded since it encourages visitors to review other sites that have some affinity. Cross-linking, however, is not a practice welcomed by major search engines like Google when linked sites share the same owner.
Cross-Pollination is a search engine optimization (SEO) practice involving the distribution of content across multiple websites and online communities.
The content that is cross-pollinated can be distributed in a variety of file formats, from documents to video and more, and it can be adjusted and optimized to match the characteristics of different online platforms; what is important is that message remains the same as it is distributed across the Internet.
Ever since Google rolled out its Panda search algorithm update (going on version 3.0 now), content marketing and cross-pollination has been a strong focus for the SEO and online reputation management communities. Promoting a positive image on the Internet is a matter of putting great content to work, and this can be accomplished with cross-pollination. Cross-pollination is often at risk of using duplicate content, rather than fresh unique content.
Crowdsourcing in the context of online reputation management utilizes hundreds or thousands of people doing relatively simple computer tasks in order to accomplish an overall goal quickly and efficiently. An example of crowdsourcing would be to assign 100 people the task of identifying blogs that accept a specific type of guest post, then inputting that information in a form for later use by a small group of search engine optimization personnel.
Other examples of crowdsourcing include error checking text, write travel reviews (think TripAdvisor), design logos, or come up with ideas.
Properly done, crowdsourcing can save time, money, and produce excellent results. Improperly done, crowdsourcing can be an expensive waste of time. The term crowdsourcing was coined by Jeff Howe in a Wired article.
A Cyber Stalker is an individual or entity that use the Internet as a platform for harassment and intimidation.
In the physical realm, stalkers pose serious dangers to individuals and organizations. Cyber stalkers in the online world use similar methods of harassment, but they are often aware that their actions could be seen by millions. Their criminal objective on the Internet is the same as in the offline world; they want to make their victims feel threatened, endangered, intimidated, and thoroughly harassed.
Clients of online reputation management firms often see their good names torn to shreds on the Internet by cyber stalkers. Depending on the level of harassment employed and their apparent level of credible threat, law enforcement may become involved.
Cyber Stalking is a form of cyber crime that involves intimidation and harassment from perpetrator who intends to make the victim feel hunted and endangered.
Stalking on the Internet is unfortunately increasing along with the growth of online and mobile communications. Cyber stalking is a criminal activity undertaken by individuals or groups who take advantage of the pervasiveness of Internet communications to make people or organizations feel threatened. Reputation management firms are frequently retained by clients who have been unfairly targeted in this regard.
Recurring false accusations and continuous defamation are forms of cyber stalking. In some cases, disparaging comments launched at a person or a business entity may be due to a strong difference in opinion, but when the harassment does not stop and begins to take on certain threat patterns, law enforcement may be called in.
A Cyberbully is a dangerous individual who uses Internet communications to deliberately harm others through slander, shaming, hostilities, and threats.
The malicious work of cyberbullies is not just a flash in the pan; these nefarious individuals often increase their harassment to dangerous levels for their victims. Cyberbullying is a phenomenon that has increased considerably among young people in recent years; adolescents are often the most vulnerable and deeply affected by this senseless crime.
Cyberbullies sometimes pick their victims at random, and they often use obfuscation techniques to remain anonymous and continue their vicious attacks. Law enforcement agencies are paying closer attention to reports of cyber stalkers and cyberbullies.
Cyberbullying is a law enforcement term used to describe the collective activities of cyberbullies who use Internet technologies with the intent of causing harm to individuals or groups of people.
Just like cyber stalking, cyberbullying is a malevolent and spiteful activity that is growing in tandem with the popularity of mobile personal computing. In some cases, online reputation management firms are retained to deal with the aftermath of a cyberbullying attack after law enforcement has intervened.
Instances of cyberbullying are growing at an alarming rate among young people, who unfortunately are more vulnerable to the online shaming and slandering typically brought on by cyberbullies. Even apps like Yik Yak are being used for cyberbullying as a kind of 'anonymous twitter'.
Data Aggregators are companies that collect, process and compile public information and offer it in electronic formats that are easy to digest.
The collection, processing and sale of public information has been a booming business ever since the World Wide Web came of age in the mid-1990s. Their biggest clients include private investigators and other companies that repackage data and sell it to consumers.
Although the information collected and sold by data aggregators is available to the public, critics of these information brokers claim that their business model amounts to scraping up sensitive data that people would normally not be interested in purchasing. Critics also point out that the information is often sold irresponsibly to data brokers.
Data Brokers are clients and business partners of data aggregators. They market personal information compiled by aggregators and sell it on a retail basis to interested parties.
Data Brokers can operate as individuals or business entities. In the United States, the information collected by aggregators and sold by data brokers is regulated by the Fair Credit Reporting Act. In fact, consumer credit reporting agencies are major information sources for data aggregators and brokers.
The data brokerage industry has come under fire due to their blatant marketing of personal information. In some cases, information sold by data brokers has ended up in the hands of stalkers and identity thieves.
Data Mining or Scraping or Scouring is the process of using bots or crawlers that traverse the Web looking for personal information.
Data aggregators and other types of information brokers do not get their information exclusively from sources such as the post office, driver's license bureaus, credit card companies, and courthouses. They often use data mining.
The data mining process allows information brokers to put together digital dossiers on people, which include personal data and behavioral tracking information. These profiles are offered by data brokers to marketers, advertisers and the general public. Data mining critics point out that such activities diminish expectations of online privacy, and that stalkers and identity thieves are the most benefited by this practice.
A Dead Link is synonymous with Broken Link. Such a link is a hyperlink on a Web page which results in an error or an incorrect page instead of the intended page.
Dead links also known as broken links, and are the cause of much distress for online visitors of websites because the search engine ranking page delivers a 404 error message instead of the page viewers are searching for. Further, they lower the rank results of any given site.
In addition, websites which have not refreshed or repaired their links will result in what is known as link rot. Link rot is an unfortunate condition that causes a Web page to lose its SERP rank. SEO and reputation management professionals use different tools to monitor the health of links and the targets they point to.
Deep Linking, also known as internal linking, is a technique used by search engine optimization (SEO) professionals that involves creating references to content that resides in the same website.
Deep Linking is at the heart of prominent online properties like Wikipedia, and it can be easily adopted by Web publishers. This publishing technique is encouraged among professional bloggers and reputation management specialists for various reasons. Deep linking not only boosts the overall SEO of a website but also encourages visitors to discover more content within the same page.
When creating interesting and persuasive content, deep linking is recommended to authors and Web designers. Deep linking can be accomplished through the use of anchor text, images or other types of media.
The Deep Web is the vast mass of information that is not normally discovered and indexed by search engine crawlers.
Information that makes up the Deep Web is contained in databases, social networks and subscription-only sites that can only be accessed by human queries. It is impossible to accurately determine just how much information on the Web lies beneath what can be crawled and indexed. Internet analysts believe that the Deep Web is likely bigger than what can be currently accessed and indexed by search engine crawlers.
The Deep Web is not completely impenetrable; to this end, some data mining companies are coming up with methods to extract personal information from these databases that can be packaged and sold to interested parties. The Deep Web is sometimes referred to as the Invisible Web, although this is a misnomer.
Defamation is an unethical activity that involves making deliberately false accusations against the good name or character of individuals or organizations.
Defamation is carried out in public, and it is also referred to as slander, disparagement, or character assassination. These wicked actions often take place online; the Internet gives perpetrators a false sense of isolation when they conduct these attacks. Defamation and libel are by no means forms of free speech, but this is one of the dubious defenses employed by online slanderers who sometimes step up their harassing attacks into cyber stalking and cyberbullying.
Description Meta Tags are HTML elements that tell search engine crawlers certain information about what visitors will find on a certain Web page.
Meta tags are not commonly seen by Web page visitors. The description meta tag, for example, is very important for both reputation management firms and search engine optimization (SEO) professionals since it serves to describe and summarize the content found in a page.
Description meta tags should be written in a concise and friendly way, using relevant keywords that will boost the rank of the page on the search engine results page (SERP). Aside from fulfilling keyword density requirements, description meta tags are also used by some search engines to display information right on the SERP about the website.
The Digial Millennium Copyright Act is a law passed by the Congress of the United States of America on October 28, 1998. The law makes it a crime to sell devices intended to "crack" code, bypass anti-piracy software, and makes web content developers pay license fees to use the products of music companies online.
Examples of people using, or attempting to use, the DMCA include:
A Web Directory lists, categorizes and organizes links to other destinations on the Internet.
Before discovery on the World Wide Web was handled by search engines, Internet users flocked to directories to look for content and information. Web directories were crucial to the Internet navigation experience, and they were constantly updated and maintained by human editors.
Today's Web directories are not as popular as they were in the 20th century, but they still serve a significant purpose. The social media concept of discovery, curation and sharing across social circles is similar to what Web directories accomplished in the early days of the Web. Blog rolls and social bookmarking services are two modern Internet concepts derived from Web directories.
Yahoo was founded on its hand-build web directory, which was shut down at the end of 2014.
A Domain is a naming convention used by Internet resources that consists of a chosen name immediately followed by a Top Level Domain (TLD) designation like .com, .net or .org.
Websites are assigned Uniform Resource Locators (URL) that are numeric in nature; they are invariably translated into alphanumeric strings that are familiar to remember. Those numeric URLs are actually Internet Protocol (IP) addresses that contain important information such as the location of the server that hosts the domain and website. Domain Name Server (DNS) systems perform the actual translation of alphanumeric strings into IP addresses.
The registration of domains and the hosting of websites are part of an important global industry similar to real estate markets in terms of supply and demand. Choosing and registering appropriate domain names are crucial activities for search engine optimization (SEO) and reputation management firms.
Domain Name Server or DNS Propagation is the process of distributing a newly-registered domain name, its Internet Protocol (IP) address and registration details across the World Wide Web.
Along with Web servers, domain name servers (DNS) form the backbone of the Internet. During the DNS propagation process, millions DNS servers around the world are updated with information about new domains.
It usually takes 24 hours after registration for a domain to be fully accessible from just about any Internet-connected device with a modern browser. This does not means that a site will be automatically indexed and ranked by Google at that time; search engine optimization (SEO) is a process that takes considerably more time.
A Domain Name Server or DNS is a special computer configured to store databases that communicate with root servers, which are the systems that handle top-level domains (TLD) like .com and .org.
Translating the alphanumeric strings of website names into Internet Protocol (IP) address formats is an invisible process handled by the DNS computers assigned to all Internet-connected devices. This process is handled millions of times a day by interconnected computers specially designed to perform this task.
The DNS system is hierarchical, and in some countries is handled by a central government office. DNS computers are typically found in the server farms of Internet Service Providers (ISPs) and companies that offer domain registration services.
A Domain Squatter is an individual or business entity that registers a domain name that may fall under the protection of trademark laws with the intent of turning future profits when someone expresses interest in purchasing it.
Domain squatting or cybersquatting is a questionable practice. Unscrupulous domain name squatters often look for names of business entities and register them with the intent of contacting the business owners and offering to sell the domain name at an inflated price.
In the United States, domain squatting is a violation of the Anticybersquatting Consumer Protection Act. Egregious domain squatters sometimes publish negative content about their targets in an effort to extort them; in this case, reputation management firms can be retained until the matter is resolved in court or by law enforcement action.
A Doorway Page is a Web page designed by unscrupulous search engine optimization (SEO) practitioners for the purpose of tricking search engine algorithms into giving it high rank and directing visitors to a different website. In the bad old days of black hat SEO, doorway pages were more common. Today, search engines have grown in sophistication and can usually identify a doorway page with little problem.
The mechanism behind doorway pages is called spamdexing, whereby content tricks a search engine into indexing a page and giving it high rank on its search engine results page (SERP).
Unlike landing pages, which feature rich and relevant content supported by an online marketing campaign designed to drive visitors to a particular site, doorway pages and spamdexing are frowned upon by Google and other major search engines. Doorway pages may result in penalties that could push websites way down the SERPs.
Duplicate Content is defined as media that is identical, or at least similar to a very noticeable degree, to other content on the World Wide Web. From a search engine optimization perspective, duplicate content is considered bad.
Duplicate Content across multiple Web pages has always been a problem for website visitors. When duplicate content is used as part of a search engine optimization (SEO) strategy, it could result in penalties that may diminish the rankings of a site with duplication of content.
Of all the issues that the Google Panda search algorithm update addressed in 2011, duplicate content was the most salient. For many SEO firms, Panda was an abrupt wake-up call with regard to content duplication across the Web. Publishing duplicate content these days may end up creating a negative SEO situation whereby copycats are pushed down the search engine results page (SERP).
Dynamic Content on the World Wide Web refers to content on pages created based on input from visitors or other externals sources.
Dynamic Web pages feature special content that changes according to parameters set by internal or external events. One example is the search engine results page (SERP). The content on these pages is dynamic rather than static; this enhances the visitor experience by providing content that is interesting and engaging. Active server page (ASP) technology is often used in the creation of dynamic content.
Search engine optimization (SEO) professionals recommend active content to webmasters whose pages offer products and services. For the purpose of ongoing reputation management campaigns, adding active content to a page can encourage visitors to return and browse new content. Dynamic content is evolving in the sense that it is becoming smarter and more adaptive.
Dynamic Internet Protocol (IP) addresses are numeric strings assigned to websites, hardware devices and other resources on the World Wide Web.
Static IP addresses are used for resources that are continuously online, like DNS computers and Web servers. Dynamic IP addresses are assigned by Internet Service Providers (ISP) each time a user login procedure takes place, but only if the ISP deems it necessary. Your home internet connection probably uses a dynamic IP address.
When a single website exhibits dynamic IP addressing, it means that it resides on a shared host server. Dynamic IP address is fine for casual Web browsing, but other activities like multiplayer online gaming and Voice over IP (VoIP) telephony perform better with static IP addresses.
Electronic word of mouth (eWOM)
eWOM, or electronic word of mouth, is a branch of buzz marketing that involves creating a funny or persuasive message that goes viral. The intention is for consumers to share, like, retweet and repost the message while becoming more familiar and loyal to the brand.
An exact match domain, also called EMD, is a domain name that exactly matches the search phrase relevant to that domain, increasing the relevance of that website and likely improving its ranking as well.
The Google Farmer Update is a change to the company's secretive search engine algorithm. It is also know as Panda, the last name of a Google engineer who was instrumental in its development and implementation.
The update is known as Farmer because it specifically targets link farms and content farms. These spam purveyors were demoted far down the search engine page results (SERP) soon after the roll-out of the Farmer update.
Ever since the Farmer/Panda update rolled across the World Wide Web in 2011, search engine optimization (SEO) and reputation management professionals have been paying close attention to the subsequent algorithm updates to ensure that the interests of their clients are not compromised.
Search engine Filters are subroutines programmed into the search crawlers to detect Internet spam and dubious search engine optimization (SEO) practices.
Major search engines like Google constantly send out their crawlers, robots or spiders across the World Wide Web to evaluate and index all available websites and the content published therein. As these programs execute their functions, they use filters to discern the information of the pages they copy.
Some of the questionable SEO practices that filters are programmed to look for include cross-linking, cloaking and spamdexing. If the filters detect such practices, the search engine algorithm may assign a low rank to that Web page or to an entire site. SEO professionals and online reputation management companies are familiar with these filters and make sure their clients' pages are free from any anomalies that may trigger the filters.
Flaming is the intentional online bashing of a person or business entity through insults or personal attacks online.
Flaming has been present on the Internet since its inception; it was particularly noticeable on interactive networks like message boards, online chat rooms and today on social media platforms. The World Wide Web has always been a place for lively discussion that reflects the diversity of users around the world. Discussion often gets heated, however, and turns into hostile ad hominem attacks that feature insults, harassment and profanity.
Flaming is sometimes used as a diversion or excuse for more elaborate attacks carried out against individuals and organizations.
A Forum is an online space that invites discussions on different topics.
Internet forums may also offer social networking features; in fact, they are often considered precursors to the social media paradigm. Web forums evolved from email lists, Usenet and chat rooms. In the early 21st century, software developers created powerful messaging board systems to host forums and thus create vibrant online communities.
Online reputation management firms constantly monitor Internet forum activity on behalf of their clients. Web forums are good examples of collective wisdom, but they are also ripe for abuse by unscrupulous members or people who wish to promote negative opinion or perception of an individual, a brand or an organization.
Fraudulent reviews are those falsely placed on review sites by culprits, shills, and competitors. These are normally negative or positive in nature and are intended to garner, or drive business away from websites.
In the past few years, review sites have gained major prominence online and are recognized by Google as valuable, relevant, trustworthy content. That means sites such as negative reviews on Yelp, Amazon, and RipOff Report stay high on search results. Unfortunately, because no review site is able to fully verify reviews that are posted (and the vast majority of them make only feeble attempts to verify reviews), they breed overt, anti-competitive behavior as adversaries post highly damaging negative reviews online. Likewise, a single disgruntled customer can wreak havoc on a business by posting a caustic review of a product or service, leaving very little recourse for the typical business owner with little or no knowledge of online reputation management.
A gap analysis, with respect to reputation management, is an investigation of the differences between the search results of a business engaged in online reputation improvement, and the search results of one or more competitors. This type of analysis is intended to find threats and opportunities that will dictate a reputation management strategy.
Google is a California-based multinational technology company that owns and operates the world's most popular search engine. Duh.
Google is one of the most ubiquitous names in online technology these days. It began as a computer science research project in 1996; the project was a data search tool called BackRub, and a year later was named Google. It only took a few years for Google to become a household name and a verb. The meteoric rise of the company has transformed the way people use the Internet and other online technologies.
The substantial market share of Google as a search engine around the world has made the company synonymous with search engine optimization (SEO).
Googlebot is the name of the Google search engine crawler.
Of all the search engine spiders that traverse the World Wide Web around the clock, Googlebot is the busiest and most recognizable. Googlebot is a powerful software application that browses, collects and copies pages, documents and objects. Googlebot begins its crawl with massive lists of websites to visit and then extracts even more website addresses for later visits.
The Google Cache is the storage space where copies of Web pages crawled and copied by the Googlebot are stored.
When major search engines send out their indexing robots, also known as crawlers or spiders, they are actually downloading copies of the pages into their servers. These cached copies are stored in their servers and offered to search engine users who may wish to browse a page and see how it looked weeks before.
The Google Cache is useful in the sense that it provides users with the information they are looking for even if the website is down at the time they conduct their search. The Google Cache function is also controversial since it retains and displays copies of Web pages long after webmasters delete them.
Google Instant was a technology that provided instantaneous glimpses of information and suggestions as users enter their search queries. Google eliminated Instant on 26 July 2017. Where Google used to show search results as you typed the query, you must now click on a query in order to see search results.
Google Toolbar is a browser add-on module that enables searching without navigating to other pages.
Google Toolbars can be downloaded and installed in various browsers to provide users with a rich search experience without having to leave the page they are presently browsing. The Google Toolbar was a very popular download until the Chrome Web browser was released to the public.
To make the Google Toolbar more attractive and useful to users, certain enhancements like pop-up blockers, dictionaries and translation modules have been included.
A Googlebomb is a manipulation of the search engine results page (SERP) to make a certain Web page rank particularly high for some keywords.
For all its algorithmic measures of integrity and transparency, the Google search engine can be manipulated to a certain extent. A Googlebomb is the result of Googlewashing, an unscrupulous practice that can create an awkward situation for Web searchers. The most famous Googlebomb took place circa 2006, when searchers who typed the keywords "miserable failure" were surprised to see the biography of former President George W. Bush as a top result.
The "miserable failure" Googlebomb exposed the vulnerability of Google and other search engines with regard to their indexing practices and algorithms.
Most search engine optimization experts fall somewhere in the category of "gray hat" SEO. They follow the rules search engines provide as to best SEO practices, but they are continually pushing the envelope and exploring the "gray" areas of SEO. They bend the rules for the benefit of their clients without usually breaking them. Gray hat SEO's are not black hat, nor are they lily white hat either.
A typical gray hat SEO will earn some inbound links, and buy others. Buying links is considered a no-no by search engines, but SEO's do it anyway - mainly because they must. Money will change hands but all in the name of high quality content and a beneficial search experience for users. This differs from black hat SEO's who care nothing for search experience in favor of the best rankings possible in the shortest period of time.
Guest posting is the writing and publishing of articles and posts by writers who are neither employees nor owners of a website or blog.
Guest posting is mutually beneficial to both writer and website: the writer gets an opportunity for publication, and the website has fresh and updated expert content keeping its ranking up in search results.
Hacking has made a public splash in recent years with scandals surfacing in large corporations all over the world. Hacking into emails and classified documents is one thing, but online enemies can be more subtle (and perhaps far more damaging) than that. Hackers can quietly gain access to a company website, surreptitiously and systematically alter reviews, change content, or divert vital traffic to other websites. More overt hacking can defile a site or take it down completely. It can result in public humiliation, huge losses in sales, major spending on IT and resources, and negative press.
Header Tags are descriptive HTML elements used by search engine optimization (SEO) professionals to communicate the relevance of certain elements to visitors and to Web crawlers like Googlebot.
Header Tags allow programmers to include descriptive elements that call attention to important aspects of the document. Keywords can be discreetly used inside header tags to boost the SEO rank of pages, and the font size can be adjusted as well.
SEO specialists and reputation management professionals use header tags in a hierarchical order for maximum effect. The first header tag, for example, indicates the theme of the page. The second and third header tags are used to point out the main sections and subsections of content on the page.
Hidden Text and Hidden Links
Hidden Text and Hidden Links are Web page elements that appear invisible to visitors but not to search engine crawlers.
Text and links can be hidden by using fonts that are the same color as the background or placing text and links behind pictures or other objects on a Web page.
In the days before major search engines launched algorithm updates like Google Panda, hiding text and links was a somewhat common activity practiced by search engine optimization (SEO) professionals. Back then the idea was to load pages with keywords and links in the hope that the search engine algorithm would assign them a high rank on the search engine results page (SERP).
Hiding text and links for SEO purposes is frowned upon by search engines and could lead to Web pages penalized by being pushed down the SERP.
A Hit is a unit of measurement used by the online marketing and advertising industries; it counts the number of times a file contained within a website is accessed.
Websites hits are not counted in the same fashion as clicks. The number of hits a website gets is a useful metric, but search engine optimization (SEO) professionals look beyond this simple metric to measure the effectiveness of their work.
Tools like Google Analytics and Webtrends allow SEO and reputation management professionals to understand how visitors are using the site, where they come from, the motivation behind their visit, and how the different pages perform in terms of browsing. This is a valid method to analyze return on investment (ROI), but it can also shed light on other important factors.
The Home Directory is the main index page of a website.
The Home Directory is crucial to the browsing experience for Web design and search engine optimization (SEO) purposes. Visitors who arrive at the home page from the search engine results page (SERP) need to be engaged and feel comfortable with the initial stage of their visit, otherwise they will click on their browser's back button. If they leave too soon, they will increase the bounce rate of the website.
Home Directories typically sit in the root of a Web server; this allows SEO professionals to work from the top down when it comes to placing certain keywords and links that will make the site rank higher on the SERP.
A Hostile Entity is an individual or a group who denigrate people or organizations using the Internet as a platform for their attacks.
Difference in opinions are expected on the Internet; to this end, a single unfavorable comment or article against an individual or a business does not constitute a hostile entity. Repeated negative comments coming from the same person, group or website, however, constitute the work of a hostile entity.
Hostile entities may have ulterior motives, and their methods are sometimes predictable. Disgruntled former employees can turn into hostile entities, and they may even recruit accomplices to assist them in their endeavors.
A Hostile Environment is a search engine results page (SERP) thoroughly populated with content created by hostile entities. When the work of hostile entities goes unchecked, their negative content may organically become top search results for certain keywords. A SERP that returns nothing but negative content aimed against individuals, brands or business entities is an extremely hostile environment that can turn into a full-fledged crisis for the victims. Concerted attacks may use black hat SEO techniques to accelerate harm.
Identity theft is a crime that involves stealing personal information for the purpose of supplanting someone's identity typically with financial gain as an intent.
Identity Theft has become a serious problem in the United States; even children are being targeted. Even people one might trust, like Bank Tellers are getting in the lucrative business. According to surveys conducted by industry groups, more than two percent of American households have experienced identity theft cases involving minors. The growth of social media and mobile technologies are exacerbating the problem. From 2010 to 2012, there was a 67 percent increase of data breaches involving smartphones.
Identity Theft is one of the most common types of cybercrime. In some instances, online identity theft is used as part of a greater pattern of crime that may include cyber stalking or cyberbullying.
One way to mitigate identity theft before it happens is to engage in "identity management" by effectively having two or more online identities. This won't solve the problem of having your real identity stolen, but it can confuse your digital trail significantly making it more difficult for would-be attackers to target you.
An Image Map is a single picture that contains hyperlinks placed across different parts of the image.
Image Maps are examples of creativity in Web design and development that are often visually pleasing, but they are not always a good idea in terms of search engine optimization (SEO).
The problem with image maps is that search engine crawlers do not see them or understand them in the same fashion as humans do. There are, however, some methods that can make SEO friendly image maps. Placing the image maps links at the bottom of the graphic or at the footer of the page is one such method.
Inbound Links or Backlinks are hyperlinks in HTML documents that point to a webmaster's own site. They are placed on external pages and are important factors in terms of search engine optimization (SEO).
Before Google rolled out its Panda search algorithm update in 2011, Inbound Links were crucial to SEO professionals. Part of Google's algorithm before Panda consisted of assigning rank to pages that enjoyed a high number of Inbound Links from across the Internet.
The Inbound Link strategy to boost the rank of a website on the Google search engine results page (SERP) led to some questionable SEO tactics that ended up creating lots of low-quality online properties. The Panda update changed the SEO landscape with regard to quality versus quantity, but Inbound Links are still significant to SEO and reputation management professionals.
A Search Engine Index is a list of Web pages contained in a database.
A Search Engine Index is usually sorted alphabetically. Each entry or record on the index corresponds to a string of documents. The websites listed on the search engine results page (SERP) indicate Web pages that contain words that have been indexed. At any given time, a search engine also stores documents that have not yet been indexed.
The size of a search engine index is mind-boggling. In 2012, Google kept more than 50 billion pages on its index, which is distributed among 900,000 servers located around the world. Feeding pages to this index is the work of the Googlebot crawler, which then passes copies of Web pages to an indexing program that sorts text, programming code and other media elements before storing them in a massive database.
Indexing is the function performed by a search engine indexer after a crawler copies the content of a Web page.
Crawlers essentially copy Web pages; the work of the bot is passed on to an indexer program. This is where the search engine algorithm is applied and Web pages are ranked.
Indexing consists of a software application that analyzes the full text of the Web pages captured by the search engine crawlers. Good search engine optimization (SEO) work is rewarded by the indexer with a high rank on the search engine results page (SERP) for certain keywords that match incoming queries. Some of the functions performed by the indexer include counting the number of times keywords appear in Web documents and their location in the page.
Internet Harassment is the online tormenting of an individual or an organization; it is further classified under cybercrime offenses like cyber stalking and cyberbullying.
Internet Harassment fits the definition of a crime in most of the United States. Depending on the jurisdiction, Internet harassment may be consider a misdemeanor or a felony.
Internet Harassment is an unfortunate sign of the times. Malevolent individuals often test the limits of free speech online by publishing negative comments that often escalate into personal attacks and threats. Devious and cunning hostile entities often become familiar with Internet harassment laws so that they can carefully continue their vicious attacks without attracting the attention of law enforcement. Civil remedies, however, may still apply and reputation management firms may be called in to help remedy the damage.
Internet Marketing is the communication and promotion of products, services, brands, organizations, and individuals online.
Internet Marketing is a fast-moving industry that is increasingly dependent on real-time interaction. Brands and companies are now more visible than ever, and the advent of smartphone technologies demands that companies pay special attention to their online image, branding and social media engagement. The future of Internet marketing is being shaped by the expectations of the users; they want to feel engaged and empowered.
The continuous growth of the Internet calls for improved marketing, advertising and public relations. Internet marketing is a very competitive field made up of different specialties. Search engine optimization (SEO) and reputation management are some of the most prominent specialties within Internet marketing, followed by advertising and social media management.
Internet Privacy is the ability of individuals to control the flow of information and have reasonable access to data generated during a browsing session. subh
The Post-Privacy World in Which We Live
Privacy is a major concern for all Internet users, but it is becoming more difficult to expect a reasonable expectation of privacy online. One of the problems with Internet privacy is that many users assume that they have control over their information. This is often not the case, particularly when they engage in activities such as online social networking, which is essentially based upon sharing of personal information. There are entire industries devoted to piercing the veil of privacy. They have entire zombie armies at their disposal for both commercial and nefarious reasons. As practitioners of online reputation management we frequently help people and companies pick up the pieces after an internet privacy snafu.
The issue of Facebook privacy is often raised. According to Facebook users 'own all of the content and information' they post on Facebook. Facebook goes on to say that "When you publish content or information using the Public setting, it means that you are allowing everyone, including people off of Facebook, to access and use that information, and to associate it with you ". As hard as Facebook has worked, many people still don't understand, or use, their internet privacy functions adequately. To Facebooks' credit, they have tried. But the fact is, we just don't have time. Learning the in's and out's of privacy in todays busy world is secondary until there is a problem.
When something is posted on any website publicly, everyone has access to it. Search engines make the information even more accessible, and anyone (including internet 'bots') can copy the information and store it indefinitely. The web has become so complex, knowing and controlling the privacy settings of all of the websites a person uses has become nearly impossible. Internet privacy settings are seemingly ever-changing.
Protecting Internet Privacy
The first rule for protecting privacy on the internet is "think before you post'.
The second rule is 'check your privacy settings'. At Reputation X we advise clients to check all of their privacy settings on Facebook, LinkedIn, Twitter, and on their own websites. Our advice: Learn about your rights and learn about your settings. The biggest issue is Facebook, because it's one of the biggest. Facebook privacy is discussed here.
Rule three: Ask friends to understand their privacy settings and let your friends know you care about your privacy. Remember, if a friend hasn't set their privacy settings properly and they share a picture of you at that college party, all of the privacy measures you have taken won't matter. 'Friends' are often the biggest privacy leaks out there.
The Problem with Friends and Privacy
To illustrate the problem with friends sharing we often point clients to the problems others have had. Google will remove content that is sexually problematic (see this blog post about removing various kinds of information). If a person owns the copyright to an image, and posts it online, you cannot control it. But if you own the copyright, a selfie for example, you have the copyright (speak to a lawyer). But friends, especially intoxicated ones, are prolific sharers. They may have set their Facebook or Instagram settings to "public". The down-side for you is that no matter how hard you protect your own privacy, your "friends" can lay your plans to waste with a click. Take it from us, removing negative internet content is difficult and expensive.
Your privacy extends to your IP address. Every time you visit a website your IP address is logged. Your IP address tells the website (and the people that run it) approximately where you are. Have you ever gone to a shopping site and noticed it has a big message at the top that says 'We ship to (your city or state)!" They know your city or state based on your IP address.
Almost anyone can find your IP address, and with it they will know where you are. This blog post outlines how we found professional thieves by using free tools online. If we can do it, pretty much anyone else can as well.
To protect your IP address you can use a proxy service like TOR, which is to some extent described in this post on how hackers are hired. Of course the very good people at TOR will explain it much better than we can, so point your browser there and read all about it.
A web cookie is just a text file placed on your computer, usually by a website. Sometimes cookies are 'session' based meaning they only work while you are on a site. Other cookies are 'persistent' meaning they continue to exist long after you have left a site. Normally, it tracks your visits so the website knows you are a returning visitor, or what ads to show you. Internet cookies are necessary for the web to function the way we've come to expect it to, but they are also viewable by third-parties and have an impact on internet privacy. We've probably already placed a cookie on your browser, and we're probably tracking you now - but we're really nice, so don't worry about it. :-)
One of the easiest ways for people with dark purposes to begin to learn about your existence are chain emails. Because everyone has a friend that passes along jokes to an entire list of all their friends we like to use different online identities, we call them personas. This can be important because the Reply All emails often end up getting forwarded to hundreds of other friends and the email gets longer and longer. Each iteration adds more clearly visible email addresses. Eventually a spammer, web robot, or other internet opportunist receives the email. They now have the email addresses of what is essentially your personal social network. They can hijack your email address, send emails to your friends pretending they are you, and work all kinds of mischief.
Reputation management firms often advise their clients about the adequate amount of information they should make public as part of projecting a positive image.
An Internet Protocol (IP) Address is a numerical string assigned to a resource that connects to the Internet.
The IP addressing system enables the discovery and exchange of information across the Internet. The resources that can be identified with IP Address include desktops, servers, mobile computers, smartphones, websites, virtual machines, printers, routers, and others.
There are two IP addressing systems in use these days: IPv4 and IPv6. IPv4 addresses have four sets of numbers that contain one to three digits. IPv6 addresses, which consist of eight sets of four digits in the hexadecimal numbering system, gained prominence in 2012 due to the inevitable depletion of IPv4 addresses.
IP Spoofing is the deliberate modification of Internet Protocol data headers for the purpose of forging an IP address.
IP Spoofing is a technique used by outlaw hackers or cybercriminals to distribute malicious code, gain unauthorized entry to private networks, launch denial-of-service attacks, and other nefarious activities. The basic intent is to hide the identity of the malefactor or to bypass a computer security system by taking on the identity of a trusted computer.
Some users use Web proxies to gain some level of anonymity by using another resource to browse the Internet. Others make use of virtual private networks (VPN) or virtual machines. These methods may conceal IP addresses to a certain extent, but they do not forge or spoof them.
Internet troll are people who post abusive, inflammatory or off-topic responses in online community forums or blogs in order to provoke emotional responses from readers.
Java Applets are small programs typically inserted into Web pages and delivered to browsers for further execution by Oracle's Java Virtual Machine.
Java Applets truly came of age in the 21st century thanks to the Web 2.0 paradigm of cloud computing, Internet-based applications and dynamic websites. The code for Java applets resides in a server and waits to be called up by a HTTP request. The program can then be embedded into a Web page, displayed on a browser and executed by the Virtual Machine running process.
Examples of useful Java applets include virtual keyboards, command interfaces, media players, games, and information tickers.
Junk Pages are essentially the rascally work of unscrupulous search engine optimization (SEO) practitioners who are typically after hits, clicks and page views for financial gain. These pages involve the use of dubious techniques like cloaking, hidden text and duplicating content to rank high on the index of search engines. Junk websites were a thorn in Google's side for most of the 21st century, at least until the Panda algorithm update was applied in 2011. A search engine that does not make efforts to remove junk pages from its index runs the risk of losing market share.Junk Pages are HTML documents loaded with keywords, links or ads that do not serve any real informational purposes other than spamdexing.
Keyword Phrase Research is a search engine optimization (SEO) process that evaluates terms used by Internet searchers in relation to their needs.
To identify the potential customers of their clients, SEO professionals evaluate business models, industries, brand awareness, and the online behaviors of searchers who fit certain consumer profiles. The start with broad sets of keywords and focus on the most popular. These terms become primary targets for SEO.
Online reputation management specialists look at negative keywords that might affect the Internet image of their clients. Technologies such as Google Suggest and Insights may also provide valuable keyword information about trends and sentiments of consumers towards specific brands.
Keywords and Keyword Phrases are terms used by Web searchers that tend to define the Internet content they are looking for.
Keywords are the most substantial elements of the search engine optimization (SEO) and online reputation management industries. Internet searchers have come to trust major search engines like Google to deliver relevant, high quality results in relation to their queries. To this effect, the data retrieved by search engine crawlers is compared and categorized by indexer applications according to keyword databases.
Good online reputation management companies monitor the keywords relevant to their clients and look for shared name spaces, conflicting name spaces, hostile environments, and other factors that can affect the positive Internet image of the individuals and organizations they serve.
Keywords Meta Tags are HTML elements in Web pages that list the keyword phrases relevant to the document.
A keyword meta tag is invisible to website visitors as it lives in the HTML of the web page. They were once frequently used by search engine optimization (SEO) professionals to inform crawlers and indexers about the primary keywords that the content on a Web page was created around. Keyword phrases in meta tags are separated by commas, and these meta tags are placed before the description tags.
Some major search engines have changed the behavior of their crawlers and the functions of their algorithms to ignore keyword meta tags, but many SEO and reputation management companies still include these tags for the rest of the search engines that still scan them.
The Knowledge Graph is a feature in some search engines that displays informational snippets related to the query.
Google introduced its knowledge graph in 2012 as a right sidebar that pulls information and data from Wikipedia, Google Maps, the CIA World Factbook, and other trusted sources. Bing has its own version of a knowledge graph located just to the left of its social media sidebar.
The knowledge graph is reminiscent of question-and-answer (Q&A) sites like Quora and the Wolfram Alpha computational search engine; it is part of an emerging trend in search engine technology that favors semantics and knowledge discovery.
Latent Semantic Analysis is a computational method of analysis whereby documents are compared and checked for the natural occurrences of certain words and their relationships.
With latent semantic analysis (LSA), search engines can automate the process of establishing what a Web page is really about. Google has been investigating the use and integration of LSA into its algorithm to recognize certain text patterns and improve the quality of its massive indexing database. Search engine optimization (SEO) professionals have observed that Google's use of LSA in its search algorithm could mean that the company is looking to separate content words from functional words and assigning values to them, thus changing the search engine landscape.
Libel is an unlawful activity that consists in the communication or publication of falsehoods and slander against an individual, a brand or an organization.
Online libel and defamation on the Internet are malicious acts that unfortunately increase in occurrence as the World Wide Web reaches more households and businesses. The Internet has become a hotbed of defamation due to the ease of online publishing and the different methods used by perpetrators to conceal their identity.
Libel is one of the main reasons online reputation management firms are retained these days. The nefarious work of wicked slanderers tends to remain on the Internet even after law enforcement and the judicial system catches up to them. With the right reputation management strategy, a positive online image can be restored.
A Link is an element of information systems that serves as a reference to access a resource on a network.
Hyperlinks are intrinsic elements of the Internet. They are usually coded in HTML and typically refer to a target on the World Wide Web that is identified by an Uniform Resource Locator (URL) or an Internet Protocol (IP) address. The targets can be Web pages, site, documents or even a specific position within them.
Search engine optimization (SEO) professionals and reputation management specialists spend considerable time evaluating links. Inbound links contribute to the ranking of a Web page on a search engine results page (SERP); outbound links must be carefully chosen so that they do not point to dead links, doorway pages or link farms.
Link Anchor Text is the visible and functional part of a hyperlink on a Web page; the description of what a visitor can expect to happen once they click on a link.
The anchor text of a hyperlink can be a string of characters, a single term, a phrase, or a full sentence. Anchor text composition is a special consideration of the search engine optimization (SEO) field, but there are different opinions on the matter. There is a consensus that supports the use of keywords as the anchor text of inbound links to increase SEO and search engine ranking, but there are arguments in favor of composing simple calls to action such as "Click Here" for maximum effect and effective conversion rates.
Link Equity is a measurement of the value of a Web page in terms of search engine optimization (SEO).
Link Equity is mostly used to refer to the Page Rank that Google assigns to a Web document on its search engine results page (SERP); this is determined by the longevity of a page, the number of inbound links, the quality and relevance of the websites that point to it, the popularity of the document, the relevance of the keyword phrases contained therein, the quality of the content, and other factors formulated by search engine algorithms.
Obtaining favorable link equity is part of a good reputation management strategy to ensure that positive content that boosts the Internet image of clients ranks high on the SERP.
Link Exchange is the practice of trading an outbound link for an inbound link in a reciprocal manner. Public link exchanges are pretty much ancient technology at this point.
The term link exchange also refers to a website directory established for the purpose of improving search engine optimization (SEO) and helping overall marketing efforts. The idea is to drive traffic to all the websites that subscribe to the link exchange service by means of reciprocity. The premise of reciprocal links dates back to the era of web rings, whereby a group of sites that shared the same interest exchange links and invite visitors to explore.
Reputation management firms used to use specialized link exchange services for SEO purposes and to boost the search engine rank of the positive Web content they create on behalf of their clients. But today those exchanges are private due to changes with search engine algorithms.
A Link Farm is a website that serves no other purpose than to provide inbound links to other sites.
Link Farms are typically devoid of content and are sometimes created under a domain squatting scheme. Webmasters who maintain link farms often participate in dubious search engine optimization (SEO) practices such as cloaking and spamdexing to push their sites to the top of the search engine results page (SERP). Their main objective is to increase the SERP rank of the Web pages listed in the farm.
Ever since Google and other major search engines clamped down on link farms by updating their algorithms, SEO and reputation management practitioners have steered clear of them.
Link Popularity is a measurement of a Web document's relevance with regard to search queries.
Link Popularity is also known as Link Equity or Google Page Rank. It determines how high on the search engine results page (SERP) a Web page will appear based on the quantity and quality of the inbound links that direct visitors to it. For example, if the About Us page of a company is linked to as a reference in multiple Wikipedia articles, the link popularity for that website will climb and a higher SERP rank will be obtained.
Link Popularity is an essential goal of the reputation management industry when cultivating a positive Internet image for clients.
A Link Warning is a cautionary message issued by Google on its Webmaster Tools with regard to potentially harmful inbound hyperlinks.
The link warning is often phrased as a notice from Google Webmaster Tools announcing the detection of unnatural links pointing to a website. This can be interpreted in different ways; for example, a webmaster being warned about questionable linking practices in order to boost the search ranking of a Web page. Another reason might be related to a rogue search engine optimization (SEO) campaign being waged against a website for malicious purposes. Reputation management firms are sometimes retained to ameliorate the negative effects of link warnings.
Linking is the practice of inserting coded hyperlinks in a Web document for the benefit of visitors.
Good linking practices are the cornerstone of providing a positive user experience on the World Wide Web. Linking refers to outbound links in general, but pointing to internal resources in a website can also augment the user experience of visitors while increasing the search engine optimization (SEO) process.
A good top-level case study in proper linking practices is Wikipedia. A well-edited Wikipedia article contains the right blend of deep links to other articles alongside external references. The linking practices of SEO and reputation management professionals require careful considerations, from the relevance of the links to the anchor text utilized.
Links are elements of HTML coding that contain references to resources on the Internet.
A Log File is a document that displays a chronological list of events that take place in a server.
Log Files serve a range of purposes; they are created by operating systems, software applications and network systems. Log Files created by Web servers give network administrators and webmasters a record of visits. The information collected can be as simple as the Internet Protocol (IP) address of the visitor and the time a website was accessed. More sophisticated log files can reveal information about visitors and the resources utilized during each visit.
Log Files form the basis of analytics and deep traffic analysis, a practice frequently undertaken by search engine optimization (SEO) and online reputation management professionals.
Machine Learning is a field of computer science and artificial intelligence that aims to build systems that perform actions based on discovery and logic.
Machine Learning has greatly advanced in the 21st century thanks to progressive developments in information technology. Wall Street investment firms have used machine learning in recent decades to come up with algorithmic and high-frequency trading to give them an edge over human traders.
A common application of machine learning as it applies to the Internet is spam filtering in email programs. Search engine technology is increasingly relying on automated techniques like machine learning and latent semantic analysis for indexing and ranking purposes.
Earned, owned and paid media are three different means by which a business can publicize its brand. Used together, they create a more comprehensive marketing strategy that improves potential for a company's brand and success.
- Earned media requires involvement from the public in the form of shares, mentions, reposts, and reviews.
- Owned media is the online property of the business and includes websites, blogs, and social media channels.
- Paid media utilizes advertising and paid promotion to broaden exposure of a brand.
A Meta Description consists of a phrase or small paragraph that tells site visitors about what they will see when they click on a link.
Link Descriptions are often, but not always, used by search engines in the snippets they show visitors on the search engine results page. They give readers a good idea about what will happen once they click, tap or swipe a Web page element. The most common method used to create descriptions consists of composing anchor text sentences.
Proper composition of descriptions is crucial to the work of search engine optimization (SEO) and reputation management professionals. It is also important to ensure that the link is not incorrect or broken. A description makes a promise; if the site fails to deliver on this promise once the visitor clicks, that visitor may leave the Web page and choose not to come back.
A Meta Search Engine allows Internet users to enter queries and get results from multiple decision engines in a neatly formatted Web page.
Meta search engines are essentially data aggregators that base their business model on the premise that Internet searchers can improve upon their user experience by evaluating results from competing search engines. In most cases, meta search engines do not utilize their own crawlers, indexers or search algorithms. Since they usually return search results that rank high on the search engine result page (SERP) of competing decision engine providers, search engine optimization (SEO) and reputation management professionals aim to give their clients the best rank for specific keywords on different search providers.
Meta Tag is the element of HTML programming that contains information about a Web document but is not normally displayed in a browser.
Meta tags store metadata, which is information about the Web document they are coded in. Meta tags can define attributes such as keywords, locations, languages, and more.
Search engine optimization (SEO) professionals heavily focused on the use of meta tags in the late 20th century thanks to the algorithms of decision engines such as AltaVista and Infoseek. Misuse of meta tags by unscrupulous SEO practitioners, however, prompted Google and other major search engines to change their algorithms in this regard. Meta tags may still be used these days by SEO and reputation management specialists since not all search engines have chosen to ignore them.
Mirror Sites are websites that are identical in every sense except for the domain they occupy.
Mirror sites are essentially copies of entire websites that reside in separate Web servers. There are several reasons to justify mirroring a website. Businesses use them as part of their disaster management policies. Busy sites may also deploy mirrors as a way to efficiently manage major spikes in traffic. Other reasons include censorship and redundancy in case one server becomes subject to a denial-of-service attack.
Since site mirroring may be confused with duplicate content by search engine algorithms, search engine optimization (SEO) and reputation management professionals take special precautions when handling mirrors.
Name Space is the aggregate of all Internet search results that appear when the name of a brand, organization or individual are queried.
Name Spaces are rarely exclusive; in many cases, they are shared or contested. Examples of shared name spaces for individuals include "Jane Smith" and "Michael Johnson." In the case of business entities, a classic example would be "ABC company." Contested name spaces occur when same-named entities try to rank higher on the search engine results page (SERP). This creates a challenge for online reputation management firms that aim to gain SERP rank for clients who have common names, but there are methods that can be used to foster a favorable disambiguation.
Natural Search Engine Optimization (SEO), also known as organic SEO, is the process of building relevancy and authority for a Web page without using paid search advertising.
Modern search engines have different panes or sections in the Web document they use to display their results. This Web page is known as the search engine results page (SERP), and it may contain a mix of organic and sponsored search results. Not all sponsored search results are paid ads; some are placed by the search engine provider as part of a public service announcement.
The natural SEO process is dynamic due to the algorithmic changes of the search engine providers and the trends that emerge among users. Both SEO and reputation management specialists include natural SEO as part of the service they extend to their clients.
Negative SEO results when unscrupulous characters (typically a competitor) point unnatural links (links built quickly and in large numbers) to a particular site in the hope of harming its ranking or getting it penalized by Google.
The .NET website ending is often your second strongest directly controlled web property. Correctly designed and developed with excellent, highly relevant content, a website with your search phrase as the domain name and ending with .NET often ranks very well in search results. This article discusses the .NET and its place in your reputation arsenal.
The ending of a web address is called a Top Level Domain (TLD). Originally the .net domain was for companies that engaged in networking technologies, but that has changed. Today anyone can get a .net domain name. Your website should have your key phrase in the domain. So if your key phrase (search phrase) is 'blue widgets', your website should be 'bluewidgets.net'. We suggest building a .com website, a .net and a .org. Each website should have your key phrase in the name without dashes.
A Niche Market is a subset or segment of a larger market or industry that deals with specific consumers, audiences, goods, and services.
Niche marketing on the Internet involves understanding the specific needs, motivations and behaviors of online consumers in relation to brands, organizations or people. Niche markets are particular in the sense that they are sometimes created by brands and emerging socioeconomic and cultural trends.
The search engine optimization (SEO) and reputation management industries constantly deal with niche marketing as opposed to mass marketing. The processes of keyword research, key outcome indicator (KOI) analysis, search ranking, and specific content creation are inherent to niche marketing.
NoFollow is an attribute that a webmaster or a programmer can assign to a hyperlink on a Web document with the intention of preventing search engine crawlers from following an outbound link.
The nofollow attribute is used by search engine optimization (SEO) professionals in different manners. Google advises webmasters to code nofollow into advertisement hyperlinks; in this case, nofollow will prevent Google from thinking that a Web page with a lot of links is a link farm or a spamdexing site.
Nofollow is often used by webmasters of blogs that have a vibrant community of readers who are active in the comments section. Spammers often target these high-traffic sites to promote their links; this practice can be limited with a plugin that automatically assigns nofollow to links left by scoundrels.
NoIndex is an attribute in HTML coding that prevents a Web document from being indexed in a search engine.
NoIndex is a meta tag that speaks directly to the search engine crawlers or robots. The message conveyed by the noindex meta tag is for the crawler and indexer programs to stay away from the Web page or document. The noindex tag can be used by search engine optimization (SEO) and reputation management professionals to keep specific content from being indexed, meaning that it will be invisible to Internet searchers. Noindex can be used for testing purposes, or in certain cases when duplicate content may be a concern.
On-page SEO factors are steps taken on the target site to improve search engine rankings. They include title tag, meta description, content, and linking best practices on the site whose rankings are being improved.
An Online Audit is a process of inquiry and study into the visible online perception of a brand, a company or a person.
The online audit is one of the initial steps taken by good reputation management firms when they are retained by clients. The online audit consists of an assessment of the client's Internet image. It begins with a series of search engine queries followed by analysis of the name space and the accuracy of the content found in relation to the client.
Other factors assessed in the online audit include the ratio of good mentions in comparison to negative commentary; this sense of a general perception gives reputation managers a reference point from which strategies can be formulated.
Online Communities are places on the Internet where individuals can gather, communicate and share electronic media.
The concept of online and virtual communities has been vital to the growth of the Internet since its infancy. Long before the use of online social networks such as Facebook and Twitter became a quotidian matter, Usenet groups, LISTSERV electronic mailing lists and the Internet Relay Chat (IRC) networks were perfect examples of virtual communities. Long before AOL became a digital media company on the World Wide Web, it was mostly known as an electronic community.
Reputation management specialists pay close attention to what members of online communities are saying about their clients through the process of online monitoring.
The Online Image of a brand, organization or a person is defined by the general sentiment brought about by the aggregate of available information on the Internet.
The preponderance of Google has caused its first search engine results page (SERP) to be almost synonymous with an online image. For this reason, search engine optimization (SEO) is a major aspect of the work done by online reputation management firms.
A positive online image is crucial to business and personal success. Just like in the real world, a solid Internet image is hard to cultivate but easy to lose. Reputation management firms protect and improve the Internet image of their clients, particularly on the major search engines. They accomplish this through a combination of content creation, SEO and constant online monitoring.
Online Monitoring is a component of the reputation management process whereby information related to brands, groups or individual is frequently reviewed and evaluated.
Online Monitoring can be conducted on a daily, weekly or monthly basis. It is a listening process that targets Web pages, blogs, Internet forums, and online social networks. Online monitoring is paramount to reputation management due to the empowering nature of the Internet and the way it encourages people to go online and vent their frustrations instead of establishing direct communications.
Constant online monitoring allows reputation managers to act quickly and detect hostile entities, hostile environments or emerging issues related to contested name spaces.
Online privacy, also known as Internet privacy, involves the right of privacy with regard to the storage, use, dispensation, and display of information about a person on the web. In other words, what people can see about you online.
Online privacy is a controversial issue. Here is an infograph that visualizes the how it works:
Online Reputation is the general opinion and social sentiment expressed about brands, individuals and organizations on the Internet.
The reputation of a business or a person can differ considerably in the online and offline realms. Cultivating a positive reputation online is a task that is often more delicate and intricate to accomplish than in the offline dimension. Due to the pseudonymous nature of interaction on the Internet, members of online communities depend on their good reputation to instill a sense of trust.
Online Reputation is subject to various dangers. The first threat comes from established online news media outlets, closely followed by websites specifically designed to collect grievances. Social media is emerging as one of the most influential components that can shape perception of online reputation.
Online Reputation Management is a professional practice that borrows elements from public relations and search engine optimization (SEO) to improve the Internet image of brands, organizations and individuals.
The Online Reputation Management process consists of image auditing and frequent monitoring of the client's Internet presence, along with content creation and favorable search engine placement. The strategy to follow will depend on the amount of negative sentiment revealed by the online image audit, the industry and the intended audience. The analysis, reporting and strategic execution is performed in cycles. Reputation management firms are also retained by victims of cyber stalking and other types of Internet harassment to improve their online image after vicious attacks.
Organic Listings are algorithmic search results resulting from queries that match the content of Web documents returned by a decision engine.
Organic Listings are also known as organic search results. These are Web pages that enjoy a high rank on the search engine results page (SERP) by virtue of meeting the criteria set forth by the search algorithm.
Paid search listings may appear alongside organic listings, and in many cases the paid results may enjoy higher click-through rates (CTR), but only organic search results have true link value. One of the objectives of the search engine optimization (SEO) industry these days is to cultivate organic listings through the creation of fresh, quality content that can serve as the basis for engagement on social media platforms.
A negative review is one that is either inaccurate, less than flattering, or does not comply with the terms of service for the review site the review was posted on.
PageRank is a score from 0 to 10 assigned by the Google search engine algorithm to a Web document in terms of how relevant it is to certain queries.
PageRank is a technology patented by Google named after Larry Page, co-founder and current CEO of the company. It is often written separately as "page rank," and it has become synonymous with the relevance of results presented by all search engines. Although the PageRank algorithm is a closely-guarded secret, one of its major factors is the quantity and quality of the inbound links that point to a page. This is often called "Google juice" by search engine optimization (SEO) professionals.
Paid Inclusion is an advertising program offered by major search engines, particularly Google and Ask, whereby marketing clients pay for their Web pages to be included on the first search engine results page (SERP).
Paid Inclusion clients are shown different fee schedules for the placement of their websites on the SERP. These fees are mostly based on page views, but pay-per-click (PPC) arrangements may be added as well. Many search engine optimization (SEO) professionals complain that paid inclusion and PPC campaign tend make organic results slide way down the SERP, often below the fold. Good reputation companies sometimes use paid inclusion as part of their strategy.
Panda is the name of a major update to Google's secretive search algorithm that initially rolled out in February 2011.
The Google Panda Update was named after one of the engineers who was instrumental in its design and deployment. It is also known as Farmer in search engine optimization (SEO) circles due to the deadly blow it delivered to websites it considered to be link farms.
Although the Panda Update uses a fair amount of artificial intelligence, the algorithmic changes were based on the work of human raters. Aside from targeting link farms, spam sites and scrapers, Google Panda boosted the search engine rankings of news media sites and social networks.
Pay-Per-Click Search Engines (PPC) are companies that offer advertisers the opportunity to place paid search results above the fold or alongside the top organic results.
The major PPC search engines are the major providers of Internet search: Google, Bing and Yahoo. PPC is a marketing option that can be as effective as search engine optimization (SEO) under some circumstances. PPC ads are displayed according to the algorithmic rules of the search engine they are placed on. PPC ad campaigns do not require advertisers to pay based on the number of times their search results are displayed; they only pay when the ad is clicked on and visitors are taken to the target page.
Penguin is the name of a major update to the Google search engine algorithm that rolled out in late April of 2012.
The Google Penguin Update widely targeted black hat search engine optimization (SEO) tactics. Google issued an official announcement with regard to Penguin on April 24, 2012. A couple of days later, the company posted a feedback form that sought the input of webmasters, SEO specialists and power users who wished to give their opinion about Penguin's performance.
The Penguin Update differed from the Panda update of 2011 in the sense that it was more aggressive when looking for Web spammers and pages tainted by unethical SEO. The Penguin update was notable within SEO circles due to the link warnings it delivered through Google Webmaster Tools.
Today, the Penguin Algorithm has evolved further and is about to enter version 3.0.
Personal Brand Management is the process of developing and maintaining a personal image based on marketing principles.
Personal Brand Management is practiced in terms of self-help and business management as it relates to career and personal development. The online reputation management industry is fully dedicated to personal brand management in the sense that by boosting the Internet image of an individual, a personal brand is improved upon.
Personal brands have become more prominent in the Information Age thanks to Internet technologies and online social networking. For corporate executives and self-employed professionals, cultivating a personal brand is essential in today's highly competitive business world. Personal brands can also be very effective when they are attached to goods or services, as is the case with high-fashion clothing designers.
Personal Branding is a self-improvement and professional development strategy for success that is based on marketing principles as they apply to individuals.
A personal brand distinguishes an individual based on the perception of value and assets a person possesses, manages, creates, or facilitates. The personal branding process is strategic; it can be molded in accordance to market requirements or for the purpose of calling attention to a person. The value of personal brands is very sensitive to public perception and opinion; this often calls for expert management by professionals.
Online reputation management specialists help individuals build and maintain their personal brands on the Internet.
A private blog network, or PBN, is a collection of sites owned and under the control of one entity. This is a grey/black hat tactic used to quickly and cheaply get links from inactive sites with no relevance except perhaps for age.
The reputation of a social entity is a viewpoint about that entity held by others, normally an outcome of social examination. It is very important in commerce, education, online neighborhoods (website reputation), and many other areas.
Sentiment analysis (also known as opinion mining) is the attempt to determine the attitude toward a document, writer, speaker or website using a combination of natural language processing, computational linguistics, and text analysis. Classifying sentiment can be simple, such as identifying polarity, and can be more granular, as in the cases of discerning subjectivity and objectivity, as well as determining feature or aspect-based sentiment.
"SERP" is an online marketing acronym meaning "search engine results page". When a search engine user enters a phrase either verbally or on a keyboard, and conducts a search, they are presented with a web page relevant to their query. The page on which the search results appear is called a search engine results page, or SERP.
Slander is defined as any untrue statement that is harmful to the reputation of the person or company at whom it's directed. Slander can result from either a lack of due diligence into the legitimacy of the statement or simply through intended malice.
Slander is different from libel, in that it is spoken through and audio file, a transcribed video or a podcast.
A strawman, specifically with respect to online reputation management, is a ficticious identity used to dilute search results. Typically these fake entities are created in social media, blogs, and any other profile that rank in search results.
User intent is what your website visitors are seeking to obtain during an interaction with your site or search results. Matching user intent to website experience results in greater conversion rates. Having a solid understanding of what an individual user is trying to accomplish, and making that goal easy to reach, is the basis for user intent as it applies to search engine optimization and online brand management.
Web analytics attempts to understand the behavior of web users. Often it is used to attract visitors, convert browsers into buyers, and to help developers and designers create a better user experience. When a website engages in activities such as A/B, or split, testing to discern which of two or more variations of a web page best meet their objectives, they are using web analytics.
Web monitoring is the process of auditing user behavior on a website or in search results or social media. Web monitoring may track a number of different aspects such as online sentiment, website ranking, competitors and more. Web monitoring can be human or machine-based, but a combination of the two is most effective.
Web monitoring is different than "website" monitoring,, which means analyzing the uptime of a website.
Web scraping, also web harvesting or web data extraction, is an automated technique for collecting data from websites using software. That data is then transformed into structured information ready to be stored and analyzed offline, and used for specific purposes such as pricing comparisons, detection of website changes, weather data, and more.
White Hat SEO's are search engine optimization experts who follow all the rules and best practices of search engines like Google and Bing. Most SEO's claim to be white hat, but are in reality of at least a slightly gray tone. White hat SEO experts follow the Google Webmaster Guidelines to the letter.
White Hat SEO practitioners organically ‘push’ specific content to the top of search results by way of valuable content, inbound links, and other indicators recognized by Google.
“White Hat” search engine optimization practices are techniques generally accepted and positively rewarded by Google’s search algorithm. Google uses a number of indicators to rank websites and online content within results for a specific search term.
For example, typing in “Acme Widgets” will produce a page of 10 search results that Google has deemed relevant based on how many inbound links those sites have (meaning how many other sites link to them), the quality of the content within those sites (Google generally likes fresh content that is rich with images and media), how highly trafficked those sites are, and so on. While even more proactive than Halo Making and Traditional PR, White Hat SEO practices are still deemed acceptable and are baseline standards for most ORM plans.