Putting effort into the offline promotion of your company or site can also be rewarding. For example, if you have a business site, make sure its URL is listed on your business cards, letterhead, posters, etc. You could also send out recurring newsletters to clients through the mail letting them know about new content on the company's website. Perhaps the hacking in this case means parting from traditional marketing realms and roles. However, with the use of paid search advertising, there’s a higher chance you’ll notice organic searches. Check forums, online communities and Q&A sites to find the questions people in your niche are asking. Providing clear answers with actionable points is a sure-fire way of creating content your target audience will love.

Meaty meta descriptions

If you are already running an Internet marketing campaign or are about to embark on an Internet marketing campaign, one concept you cannot afford to skimp on is search engine optimization (SEO). These sites rely heavily on Get your arithmetic correct - the primary resources are all available here. Its as easy as KS2 Maths or something like that... catchy keywords and unoriginal content to increase their SEOs, rather than creating new and substantial posts. Marketing-savvy businesses conduct paid and organic keyword research inform their product catalog, enabling them to predict CPA and build inventory on items that present an immediate opportunity for positive ROI. The clickable text for a link is another place to use keywords, and to help guide the user to their destination. As a result, Google respects good anchor text.

A focus on authority sites to benefit search engine placement

Targeted posts with original, useful content are your best way to avoid thin content penalties. What is Thin Content and Why is it Bad for SEO? By Adam Snape on 20th February 2015 Categories: Content, Google, SEO

In February 2011, Google rolled out an update to its search algorithm called Panda – the first in a series of algorithm updates aimed at penalising low quality websites in search and improving the quality of their search results.

Although Panda was first rolled out several years ago (and followed by Penguin, an update aimed at knocking out black-hat SEO techniques) it’s been updated several times since its initial launch, most recently in September of 2014.

The latest Panda update has much the same purpose as the original – giving better rankings to websites that have useful and relevant content, and penalising sites that have “thin” content that offers little or no value to searchers.

In this guide, we’ll look at what makes content “thin” and why having thin content on your site is a bad thing. We’ll also share some simple tactics that you can use to give your content more value to searchers and avoid having to deal with a penalty.

What is thin content? Thin content can be identified as low quality pages that add little to no value to the reader. Examples of thin content include duplicate pages, automatically generated content or doorway pages.

The best way to measure the quality of your content is through user satisfaction. If visitors quickly bounce from your page, it likely doesn’t provide the value they were looking for.

Google’s initial Panda update was targeted primarily at content farms – sites with a massive amount of content written purely for the purpose of ranking well in search and attracting as much traffic as possible.

You’ve probably clicked your way onto a content farm before – most of us have. The content is typically packed with keywords and light on factual information, giving it big relevancy for a search engine but little value for an actual reader.

The original Panda update also targeted scraper websites – sites that “scraped” text from other websites and reposted it as their own, lifting the work of other people to generate their own search traffic.

As Panda updates keep rolling out, the focus has switched from content farms and scraper sites to websites that offer “thin” content – content that’s full of keywords and copy, but light on any real information.

A great way to think of content is as search engine food. The more unique content your website offers search engines, the more satisfied they are and the higher you will likely rank for the keywords your on-page content mentions.

Offer little food and you’ll provide little for Google to use to understand the focus of your site’s content. As a result, you’ll be outranked for your target search keywords by other websites that offer more detailed, helpful and informative content.

How can Google tell if content is thin? Google’s index includes more than 30 trillion pages, making it impossible to check every page for thin content by hand. While some websites are occasionally subject to a manual review by Google, most content is judged for its value algorithmically.

The ultimate judge of a website’s content is its audience – the readers that visit the site and actually read its content. If the content is good, they’ll probably stay on the website and keep reading; if it’s bad, there’s a good chance they’ll leave.

The length of your content isn’t necessarily an indicator of its “thinness”. As Stephen Kenwright explains at Search Engine Watch, a 2,000 word article on EzineArticles is likely to offer less value to readers than a 500 word blog post by a real expert.

One way Google can algorithmically judge the value of a website’s content is using a metric called “time to long click”. A long click is when a user clicks on a search result and stays on the website for a long time before returning to Google’s search page.

Think about how you browse a website when you discover great quality content. If a blog post or article is particularly engaging, you don’t just read for a minute or two – you click around the website and view other content as well.

A short click, on the other hand, is when a user clicks on a search result and almost immediately returns to Google’s search results page. From here, they might click on another result, indicating to Google that the first result didn’t provide much value.

Should you be worried about thin content? The best measure of your content’s value is user satisfaction. If users stay on your website for a long time after clicking onto it from Google’s search results pages, it probably has high quality, “thick” content that Google likes. Effective marketing demands it. When search engines crawl a site, they hope to answer questions and serve content that satisfies searcher intent.

This story about content will haunt you forever

Doing things to competitors like using advanced knowledge of current search engine penalties from algorithm changes to sink them is an aggressive move by meanie-hat SEOs. Allow visitors to your site to social bookmark it for later, by providing your visitors this option, you are helping create links that will raise you in search engine ranks. Not only that, but you can find free widgets that can help you accomplish this for free. Remember the higher in search results, the easier you can be found and get more traffic. All you need to do is type your business’ name and pin code into the tool and click on the ‘Check My Listing’ option. According to Gaz Hall, a UK SEO Consultant from SEO York: "Google uses over 200 ranking factors, and some of the most powerful ranking factors involve backlinks."

Establish your position regarding webmaster tools

The best measure of your content’s value is user satisfaction. If users stay on your website for a long time after clicking onto it from Google’s search results pages, it probably has high quality, “thick” content that Google likes. One Take a butchers at Indozine, for instance. of the most common problems for webmasters who run both mobile and desktop versions of a site is that the mobile version of the site appears for users on a desktop computer, or that the desktop version of the site appears when someone accesses it on a mobile device. Many new marketing specialists tend to buy a few followers in order to give their social media a great kick-start. The appearance of search engine results pages is constantly in flux due to changes in the algorithm.