- Google Panda Update 1: Feb. 24, 2011 (11.8% of queries; announced; English in US only)
- Google Panda Update 2: April 11, 2011 (2% of queries; announced; rolled out in English internationally)
- Google Panda Update 3: May 10, 2011 (no change given; confirmed, not announced)
- Google Panda Update 4: June 16, 2011 (no change given; confirmed, not announced)
- Google Panda Update 5: July 23, 2011 (no change given; confirmed, not announced)
- Google Panda Update 6: Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
- Google Panda Update 7: Sept. 28, 2011 (no change given; confirmed, not announced)
- Google Panda Update 8: Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
- Google Panda Update 9: Nov. 18, 2011: (less than 1% of queries; announced)
- Google Panda Update 10: Jan. 18, 2012 (no change given; confirmed, not announced)
- Google Panda Update 11: Feb. 27, 2012 (no change given; announced)
- Google Panda Update 12: March 23, 2012 (about 1.6% of queries impacted; announced)
- Google Panda Update 13: April 19, 2012 (no change given; belatedly revealed)
- Google Panda Update 14: April 27, 2012: (no change given; confirmed; first update within days of another)
- Google Panda Update 15: June 9, 2012: (1% of queries; belatedly announced)
- Google Panda Update 16: June 25, 2012: (about 1% of queries; announced)
- Google Panda Update 17: July 24, 2012:(about 1% of queries; announced)
- Google Panda Update 18: Aug. 20, 2012: (about 1% of queries; belatedly announced)
- Google Panda Update 19: Sept. 18, 2012: (less than 0.7% of queries; announced)
- Google Panda Update 20: Sep. 27, 2012 (2.4% English queries, impacted, belatedly announced
- Google Panda Update 21: Nov. 5, 2012 (1.1% of English-language queries in US; 0.4% worldwide; confirmed, not announced)
- Google Panda Update 22: Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
- Google Panda Update 23: Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
- Google Panda Update 24: Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
The idea of managing social media profiles of your business is to maximize interaction with potential clients and increase sales.
However, there comes a time when your social media activity can get overwhelming. Aside from keeping up with the different social media channels you have registered for, you need to deal with posting timely messages on your feed and replying to disgruntled customers to control your online brand reputation. Thus, managing all your online accounts on a regular basis takes up hours of your time at work, leaving you with little for your other tasks.
Ideally, there’s no reason you should spend more than 30 minutes replying and posting on your online profiles. To do this, you will need the help of free social media tools and services to organize your workflow. Below are five highly recommended social media services to make your online work much more efficient.
Hootsuite : One of the problems why people take so long in messaging on social media is because they have to sift through different windows and interfaces just so they can send out messages to their audience.
With Hootsuite—arguably one of the most useful social media tools out there—you can register up to five of your social media profiles so you can interact with your Twitter,Facebook, Google+, and LinkedIn followers with ease.
Interface on all accounts is streamlined, so you won’t get confused with the different designs of each site. If you need to dispatch the same message on all your profiles, Hootsuite lets you do that in just one click after selecting the profiles where you intend for the message to appear.
Another advantage of using this tool is the ability to schedule your posts to appear at specific dates and times in the future. This proves very useful especially if you won’t be able to tend to your online profiles due to other commitments. Once you’ve set up the message, they will be posted on the social media sites even if you don’t log in.
InboxQ :There are Twitter users who are simply waiting to become your followers, if not clients or brand advocates. The problem is just they don’t know it yet. These people ask questions about products and services that your business offers such as “What’s the best X?” or “What Y should I buy?” InboxQ, arguably the most helpful when it comes to generating leads for your business, lets you find them.
The app can be installed on your Internet browser or Hootsuite dashboard. Once installed, enter the keywords relevant to your business and InboxQ will search for questions on Twitter. You can input as many keywords as possible, as long as they help return more questions. From there, you can answer questions relevant to your business.
Commun.it : This social media service lets you build more Twitter leads to increase your sales by focusing on your list of high-value members. Users who mention your name will appear on the list, making it easier for you to engage with them and strengthen your relationships. The tool will also automatically populate your different followers into groups for segmentation purposes. Similar to InboxQ, you can list down keywords that will be used to discover new leads for your business.
A free account allows you 30 times to engage with a single user, so be judicious with the posts you will make using this service.
Topsy and Social Mention : Social media is breeding grounds for customer feedback. This will prove valuable to possible clients of your business since they research on the Internet before they do any purchases. Positive reviews draw them in to your business, while negative repels them. What you want is to augment positive feedback, which is why searching for posts on Topsy and Social Mention and retweet, like, and comments on said posts.
For negative comments, make sure to provide answers to the issues they raised on their posts. If possible, find a way to correct your mistakes. This way, even if you received negative feedback on social media, the manner in which you addressed their concerns speaks volume of your customer support.
But from a user standpoint, long stretches of uninterrupted text-based content aren’t all that great. Nobody wants to click on to a new website—only to be assaulted by the sheer volume of required reading offered by some pages.
That said, you don’t need to choose between the text content that’s favored by the search engines and the graphic experience preferred by website visitors. Just keep “the fold” of your website’s pages in mind. Because the first few moments of interaction on your website are so critical to interesting and retaining new visitors, consider including graphic elements on the top of your pages (where they’ll be seen right away by users) and text blogs on the bottom (where they’ll still be accessible by the search engines).
And don’t even try to tell me that Flash is necessary for your user experience! We’re long since passed the days when website visitors were impressed by every moving object and animated experience you could cram into your site. These days, people want information ASAP, which means that they don’t want to waste time sitting through the minute-long Flash splash page or presentation your Web team is so proud of.
So yes, while I know it’s possible to work around the SEO weaknesses that Flash presents, my point is that there’s no reason to. Users don’t want to deal with these animations, so do everybody a favor and eliminate them from your website entirely.
As a result, sitemaps and efficient navigation options are great for the search engines, but their also great for your users. Keep in mind that users like things simplified as well. Really, nobody has time these days to click through page after page, looking for a single piece of information that could have been made easily accessible with a better navigation system.
So, do both your users and the search engines a favor by following sitemap creation best practices and implementing an HTML-based navigation system that ensures that every page on your site can be found within three clicks or less!
Link Spamming : King of black hat techniques, link spamming is just a way getting links to the websites of your choice through the use of automated software which accesses unprotected blogs through anonymous web proxies and leaves links in their comments. Long, frequently updated lists of proxy IP addresses are necessary, as well as decent comment generation software. Blog software developers have fought back, however, such as the the development of the Askimet comment filter for WordPress.
Splogs : Close cousins to scraping and spinning, splogs are simply blogs with worthless, automatically generated content. Many splogs read RSS feeds and create blog posts for themselves from them. Splogs are the framework into which scraped and spun content is laid out to create made-for-Adsense (MFA) sites. Also, splogs can be used to get other sites indexed or their Pagerank increased, by including links to them. A large percentage (some say 20% or higher) of the blogs on the web are actually splogs.
Scraping and Spinning : Scraping and its cousin spinning are a black hat technique that uses software to spider websites, grab the content, mix it up a bit, paraphrase, randomize, and generate “new” content from it. Often it will contain links to sites the marketer is trying to promote. Or, it will contain Ad-sense or other ads which are used to monetize the content. Spinning content into duplicate-content-penalty-avoiding text is the holy grail of black hat techniques. Programmers who come up with methods for doing this on-the-fly have created true money machines for themselves.
302 Redirect Hijacking : This technique is a really nasty black hat trick where the evil webmaster creates a web page on a high-page-rank domain with a 302 redirect to the page he is trying to hijack. Google-bot (or another search engine spider) follows the redirect to the second page and indexes it, but on the SERP, the URL of the indexed page will be that of the page with the redirect. In other words, the evil black hat webmaster will own the SERP, and the page with the content will be De-indexed. A truly evil hijacker builds cloaking into the redirect so human visitors to the page will go to his “money page”, while search engine spiders will still see the 302 redirect.
XSS Injection : Cross-Site Scripting (XSS) is a technique used to take advantage of certain pages with a special security flaw. They accept input from the HTTP GET request and display it on the page. Therefor, it is possible to construct a URL to one of these pages which will be displayed as a link to the site you specify, with the text you specify as the link text. The constructed URL can be set up as a link somewhere that a search engine spider will follow, getting the XSS-generated page indexed. Very sneaky.
Web Page Cloaking : This technique goes hand-in-hand with the doorway pages technique. The idea behind cloaking is to show a doorway page to search engine spiders but the “money page” to human visitors. Both pages are accessed using the same URL. Software is used to identify the search engine spiders and serve the doorway page to them. There is a dual purpose to web page cloaking: competitors are kept from scraping the content of the optimized doorways, and human visitors are kept from seeing the ugly doorway pages (the redirect is unnecessary in a properly executed cloaking solution).
Doorway Pages : Doorway pages are web pages created solely for the purpose of being spider-ed by search engines and included in the search engine results pages (SERPs). They are usually optimized for placement in the SERPs by being stuffed with keywords and created in bulk. Often, you will see that the pages are named after the primary keyword being targeted. Also, doorway pages will likely have a form of redirection involved sending visitors to the “money site”. The redirection can be a meta refresh tag or java script. Most webmasters using this technique will have software which cranks out doorway pages by the thousands.
Keyword Stuffing/Hidden Text : This technique involves picking a bunch of keywords for which a marketer wants a page to be optimized, and then placing them on the page in such a way that they will be read by search engine spiders, but not by human visitors. They can be located in a hidden div tag, colored so that they blend into the background, or even placed within HTML comment tags. This is truly an old school technique, and is not nearly as effective now as it was back in the day.
Cyber Hoaxing : Hoaxing is a way of “creatively” making news. First, create a fake news website that looks real. Second, write a sensational but false news story. It helps if it is difficult to prove the veracity. Third, create multiple accounts on various social networking sites such as Digg, StumbleUpon, Del.ico.us, etc., and submit your story there. Fourth, be ready for emails and phone calls from actual big-time media outlets with questions about your story. You will generate buzz and get links to your fake news story. Eventually, when it is discovered your story is false, try to capitalize on the outrage. How you monetize the whole situation is up to the webmaster, but it is commonly done with affiliate programs.
Buying Links : Is buying links a black hat technique? Of course it is. When a marketer pays for a link, they are essentially “buying a vote” for the page they are promoting. That link would not exist except that it was paid for. This gives extra weight to the promoted page in the search engine algorithms. The paid-for link does not itself add extra value to visitors, so the technique must be black hat.