Aside from wonderfully unseasonable weather, a jam packed expo floor, and another triumphant SearchBash, WebmasterRadio.FM returned last week to its unbearably humid confines in Fort Lauderdale, Florida stocked with interviews from sponsors and exhibitors that enlightened us with their expertise.
Here is the rundown…
-Why Local Search Leads to Higher ROI with Steve Yeich, CEO of Local Splash.
-Taking Stock With Your Customers with SearchMetrics CTO Marcus Tober.
-Google Venice Update and Impact on Content with Textbroker CEO-Americas Phillip Thune.
-SEM and Mobile Ad Networks in China with Zhaohui Tang, the CEO of adSage.
-Enterprise Press Release Tips with Peter Shankman, Founder of Help a Reporter Out and Vocus Small Business Evangelist in Residence.
-Paid Search Competitive Analysis; Metrics for SEO with Suren Ter, Team COO of SEOquake.
-Benefits from Participating in Awards with Web Marketing Association President William Rice.
-Content Strategy and User Behavior with SlingshotSEO Search Media Creative Consultant Rasheité Radcliff.
-Using Google AdWords Strategy in Microsoft AdCenter with Acquisio VP Marketing and Co-founder Marc Poirier.
-Setting Up A Paid Search Account with Zach Morrison, VP of Elite SEM.
-A showcase of Vivastream and their In-Event Professional Networking and Social Navigation with Vivastream Founder Kyle Morehouse.
-Search and Social Integration; SOPA Blackout Lessons Learned with Jeff MacGurn, vice president, earned media services for Covario.
-Google Analytics Integration with Steve Lock with AnalyticsSEO.
Advanced Keyword Modeling; Paid and Organic Search with SES Advisory Board and President of Back Azimuth Consulting, LLC, Bill Hunt.
-Usability, Duplicate Content and Lost Link Equity with Jenny Halasz from JLH Marketing, Inc.
-SEO, Social Media and PPC Tools with Raven Tools User Experience Manager Alison Groves.
-Cracking the Pinterest Code; Enterprise Link Building with cognitiveSEO.com Founder and Chief Architect Razvan Gavrilas.
-Local on Organic Search; Best Practices For Using Google Places with Milestone Internet Marketing President and Founder Benu Aggarwal.
-PPC Tools of the Trade; Getting the Most Out of Google AdWords with Fang Digital Marketing CEO/Lead Consultant Jeff Ferguson.
It’s that time of year again when Google, the world’s most prominent and oft-used search engine, revamps its algorithms in an effort to crack down even harder on spam sites. The trouble is, they’re such a megalithic force on the internet that anytime they make even the most minor of shifts, everyone has to adjust accordingly. Here’s a breakdown of the changes that have gone into effect, based on Google’s announcement, and some changes you may have to implement on your own website to ensure you’re not negatively impacted.
A completely revamped spam detection program that looks at individual pages and filters out what it identifies as numerous instances of repeated keywords and phrases. By pushing sites like this down to the bottom of the search results, it limits the likelihood that someone searching for information on European vacations, for example, will be directed to an ad-ridden site. As a consequence, sites with actual content about vacations to Europe will populate the top of the heap, thus giving the user a better overall web browsing experience. In order to make sure that you’re not lumped in with these spammy sites, now’s the most important time to reevaluate your content. If you’re not sure whether your site will be impacted by this new algorithm, review your site’s content and look for instances of repeated keywords. It may be necessary for you to restructure your web copy to be less keyword-driven and more content-driven.
Google is also cracking down on duplicate content and content farms. Not quite driven by the desire to expunge plagiarism from the face of the earth, the ultimate goal here is to prevent the duplicate and triplicate re-posting of legitimate articles that originally appeared elsewhere as a way to attract visitors to an add-riddled website–thus obtaining click throughs and revenue. This type of duplication of existing articles is called “scraping” and has long since been the bane of article authors, as well as users interested in finding unique, original content who wind up running into the same article over and over again. Content farms use programs to aggregate information obtained from various sources, and re-post them in the hopes of attracting visitors. But through the new algorithm, someone conducting a Google search will be more likely to actually come across the website that originally published an article or a blog, instead of being directed to a site that’s scraped or accumulated the information from various other sources. Unless you’re operating a program that scrapes articles from other websites to bring traffic to your own, or unless your website is comprised primarily of re-posts of other people’s work, this change shouldn’t affect you.
Google has also said it’s greatly improved its ability to detect sites that have fallen prey to hackers and have been taken over for purposes of spamming. These words should come as great relief, especially to websites just starting out who are concerned with the abilities of hackers to do major reputational damage with minimal effort. As far as all the other changes go, the best advice remains one that’s been a solid piece of advice for years now: create original content frequently, and you’ll rise in the search engine rankings. Period.
Arguably the most important SEO initiative announced at SMX West last week came from Googleâ€™s chief quality czar, Matt Cutts, who outlined a new Canonical link element which will tell search engines which version of a page a webmaster wants indexed. Designed to delete duplicate content from the search engine indexes, the new relational element allows search spiders to determine which pages to index and which pages to gloss over without adding duplications of pages to their indexes. The element has been accepted by the three major search engines, each of which has pledged support for it.
For example, image a content management system that automatically names pages with numbers or appends tracking or session IDs to a URL string. A URL leading to a page about something nice like Rupert the Bear might read: http://www.yoursite.com/specific-page.html?sid=yucky85448633572/. A good SEO would rather that URL be phrased in a far cleaner and clearer way: http://www.yoursite.com/rupert-bear.html/.
The element looks like this;
<link rel=â€canonicalâ€ value=“http://www.YOURSITE.com/correct-page.html”/>, and is coded in place of the 301 redirect formerly used to steer spiders to the correct version of a specific page.
Google, Yahoo! and Microsoft have all published information on how to implement the relational link element in the HEAD section of your website. A note of caution; the Canonical link element is domain specific and will not pass juice between different domains though it will within a sub-domain structure.