Archive for the ‘SEO’ Category

Enabling Google Webmaster Tools Data within Analytics

Tuesday, February 26th, 2013

For some time now, Google has offered the capability of accessing Webmaster Tools search engine optimization data directly from Google Analytics. By linking your Webmaster Tools site to a Google Analytics web property, you can now access additional insight into your site via Webmaster Tools data, including Search Queries, Landing Pages and Geographical Summary data. At the same time, you can now access Analytics reports from your Webmaster Tools sections of Links to Your Site, Search Queries, and Sitelinks.

Step by Step: Accessing and Merging the Data

  1.  Login to your Google Analytics account and select the web property you’d like to focus on. Ideally, this will be the same account as you utilized for your Webmaster Tools account.
  2. Navigate to Traffic Sources>Search Engine Optimization. Click on “Set  up Webmaster Tools data sharing” when you see the following:
  3. Under Property Settings, you’ll see “Webmaster Tools Settings”. Click on “Edit” and you’ll be able to select the verified site that you would like to merge with Analytics. If you don’t have a verified site in your Webmaster Tools account, you’ll need to verify your ownership using one of the options identified by Google.
  4. You’ll then be able to view and analyze the data in the Traffic Sources>Search Engine Optimization report.

How to Use Webmaster Tools data (Search Engine Optimization Reports) to Your Advantage

  • Identify landing pages on your site that have good clickthrough rates (CTR), but have poor average positions in search results. These could be pages that people want to see, but have trouble finding.
  • Identify search queries or keywords for which your site has good average positions, but poor click through rates. These are queries that get a good amount of attention, but by improving the content or call to actions could lead to more visitors.

For additional information, please see Google’s resources: “About Search Engine Optimziation” and Linking to Google Analytics.

Forward Thinking – Future SEO

Thursday, October 18th, 2012

Yesterday at PubCon 2012 was probably the most info packed day of the week. From the early morning news that Internet Marketing Ninjas in New York acquired WebmasterWorld Online Forums to the jam packed late afternoon Authorship session in a crowded Exhibit Hall A, my notebook was brimming and my brain was spinning. No, not from comped casino beverages, but from a virtual dump truck of information exchanged here at the second big day of SEO sessions. Standout tracks from the day were the Bing session and the Authorship discussions.

Future of SEO - PubCon 2012 SessionDuane Forrester from Bing delivered an amazing presentation, that has so far been my favorite track from the conference series. Entitled “What’s Up at Bing?” it was less of a – here’s what we’re doing – type of discussion, and more of a layout of the future of search and the tools that Bing has at the ready for all inquiring SEO minds. Two main things to remember when thinking about SEO in 2012 and beyond is to see the act of search as less of a one time action and more of a lengthy session. Forrester pointed out that for many things (such as a cup of coffee, a slice of pizza, or a recipe) the search may be a one and done type of deal, but for a substantial purchase, or to find a family doctor, the searcher is going to do considerable research. They are going to spend hours, even days, gathering and digesting information (good content), specs or credentials, where the business is located (local), and product or service testimonials (reviews and recommendations in their social circles). According to Duane Forrester’s data:

“44% of search sessions last a day or more”

and in order to capitalize on this, to leverage your product and your brand, you need to be focusing on a different style of search and keyword data than the current one search SEO query is looking at.

A lot to think about and digest? Yes!  But, that was just the beginning of the knowledge-fest because hot of the heels of the Bing talk was the Authorship session (otherwise known as rel=”author” tagging in G+). That’s been the rising focus of many SEO campaigns since August of 2011, and it was clear that the mechanics are still hard to grasp for newcomers to the topic as they were for those of us who have been trying to keep up with Google’s changing preference for implementation methods over the last few months. There were furrowed brows, many questions, and some blank stares in the large crowd listening to Eric Enge, Stephen Spencer, and Jim Boykin shed light on the subject.  At Page 1 Solutions we’ve been following all the changes and keeping up with the most current info regarding authorship, so there was not a lot of new info for us here, but it is good to know that we are solidly on the right track.  Go Page 1 Ninjas!

The rest of today will be spent listening to thoughts on competitor “stalking” (aka – gathering intelligence about what your competitors are doing and analyzing the data). More about that and the final thoughts on the conference as PubCon wraps up later this evening.

by Tammy Smith SEO Analyst, Page 1 Solutions, LLC

Day 1 at PubCon 2012 – G+ and Disavow Links

Wednesday, October 17th, 2012

PubCon 2012 Las Vegas, NV – Day 1

Yesterday was the first full day of conference sessions at the PubCon conference in Las Vegas. True to Vegas style it was energetic and loud and full of pumped up music and flashing lights. The keynote, about the psychology of motives and persuasion by Dr. Robert Cialdini, was an informative introduction to the day ahead.

There were a lot of new people to meet, and an abundance of great information to be digested. I am proud to be a part of an industry that thrives on knowledge and has such passion for guessing the next piece of the puzzle in the game of SEO. As a whole, the SEO community is very exciting and forward thinking. Everyone I met on Tuesday was eager to share what makes them passionate and what gets them excited about working in this field. I was asked questions like: Do you find it difficult to traverse the Google landscape since the changes in local+? What types of A/B testing do you do with your websites and what has been the most influential revelation from that testing? How do you leverage Pinterest for a local small business? Do you Foursquare? Authorship and Search, how huge has that become, and where will it lead? It’s enough to make a nerdy girl like me giddy.

I am literally amazed at the amount of topics covered in just a few hours the first day at PubCon, and I’m grateful for the chance to be a part of such an inquisitive and passionate community.

Two big takeaways for me from the the first day of the conference:

  1. G+ linked content gets indexed faster than content linked from Twitter and Facebook (more about this to come in a follow up blog, but for now, know that it is pretty important to get set up on G+ and utilize it to promote your website and blog content.)
  2. Matt Cutts announced the ability to Disavow Links in Google Webmaster Tools (this is HUGE)

Matt Cutts announces the Disavow Tool

 Best quote of the day?

“Google is really making things complicated!” -Marcus Tober

I’m looking forward to Days 2, 3, and 4 – and sharing some of what I learn with you through this blog.

by Tammy Smith SEO Analyst, Page 1 Solutions, LLC

Conversion Rate Optimization

Thursday, July 19th, 2012

SEOgadget has recently posted on conversion rate optimization (CRO) to increase leads.  They have developed a methodology to help you beat the struggles and difficulties when trying to get customers to convert to valuable leads.  The CRO methodology that they have implemented to help conversion is very simple: identify, target the core barriers and test the changes.

Check out their full guide to conversion rate optimization:

The SEOGadget guide to Conversion Rate Optimisation - Infographic
A CRO infographic by SEOgadget.co.uk, read the full guide on SEOmoz

Conversion rate optimization is removing the barriers to conversion.  You need to identify the weak points of your site’s conversion funnel and build on your sites strengths.  The way that SEOgadget puts it is that CRO is a scientific process of diagnosis, hypothesis, and testing.  You will need to research the barriers to conversion.  Why are people not converting?  Learn about your …your site may have usability issues, weak calls to action and persuasive techniques or irrelevant page content.  Learn about your users…what are you users looking for?  Are you giving them what they want?

This isn’t a guessing game….do not guess as to why people aren’t converting.  You have the tools to figure out why people aren’t converting and what you can do to change it.

The first thing you need to do is to set up funnels. Analyze where your users are entering and exiting your site.  Identify where they are abandoning your site and where you can improve.  Here are some great tools to create funnel conversions: Google Analytics, Omniture and Kissmetrics.

Also research your analytics.  You can analyze what people are actually doing and what is happening on the site.  You can see what keywords they are searching for and what pages they go to.  Of course google analytics is an amazing tool but there are also tools that you can use for testing including ClickTale, CrazyEgg, Ethnio, Usertesting and Whatusersdo.  These programs will show you exactly where people are clicking on your page and go in-depth as to how many clicks, when they click, what they search, etc.  My favorite tool is CrazyEgg!

The next step is to identify the barriers that users may have.  Give them the option to ask open ended questions and get the information they need on the spot.  Some good survey tools are Kissinsights, Kampyle and Pop-Survey.  Good instant help tools are: Olark and LivePerson.  Try and limit the information you require them to give.  People are very hesitant on giving personal information like their name or phone numbers.  They want answers and information, so give it to them.  The only thing that may be a good requirement is an email address.  This way you can follow up with the user and it can improve response rates which increate conversion rates.

Try and test your site.  Pretend you are a user yourself.  See if your contact form is working, secret shop your sales staff, call you customer service number, test your employees.  There may be simple issues that you are missing.

Sell your site to your audience.  Give your users reasons why they should want to contact you.  Get reviews to your site, list accomplishments, achievements, awards, community service, and testimonials.  Let them know why you stand out and why you are “better” than your competitors.

Once you have diagnosed your errors and barriers, set a plan of action of how you are going to improve your site.  After you have made improvements on and off of your site, being testing and repeat the entire process.  SEOgadget says to review your test, analyze the analytics, and compare it to what you had before.  Most likely you will have better results and more conversions!

 

Creating a Search Engine Friendly Title Tag

Friday, July 6th, 2012

So we all have been doing it….cramming as many keywords we have into our title tag using only 70 characters that Google “kindly requests” us to limit our title tags to.  But it seems that things have changed a bit.  Previously Google would only display 70 characters and cut off the SERP title after that.  Now based on an experiment, Google seems to care more about how wide your title is.  Google is more concerned on the measurements of the pixels that your title is regardless of how many characters there are.

Google no longer puts a limit on the number of characters in your SERP title. Rather, it limits the title based on the pixel width.  To sum it up, the old rule of “70 characters or less” is no longer being used.

Read more here: http://www.seomofo.com/experiments/serp/google-snippet-07.html

Google has also been frowning upon long title tags.  The blog by Google Inside Search released a monthly list of algorithmic tweaks for May.  3 of them specifically had to do with how titles are displayed.

  • “Trigger alt title when HTML title is truncated.”   When html titles are truncated, trigger alt title.  Algorithms are designed to provide the best possible title.  When the current title is too long, it gets truncated.
  • “Efficiency improvements in alternative title generation.” Google has improved the efficiency of title generation systems and they have been more focused on a set of titles actually shown in search results.”
  • “Better demotion of boilerplate anchors in alternate title generation.”  When displaying titles in search results, their goal is to avoid anything that does not describe the page.  It eliminates text that is not useful.

When your title tag is too long, Google is simply trying to algorithmically determine a better title for the page instead of truncating it for a better page title.  Having a short title tag that is search engine friendly has increased in importance.  Without this, Google would replace your title with just about anything.  This replacing of title that Google will do may lose you the opportunity to entice uses to click on your page.

Final recommendation: Make sure your title tags have shorter, rich keywords and be conscious of how wide your tags are. 

Does being listed on a directory help my SEO results? SEO Basics XX

Friday, December 30th, 2011

Listing in Directories is one way of building links and can bring value to the site provided we make sure that the directories are credible and have authority in the eyes of the Search Engines. Google is said to prefer Directories that are human edited and that have very specific categories for listings ( e.g.Yahoo Directory).

Here are some points to keep in mind:

1. Does the Directory have a good Page Rank?
If the Directory has a low PageRank, it probably does hold much authority with Google.

2. Are Category Pages Indexed? To be more specific, is the category page you wish to be listed in indexed in Google?
If the category page that you wish to be listed in is not indexed in Google, then the page has no power to improve your site’s link popularity, because Google does not know about it, it cannot recognize that it is linking to you.

3. How Many Sites Are You Sharing the Link Juice Wealth With?
In most cases, an interior directory category page will contain ten listings – maybe twenty at the most.If the directory has too many listings on the same page, it may not be worth getting listed on there unless it is a proven source of traffic.

4. Does the Directory Add the NOFOLLOW Attribute To Its Listings?
If you are submitting to a directory to help improve your site’s link popularity and they are adding the nofollow attribute to outgoing links, it is not going to help you. Unless you are expecting to get some traffic from the directory, avoid it.

5. Will the Directory Send You Any Traffic?
It is worthwhile to keep a track of the traffice being received from a directory to decide whether it is reliable source of traffic for your website.

How Is Mobile Search Different from a Desktop Search?

Tuesday, December 20th, 2011

It is estimated that more people will be accessing Internet information via a mobile device than a personal computer by the year 2013 (that’s not very far away, folks). As we know, most standard websites do not render well on a smartphone screen due to the smaller screen size, or use of incompatible plug-ins (such as flash). As many businesses opt for a mobile website to complement their standard website, one wonders what this means for keyword optimization and search trends? Do smartphone users search the same way desktop searchers do?

Interestingly enough, mobile search is used and rendered very different from a desktop search. Here are my Top 5 reasons why this is so:

  • Mobile search is highly geared toward local information. Statistically 9 out of 10 smartphone searches result in an action. Chances are, if you are looking for for something on your phone’s browser (like a food, a museum, or a bike repair shop) it is because you want to purchase, find, or visit the searched item. Desktop searches tend to be less action oriented overall and more information oriented in nature. Because of this pages like Google Local Listings are positioned to rank higher than pages that are not locally oriented. Domains with geo-targeted keywords will also rank well in this system.
  • Google has 97% of the mobile search market share, and their algorithm is different for mobile devices than desktops.
  • Smartphone screen size is much smaller than a PC’s screen size, so it will be even more important to snag the top few spots on a mobile search in order to be on the first page.
  • Site loading speed becomes critical in a mobile search. A site that takes too long to load when on-the-go information is needed quickly, will lead to higher bounce rates than a standard desktop site search.
  • Android users are always logged in to Google on their mobile system.  This means that Android users will always be served personalized results more often than folks searching on a PC who may not have logged in. This will obviously change as more and more people begin to use Google+ or who search when logged into their Google accounts.  Most users aren’t aware if they are logged in or not, and personalized results definitely have an impact on what you will see in your search results.

These are key points to keep in mind when conducting a search on either platform, and even more important to keep in mind when positioning yourself and your business for the future, whether or not you have a mobile website. Keeping a claimed and optimized local listing has always been an important piece in your overall SEO strategy. But now, it seems as if listings such as Google Places and Bing Local could have even more impact on your search results in the future.

by Tammy Smith SEO Analyst, Page 1 Solutions, LLC

Domain Optimization– SEO BASICS XIX

Thursday, December 1st, 2011

Do domains with keywords included help with getting better results?

Generally Keywords in the domain can help get better results, it has been seen that the ones that seem to be the best are those with an exact match to the targeted keyterm. Also, more often than not, .com domains seem to have an advantage over other domains.

However search engines are reported to be laying less and less emphasis on keywords in domain names as this has been misused by spammers registering hundreds of “long-tail” domains to rank them for exact match and also because it’s now nearly impossible to obtain any “exact match” domains that wouldn’t be too long or irrelevant.

Howver, if it is possible, getting an exact match domain or a domain with relevant keywords in it is definitely helpful.

I had my domain for many years, but I purchased a better one recently.Should I start optimizing the website for the new domain?

If your domain has been around for a while, it will surely have gained equity in the eyes of the Search Engines. This is very valuable as it is a slow process to get the SEsto recognize your site as authoritative and credible.

It also depends on the age of the domain you have recently purchased, if it is an established domain with an exact match to your target terms, then it would be beneficial to work on getting this domain to be the main one. If not, then it might be better to let the new domain gain credibility with Google and the other SEs .

Why Should I Re-Optimize My Website? – SEO Basics XVIII

Wednesday, October 26th, 2011

Some reasons to re-optimize your website :

    Your site has enjoyed great rankings for several years but you’ve now lost your top or first page listings.

    Your website statistics show a big drop in the number of people visiting your site.

    Your site was optimized a long time ago, the keywords you originally used to gain high rankings no longer work.

    Your site has undergone a redesign or moved to a different platform.

    Your competitors seem to be achieving top rankings, you need to market your site to improve rankings.

    Your business has changed it’s focus and the priority Keyterms have changed.

    Your location has changed or you have new offices and you need to update your optimization accordingly.

Strategies to keep in mind while re-optimizing your Website to Increase Rankings and Traffic

    1.Look at the keywords generating the most visits. Are they currently included in your Title tags and content? Use keyword research tools to find related keywords that can be added to your web pages. Create extra web pages that target high priority keywords.

    2.Do searches for your top priority keywords and find the competing sites that are showing up at the top of the first page. Analyze these competitor websites for title tags, web copy plus the quantity and quality of their backlnks.

    3. Find related keywords and include these within your web pages to increase web traffic.

    5. Incoming links to your website help your web page rankings increase. Link building is a task that must be done regularly to stay ahead of your competitors. Boost the number of backlinks across your webpages so they can achieve better rankings.

    6. Look for high quality local directories and review sites where you can register your website and increase visibilty for your business.

    7. Optimize and verify your Google Places and Bing and Yahoo Local listings.

Being #1 Only a Small Piece of the Internet Marketing Pie

Monday, October 24th, 2011

According to a new study, your #1 ranking in organic search accounts for an 18.2% click through rate on Google, and a 9.66% click through rate on Bing.  For the combined click through rate of the top 10 results (page 1 on the SERPs) you get 52% for Google and 26% for Bing. This is a dramatic decrease when compared to similar studies done not too long ago that showed the top ten organic search results would garner anywhere from 63% to almost 90% of the click throughs. Overall, the data suggests trend has been on the decline, but why? (more…)


Search
Archives