Thursday, October 6, 2011

Track Social Media Using Google Analytics & WebmasterTools

Google and its web analytics tool known as “Google Analytics” as well as Google Webmaster Tools are continously becoming more advanced by adding more and more features to provide better user experience.

Now again both are enriched with tools to track social media more effectively. Now you can utilize them to track the impact of tweets, facebook likes, +1s & a lot more.

If you logged in to your Google Webmaster Tools account you will see that there’s a “+1 Metrics” section.”+1 Matrics” provides reports on the impact of the +1 Button on search. The new analytics tracks the amount of +1s on a given page and show how +1s affect your website’s clickthrough rate (CTR).

The new Activity Report and an Audience Report also added to Google Webmaster Tools. The Activity Report shows the number of +1s your website or webpages have received as well as the Audience report shows the geographic and demographic data about website visitors who have clicked on Google’s +1 button for your website pages.

Google has also introduced a very useful new Social Plugin Tracking tool for Google Analytics. This new social plugin tool can be utilise to compare the impact of various types of social media activities on your website. Along with +1s, it also tracks Twitter tweets, Facebook Likes, Facebook shares etc.

The new Social Plugin Tracking Tool generates three types of reports:

1. Social Engagement- It tracks behavioral changes (time spent on website, pageviews,bounce-rate, etc.) for visits from social plugins.
2. Social Actions- It tracks the number of social media activities users take on your site.
3. Social Pages.- It is used to compare the number of social media activities received on your website.

Its worthful to track social media as social media sites drive a lot of the website traffic.

Social media tracking from Google, just came after launch of Google’s +1 so, we can also say its Google strategy to show the importance of its +1 along with other social media sites. However, these new features from Google provide a great deal of more details and the impact of social media activites to the website owners on their websites.

Source- Digital Web Trendz- Digital Marketing News & Updates

Wednesday, April 22, 2009

How To Optimize A Dynamic Website

Internet technologies and e-commerce are advanced now and still developing day by day. As a result people prefer to have a dynamic website for their businesses or their online presence. So for some webmasters or new search engine optimizers who have experience in doing SEO for a simple static websites becomes necessary to know also about how to optimize a Dynamic Website?

For successful search engine optimization (SEO) of a dynamic website it is require to have some complex search engine technology and methods that are substantially different and much more sophisticated than the latest SEO techniques used for ordinary, more conventional "static" web sites.

In this article you can find some useful and important tips for how to optimize a dynamic website but first I would like to describe about what are Dynamic Websites?

Introduction to Dynamic Websites:
Now days often business websites are dynamic means that the web pages are dynamically built pages that allow user interaction and online shopping cart is an example for that.

Dynamic websites are websites whose pages are generated on the fly and usually built with a programming language such as ASP, PHP or JAVA. Often Dynamic sites are database-driven means that the site content is stored in a database and the dynamic code "pulls" the content from a database.

Problems in indexing Dynamic URLs:

It is really difficult to get dynamic web sites properly indexed in the major search engines unless they are professionally optimized. Even most search engines claim that they now index the majority of dynamic web sites but still only in some cases and it is limited to a number of URLs.

One of the most important reason behind having problem with dynamic sites to get indexed by major search engines is that Search engines often consider a dynamic URL as a set of infinite number of links.

Now days often dynamic web pages are created "on the fly" with various technologies such as ASP (Active Server Pages), Cold Fusion technology, JSP (Java Server Pages) and so on. Now all these pages are user friendly and works very will for real users actually visiting the web site, but they usually create a mess with most search engine spiders.

The main reason behind it is because all dynamic pages do not even exist until a user actually goes through a query or variable that generates the pages. Often search engine spiders are not programmed to select or choose any of query or variables. In this way, those dynamic pages do not get generated and that is why do not get to be indexed.

One of the main difficulties with search engine spiders are that they cannot read and are not trained to understand any of the dynamic databases of URLs which either contain a query string delimited by a question mark or any other database characters (#&*!%) that refers to as "spider traps." Once a search engine spider falls into any of those traps, it usually spells bad news for that dynamic web site.

As a direct consequence that most search crawlers have significant problems "reading" any level into a typical dynamic database, most of these search engine spiders have been programmed to initially detect and then ignore most dynamic URLs.

How to optimize a dynamic website to get it indexed by major search engines:

1. Using URL Rewriting Tools or Softwares - There are some URL Rewriting Tools and software available on the web that converts a dynamic URL to Static URLs. So it is better to use these tools to convert a dynamic URL of your site to Static URL.

For an example- Exception Digital Enterprise Solutions offers software that helps to change the dynamic URLs to static ones.

In this way, changing a dynamic URL to static one helps it to get easily indexed by search engines.

2. Using CGI/Perl Scripts - Using CGI/Perl scripts is one of the easiest ways to get your dynamic sites indexed by search engines. Path_Info or Script_Name is a variable in a dynamic application that contains the complete URL address.

In order to correct this problem, it is needed to write a script that will pull all the information before the query string and set the rest of the information equal to a variable.

When you are using CGI/Perl scripts, the query part of the dynamic URL is assigned a variable. So, in the above example "?id=586" is assigned a variable, say "X".
The dynamic URL www.xyz.com/abcproduct.asp?id=586

will change to- www.xyz.com/productname/A

through CGI/Perl scripts that can be easily indexed by the search engines.

3. Managing Web Servers-

Apache Server - Apache has a rewrite module that enables you to turn URLs containing query strings into URLs that search engines can index. This module however, isn't installed with Apache software by default, so you need to check with your web hosting company for installation.

ColdFusion - It is needed to reconfigure ColdFusion on your server so that the "?" in a query string is replaced with a '/' and pass the value to the URL.

4. Static Page linked dynamic Pages - Creating a Static Page that linked to an array of dynamic Pages becomes very effective especially in case you are the owner of a small online store. Initially just create a static page linking to all your dynamic pages. And optimize this static page for higher search engine rankings.

Make sure to include a link title for all the product categories, place appropriate "alt" tag for the product images along with product description containing highly popular keywords relevant to your business. Submit this static page to various major search engines including all the dynamic pages as per the guidelines for search engine submission.

In this way if you are going to optimize a dynamic website then all above SEO tips can help you to successfully optimize your dynamic website and your site will get indexed by major search engines without facing any problem and you have a great chance to take your site among top ranks on major search engines. For my more useful articles also visit Ezinearticles-Neeraj Srivastava


Monday, May 5, 2008

Useful Tips to Counter Web Spam-Matt Cutts

It is not a secret for now that spammming pollutes Web sites as well as inboxes of email users. It is a hot topic in many SEO Forums among Web site owners and webmasters and they have taken many steps and actions to combat it that sometimes work an sometimes not.

According to post at news.com you can find Google's pointers on countering Web spam from Google's Matt Cutts head of Google's Webspam team and an engineer who's been working on the problem for eight years.

During a speech at the Web 2.0 Expo Matt Cutts expalained about countering Web spam as-
"Spammers are human," Cutts said. "You have the power to raise their blood pressure. Make them spend more time and effort...If spammer gets frustrated, he's more likely to look for someone easier."
How? Forthwith, some tips for those who manage their own or others' Web sites.

Use captcha systems to make sure real people, not bots, are commenting on your site. He uses a simple math puzzle--what's 2 + 2?--but he also likes KittenAuth, which makes people identify kitten photos.

One blogger merely requires people to type the word "orange" into a field. "The vast majority of bots will never do that," Cutts said.

Reconfigure software settings after you've installed it. A little modification of various settings will throw bots off the scent. "If you can off the beaten path, away from default software installations, you'll save yourself a ton of grief," he said.

Employ systems that rank people by trust and reputation. For example, eBay shows how long a person has been a member and how satisfied others are with transactions with that person.

Don't be afraid of legitimate purveyors of search-engine optimization services. "SEO is not spam. Google does not hate SEO," Cutts said. "There are plenty of white-hat SEO (companies) who can help you out."

Registering your Web site at Google's Webmaster Central site can help find bogus search-engine optimization tricks others may use on your site, such as keywords written in white text on white backgrounds, he added.

Webexcel Solutions (ISO 9001:2000 Certified) A Software Development Company and an SEO Comapny

Friday, November 30, 2007

Saturday, November 17, 2007

Google Owns a Search Engine Optimisation Company

If you own or work with a search engine optimization company, or even if you're just hoping to better your search engine placement, then you are probably aware of the recent acquisition frenzy that took hold among the major search engines. Google paid $3.1 billion for DoubleClick, Microsoft paid $6 billion for Aquantive, and Yahoo paid $680 million for the 80 percent of Right Media that it did not already own and another $300 million for BlueLithium. The companies purchased are all intended to help widen the advertising range of each of the engines in question, and to take advantage of increasingly sophisticated behavioral-based ad-serving technologies that the acquired companies owned.

What many people failed to realize was that when Google purchased DoubleClick, it now was also the owner of a very large search engine optimization company called Performics, which is a wholly owned subsidiary of DoubleClick.

This fact is of course raising some eyebrows in the industry. Google has consistently maintained that there is no way that people can pay for better search engine placement in the organic index, a stance that the company still claims applies despite this recent purchase. In fact, a portion of Google's published guidelines about SEO says, "While Google doesn't have relationships with any SEOs and doesn't offer recommendations..." In another portion, Google says "While Google never sells better ranking in our search results..." However, anyone who hires search engine optimization company Performics is of course now paying Google for better search engine placement. It seems like a pretty black and white issue, but Google would obviously prefer that it was kept delightfully blurry.

Read more in detail at- Google Owns a Search Engine Optimization Company

Software Development Comapny provides Search Engine Optimisation Company.

Join SEO Forums

Tuesday, November 6, 2007

Matt Cutts Confirms Google PageRank Update

Most of people were looking for page rank updation by Google since last week of July this year. This was obvious because generally Google PR updation takes place after every 3-4 months and the last PR update held in April of this month.

In the first week of August people have seen a little fluctuation in PR and number of backlinks of their sites in some places. But this fluctuation was just happened with fewer sites so people were confused about that PR updation has been completed or still have to complete.

Recently last week in October, people again have seen a great fluctuation in PR of their sites and also in their number of backlinks and hope that PR updation has been finished for this time now. and yes people were right as Matt Cutts confirmed Page Rank Update for this time in an email to Search Engine Journal on 29th October.

Matt Cutts emailed Search Engine Journal last night to let us know that in fact, the partial Google ‘Toolbar’ PageRank update which happened last week was a result of Google’s campaign against paid linking and advertisement links which influence PageRank.

Read here in detail at Matt Cutts Confirms Google PageRank Update

Software Development Company provides Search Engine Optimisation services.

Thursday, October 25, 2007

TOP 10 COMMON SEO MISTAKES

Webmasters spend a lot of time and efforts on optimizing a website. They try different strategies to get higher search engine rankings for their website, but some of them fail to get desired results just because of some mistakes. In this article I have listed some common mistakes which should be avoided while optimizing a website as they may have adverse effect on your Search engine rankings:

1. Keyword Stuffing: Putting the same keyword again and again or using hundred different spellings or tenses of the same keywords in your keyword meta tag is known as keyword stuffing and considered as spam by search engines. You must avoid it as it may harm your search engine rankings.

2. Duplicate Content: Make sure to have some unique and informative content for users on all web pages, it must be related to your business. Having the same content on your different pages of website must be avoided as it may have an adverse effect on your search engine rankings.

3. ALT Tags: Do not forget to add ALT tag while using images in your website. You can use your targeted keywords also in ALT tags. It makes your website accessible for search engine spiders.

4. Framesets: Avoid using framesets as frames-based pages behave differently from standard Web pages, affecting essential Web functions like printing and navigation. search engines cannot index framesets.

5. Navigation and internal linking: Proper navigation and internal linking is also matters a lot. Navigation menu should be easily accessible by users. Make sure that the anchor text linking to pages within your own website is relevant to the target page.

6. Anchor Text of Inbound links: Having a lot of inbound links is not enough but the anchor text pointing to these links is also very important. The anchor text should be targeted to your main keywords and the web page they point to should have those keywords.

7. Redirects: Redirect is used to re-route users from one URL to another and search engines consider it as you are trying to trick them. You must avoid it, if necessary then use only ‘301 redirect’ which is the safest method to redirect.

8. Cloaking: Cloaking is a technique used by some webmasters to display different pages to the search engine spiders than the ones normal visitors see. You should always avoid any form of cloaking as it strongly forbidden by most major search engines now.

9. Over Optimisation: Over optimization shows that your site has been designed for search engines and not for users. It may drop your search engine rankings as search engines are now able to detect over optimized sites so you must avoid over optimization.

10. Impatience: Search engine optimization requires a lot of patience. You must wait for few months for results after optimizing your website. Have a little patience and you will get your desired results if you properly optimized your website using ethical SEO techniques.

Software Development Company provide Search engine optimization and PPC Services.
Please visit SEO Forums

Google