Friday, December 31, 2010

The Benefits Of Web Analytics !!


First of all, what are web analytics?

The Official WAA Definition of Web Analytics: Web Analytics is the measurement, collection, analysis and reporting of Internet data for the purposes of understanding and optimizing Web usage.



Analysis

Web analytics has been gaining steady popularity among the online community, and is showing no signs of slowing. It is a great device to look at your latest internet site trends and your visitors’ or users’ preferences in terms of site features. Here are some very general examples of the benefits of web analytics.

* It helps monitor your visitors and users

With web analytics, you know how long your visitor stayed on your site, who they are and what source they came from. It is possible for you to know their clickstream activity, the keywords they may have used to access your site, and how they came to enter your site in the first place. You can also see the number of times a visitor returned to your site and which pages took preference over the others. All in all, very useful information vital in order to make constructive website changes.

* It can help you optimize your website

Once you have carefully studied the actions of your visitors, you will be able to action some changes. This puts you in a great position to write better-targeted ads, strengthen your marketing initiatives and create higher converting websites. You can improve, streamline or reshape site navigation to better assist your visitors, and improve their overall browsing experience.

* It can help you formulate a sales and e-marketing plan

Web analytics will be able to assist you in the preparation of an e-marketing plan. This will be more effective because your plan will be based on solid facts and not mere probabilities. You‘ll know what is popular on the site, and what your market likes and wants. By tracking highly-viewed items, you will learn which features receive the highest interest. You can even use analytics data to enhance other programs that you already have in place - like PPC for example. Then you can work on expanding your client base, as well as retaining your current customers.
Conclusion

An application like Google Analytics can help you achieve all of the above - but in order to turn your data into information - that is to make sense of it and fully understand what it means - it is helpful to use a business intelligence solution which can open your eyes to the real potential of your website in just a few clicks.

Can you think of any more? We'd love to hear your thoughts.

Read more: The Benefits Of Web Analytics

Tuesday, December 28, 2010

Common Web Analytics Issues !!


Having frequently been involved with the web analytics process I have noticed some consistent issues with web analytics both from an agency and in house perspective. I am not talking about data quality or even vendor selection, I am talking about how web analytics strategically fits in within an organization.


Analytics is not a priority: In many cases web analytics is often an afterthought and is not implemented during a site launch or during a sponsored/email campaign. Web Analytics needs to be given more priority and should be thought of before any marketing campaigns are implemented so that you can actually quantify the amount of dollars you budgeted and spent for the marketing.


The right stakeholders are not getting the right data: If the same dashboard is given to every person involved with your online strategy then you're not allowing them to make informed business decisions which affect their part of the overall plan. Customized reporting is an absolute must - show the Marketing Manager leads (SEO vs. PPC), show the online marketing team keyword referrals/ROI by source, show the CEO/CFO sales and revenue numbers, show the IT Team Site Errors/Traffic Spikes and show the usability team barriers within conversion funnels.

Too much data and not enough resources: In both the In-House and Agency worlds there becomes a time where analysts are simply bombarded with so many requests that they simply can't keep up. Web Analytics is an extremely important tool used to show the performance of a business and how to best tweak your business's performance, so WHY NOT add some more resources to it.


Tough and tedious to find good analysts: It is difficult to find analysts who have the technical ability to implement a training solution but also have the marketing savvy to know what recommendations to offer once the data has been collected. However, there are a few good ways to train a new analytics analyst: Get them involved with the SEO/PPC teams so they better understand the business, Give them a mix between reading and scenario based training, give them some work to do which is out of their comfort zone, work with them through an analysis or deliverable, send them to SEMPhonic for some analytics training, and finally see if they're still passionate after all of this.

Source: Common Web Analytics Issues

Monday, November 29, 2010

Google Analytics Data Skewed Because Of Instant Previews !!


There are confirmed reports that Google Instant Previews may be skewing your web analytics data.

It appears that in some cases Google will conduct an on-demand fetch of your page to dynamically create an Instant Preview. The on-demand fetch happens when a searcher places his mouse over the search result on Google and the image preview comes up. Some analytics tools, including Google Analytics, may consider that a visit, because Google Instant Preview is actually visiting the page in real-time to get that on-demand Instant Preview.

There are several complaints about this issue in the Google Help forums. Google’s John Mueller replied saying they are working on fixing this from showing up in Google Analytics. “We’re working on a solution for this, to prevent Google Instant Preview on-demand fetches from executing Analytics JavaScript,” John said.

Ideally, There is no estimated time for when this may be resolved. So if you have seen a skew in your analytics data in the past two weeks, this may be the reason.

Read More: Google Analytics Data Skewed Because Of Instant Previews

Friday, November 19, 2010

Troubleshooting Google Analytics Goals and Funnels !!


Objective:-
In this module you'll learn some of the most common reasons for why goals and funnels aren't functioning properly, and how to fix them.
Troubleshooting Goals That Aren't Being Tracked

One way to check if you have written your Goal URL correctly is to see if the page is being tracked. Search the Top Content report for the goal page to confirm the page is properly tracked and counted as a goal. Please see the examples below for further information.

I. Exact Match/Head Match:

Search the Top Content report (found underneath the 'Content' section) for the request URI of the goal URL.

For example, if your goal URL is www.example.com/cats/prettycatcheckout.html, then the Request URI is everything after the domain name '/cats/prettycatcheckout.html.'

If the request URI appears in the Top Content report, then the goal URL is written correctly. However, if it doesn't appear, please see troubleshooting tips here.

II. Regular Expression Match

Search your Top Content report using your regular expression. The report filter allows regular expressions, so your goal page should appear if your regular expression is written correctly. If the goal page doesn't appear, please see the troubleshooting tips here.

Tip: If the goal page doesn't appear in your Top Content report the first time you search, try modifying your search until it does appear. Then use the search result from your modified search query as your Goal URL.
Troubleshooting Funnel Drop-Offs

If you have funnel steps with URLs to different domains or subdomains, and the tracking code isn't customized as described in the following articles, then all visitors that go from the website in the first step to any other domain or subdomain will appear to drop off in your reports. Learn how to track multiple domains or subdomains in Section 1: Installing Google Analytics on Complex Websites.

For example, suppose a funnel is setup as follows:

Goal URL: www.secondsite.com/jkl.html

Funnel Steps:
Step 1 (You're checked off 'Required step' for Step 1 from your Goal settings page): www.firstsite.com/abc.html
Step 2: www.firstsite.com/def.html
Step 3: www.secondsite.com/ghi.html

In this example, if the tracking code is not customized as needed for multiple domains, then users will appear to drop off at Step 2 because a new session will be created when the user goes from www.firstsite.com to www.secondsite.com. In the new session, Google Analytics will not know that the user actually visited Step 1 because it was done in the previous session. Since the first step is required in this funnel, it will also appear as if the user did not convert towards the goal, and Step 3 and the Goal URL will not be recorded in the funnel since it didn't occur in the same session.

Read more: Overview of Troubleshooting Google Analytics Goals and Funnels


Monday, November 15, 2010

In web analytics, everything is relative !!


What's a good bounce rate for my web site?

I get that kind of question a lot. What's a 'good' bounce rate? A 'good' time on site?

The answer, I'm afraid, is: Better than your current bounce rate. Better than your current time on site.

In web analytics, it's best to focus on your own data and on improving. Use yourself as the benchmark. This is your best strategy for two reasons:
Lack of accurate benchmarks

Accurate, internet- or industry-wide data on keyword searches, or competitors, or just about anything else, is scarce. Non-existent, really.

1. 'Panel'-based statistics like Compete.com (which I love) and Alexa (which I'm starting to like again) sweep in an incredibly wide range of web sites. The bounce rate on your online bike shop won't compare to, say, the bounce rate on the New York Times web site.
2. Statistics within your own industry will include outliers at both end of the spectrum: At one end are the companies that have invested 100x your budget to become the shining pinnacle of conversion rate optimization. At the other, you'll be comparing yourself to the sites designed according to 1992 best practices. Even if you can narrow down the data in #1, it'll be inaccurate..
3. Keyword data from Google is about as trustworthy as a credit default swap.
4. Keyword data from other sources may be more trustworthy, but shows you a tiny sliver of total search traffic.

Numbers lie

Even if you could get accurate benchmarks, they still lie. Your business isn't like your competitors', no matter how similar they seem. Competitor A just fired his head of sales, so conversion rates tanked for a month. Competitor B happened to get on Channel 5 News. Her traffic tripled, lowering her conversion rate, too - but her sales skyrocketed.

Unless you've got the whole story, the numbers will lie. And you can't get the whole story.
Focus on improvement

So, if you're trying to figure out how many visitors you should be getting for 'slobber knocker', the answer is? More than you get right now.

If you're trying to figure out where your conversion rate should be? Yep. Better than what you're getting right now.

That's what web analytics are for: Helping you improve. Which, as it happens, is also how you beat your competitors.


Read more: In web analytics, everything is relative

Saturday, November 6, 2010

Web Analytics Can also Track Offline Campaigns !!

Web analytics is not just a tool for measuring Web site traffic. Web analytics applications can also help companies measure the results of traditional print advertising campaigns.

That's what New York-based retailer BuiltNY discovered when it began running a four-issue print campaign in Dwell magazine in August. The company's "design-focused" neoprene tote bags for things like wine, lunch, or laptops are sold directly on its Web site and through resellers in 30 countries.

The campaign began in Dwell's September issue with a quirky expose of what various items look like in a BuiltNY bag through an airport x-ray machine. The ad itself featured a candid letter from BuiltNY, superimposed on one of the images, describing the events leading up to the creation of the ad.

To track the print campaign, BuiltNY put a unique, easy-to-remember unique URL in the ad, which was only in use for that campaign. That landing page shows colorful x-ray images of objects like wine bottles, lunches, seashells and beach gear, all inside the appropriate BuiltNY bag.

Through Google Analytics, BuiltNY was able to attribute an 800 percent boost in traffic when the ad hit newsstands, and a 40 percent increase in online sales from visitors that came through that URL, Steve Bowden, art director for BuiltNY, told ClickZ.

"We can read it like the Wall Street Journal for our own Web traffic," Bowden said. "Every morning we get an update on how our Web, print and e-mail campaigns are doing, correlated to sales."

"Instead of gathering around the table scratching our heads, we actually have data to show how the campaign is performing," added Aaron Lown, a principal at BuiltNY and its co-creative director.

The second ad in the series, in Dwell's October issue, follows the bare-bones letter approach, with an invitation to browse BuiltNY's new line of cases for cell phones, MP3 players, laptops and other electronics. That ad links to an online game BuiltNY developed with Justin Bakse of VolcanoKit.com, where users battle alien neoprene electronic accessories with a "ballistic champagne bottle."

BuiltNY ran a more traditional test ad in Dwell earlier this year, before it began using Google Analytics. "But I have no idea if it worked," Lown said, since he had no way to track its success. Prior to implementing Google Analytics about the same time the X-ray campaign began, BuiltNY hadn't used any Web analytics products for its first three years in business. "We just had too many other things to do, like design new products, run our business...," Lown said.

These print ads account for a majority of BuiltNY's marketing spend, which is balanced out primarily with public relations outreach, quarterly e-mail campaigns, and a small AdWords campaign, Lown said. "This is the first advertising we've done, and while it's small in the grand scheme of things, it's our largest ad effort," he said.

BuiltNY is now using Google Analytics to track all of its online and offline efforts. It's also using the data to optimize future campaigns, such as determining the best day and time to send out e-mail communications to existing customers. Because it can now accurately determine the value of its online and offline campaigns, BuiltNY can more confidently spend its limited marketing budget knowing what the return will be, Lown said.

Web analytics applications are often underutilized by small and mid-sized businesses like BuiltNY, which currently has about 30 employees. For many small businesses, the only Web analytics available are simple traffic counting applications available through their hosting provider, which provide little actionable value, Greg Dowling, senior analyst at JupiterResearch, told ClickZ. Those companies eschew more sophisticated analytics applications for reasons of cost, information overload, or confusion over how to utilize the data in their business, he said.

"I think the primary barrier is the lack of internal resources required to effectively establish, monitor, and maintain a Web analytics installation, as well as the cost-prohibitive nature of enterprise class Web analytics platforms," Dowling said. "Additionally, tool complexity prevents users who actually install these applications from getting any real value out of them if they don't have dedicated Web analysts supporting these installations."

Google tackled the first issue head-on by offering its analytics product for free, and is taking on the others with a robust help center and knowledge base of practical applications it calls Conversion University. The money saved by Google Analytics being free can also be put into hiring and training Web analysts or paying for consulting from existing partners, Dowling said.

While the tracking method BuiltNY used for its print ads is straightforward, there are hidden pitfalls that have prevented more implementations, Dowling said.

"While it is technically easy to track offline campaigns through the use of redirect or vanity URLs, the practice is often wrought with complexities and is prone to error, making the data collected highly suspect," he said. "Tighter integration with direct marketing systems that would allow for the tracking of campaign respondents across campaigns both on and offline is becoming available as Web analytics vendors enhance data integration capabilities, but widespread adoption and usage is limited."

Saturday, October 30, 2010

IBM acquires Coremetrics, adds Web analytics Domain

IBM said Tuesday that it will acquire Coremetrics, a privately held Web analytics company.

With the move, IBM enters the Web analytics fray. Coremetrics focuses on everything from social media to marketing optimization to cross-channel retail sales tracking. Coremetrics counts Bank of America, Enterprise, Kraft, Virgin Atlantic, Costco, QVC and others as customers.

According to IBM, Coremetrics will give the company the ability to better track consumer interactions via a software as a service model.

Coremetrics’ software portfolio includes Web and mobile analytics, targeted email and advertising tracking and other reporting and benchmark tools. IBM said it will add Coremetrics to its business analytics offerings. Overall, Coremetrics will ride shotgun with IBM’s existing WebSphere, information management and analytics software.

Coremetrics’ 230 employees will be integrated into IBM and the deal is expected to close in the third quarter.

Source: IBM statement.

Monday, October 11, 2010

Website Path Analysis: A Good Use of Time !!

Path Analysis: A process of determining a sequence of pages visited in a visitor session prior to some desired outcome (a purchase, a sign up, visiting a certain part of site etc). The desired end goal is to get a sequence of pages, each of whom form a path, that lead to a desired outcome. Usually these paths are ranked by frequency.

Is doing Path Analysis a good use of time? In my humble opinion the answer is a rather emphatic no, except for one exception (which I’ll discuss below). Almost always Path Analysis tends to be a sub optimal use of our time, resources and any money that is expended on buying tools that do “great” Path Analysis.

WWe usually strive to do Path Analysis in a quest to find this magic pill that will tell us exactly what “paths” our visitors are following on our website. If they “follow” the path we intended we celebrate.

If as usually it turns out that our visitors don’t “follow” the path that WE want them to follow then it back to the drawing board to redesign the site structure / architecture to get them to “follow” the path or at times, worse, hours of “analysis” on: what the heck were they thinking when they click on this button or go to that page (bad customers, bad customers!).

Challenges with Path Analysis are:

* Imagine a website with five pages. Page one Start, Page five Finish. With a simple visualization in your mind you can imagine the number of paths that a visitor could take. Now imagine a website with 100 pages, now one with 5,000 pages. The number of possible paths quickly becomes infinity (well not really but you get the point).
G

* Most of our tools do a terrible job of representing this path: click forward, back to home, click forward, reverse to three pages ago, hit buy. In a world of linear path representation at a page level this is really hard to compute, even harder to depict. Yet this is exactly how our customers browse our websites.

* On most websites the most common path is usually followed by less than five percent of visitors, usually 1%. As responsible analysts could we make any decision on something such a small fraction of site traffic is doing?

* Even if the most common path is followed by 90% of the visitors current Path Analysis has two fatal flaws:
o It can’t show / say which page in a series was most influential in convincing a customer to move on.
o Current tools aggregate traffic into one bucket, when in reality each segment of traffic behaves differently (say DM traffic vs SEM vs “bookmarks” vs Print Ads). Segmentation is always key.

All of the above combine to make it quite sub optimal to glean any actionable insights that will lead to making our websites more endearing to our customers.

There is one exception to this rule. For structured experiences such as a Checkout or a Closed-off DM Landing Page experience (no navigation, just Next – Next – Next – Submit) Path Analysis can identify where the “fall off” can occur. Once that is identified we will still not know the Why (see Qualitative Metrics Post) but Path Analysis is helpful.

Here is example of new way of thinking about “Path Analysis” that I think is heading in the right direction. (Please see Disclaimers – Disclosures first.) There are atleast three more things I would like to see fixed in this version but ClickTracks address some of the usual fatal flaws here.

* CIt is possible to break down a linear process into one in which we can group a bunch of related pages (say all product pages) into “groups”. This helps fix the problem of linearity because customers can go from A to B to C or C to A to B and it does not matter for related content.

* It is possible for Visitors to show up in any stage at any point (this is actual behavior now with SEO influencing where people land). Google Analytics also has this feature(please correct me if others do as well).

* Perhaps the cutest thing is that it shows which page in the “Path” is most influential in moving people to the next stage. This is awesome because one can simply look at the “darker shaded” pages and know, for example, that no one cares about system requirements but rather the page on our 10 year no questions asked return policy is the most important one in convincing people to add to cart.

* It is also quite easy to view how different segments are influenced by different content, in my unreadable screen shot you can see All Visitors vs Visitors from Google. Imagine this intelligence then turned around and applied to personalization (!).

This is not perfect but getting there and I think all the vendors will soon coalesce around this innovation and we will all be greatly empowered.

Path Analysis as it is practiced currently ultimately is like communism (with sincerest apologies to anyone in my audience who might be offended). There are overt/covert intentions to control things, to try to regulate, to say that we know better than you what you want, to push out a certain way of thinking. I know this sounds extreme, and it is but simply for shock value and not to offend anyone.

The web on the other hand is the ultimate personal medium and one in which we all like different things, we all have specific preferences and opinions and a certain way we want to accomplish something. The beauty of the web is that all that is possible and cheaply with easily accessible technology. So why do typical Path Analysis and why try to “push” a certain way of navigation / browsing / buying? Why not get a deep and rich understanding of our customers and then provide them various different options to browse our website they want to and get to the end goal the way they want to.


Read more: http://www.kaushik.net/avinash/2006/05/path-analysis-a-good-use-of-time.html#ixzz123g8T3Jr

Friday, October 8, 2010

What are the Key Performance Indicators ?

key performance indicators, help organizations achieve organizational goals through the definition and measurement of progress. The key indicators are agreed upon by an organization and are indicators which can be measured that will reflect success factors. The KPIs selected must reflect the organization's goals, they must be key to its success, and they must be measurable. Key performance indicators usually are long-term considerations for an organization."

Some important points of this KPI definition:

Organizational Goals: It is important to establish KPIs based on your own business goals rather than standard goals for your industry. Expanding on this, a company whose goal is "to be most profitable" will have different KPIs than a company who defines their goal as "to increase customer retention fifty percent." The first company will have KPIs related to finance and profit and loss, while the second will focus on customer satisfaction and response time.
* Measurement Purpose: It is important to analyze KPIs over time, allowing you to make changes to improve website performance – then periodically reevaluate performance to verify progress. For this reason, KPIs must be measurable. The goal "increase customer retention" is useless because there is no quantifiable goal, whereas the aforementioned goal, "to increase customer retention fifty percent" has a definite quantity that can be tracked.

* Goal Continuity: KPIs are long-term considerations designed to help with strategic planning. While it is important to have targeted goals, they should be incremental to an overall success. Simply because something is measurable does not mean that it is significant enough to be a key performance indicator. You must define your KPIs and keep their measure the same from year to year. Not that you can't adjust your goals, but you should use the same unit to measure those goals.

* Managerial Consensus: It is important to have all managers on the same page because personnel from different functions within your company will help create the KPIs. If your KPIs truly reflect your organizational goals then it is necessary for all levels of your company to get with the program. Encourage company unity and enthusiasm for the project and make sure that everyone knows what the KPIs are.

For Various websites or online marketing compaingns KPI May be following:

* Order conversion rate
* Buyer conversion rate
* Cart conversion rate
* Checkout start rate
* Revenue per visit
* Revenue per visitor
* Average order value
* Visits per visitor
* Page views per visit
* Percent committed visitors
* Lead conversion rate
* Home page bailout rate (a personal favorite of mine)
* Average number of items per purchase
* Average time spent on site
* etc

Wednesday, September 29, 2010

Omniture Products

These are some omniture web analytics products:

* SiteCatalyst, Omniture's software as a service application, offers Web analytics (client-side analytics).
* SearchCenter+ assists with paid search and content network optimization in systems such as Google's AdWords, Yahoo! Search Marketing, Microsoft Ad Center, and Facebook Ads.
* DataWarehouse, data warehousing of SiteCatalyst data.
* Test&Target, A/B and MVT (multi-variate testing), derived in part from Offermatica and Touch Clarity.[6]
* Test&Target 1:1, Omniture's main behavioural targeting solution, drills down to the individual level of testing.
* Discover, an advanced segmentation tool.
* Insight, a multichannel segmentation tool (both client-side and server-side analytics). Formerly called Discover on Premise, it was derived from Omniture's Visual Sciences acquisition in 2007.
* Insight for Retail, an Insight offering geared toward multiple online and offline retail channels.
* Genesis, a third-party data integration tool (the majority of integrations work with SiteCatalyst).
* Recommendations offers automated product and content recommendations.
* SiteSearch, an on-demand enterprise search product.
* Merchandising, a search and navigation offering for online stores.
* Publish, for web content management.
* Survey, to gather visitor sentiment.
* DigitalPulse, a Web analytics code configuration monitoring tool.
* VISTA, server-side analytics.

Friday, September 17, 2010

Web Visitor Identification Methods !!

Urchin has five different methods for identifying visitors and sessions, depending on available information. Of these, the patent-pending Urchin Traffic Monitor (UTM) is a highly accurate system that was specifically designed to identify unique visitors, sessions, exact paths, and return frequency behavior. There are a number of visitor loyalty and client reports that are only available when using the UTM System. The UTM System is easy to install and is highly recommended for all businesses.

In addition to the UTM System, Urchin can use IP addresses, User-Agents, Usernames, and Session-IDs to identify sessions. The following table compares the abilities of each of the five identification techniques:

Data Model

The underlying model within Urchin for handling unique visitors is based on a hierarchical notion of a unique set of visitors interacting with the website through one or more sessions. Each session can contain one or more hits and pageviews. Pageviews are kept in order so that a path through the website for each session is understood. As shown in the diagram, the Visitor represents an individual's interaction with the website over time. Each unique visitor will have one or more sessions, and within each session is zero or more pageviews that comprise the path the visitor took for that session.

Proxying and Caching

In attempting to identify and track unique visitors and sessions, we are basically going against the nature of the web, which is anonymous interaction. Particularly troublesome to tracking visitors are the increasingly common proxying and caching techniques used by service providers and the browsers, themselves. Proxying hides the actual IP address of the visitor and can use one IP address to represent more than one web user. A user's IP address can change between sessions and in some cases multiple IP addresses will be used to represent a cluster of users. Thus, it is possible that one visitor will have different IP addresses for each hit and/or different IP addresses between sessions.

Caching of pages can occur at several locations. Large providers look to decrease the load on their network by caching or remembering commonly viewed pages and images. For example, if thousands of users from a particular provider are viewing the CNN website, the provider may benefit from caching the static pages and images of the website and delivering those pieces to the users from within the provider's network. This has the effect of pages being delivered without the knowledge of the actual website.

Browser caching adds to the question. Most browsers are configured to only check content once per session. If a visitor lands on the home page of a particular website, clicks to a subpage, and then uses the back-button to go back to the home page, the second request of the home page is most likely never sent to the website server, but pulled from the browser's memory. An analysis of paths may result in an incomplete path missing the cached pages.

In the above diagram, the actual path taken through the website by the client is shown at the top, while the apparent path from the server's point of view is shown at the bottom. In this case, before proceeding to Page-3 the user goes back to the Page-1. The server never sees this request and from its point of view it appears the user went directly from Page-2 to Page-3. There may not even be a link from Page-2 to Page-3.

Visitor Identification Methods

As mentioned previously, Urchin has five different methods for identifying visitors, sessions and paths. The more sophisticated methods which can address the above issues may require special configuration of your website. The following descriptions describe the workings of each method in more detail.

1. IP-Only: The IP-Only method is provided for backward compatibility with Urchin 3, and for basic IT reporting where uniquely identifying sessions is not needed. This method uses only the IP Address to identify visitor sessions. Thirty minutes of inactivity will constitute a new session. The only data requirements for using this method is a timestamp and IP Address of the visitor.

2. IP-Agent: The default method, which requires no additional configuration, uses the IP address and user-agent (browser) information to determine unique sessions. A configurable thirty-minute timeout is used to identify the beginning of a new session for a visitor. While this method is still susceptible to proxying and caching, the addition of the user-agent information can help detect multiple users from one IP address. In addition, this method includes a special AOL filter, which attempts to reduce the impact of their round-robin proxying techniques. This method does not require any additional configuration.

3. Usernames: This method is provided for secure sites that require logins such as Intranets and Extranets. Websites that are only partially protected should not use this method. The Username identification is taken directly from the username field in the log file. This information is generally logged if the website is configured to require authentication. This method uses a thirty-minute period of inactivity to separate sessions from the same username.

4. Session ID: The fourth visitor identification method available in Urchin is the Session ID method, which can use pre- existing unique session identifiers to uniquely identify each session. Many content delivery applications and web servers will provide session ids to manage user interaction with the webserver. These session ids are typically located in the URI query or stored in a Cookie. As long as this information is available in the log data, Urchin can be configured to take advantage of these identifiers. Using session ids provides a much more accurate measurement of unique sessions, but still does not identify returning unique visitors. This method is also susceptible to some forms of caching including the above example.

In many cases, the ability to use session ids may already be available, and thus, the time required to configure this feature may be short. For dynamically generated sites, taking advantage of this feature should be straightforward. The result is more accurate visitor session and path analysis.

5. Urchin Traffic Monitor (UTM): The last method for visitor identification available in Urchin is the Urchin Tracking Module. This system was specifically designed to negate the effects of caching and proxying and allow the server to see every unique click from every visitor without significantly increasing the load on the server. The UTM system tracks return visitor behavior, loyalty and frequency of use. The client-side data collection also provides information on browser capabilities.

The UTM is installed by including a small amount of JavaScript code in each of your webpages. This can be done manually or automatically via server side includes and other template systems. Complete details on installing UTM are covered in the articles later in this section.

Once installed, the Urchin Traffic Monitor is triggered each time someone views a page from the website. The UTM Sensor uniquely identifies each visitor and sends one extra hit for each pageview. This additional hit is very lightweight and most systems will not see any additional load. The Urchin engine identifies these extra hits in the normal log file and uses this additional data to create an exact picture of every step taken by the users. This method also identifies visitors and sessions uniquely so that return visitation behavior can be properly analyzed. While this method takes a little extra time to configure, it highly recommended for comprehensive detailed analytics.

Sunday, September 12, 2010

Measure What Matters: Defining Key Performance Indicators !!


The beauty of Web analytics—and the promise of the Internet—is the ability to capture nearly unlimited amounts of data about your Web site. Without a clear strategy to "measure what matters," your Web analytics initiatives will quickly drown in a sea of data. So how can you turn these incredible data resources into clear and actionable insights? A good place to start is by defining key performance indicators, or KPIs.

This paper is designed to help you understand KPIs and define metrics that support your organizational goals. What data should you focus on? Who needs this information? How different are the data needs and delivery mechanisms between various teams and job functions? And what is the most effective and impactful way to share key metrics across the enterprise?

Sunday, September 5, 2010

What is Hitbox ??



Hitbox is a popular web analytics tool and web analytics product created by WebSideStory, now Adobe, originally for adult entertainment websites. Some of the services have been declared spyware by several anti-spyware organizations, such as Lavasoft and the makers of Spybot - Search & Destroy. It is now widely used by commercial & other organizations across a variety of industrial sectors as a complete and integrated metrics solution for monitoring web traffic and driving marketing.

If a hitbox program is unknowingly downloaded to your computer, you will generally not be aware of its presence; however, it could result in slowing down of your computer. An unseen "hitbox" program may also cause other anti-virus or anti-spyware programs to run or take defensive action, which could also then result in slowing down your computer. For instance you may see "TeaTimer" running in your file manager - which is Spybot's anti-spyware program.

Many major corporations install hitbox applications on your computer without your knowledge, to track various aspects of your internet activity. For example Lexmark remote technical support installs ehg-lexmark.hitbox.com on your computer when implementing a remote repair.

Use of these unseen "hitbox" tracking programs is considered unethical by many people in the web community. However, because most computer users are unaware of them (because they invisibly run in the background) very little is done to stop this practice.

Running a reputable spy-ware program is usually the only way to identify and remove a "hitbox" application from your computer.

Saturday, September 4, 2010

Social Media facts and figures !!


The past couple of weeks I collected a couple of interesting facts and figures on Social Media. As I love this kind of information and lists I thought you might be interested too. My first conclusion based on this list….. Social Media has taken over the internet the last year. If you have some interesting add-ons to this list please feel free to add, I am very interested in more details on social media and networking.
  • There are currently 350 million Facebook users.
  • 25% of all search results for the top 20 brands are links from social media related websites.
  • 34% of all online bloggers blog about their opinions and views on products or services they use.
  • Google is number 1 search engine followed by Youtube.
  • In 2008 12.5% of all US married couples have initiately met eachother through a social network
  • The total online marketing spend on social media was $350 million in 2006, it is expected that this figure increases to over $2.5 billion in 2011.
  • 63% of all Twitter users is male
  • In total 2,600,000,000 minutes global users are spending on Facebook daily
  • Flickr has currently over 3.6 billion user pictures and photos
  • Wikipedia consists over 13 million articles in around 260 languages.
  • It is expected YouTube will host approx. 75 billion video streams and receives 375 million unique visitors in 2009.
  • Just within 9 months there were over 1 billion iPhone apps available to download or purchase.
  • If Facebook was a country, it would be the fourth largest country in the world. (This while Facebook is ‘banned’ in China)
  • An American study in 2009 discovered that on average, online students learn easier than students who could collect information throught “face to face”contact.
  • 1 in 6 college students has an online resume
  • 110 million US citizens, or 60% of all internet users, are using social networks.The average user of a social network visits social networks around five days a week and logs in around four times each day with a total login time of 1 hour. A social network addict ( approx 9%) stays logged in the whole day and is “constantly checking out user generated content”
  • The fastest growing group on Facebook consist of females around the age of 55-65 year old.
  • Generation Y and Z thinks that e-mail is outdated.
  • Wikipedia receives more than 60 million unique visitors each month and it is said that content on wikipedia is more reliable than any known printed Encyclopaedia.
  • Social Media has taken over the number 1 activity on Internet, pornography.
  • Facebook users have translated the entire Facebook website within 4 weeks from English to Spanish using a Wiki. The costs for Facebook were $ 0.
  • More than 1.5 million pieces of web content (web links, news stories, blog posts, notes, photos, etc.) are daily shared on Facebook. There are over 6,7 Billion “tweets” sent into the world. Watch the live score; http://popacular.com/gigatweet.
  • Common used enterprise social media strategies: Discussion Boards 76%, RSS 57%, Ratings and Reviews of articles or site content 47%, Profiles of Social Networking 45%, Photo Albums 39%, Chat 35%, Personal blogs 33%, Video-user submitted 35%, Podcasts 33%, Social Bookmarking 29%, Video Blogs 29%, Widgets 22%, Mobile Video/image text submission 16%, Wikis 16%, Citizen Journalism 12%, Micro-blogging 6%, Virtual Worlds 4%
  • 52% of all social networkers become friended or a fan of at least one brand through social networking
  • 95% of business decision makers worldwide use social networks (source: Forrester Research)
  • 17% of all university and college students have found a suitable job with only their online resume
  • Around 64% of marketers are using social media for 5 hours or more each week during campaigns, with 39% using it for 10 or more hours per week.
  • The online bookmarking service, Delicious, has more than five million users and over 150 million unique bookmarked URLs.
  • Favorite people on Twitter (Ashton Kutcher, Ellen DeGeneres and Britney Spears) have more combined followers than the entire population of Austria.*
source: http://www.visitorintelligence.org/social-media-facts-and-figures/

Thursday, September 2, 2010

What is Clickstream ??


Ideally Clickstream is the recording of what a computer user clicks on while Web browsing or using another software application. As the user clicks anywhere in the webpage or application, the action is logged on a client or inside the Web server, as well as possibly the Web browser, routers, proxy servers, and ad servers. Clickstream analysis is useful for Web activity analysis[1], software testing, market research, and for analyzing employee productivity.

A small observation on the evolution of clickstream tracking: Initial clickstream or click path data had to be gleaned from server log files. Because human and machine traffic were not differentiated, the study of human clicks took a substantial effort. Subsequently javascript technologies were developed which use a tracking cookie to generate a series of signals from browsers. In other words, information was only collected from "real humans" clicking on sites through browsers.

A clickstream is a series of page requests, every page requested generates a signal. These signals can be graphically represented for clickstream reporting. The main point of clickstream tracking is to give webmasters insight into what visitors on their site are doing.This data itself is "neutral" in the sense that any dataset is neutral. The data can be used in various scenarios, one of which is marketing. Additionally, any webmaster, researcher, blogger or person with a website can learn about how to improve their site.

Use of clickstream data can raise privacy concerns, especially since some Internet service providers have resorted to selling users' clickstream data as a way to enhance revenue. There are 10-12 companies that purchase this data, typically for about $0.40/month per user. While this practice may not directly identify individual users, it is often possible to indirectly identify specific users, an example being the AOL search data scandal. Most consumers are unaware of this practice, and its potential for compromising their privacy. In addition, few ISPs publicly admit to this practice.

Since the business world is quickly evolving into a state of e-commerce, analyzing the data of clients that visit a company website is becoming a necessity in order to remain competitive. This analysis can be used to generate two findings for the company, the first being an analysis of a user’s clickstream while using a website to reveal usage patterns, which in turn gives a heightened understanding of customer behaviour. This use of the analysis creates a user profile that aids in understanding the types of people that visit a company’s website.

As discussed in Van den Poel & Buckinx (2005), clickstream analysis can be used to predict whether a customer is likely to purchase from an e-commerce website. Clickstream analysis can also be used to improve customer satisfaction with the website and with the company itself. Both of these uses generate a huge business advantage. It can also be used to assess the effectiveness of advertising on a web page or site.
With the growing corporate knowledge of the importance of clickstreams, the way that they are being monitored and used to build Business Intelligence is evolving. Data mining, column-oriented DBMS, and integrated OLAP systems are being used in conjunction with clickstreams to better record and analyze this data.

Clickstreams can also be used to allow the user to see where they have been and allow them to easily return to a page they have already visited, a function that is already incorporated in most browsers.Unauthorized clickstream data collection is considered to be spyware. However, authorized clickstream data collection comes from organizations that use opt-in panels to generate market research using panelists who agree to share their clickstream data with other companies by downloading and installing specialized clickstream collection agents.

Source: http://en.wikipedia.org/wiki/Clickstream

Wednesday, September 1, 2010

Logfile Analysis vs Page Tagging

Ideally Both logfile analysis programs and page tagging solutions are readily available to companies that wish to perform web analytics. In some cases, the same web analytics company will offer both approaches. The question then arises of which method a company should choose. There are advantages and disadvantages to each approach.

Advantages of logfile analysis

The main advantages of logfile analysis over page tagging are as follows:

• The web server normally already produces logfiles, so the raw data is already available. To collect data via page tagging requires changes to the website.

• The data is on the company's own servers, and is in a standard, rather than a proprietary, format. This makes it easy for a company to switch programs later, use several different programs, and analyze historical data with a new program. Page tagging solutions involve vendor lock-in.

• Logfiles contain information on visits from search engine spiders. Although these should not be reported as part of the human activity, it is useful information for search engine optimization.

• Logfiles require no additional DNS Lookups. Thus there are no external server calls which can slow page load speeds, or result in uncounted page views.

• The web server reliably records every transaction it makes. Page tagging may not be able to record all transactions. Reasons include:
o Page tagging relies on the visitors' browsers co-operating, which a certain proportion may not do (for example, if JavaScript is disabled, or a hosts file prohibits requests to certain servers).
o Tags may be omitted from pages either by oversight or between bouts of additional page tagging.

o It may not be possible to include tags in all pages. Examples include static content such as PDFs or application-generated dynamic pages where re-engineering the application to include tags is not an option.


Advantages of page tagging

The main advantages of page tagging over logfile analysis are as follows.

• Counting is activated by opening the page, not requesting it from the server. If a page is cached, it will not be counted by the server. Cached pages can account for up to one-third of all pageviews. Not counting cached pages seriously skews many site metrics. It is for this reason server-based log analysis is not considered suitable for analysis of human activity on websites.
• Data is gathered via a component ("tag") in the page, usually written in JavaScript, though Java can be used, and increasingly Flash is used.

• It is easier to add additional information to the tag, which can then be collected by the remote server. For example, information about the visitors' screen sizes, or the price of the goods they purchased, can be added in this way. With logfile analysis, information not normally collected by the web server can only be recorded by modifying the URL.

• Page tagging can report on events which do not involve a request to the web server, such as interactions within Flash movies, partial form completion, mouse events such as onClick, onMouseOver, onFocus, onBlur etc.

• The page tagging service manages the process of assigning cookies to visitors; with logfile analysis, the server has to be configured to do this.

• Page tagging is available to companies who do not have access to their own web servers.

• Lately page tagging has become a standard in web analytics .

Tuesday, August 31, 2010

Misconceptions in Web Analytics

1) The Hotel problem

The hotel problem is generally the first problem encountered by a user of web analytics. The term was first coined by Rufus Evison explaining the problem at one of the Emetrics Summits and has now gained popularity as a simple expression of the problem and its resolution.

The problem is that the unique visitors for each day in a month do not add up to the same total as the unique visitors for that month. This appears to an inexperienced user to be a problem in whatever analytics software they are using. In fact it is a simple property of the metric definitions.

The way to picture the situation is by imagining a hotel. The hotel has two rooms (Room A and Room B).

Day 1 Day 2 Day 3 Total
Room A John John Jane 2 Unique Users
Room B Mark Jane Mark 2 Unique Users
Total 2 2 2 ?

As the table shows, the hotel has two unique users each day over three days. The sum of the totals with respect to the days is therefore six.
During the period each room has had two unique users. The sum of the totals with respect to the rooms is therefore four.

Actually only three visitors have been in the hotel over this period. The problem is that a person who stays in a room for two nights will get counted twice if you count them once on each day, but is only counted once if you are looking at the total for the period. Any software for web analytics will sum these correctly for whatever time period, thus leading to the problem when a user tries to compare the totals.

2) New visitors + Repeat visitors unequal to total visitors

Another common misconception in web analytics is that the sum of the new visitors and the repeat visitors ought to be the total number of visitors. Again this becomes clear if the visitors are viewed as individuals on a small scale, but still causes a large number of complaints that analytics software cannot be working because of a failure to understand the metrics.
Here the culprit is the metric of a new visitor.

There is really no such thing as a new visitor when you are considering a web site from an ongoing perspective. If a visitor makes their first visit on a given day and then returns to the web site on the same day they are both a new visitor and a repeat visitor for that day. So if we look at them as an individual which are they? The answer has to be both, so the definition of the metric is at fault.

A new visitor is not an individual; it is a fact of the web measurement. For this reason it is easiest to conceptualize the same facet as a first visit (or first session). This resolves the conflict and so removes the confusion. Nobody expects the number of first visits to add to the number of repeat visitors to give the total number of visitors. The metric will have the same number as the new visitors, but it is clearer that it will not add in this fashion.

On the day in question there was a first visit made by our chosen individual. There was also a repeat visit made by the same individual. The number of first visits and the number of repeat visits will add up to the total number of visits for that day.

What is Web Click Analytics

Web Click analytics is a special type of web intelligence that gives special attention to clicks (Point-and-click).

Generally in internet marketing, click analytics focuses on on-site analytics. An editor of a web site uses click analytics to determine the performance of his or her particular site, with regards to where the users of the site are clicking ie what are the various pattern of clicking.
Also, Click analytics may happen real-time or "unreal"-time, depending on the type of information sought.

Typically, front-page editors on high-traffic news media sites will want to monitor their pages in real-time, to optimize the content. Editors, designers or other types of stakeholders may analyze clicks on a wider time frame to aid them assess performance of writers, design elements or advertisements etc.

Different data about clicks may be gathered in at least two ways. Ideally, a click is "logged" when it occurs, and this method requires some functionality that picks up relevant information when the event occurs.

Alternatively, one may institute the assumption that a page view is a result of a click, and therefore log a simulated click that lead to that page view.

Wednesday, June 16, 2010

web analytics Tools

Web analytics or Web Intelligence is the process of collecting and analyzing the web content’s data in order to creat meaningful information about how your site is being utilized by your users. There are plenty of Web analytics applications out there, and we probably already know the big guns such as Google Analytics-GA, Web trends, Omniture, Hitbox, Crazy Egg, and remote-site services such as Alexa and Compete etc. some web analytics tools are pupular geographical region wise.

Sunday, June 13, 2010

Web Analytics Glossary

These are some important and popular web Analytics Glossary:-

Abandonment - The number of customers who drop off during the process of conversion, like a half filled form or incomplete purchase.

Acquisition - Process of attracting visitors to a website or the number of visitors arrived.

Affiliate Marketing - A method of marketing where other websites can sign up to sell your products for a commission.

Bounce Rate - The instances of visitors entering and leaving the same page.

Click Through Rate - Usually used as banner ad success. It is number of clicks/ number of impressions.

Click Through - This is an instance of a click on a link leading to another section of the site or page, or another website.

Conversion – An activity which fulfills the intended purpose of a website like buying a product, filling up a form or subscribing to a newsletter. Conversion rate is the percentage of visitors who successfully convert.

Conversion funnel – The defined path, like a series of steps or pages for a visitor to reach the final objective, like filling up a form or purchasing a product.

Cookie - A text file placed on the visitor’s computer while browsing a website. Cookies contain information to track returning visitors.

Crawler – An automated program used primarily by search engines and other services to gather information from the World Wide Web.

Entry Page – The first page viewed by a visitor while browsing through a website.

Exit Page – The last page viewed, rather the page from which the visitor exited.

Filters – A set of rules to extricate information from a large amount of data.

First Party Cookie – These cookies are placed by the websites unlike third party cookies placed by vendors. First party cookies are understood to be secure and reliable.

Hit – An often confused term, hits are any request by the browser to the web server. A web page is a collection of different components like HTML, Images and CSS, each registering as a separate hit with every single request for the page.

Impressions - Each view of an online advertisement is counted as an impression.

Key Performance Indicators – The crucial parameters showing the health of the website and success of marketing strategies.

Keywords – Words and phrases entered in a search engine to reach a result page. Keywords help position websites well to attract potential customers.

Log Files – A text file created in the server capturing all activity on the website. This file is the primary source of data for analysis.

Organic Search – Users find results through unpaid search engines, unlike PPC.

Page Duration – Time spent by visitor on a web page.

Page Tags – Tags are JavaScript codes embedded in the web page to be executed by the browser. Tags are used to generate log files used by certain Web Analytics Tools.

Page Views – Each rendering of the web page by the server is counted as a page view.

Path Analysis – Analysis on how visitors traverse through the website. Gives valuable information to check if they follow the intended site navigation etc.

PPC – Pay per click, also called paid searches where the advertiser pays based on the number of clicks on the advertisement. Google and Overture are two popular paid search engines.

Referrer – Websites, Search Engines or Directories or any others identifiable as the origin of the visitor.

Return Visitor – A visitor who can be identified with multiple visits, either through cookies or authentication.

Search Analytics – Analyzing search terms and behavior of visitors using the website search engine.

Session – The record of a single visitor browsing through the website. It includes an entry page, navigation and exit pages.

Stickiness – A website’s capability to retain visitors, measured as number of pages visited per session and time spent on website.

Visitor - Also called unique visitor, is understood as an individual visiting the website over a specified period of time. A visit is understood as two consecutive actions by a visitor within a span of 30 minutes.

Visitor Segmentation – The process of segregating and studying visitors based on various behavior patterns.

Web Analytics – The process of collection, measurement and analysis of user activity on a website to understand and help achieve the intended objective of the website.

Tuesday, June 8, 2010

Monday, May 31, 2010

Omniture, Google Analytics and webtrends

There are various web traffic measurement tool like Google analytics, webtrends,Hitbox etc. in india Google analytics is so popular.companies which has to handle large scale of data, that prefer webtrends due to its performance,new features and scalability.

Saturday, May 29, 2010

Webmaster, web traffic and web visitors

web traffic comes through web visitors who searching for a specific thing on internet.
webmaster try to pitch the potential visitors on internet, who may take or purchase the services or product of the website. Webmaster, web traffic and web visitors and closely inter-related.

Facebook, linkedin,orkut and social media

Facebook, linkedin,orkut etc networking sites are very popular these days globally. These sites provides various facilities and tools to spread any information globally within seconds.These days meet meet online and share their views.Marketers are adopting these social networking sites as a potential digital marketing tool since through these sites they can reach to targeted and potential web visitors or end user.

Thursday, May 13, 2010

Social Media Optimization and Digital Marketing

These days Social media is a way to interact people in digital environment.Marketers prefers digital marketing through social media networks.

These days popular social media marketing sites are:


  • Facebook

  • linkedin

  • orkut

  • hi5

  • SEO Analytics

    SEO Analytics and web marketing, webtrends

    Tuesday, April 20, 2010

    web analytics marketing scopes

    Today every company wanted to web presence due to marketing demand. These days potencial market is reachable throght internet.

    Sunday, March 28, 2010

    Web Analytics And Accenture Consulting

    Globally various IT companies working for web analytic, But Accenture is really delivered High performance in web analytics. This company creates many benchmark for digital marketing and web analytics.

    Sunday, March 7, 2010

    Web analytics and Digital marketing

    Web analytics and digital marketing are closely inter-related.Marketing strtegist can take marketing decision though web analytics.

    Wednesday, March 3, 2010

    What is Mobile Web Analytics ?

    Mobile web analytics studies and analysis the behaviour of mobile website visitors, in a similar way to traditional web analytics (or desktop web analytics). In a commercial context, mobile web analytics or Intelligence refers to the use of data collected as visitors access a web site from a mobile phone. It helps to determine which aspects of the website work best for mobile traffic and which mobile marketing campaigns work best for the business; this includes mobile advertising, mobile search marketing, text campaigns and desktop promotion of mobile sites and services. These days there are many scope in mobile web analytics.

    Friday, February 19, 2010

    Global Marketing Based on Web Analytics

    Global Marketing and web analytics:

    In this Digital age, marketing strategies are based on web analytics.
    Its because, web analytics tools says all stories about web visitors,
    and even through web analytics one can target its potential visitors.
    One can track and target its potential web visitors, who can take thair
    product or services online.


    There are many features in analytics tools that track even web visitors location like
    its country,state,city etc.and this data are very useful for decision maker marketing
    managers. These are the reason.

    Web analytics and KPI

    In today digital competitive marketing web analytics and Key performance indicators are interlinked greatly. web analytics are based on some KPIs.There may be different KPIs for diffeent marketing.KPIS are very important for market analysis and strategic decision making.Today web marketing managers takes decisions based on KPIS.so its play vital role in today marketing decision making.

    Wednesday, February 17, 2010

    Web Analytics Softwares

    These following are the web analytics softwares:

    1) Google Analytics
    2) Web Trends
    3) Omniture
    4) Yahoo web analytics
    5) AWStats
    6) CrawlTrack
    7) Webalizer
    8) Bango Mobile Web Analytics
    9) Hitbox
    10)VisitLab
    and many more......

    Saturday, February 13, 2010

    Marketing Decision Based on Web Analytics

    These days all marketing strategic decision are taken based on web analytics data, and it will increses day by day.

    Scope of Web Analytics

    In Today's Internet age, scope of web analytics is very broad. These days people are liking to purchase things on internet.specially in advanced countries like UK, USA, Canada, Australia, etc has becomes trend and habit to sell or purchase things on internet.These days product producers and seller know the importance of internet that how way internet increases the sale of a company.so these days product or services companies are making strategic analysis based on web based internet data.

    The scope of web analytics is broad and it will increasing day be day, because people are going to digital day by day.

    Friday, February 12, 2010

    what is Key Performance indicator in web analytics ?

    Key performance indicator is very important in web intelligence. A key performance indicator or (KPI) is a measure of performance. Such measures are commonly used to help an organization to define and evaluate how successful it is, typically in terms of making progress towards its short and long-term organizational goals. KPIs can be specified by answering the question, "What is really important to different stakeholders?". KPIs may be monitored using Business Intelligence techniques to assess the present state of the business and to assist in prescribing a course of action.

    Top Web Analytics IT companies in India

    These are the few companies in Web Analytics:-

    1) Accenture,Bangalore,Gurgaon
    2) Sapient, Gurgaon
    3) Wipro Pune
    4) Birla soft Noida
    5) IBM ,Bangalore
    6) Genpact, Gurgaon
    7) WNS, Gurgaon
    8) TCS, Pune, Gurgaon
    9) Patni, Pune
    10) GroupM, Gurgaon
    11) Mckinsey, Gurgaon
    12) HP,Bangalore,Chennai, Pune
    13) Naggaro
    14) Fidelity, Bangalore, Gurgaon
    15) Ibibio, Gurgaon
    16) American Express
    17) Yahoo
    18) Google, Hyderabad
    19) Microsoft, Bangalore
    20) KPMG, Gurgaon
    21) Makemytrip, Gurgaon
    22) Yatra.com, Gurgaon
    23) corematrix
    24) Omniture
    25) E2solutions
    26) AOL Time Warner
    27) Tech Mahindra
    28) Cognizant
    29) OPI Global
    30) Target inc
    31) ZS Associates, Pune
    32) Netapps, Bangalore
    33) Cisco, Bangalore
    34) Amazon, Bangalore
    35) eBay -Chennai
    36) Citibank, Chennai(Card Analytics)
    37) Dell Analytics, Bangalore
    38) HSBC Analytics, Bangalore
    38) JP MORGAN, Mumbai
    39) Standard Chartered Bank, Chennai
    40) UBS, Hyderabad
    41) Adobe, Bangalore, Noida
    42) Deloitee, Hyderabad
    43) HLL, Bangalore
    44) Boston consulting, Mumbai
    45) Target, Bangalore
    46) Tesco, Bangalore
    47) Citianalytics, Bangalore, Mumbai
    48) Hewitt Associates,Gurgaon
    49) Reseve Bank of india,Mumbai
    50) Nokia Networks, Gurgaon
    Many more

    Web Analytics FAQ

    1) What do you mean by web analytics?

    2) Why web analytics??

    3) Why web analytics is so important for today’s business?

    4) Tell me five most popular web Analytics tools other than Google Analytics?

    5) What is the difference between unique visitors, visitors, repeating visitors?

    6) Explain me five most important KPI’s in web Analytics?

    7) Have you ever configure and set the goals in Google web analytics?