Saturday, November 26, 2011
Sunday, July 31, 2011
How Do You Spell Website Success ?
spelling errors on commercial websites are a turnoff for many people. A recent BBC News article highlighted bad spelling as a potential cause of lost online revenue. In other words, typos could hurt your conversion rate and "cost you deep in the purse" or "deop in the pursa" as it might have been written 500 years ago.
That ancient phrase dates back to a time when very few people could read and write, and there was very little writing for most people to read. The idea that we should have a standard way of spelling only gained traction after printing technology drastically increased the number of words being put on paper (and even then, it took several centuries for the plural of egg to settle down as eggs, rather than egges or eggyes).
Some people still aren’t sure that standardized spelling is a good idea, a view reflected among the more than 600 comments sparked by that BBC News article in just 24 hours. However, many of those comments missed the point of the article: Bad spelling can undermine website conversion rates.
As one randomly selected web shopper put it to me: "If an online store is too stupid to get their spelling right, why should I think they will get my order right?"
Why Typos Kill Conversion Rates
Spell It RightThe fact is, commercial websites rely on text — written copy — to conduct business, from describing the product to explaining the purchase process. Even sites full of fancy graphics have to use words, and whether accurate spelling matters to you personally is irrelevant when it comes to conversion rate optimization.
The relevant opinions, the ones that rule in CRO, are those of your website's visitors. (Note: The use of Lick instead of Like as the first word of this article was intentional.)
Why would a typo cause visitors to your website not to convert, even when those people may themselves be terrible spellers? Without performing a formal survey of website visitors, any answer to that question must be based on supposition, but here are some suggestions:
Accurate spelling and good grammar are equated with legitimacy, if not consciously then subconsciously. Some of us may be more aware of this sentiment when it is expressed in the negative: Bad spelling and bad grammar are cause for suspicion. For example, what's the first clue a piece of email from a stranger is a scam? Many people would say it's the bad spelling and grammar.
Virtual transactions lack familiar clues about integrity, sincerity, and trustworthiness such as facial expression, tone of voice, and body language. We may be looking to website copy for clues instead and good spelling and grammar signify respect for the website visitor because the site owner has made the effort to copy edit the content. Obvious failure to do so may undermine consumer trust, a valuable commodity when competing for online dollars.
Size and location matter. If you are a big brand name like Target or Walmart you might not lose too many sales due to a typo in the details of a product description. But a glaring typo on the home page of a smaller brand may cause new visitors who don't know much about it to bounce.
Tips Two A Void Spell In Miss Teaks
Typos may be acceptable in some places, like tweets and Facebook comments, but commercial website copy needs to be clean and accurate.
Don't rely on spell checkers. The above heading passes a spell check with flying colors. Spell checkers can be a big help, especially those that flag errors as you type, but they just don't have the human intelligence required to know which words you should be using.
Use multiple human editors. I don't know any serious writers who believe they can reliably copy edit their own work. As the writer you tend to see what you think you wrote, not what characters ended up on the page. In a pinch, "multiple human editors" can mean the person writing the copy and one other person, but three sets of eyes are better than two.
Make sure your graphics people use the spellchecker in Photoshop for any images that include words. They need to use it before rasterizing the text layer. Editing typos in flattened image files is a real pain so check before you save to JPEG, GIF, or PNG.
Read more: http://goo.gl/Z0PGj
Sunday, July 24, 2011
3 Common Mistakes in Digital Media Data Analysis !!
With multiple campaigns in full swing and multitudes of data pouring in, it can be easy to misinterpret details and jump to conclusions about results without sufficient evidence. For example, media buyers may frequently be marketing to consumers who would have searched and/or bought anyway, without being hit with display impressions.
Buying behavioral, retargeting, search and other types of targeting data can make it even more likely that you are preaching to the choir. The trick is to determine whether your ads reached those consumers who truly needed to be persuaded or if they reached those who were closer to the conversion tipping point—either they already are or were likely to become customers anyway.
Certainly we want to avoid overkill and wasting money and impressions on consumers who didn’t need it. Before making a hasty assumption that may prove to be unfounded upon a deeper inspection, consider these common mistakes when examining digital media data.
Assigning a Causal Relationship Where There Was None
It can be quick and easy to assign causality when much of your data seems to point in the assumed direction. However, thorough testing of the hypothesis is required before jumping to conclusions.
For example, perhaps we have a lot of display impressions correlated with high search volume in one geographic area. Don’t assume that your display impressions caused the increased search volume. Perhaps instead there has been a general overall spike in brand interest in this market. Could offline tactics be the driver? Perhaps there was local news coverage related to your products.
To test the hypothesis that higher display impressions are driving search, increase or decrease display impressions and isolate other potential factors to see what kind of measurable impact—if any—this has on search.
Assigning Attribution for Sales Incorrectly
Particularly in markets where there’s a high likelihood that you’ll be targeting customers who are already buyers, attributing the sale can be complicated. This is especially true with site and search retargeting tactics.
Dropping a cookie on a user who visits your site or delivering ads across multiple networks to anyone who searches for your keywords can be very effective. However, in a typical purchase cycle, consumers shop around quite a bit. Absent a direct click-through-to-conversion path, it’s difficult to say that those who come to your site and viewed a banner made a purchase because of that banner. And, we don’t know what got them to the site the first time.
Are you showing ads to an audience who would have bought anyway and then attributing their buy to the fact that you showed them an ad? It’s a slippery slope that requires testing to measure the real impact.
To test your attribution theory and be aware of how retargeting might influence your results, adjust the number of impressions, frequency caps and other parameters and closely monitor and/or control for external impacts on search. When you have an overall picture of the pre-purchase drivers, you can more clearly begin to see what’s sparking the tipping point of conversion.
Failure to Consider the Big Picture
Digital media marketing through search and display don’t exist in a vacuum. Therefore, we must take a more holistic approach in determining the results. Don’t just look at click-through or search rates, but consider conversion rates, basket size, and other KPIs in relationship to these metrics.
It can be easy to say that display isn’t driving conversion if there’s no direct click-through to attribute, but how many consumers might convert with a higher basket size because of display impressions? If we look at the total number of impressions, but don’t see an increase in clicks, we might think it didn’t work, but we may be actually making more money because consumers trust the familiarity of the brand enough to make larger purchases. And, ultimately, isn’t that what we’re after?
Had we just looked at clicks or just at impressions, these results may have been obscured. To get a more accurate picture of results, we must look at all metrics from a holistic perspective to arrive at a bottom line.
Digital Media Analysis: A Double-Edged Sword
We definitely have access to hoards of data—infinitely more detailed than we could have ever dreamed of in the offline world. However, without careful critical analysis of this avalanche of information, we run the risk of jumping to conclusions without hard evidence or misinterpreting the data we collect.
Inspired by article: http://goo.gl/Taiqj
Saturday, June 18, 2011
Google Pilots an Analytics-Webmaster Integration
If you're serious about running a website and you want to get your search-related data straight from Google, chances are you've registered for both Webmaster Tools and Analytics; anyone who signs up for Analytics almost certainly benefits from Webmaster Tools as well. In an effort to start cross-breeding the two services, Google has launched a limited pilot program that integrates data and tools from both services
What the Experimental Service Includes
The cross-service tools for the pilot program will start out as fairly limited. More specifically, they will include a set of reports that pull information from your Webmaster Tools and display them in your analytics interface. Beyond giving you the chance to see more data in one place, the reports will allow Webmaster data (such as clicks, queries, impressions, average position, etc.) to be seen in the dynamic Analytics charts. Many other Analytics features, such as filtering and visualizations, will be available.
For example, organic search impressions, clicks average organic position and clickthrough rate (CTR) are part of the main interface in this Google Analytics pilot.
Google Analytics with Webmaster Tools Integration
This is just the beginning for cross-integration, however, it seems that Google is mostly aimed at rolling Webmaster data into Analytics.
The official Google statement announcing the program indicated that, "We hope this will be the first of many ways to surface Webmaster Tools data in Google Analytics and give you a more thorough picture of your site's performance." It's not yet clear whether those accepted to the pilot program will also have earlier access to additional programs.
Signing Up for the Pilot
First, a couple caveats. Keep in mind that as a limited pilot program you may just not get in. Further, if you do get in, it may not be for several weeks; as Google puts it, "If you're chosen for the pilot, you'll receive an email in a few weeks with more details." That said, signing up is fairly easy.
First, go to the Google pilot sign-up page. Fill out the form located here (you'll need to provide your contact email address, first and last name, Google Analytics ID, login email address, and domain you intend to use the reports for). Then hit submit and you're clear! All you have to do now is wait patiently and hope you get selected.
Source: http://goo.gl/jxfX1
Tuesday, May 31, 2011
IBM Social Media Jam Aims to Build Social Business
IBM has published a paper on social media and where it believes it is going. While IBM (news, site) might not normally be associated with social media, the paper is the result of what is described as a web jam with over 2,700 participants over three days and from over 80 countries.
A web jam, IBM says, is an online conversation with the purpose of discussing a particular issue — in this case social media — and drawing conclusions that can be brought into the future with a particular goal.
While the goals of this jam are not completely clear except that it falls into Big Blue's wider "Smarter Planet strategy," there does appear to be an implied suggestion that participants should look at their social media strategies and see where IBM technologies fit into it all.
Though not providing a clear roadmap of IBMs social media strategy per se, it does point in the direction that IBM will be going to address the concerns raised over the course of the online conversation.
A web jam, IBM says, is an online conversation with the purpose of discussing a particular issue — in this case social media — and drawing conclusions that can be brought into the future with a particular goal.
While the goals of this jam are not completely clear except that it falls into Big Blue's wider "Smarter Planet strategy," there does appear to be an implied suggestion that participants should look at their social media strategies and see where IBM technologies fit into it all.
Though not providing a clear roadmap of IBMs social media strategy per se, it does point in the direction that IBM will be going to address the concerns raised over the course of the online conversation.
Friday, May 13, 2011
Why an Adaptive Social Business Model is Needed ?
The trouble with many social business models today is that they don't allow room for adaptation and manipulation; they put organizations into a box and expect them to move in a linear way to get to their goal, not being able to move forward until each preceding step is completed.
Problem is, every organization has different needs in different areas. The point of developing something like this is to address and show the common elements among organizations that are investing in social business while allowing the flexibility of every organization to focus on the necessary areas.
How This Can Help Your Organization
Some organizations might have a rock solid organizational culture, a solid process, and a great technology stack but might be weak in the goals and objectives and governance areas. This framework is designed to let organizations look at and understand the key components that make up each sphere.
Your organization might be great at one of the five areas, whereas another organization might be solid at three of the five.
Organizations can maneuver through this framework to improve on areas where they are weak or perhaps not as strong as they would like to be. It's adaptive because it doesn't force organizations down a single path – yet addresses the key areas for social business.
Every organization can determine which areas they need to work on and which ones are solid.
While the framework uses the term "social" the reality is that many of the concepts are built around traditional approaches to business but in this case slightly adapted specifically to organizations interested in emergent social software. As Gil Yehuda has said, "it's very healthy to view a social initiative as a business initiative."
source: http://goo.gl/sVGnx
Sunday, April 24, 2011
Benchmark – Bounce Rate and Conversion Rate Survey !!
Want to know about Industry Benchmark data of Bounce Rate and Conversion Rate. Participate in the Survey to access the free report.
Bounce Rate:- Bounce rate (sometimes confused with exit rate) is a term used in web site traffic analysis. It essentially represents the percentage of initial visitors to a site who “bounce” away to a different site, rather than continue on to other pages within the same site.
Conversion Rate:- The percentage of visitors to a web site who complete a process designed by the web marketer. For example, if out of 100 visitors who respond to an ad by clicking on it and coming to a landing page, two go on to complete and submit a form, the conversion rate for that program is 2%.
Wednesday, February 16, 2011
Web Analytics – Refining and Redefining The Current Techniques !!
In the coming year we shall see tremendous refinements in the current web analytics tools to meet the new and very specific analytics needs of growing organizations. Here is our list of the top trends that we are likely to observe in 2011:
1. Tackling the challenge of analyzing multiple web access points:
The ways through which people can now access web has grown significantly and so analysts needs to stand up and face the challenge of understanding the behavior of the users of these multiple platforms to drive the whole advantage. In 2011 connected devices are set to make more use of your cars, work stations, TV, devices with larger screens and so on for keeping you fully connected. With multitude of handheld devices like smart phones and iPad that offer enhanced viewing and browsing experience – the number of mobile web users is increasing exponentially. Mobile analytics here would emerge as a distinct component of reporting user information for clients.
2. In-app analytics will come up significantly in 2011:
Today’s mobiles are flooded with options to run thousands of applications, which are slowly becoming significant for communication, interaction, navigation, search, entertainment, information, gaming, etc., for the users. The list is growing at an alarming pace and is capable of keeping the users addicted to their gadgets. iPhone applications alone run over 350,000, while Android has over 50,000 apps. The reason for this growing numbers is the usability and utility of these applications in smart phones. Web analysts will have to burn the midnight oil to extract valuable data from these applications’ usage.
3. Proactive companies will focus on analyzing the voice of the customer:
With millions of conversations happening online, companies have little control over what is being talked about their brands, but by being proactive in listening and following their brand name on all online channels, companies can influence and give a positive direction to those conversations. Social media analytics would be extremely useful. Another challenge here is to cut down huge amounts of unnecessary data, which may become a specialized task for analysts. Twitter is coming up with its own analytic tool which could give out some real valuable data, good for analysts to track and drive marketing objectives.
4. Better framework for social media analytics would be developed:
Currently, there are thousands of free and paid tools in the market. Each addressing a very specific analytics need and none of them are totally accurate. There’s a need for some level of tool consolidation and a social media analytics solution that answers critical questions holistically when it comes to tracking a brand online. Convergence of best web analytic practices may shape up new and standardized frameworks. Recently concluded global events and conferences like Blogwell, FutureGov Summit, among others, focused on the hot issue of developing a strategic social media framework for better and effective analysis.
5. Social media campaign analytics is coming into the picture:
With major companies targeting social media for online marketing; campaign analytics on social media is becoming a necessity. The metrics for measurement is totally different from the standard campaign analytics. Tracking Facebook fan pages, twitter hash tags, various brand specific applications and promotional campaigns on social media is becoming the need of the hour. So with increased use of online marketing, tracking social media campaigns will now grow at par with regular web analysis.
6. Privacy concerns might change the web analytics landscape totally:
With increased concern over privacy issues, tracking data that is private in nature might get banned altogether, and analysts might end up looking for new jobs. In 2011, we will see what Federal Trade Commission’s Do-Not-Track initiatives include and how it will impact web analytics. It may be something similar to the Do-Not-Call registry for phone users.
7. The need for web analysis will shoot up, so will the jobs
With online marketing, experiencing a near primary-marketing platform status, businesses are rapidly joining the web analysis bandwagon. Major web analysis vendors have seen a steep rise in the application of their solutions, which is indicative of this trend. And with more analysis needed we will surely see a surge in demand for quality web analysts too. As per Econsultancy report, companies are investing more in people as well as technology in order to get ‘actionable’ information from their web data. Forrester Research reported that web analytics spending will more than double to $1 billion by 2014. This implies a pressing need for analysis, and of course, more jobs for analysts.
Source: Web Analytics – Refining and Redefining The Current Techniques
Sunday, January 23, 2011
Why Different Web Analytics Tools Report Different Numbers !!
The main reasons behind different data against different web analytic tools are:-
a). Raw logs are useless:-
Most servers store raw logs, which are lists of all the accesses and page requests on your website. It’s possible to interpret those raw logs with special programs, creating graphic reports which will contain the number of visitors, page views and so on.
Those numbers are grossly overestimated, though, because all kinds of search bots and automated queries are counted together.
b) Webalizer and AW Stats overestimate
Webalizer and AW Stats are very popular web analytics programs, and that is because they are usually installed by default on cPanel (the control panel software on most hosting companies). Both of them tend to overestimate the number of visitors and page views your website receives, however, and such data should always be used with a grain of salt.
c) Ad networks underestimate a bit, but there is nothing you can do about it
The number of impressions you’ll see on most ad networks control panel usually is an underestimation of your total traffic, and that is because they won’t track people who can’t see ads or who block them on purpose.
There is nothing you can do about it though, and if you want to make money using ad networks you need to play under their rules. The alternative is to have your own banners embed with HTML code, in which case they would be seen by 100% of your visitors.
d) Google Analytics underestimates a bit, but it’s the industry standard
The numbers reported by Google Analytics also underestimate your traffic slightly, and that is because the software has very strict rules regarding what should be considered a visitors and a page view.
GA’s tracking is very reliable, though, and that is why it’s used as the industry standard. If you want to sell a website, for example, most serious buyers will ask for Google Analytics data before they make an offer.
Subscribe to:
Posts (Atom)