Pages where we have presented your company to an audience who may or may not be searching for products or services that you offer. Impressions are a good indication of how active your sector is on the directory.
An industry agreed measurement that we use to describe the number of times a human has visited a particular page. This statistic is one of the most important as little interaction can occur without first receiving a visitor to your advert.
Your advert contains a small hotspot that reveals your telephone number when activated. The action of revealing your phone number will record a telephone request. Trials have confirmed that this measurement is 85% accurate taking into account accidental requests.
Times have moved on and our advertising now offers a very concise, feature-rich overview of your website and business. Unlike a blind click from a conventional PPC ad, our referrals are made by pre-qualified customers who already like your proposition.
Analysing website traffic can be far trickier than it would seem. Long gone are the days where the internet was considered 'worth a try'. It is now a far more serious business, driving commerce on an unprecedented scale. This new way of the world has become part of our daily lives, generating many businesses large portions of their income.
The result of this has given rise to advertising competition beyond what could ever have been realised at the turn of the century. We all want to make the most from our marketing investments and if any analysis can give an insight into the behaviour of our visitors we welcome the opportunity.
There are numerous ways to record website traffic, and each way will be open to interpretation and its own set of rules. This causes what we call discrepancies or maybe inaccuracies to others. Here we will provide information on various measurement methods and describe why differences are commonly seen between each method.
Businessmagnet provide the raw data of each activity that occurs after it has been cleansed of robotic activity and our extensive database of known nuisance ip addresses has been removed. Whilst we do not record demographic information we do provide an accurate measure of what has happened on your advert.
We have in place our own iTraka software, developed and improved over the last decade to provide the most accurate data that we can supply our advertisers.
The most common issues arise when we try to compare results from two or more different analytical sources.
There are various methods that are used to record internet activity on any given page or website. Some use instant code that records to a database as the action happens, others use code which triggers behind the scenes scripts to send data to an account associated with the website. Servers where websites are hosted also commonly record website activity into a text file called a log file, this information can then be analysed using an analytical program.
No matter what the method, each will be open to discrepancies when compared to one another. This is very common and we hardly ever see any two methods that match when comparing reports.
By far the most widely recognised website analytical software used today is Google Analytics. The program is free to use and is a very good method of understanding trends, traffic flow and other useful data. We find that most inconsistencies reported occur between our data and this tool due to its widespread use. We find that analytics most often under-reports our data.
There are various reasons that these differences occur which we will describe, however, it is important to understand that Businessmagnet records a statistic as a result of a direct action; a click of a mouse in this case. This action is recorded to our database as it occurs after passing through several checks to ensure it was not robotically generated. For analytics to capture this same event, the website must receive the visit and fire a script which in turn records the click to an analytics account. The referral information must also be intact to identify that the visit arrived from Businessmagnet.
When an analytics account is created, a snippet of code will be generated that will need to be placed in the head section of your websites code. This is between two tags called <head></head>. Placing the snippet just before the closing head tag </head> has been identified as the most reliable. See here. If the snippet is placed anywhere else within the code, there may be inconsistencies in data. It was common before 2009 to place the analytics snippet at the foot of the webpage code. This is not considered the best method for more recent analytics scripts as the code will not fire until all other scripts on the web page have been completed.
Businessmagnet use a friendly temporary 302 redirect to refer a visitor to your website. This is the most common method to refer website traffic and we use a specific script ourselves to record the action of the website click. This action of passing a visitor through a temporary redirect whilst needed often causes the loss of referral information contained within the http header of the visitor. What this means is Businessmagnet will not be identified as the referrer and the visitor in effect will have appeared from no where. These are classed by analytics as direct visits. See here
Most websites these days are template driven so adding analytics code is quite easy. We do however commonly find that discrepancies between data has been caused by analytics code that is not present on one or more pages or has been over-written by other code managed in a content management system. The analytics snippet has to be present to record activity.
Websites are becoming more complex and the need for external scripts is now common place. We commonly see websites with ten or more scripts, each responsible for an aspect of the web page. These scripts all need to run successfully and complete to successfully load the web page. As analytics also uses a script it is sometimes seen that on heaviliy scripted websites the load of the snippet does not complete before a visitor leaves your website. This is why it is important to place the analytics snippet with the head tags of your code so it can be given priority over any other external or on page scripts.
Having duplicate analytics snippets within the same web page can cause large data discrepancies. We have seen this issue many times. See here >
It stands to reason that writing data to an old analytics account or to an incorrect account number will not record any activity into the appropriate analytics account. Check the account number that appears in your analytics log in with the account that is present in the snippet and ensure they are identical.
If the referral from Businessmagnet was via a mobile device, the action of visiting a website that has a mobile version sometimes triggers a script to load the mobile version of the website which stops the referring information being passed. In most circumstances this results in seeing a referral from your own website domain.
Websites under a security protocol such as a secure socket layer normally accessed via Https will almost always prevent referral information being passed from third parties. If your website is Https then you can use a landing page outside of this protocol to collect data.
Analytics reports direct traffic which is usually people accessing your website via your domain. However, most website receive their traffic via links, referrals, ads and organic search. If direct traffic accounts for more than 40% of your websites traffic, it is highly likely that referrals and visits are being received from a third party source but the referring information is being lost. There are various reasons for this including some of the above already mentioned.
Analytics is an extremely useful tool but it does rely on a number of factors that are outside of its control. We use an event script which occurs on click to record data, variables are very few and accuracy is high. The job of analytics is far more difficult and open to many external variables.
A useful tool for checking your analytics code can be found here. Tag Assistant is an official Google program that is used in conjunction with Chrome. It will report on any issues such as duplicate, broken, incorrectly placed and outdated code.
A log file or web log in the website world is a text document that is written to by the hosting server giving details of actions that have occurred on a website. The information retained by a log file commonly includes, client ip address, time of request, page requested, HTTP code, bytes served, refer and user agent. Log files are one of the most accurate ways to measure website activity, but as with analytics, they too are open to inaccuracies.
In its raw format, a log file can be extremely difficult to gain useful information from. Typical log files will extend to thousands of lines. To make the process of understanding website log files a more simple task, many companies offer dedicated software that reads and interprets the individual lines into a more understandable format. Programs that are commonly used include AWstats, Webalizer, ClickTraks, Web Log Expert and Web Log Storming.
The most comprehensive of these will analyse similar information to Google Analytics TM, however, as log files are based directly on a server where the website is hosted they will normally provide a more accurate set of statistics. This is due to the way in which the activity is recorded. The request for a web page by a visitor is made to the server, it is this request that then records the activity. The margin for error is small, but there are still a few problems that cause inconsistent results.
Internet Service Providers use caching to store website pages so that they appear to load faster the next time you browse to the same page. This is common practice and is becoming more popular with internet service providers as competition increases. There are various problems that website developers face with ISP caching. Unlike browser caching that takes place on your computer, ISP caching gives no control or ability to refresh a web page. The usual ISP cache is refreshed after a pre-determined amount of time but can be as long as 24 hours.
The caching issue is also extended to log files. As the request for a web page is served from a stored copy, the hosting server isn't contacted to retrieve the page thus not recording the activity. It is estimated that this results in a 20 - 30% traffic reduction as read from log files.
Almost all browsers will cache copies of website pages that you have visited. Revisiting a page that has already been cached will be served from a temporary store on your hard drive. This in most cases will not send a request to the hosting server, thus not recording an action to the servers log files.
Networks also cache web pages from time to time. As with both caching methods described previously, network caching also has its own set of rules. Popular websites visited from within a corporation are the most likely candidates to be cached within a network store. As with browser and ISP caching, any page served direct from a network will not send a request to the hosting server.