Technology Services

Remote Hosted Sites and ISP Policies

[Editor’s note: This post is part 5 of a series of posts discussing Log File Management. For more on this topic, be sure to read Tyler’s other posts.]

For small, medium, and even large organizations, it is still relatively common to outsource the hosting of one or more Web sites. While this provides many advantages and can be cost effective, it does pose certain challenges which are often overlooked. Within the scope of this blog, the core challenge here refers to accessing log files.

Depending on the policies of the ISP, log files may be always accessible, may require special permissions, or may not be available at all. When choosing the ISP for your sites, this is a concern which must be considered. For those ISPs which do provide access to logs, it is important to understand that this does not imply that they warehouse days, months and years of log files. In fact, we’ve helped many clients and potential clients identify that their ISPs only warehouse logs for 30 – 45 days as an internal policy and do not provide any means to access files beyond that.

As a case in point, PublicInsite worked with a client with a substantial sized Web site which outsourced a portion of their Web site to an ISP for load balancing reasons. In anticipation of getting an extremely large amount of traffic in a very short time for visitors to download a particular document, they chose to outsource rather than invest in hardware and bandwidth internally (not a bad decision at the time). A few months after launching the new document and having a desire to understand the traffic (i.e. number of downloads of the PDF document), we were told that they outsourced this portion of the site to a local ISP which does not warehouse logs greater than 45 days. What did this mean? It meant that by the time we were contracted to analyze the data, the first few week of logs were long lost. We were unable to identify the true initial demand on days 1 thru 15 of the release of this document, clearly, which would have been the largest volume of traffic (we’ve seen this historically each year so we’re confident of this!).

Regardless of how many days your ISP will warehouse your logs, it is an extremely simple process to create a script which runs daily to download the logs. Once you’ve established this process, all you need to do is ensure that the script is working reliably about once per month.

It’s simple; don’t let your ISPs policies interfere in your ability to do proper historical analysis of your log data. By asking a few simple questions of your ISP and downloading your logs regularly, you will never find yourself find a position like the example above.

 

[Editor’s note: For more information on log file management, be sure to read Tyler’s ongoing series of blog posts on the topic starting with Best Practices for Log File Management.]

Tyler Gibbs

Share
Published by
Tyler Gibbs

Recent Posts

Optimizing user experiences with Digital Experience Analytics (DXA) platforms

As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…

2 months ago

Enabling Value-Based Bidding with Google Tightlock

Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…

3 months ago

Resolving “Unassigned” Traffic in GA4

Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…

3 months ago

This website uses cookies.