SiteCatalyst FAIL! ;-)

March 10, 2009

They should have saved this for April 1. I was trying to compare February 08 data to February 09 data. Everything was going along swimmingly until the calendar is Omniture Ste Catalyst failed, big time. I have tried and tried to recreate it so I can let Client Care know how I achieved this spectacular bug, but to no avail. I cannot replicate the magic combination of clicks that freaked the calendar out.

Site Catalyst Calendar Fail

Site Catalyst Calendar Fail


We (Tony, my boss, and another colleague of mine) returned from Omniture Summit last week all aglow with excitement from things we learned, ideas we plan to pursue, etc. I still have remnants of that excited tingle as I cruise back over my notes.

The absolute biggest take away for me is the use of the Excel Client and the power that gives me to combine web analysis with other business intelligence data (for various reasons we aren’t currently utilizing the API to load off line data into Site Catalyst). I had played with the Excel Client previously but I never built any actual reports using it. This week I jumped headlong into it and I have three immediate thoughts:

  1. I really miss the report builder that was in SC 13, the interface in the Excel Client is similar and building complex reports is so much quicker than is v14
  2. Oh I wish there was an Excel Client for Discover (there isn’t, is there?)
  3. Can I get just a refresh button that anyone can use without having to give them an account as an Excel Client licensed user?

I’ve been practicing by building a report/dashboard for a specific campaign we’re running about Dave’s advice to keep investing despite the current economic turmoil. Specifically the campaign funnels visitors to a page with a video, a download, a newsletter signup, and progression to a lead form. The current measures (this still a work in progress) tracked are:

  • page visits
  • visits from the various entry points
  • video start and completions
  • downloads
  • newsletter signups
  • lead form starts and completions from this particular page

…and then all the various ways of conversion ratios etc.

I haven’t yet segmented any of the data yet, by say entry point or visit number (despite spending much time on Occam’s Razor this week). But the reason is a good one – I swear, I’m exploring Visualizations and how to represent the data compellingly. Unfortunately I haven’t had a big breakthrough yet, but I am digging through Instant Cognition veraciously. It could be the case that my breakthrough won’t happen until I’ve segmented my data.

We’ve made changes to our cross-selling and the time has come to determine the effectiveness of those changes.  The changes were made through common sense; we looked at the organic cross-sales that occurred, compared those to the cross-sell recommendations in our store, and added replaced ineffective cross-sell recommendations with strong organically occurring cross-sales. Simple really.

The difficulty I’m finding is in determining the attributable growth in store sales overall to cross-sell. Does anyone have any tips/tricks to get the numbers to show me something, anything useful at all?

We are very fast growing, in reach, in traffic, in brand awareness, and all those result in sales. Additional to the difficulty, is that when the changes were made we only looked at a handful of products, rather than the entire catalog.

So far my process looks like this: I’ve pulled comparison cross-sell reports and highlighted the products changed. The data has been normalized and I’ve looked specifically at the 2 things

  1. The ratio of cross sell for the changed items against the sales of the originating item
  2. The ratio of sales from cross sell to total sales for the changed item

So I have increases in individual item cross sales, but how can I know how much of an items total growth is due to cross sale growth, especially if total growth outpaced cross sell growth? And how do I take that one step further and extrapolate the growth of the store sales to cross sale growth?

Campaign Tracking

October 24, 2007

I’ve spent a few days thinking over how to track all our various campaigns both internal and external. We set up Site Catalyst, beyond the standard campaign tracking technique, to give us perpetual tracking based on the first internal campaign responded to, and separate reporting based on the most recent campaign response.

I wish Omniture would make a campaign value assignment similar to the page value that already exists. We have so many various email campaigns, adwords, etc that it is quite possible a visitor may respond to multiple campaigns sequentially leading up to a tracked event. It would be nice to see a break down of campaign value for when a campaign doesn’t lead directly to a conversion, but leads to future campaigns (or is at least within the string) that result in conversions.

The most difficult part (which probably wasn’t really that difficult) was determining how to manage ongoing email campaigns. We have a number of standard emails that run based on events, and then of course the weekly and monthly promotions, newsletters etc. How to set up campaigns for those? As individual campaigns for each email? As campaigns based around events that trigger emails?

Blogged with Flock

WAW Nashville

July 18, 2007

The first Web Analytics Wednesday in Nashville (in a while at least) will be August 8 at Logan’s Roadhouse in Cool Spring (15 minutes south of Nashville on I-65) at 12:45pm. We’re going to locate ourselves in a back corner of the patio (past the bar area), ask the host for the Web folks, and hopefully I’ll have gotten a copy of the big red book to display by then.

Breaking from the WAW norm, this will be over lunch (which is why you won’t find this on the WAW event list – because Eric specifically asks “Please do not ask to schedule events for other days or times.” It will be at lunch this time (against regulations) because that was recommended to me by a couple of other individuals for our first time out of the gate (and Wednesday evenings in the Buckle of the Bible Belt are a tough sell). So we’ll try lunch and then hopefully move to the officially sanctioned time of 6-8pm.

Expected in attendance are Dave Ramsey (that’d be me and Brent) and one, maybe more from Hopefully someone from Nissan will get wind, we’re going to try twisting arms at Gibson, and I hear that Genesco just implemented (or are in process) a major analytics vendor.

Any questions? Post a comment.

Tags: , , , , , , , , ,

Lessons in Segmenting

July 9, 2007

We have now been live with Site Catalyst for exactly 7 days (minus a few hours). We get enough traffic to actually have some data to look at, which is nice, but we aren’t fully implemented yet, which isn’t so nice. It actually leaves me in a slight state of limbo able to see something and come away with interesting observations, but I’m unable to segment much of the data yet as we are still implementing. And so much of my time is spent watching and think “wow, I wonder if that means anything.”

For my own edification a breif story in segmenting and so what. One of our web properties is a subscription site. In the past week visitors to the home page (which also doubles as the login page) have navigated away in under 15 sec at a rate of 37%. Gee that seems high.

So I dig a little further, where are people going? Well it turns out that 32% are actually logging in. It would be my guess that most of those are on that home page for under 15 sec, but here is where segmenting would be nice (does anyone know how to correlate time on page with next page in SC? I’m unable to figure it out yet). Potential, that <15sec crowd is made completely of members loggin in, but I really don’t know that yet. And if that’s the case, why are so many non-members (68% – and yes I know I can’t really say that, but for arguments sake) sticking around the home page but not signing up?

Man I’ve got bunches to learn yet.

Correlating KPImportance

June 27, 2007

Brent and I sat down with Site Catalyst today to begin building what we want for our traffic correlations. We made a critical error that I only see now looking back over our meeting, we didn’t bring with us our KPI’s nail them to wall in front of us.

In the moment it just seemed like maybe the heat (the air conditioning is out on our floor and as I type this it is a balmy 85 degrees in the office – but it’s getting fixed tomorrow) was making the thinking harder as we asked questions of each other about what do we really care about correlating and what’s important and what we may be over looking.

But now with a cool glass of water and a bit over an hour gone by I can see so clearly how much more easily that process would have gone had we had our KPI’s right there, not just to reference, but to build directly from. Rookie mistake?