Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Shopping Cart Redesign Boosted Software Sales 94% in A/B Test (bingocardcreator.com)
75 points by patio11 on Feb 17, 2009 | hide | past | favorite | 20 comments


Thought you guys would like to hear an update. Short version: the new cart appears to be working spectacularly well. Long version: see submission link.

Original HN thread: http://news.ycombinator.com/item?id=477233


I know I did. Thanks so much for writing these great "What I did and how it worked for me" posts with such detail, I really enjoy seeing how other entrepreneurs have gone about their businesses.


Me, too. And congratulations -- finding a 100% increase lurking on your site is pretty huge. I hope you find more. And make sure you let us know about 'em!


You may be interested by my blog (see profile), which is essentially 2.5 years of "I did X and got a 2%, 5%, 25% lift" or "I did Y -- sucked, reverted". Granted, most of them are not in nearly so much detail and when I was starting out my approach lacked rigor. Heck, it still does in a lot of ways.

Some random notes:

1) Linking non-technical users to an image file is like asking them to please leave your site. Since I had a prominent image on the front page, replacing the direct link with a lightbox resulted in about 10% decrease in bounce rates.

2) I used to use quiet, understated buttons in my sidebars and on my download page. My friends derided them as "big pancake buttons" when I started, and it has sort of become my design trademark: they're now MUCH bigger. (I think the ideal size is about 800x600 but I've never been quite brave enough to test.) +25% conversions.

3) The first textual link in copy gets clicked on more than any other textual link in the page. Make sure it leads to a high value page (in my case, download free trial) rather than a low value page (in my case, screenshots -- I think, its been a while).

4) I just learned this two weeks ago: if you have copy which takes up impressive vertical space with a button at the top and a button at the bottom for God's sake put a button at appropriate points in the middle. +10%



well done!

Would be great if you could post an update at the end of this month, and see if you really had to rescale the Y-axis ! :)


There's a link in there to my automatically updated sales data. It rescales automatically. Feel free to submit that page if you want, I try to keep bragging to a dull roar. ;)


I am so used to loosing my place when clicking links, that I routinely middle click them to open new tabs - this includes add to cart links.

So I would never discover this new box on my own on your page. (Since middle clicking just takes you to the old cart.) Ideally you'd want middle click to take you right back where you came from, but with the ajax cart pre-opened.

Am I so unusual? Every person I show the "middle click open tab in background" to gets hooked, and now that's the only way they browse.


You may or may not be usual, but you are certainly not a usual elementary school English teacher.

I could show you an old heatmap of my front page which had a non-interactive screenshot of the program. There is a BIG RED DOT on the print button in the screenshot.

My customers are not middle clickers.


I do this too.

One possibility would be to send the product id in the link's url, then display the product in a lightbox on the new tab.


no, I introduce people to it and they all get hooked too. no more time wasted waiting for pages to load.

ctrl + click does the same thing, and ctrl + shift + t opens the last tab you closed.


Interesting... congrats on the increased sales.

One thing I was a bit confused about: You say you want people to easily be able to continue browsing your marketing copy before committing to buy. But how do they get back to the cart once they've closed the lightbox? Clicking the buy button again reveals just one item in the cart. So in essence closing the lightbox means "discard this cart" rather than "continue shopping", right? That means your findings only work for pages that have one single item for sale (or items similar enough that buyers only want to purchase one of them, as in your case).

Your sample size is also quite low (94% increase = from 8 conversions to 15), and you're using absolute values in the graphs. Did traffic to the page stay constant over all that time?


Thank you.

In my entire existence as a business I have never seen someone by X licenses plus Y [license plus CDs], and indeed even explaining the difference to my customers would be tricky. So I opted to make it impossible -- if you ask for 5 copies of the software via download you get five copies, if you then ask for the CD instead it will set you up for 5 copies and 5 CDs.

The cart actually remembers everything that you've put into it within the current page, even if you close the lightbox. If I wanted to save it there are options but a) dirty hack and b) most of my customers have no need for it.

Your sample size is also quite low (94% increase = from 8 conversions to 15), and you're using absolute values in the graphs. Did traffic to the page stay constant over all that time?

That last question appears to demonstrate a misconception about A/B tests. I did not test the old cart serially with the new cart -- I've done that sort of thing before, but the results are automatically suspect because factors other than the variable you're testing are constantly changing. An A/B test tests the old cart and the new cart at the same time -- when you open purchasing.htm Google flips a coin and cookies you up with the results. Heads you get the old cart. Tails you get the new cart. No matter how many times you go back you get the same cart (until I terminate the test, obviously).

This means that I'm able to have confidence in the results despite this week having traffic far above my typical values, due to Valentine's Day. (Certain holidays are almost always good to me. Why is outside the scope of this post.)

The sample size was not 8 or 15, incidentally. It was two groups of over a hundred (prospects, not customers). While I'd prefer groups of over a thousand for the obvious reason that it implies I'd sell ten times more software, in stats terms that doesn't make the experiment more valid, it would just decrease the size of the confidence intervals by roughly a factor of sqrt(10), and it might also increase the confidence in the significance test (that was the second 94% value, see the writeup).


The image clearly shows that the result is statistically insignificant. This is very important as you could be misguided. The general thumb of rule for running full factorial A/B test (like the ones GWO does) is that you require about a baseline level of million page views with about 5% conversion rate in order to get successful results in one week (including weekends).


My understanding of the word "full factorial" means that you have multiple design elements under test at the same time and you're testing all possible combinations. For example, you are testing two alternative images and two alternative headlines. This gives you 2 x 2 = 4 possibilities to show to any given user. As you increase the alternatives for each factor and increase the number of factors, the total number of alternatives grows in a combinatorially explosive fashion and you might indeed need a million page views and 5% conversion rates. (Heck, six factors with 6 options each and even with a million viewers you'd have less people seeing each combination than I did, unless you started pruning them early.)

But I'm still only testing two alternatives of one factor. I mean, yes, that is included in the definition of "full factorial" but it makes an absolute hash out of that rule of thumb. Two choices total means the stats test is simple and does what it says on the tin: 94% chance that new cart outperforms old cart, exact magnitude of outperformance bounded by calculable confidence intervals.

You can consider 94% insignificant or significant -- your call really. If you chose p = .05, its insignificant. If you choose p = .1, its significant. It costs me very little (except opportunity costs) to keep the experiment going but 94% is good enough for me personally to claim a win out of it.


Although in the image, nowhere could I see p value, I have verified it by myself that there is 95% significance. And I agree with you it depends on perspective on what you consider significance.

BTW, those wanting to know the math head to http://20bits.com/articles/statistical-analysis-and-ab-testi...


Sorry, should have been more specific: I meant that you use absolute $ values in your sales-by-month graph -- that must also be dependent on traffic. Obviously the A/B test is not.


Executive summary:

Old cart: Opens new page, standard cart design.

New cart: Lightbox style (gray out the background, box in the middle), ajax shopping cart.


Actually:

Really old way I did things: No cart at all, used matrix of buy-it-now buttons.

Old cart: Lightbox style (gray out the background, box in the middle), ajax shopping cart which takes two seconds to open and is a little cluttered in some use cases. Beats above option by lots.

New cart: Lightbox, feels-like-AJAX (everything is pre-loaded so that opening a new DIV or recalculating prices is essentially instantaneous -- no HTTP request involved), instantaneous response and no clutter. Beats old cart by lots.


This was great reading. I've put off A/B testing with Google Website Optimizer long enough. Looking forward to boosting my revenue! ;-)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: