Hooked and booked

At Booking.com, they do a lot of A/B testing.

At Booking.com, they’ve got a lot of dark patterns.

I think there might be a connection.

A/B testing is a great way of finding out what happens when you introduce a change. But it can’t tell you why.

The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.

If I were trying to convince you to buy a product, or use a service, one way I could accomplish that would be to literally put a gun to your head. It would work. Except it’s not exactly a good solution, is it? But if we were to judge by the numbers (100% of people threatened with a gun did what we wanted), it would appear to be the right solution.

When speaking about A/B testing at Booking.com, Stuart Frisby emphasised why it’s so central to their way of working:

One of the core principles of our organisation is that we want to be very customer-focused. And A/B testing is really a way for us to institutionalise that customer focus.

I’m not so sure. I think A/B testing is a way to institutionalise a focus on business goals—increasing sales, growth, conversion, and all of that. Now, ideally, those goals would align completely with the customer’s goals; happy customers should mean more sales …but more sales doesn’t necessarily mean happy customers. Using business metrics (sales, growth, conversion) as a proxy for customer satisfaction might not always work …and is clearly not the case with many of these kinds of sites. Whatever the company values might say, a company’s true focus is on whatever they’re measuring as success criteria. If that’s customer satisfaction, then the company is indeed customer-focused. But if the measurements are entirely about what works for sales and conversions, then that’s the real focus of the company.

I’m not saying A/B testing is bad—far from it! (although it can sometimes be taken to the extreme). I feel it’s best wielded in combination with usability testing with real users—seeing their faces, feeling their frustration, sharing their joy.

In short, I think that A/B testing needs to be counterbalanced. There should be some kind of mechanism for getting the answer to “why?” whenever A/B testing provides to the answer to “what?” In-person testing could be one way of providing that balance. Or it could be somebody’s job to always ask “why?” and determine if a solution is qualitatively—and not just quantitatively—good. (And if you look around at your company and don’t see anyone doing that, maybe that’s a role for you.)

If there really is a connection between having a data-driven culture of A/B testing, and a product that’s filled with dark patterns, then the disturbing conclusion is that dark patterns work …at least in the short term.

Have you published a response to this? :

Responses

Josh Hughes

“The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.” adactio.com/journal/13109

# Posted by Josh Hughes on Saturday, November 18th, 2017 at 8:57pm

Bartosz Borowski

A/B Testing on point 🎯 “The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.”adactio.com/journal/13109

Uxfam Ltd

I’m not saying A/B testing is bad—far from it! I feel it’s best wielded in combination with usability testing with real users—seeing their faces, feeling their frustration, sharing their joy. adactio.com/journal/13109

# Posted by Uxfam Ltd on Monday, November 20th, 2017 at 11:15pm

Damon vV

No amount of A/B testing justifies misleading and tricking your customers (creating a false sense of urgency, burying negative reviews, etc.)adactio.com/journal/13109

# Posted by Damon vV on Tuesday, November 21st, 2017 at 12:30am

CSS-Tricks

Does A/B testing lead to dark patterns? adactio.com/journal/13109 “… in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.”

# Posted by CSS-Tricks on Wednesday, November 22nd, 2017 at 1:00am

Lauren Hutchison

RealCSSTricks: Does A/B testing lead to dark patterns? adactio.com/journal/13109 “… in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.”

Indi Young

“The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.” @adactio bit.ly/2AoYCBo

# Posted by Indi Young on Wednesday, November 29th, 2017 at 9:57pm

ruymanfm

“If I were trying to convince you to buy a product one way I could accomplish that would be to put a gun to your head. If we were to judge by the numbers (100% of people threatened with a gun did what we wanted), it would appear to be the right solution” adactio.com/journal/13109

# Posted by ruymanfm on Monday, December 4th, 2017 at 11:42am

Suzanne Hillman

A/B testing is problematic for a different reason than I realized: “in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.” adactio.com/journal/13109

Barry 🎮👻

The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. BUT JUST BECAUSE SOMETHING WORKS, DOESN’T MEAN IT’S A GOOD THING.

Zack Argyle

I wholeheartedly agree with your conclusion: “In short, I think that A/B testing needs to be counterbalanced.” Absolutely yes.

# Posted by Zack Argyle on Saturday, January 20th, 2018 at 7:31pm

Zack Argyle

But the tests clearly show that people look at my💩more than my 🍰, so clearly users prefer my 💩 over my 🍰!

# Posted by Zack Argyle on Saturday, January 20th, 2018 at 7:50pm

Carolyn Lyden

“The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.” adactio.com/journal/13109

Paula

Is A/B testing a gateway to dark patterns?

# Posted by Paula on Wednesday, October 3rd, 2018 at 11:56am

3 Shares

# Shared by Cyd Harrell on Saturday, January 6th, 2018 at 4:48pm

# Shared by Thomas Puppe on Saturday, January 20th, 2018 at 10:12pm

# Shared by Florian Pilz on Sunday, January 21st, 2018 at 9:15am

12 Likes

# Liked by Nick F on Saturday, November 18th, 2017 at 8:23pm

# Liked by Marty McGuire on Sunday, November 19th, 2017 at 12:47am

# Liked by Deborah Edwards-Onoro on Sunday, November 19th, 2017 at 4:17pm

# Liked by Simon King on Sunday, November 19th, 2017 at 6:56pm

# Liked by Matthias Beitl on Tuesday, November 21st, 2017 at 7:34am

# Liked by Pablo Domínguez on Thursday, December 7th, 2017 at 12:13pm

# Liked by Willem van den Ende on Thursday, December 7th, 2017 at 12:13pm

# Liked by Sarah Drasner on Saturday, January 20th, 2018 at 7:58pm

# Liked by Esther Lam Schanler on Saturday, January 20th, 2018 at 7:58pm

# Liked by Dan Silverlock␀ on Saturday, January 20th, 2018 at 8:27pm

# Liked by Derek Oakley on Saturday, January 20th, 2018 at 9:01pm

# Liked by Vašek Ostrožlík on Saturday, January 20th, 2018 at 9:32pm

Previously on this day

10 years ago I wrote Webiness

What the web is(n’t).

13 years ago I wrote Play me off

All’s fair in love’n’wikipedia.

16 years ago I wrote Big in Japan

Adventures in the land of the rising sun.

17 years ago I wrote Local activity

Brighton to London.

19 years ago I wrote One morning in York

Thanks to the good folks at Vivabit, I’ve had the opportunity to take the DOM Scripting show on the road.

23 years ago I wrote RSS fever

Time for some more geek talk. I’ve been spending the day playing with RSS feeds on my little portal (again). I was spurred on by an encouraging email I got from Prentiss Riddle, who keeps a great weblog.