How Socialcam Got Millions of Users With Mobile A/B Testing

Apptimize

Ammon

We recently sat down with Ammon Bartram, co-founder of Socialcam, a popular mobile social video app with 10’s of millions of downloads. Ammon tells us about how Socialcam optimized its flow and improved signup and engagement through mobile A/B testing.

It’s extremely hard to tell what people are actually going to use, so we decided to narrow things down and build a system where we could see exactly what people were using and try to start at the beginning and incrementally make it better… We were able to get that to a 2.5x increase in the number of people who complete the flow.

Apptimize: At what point did you realize that you should start A/B testing in order to make your app better?

Ammon: SocialCam started as a project at Justin.tv. For almost a year, we were developing inside the company, but we were not seeing the growth that we needed–we knew that we had to do something different. Before, we had been throwing spaghetti on the wall, just developing features. It’s easy to get stuck in that trap of falling in love with your most recent idea, building it, putting it out there, and then seeing no one use it. It’s extremely hard to tell what people are actually going to use. So we decided to narrow things down and build a system where we could see exactly what people were using and try to start at the beginning and incrementally make it better. The very first thing we did was put a whole bunch of analytics in the account creation flow and tweak things. We were able to get that to a 2.5x increase in the number of people who complete the flow once they arrive on the first page.

Apptimize: What kind of metrics were you tracking and how did you get a 2.5x return on just looking at analytics alone?

Ammon: I believe that was a change from an only Facebook connect login flow to providing email as an additional option. But that doesn’t explain the entire change. Things like changing color or text often only have a small improvement. We weren’t so much of testing slight design changes as testing only Facebook connect, only email, and Facebook connect and email. And the next week we’re going to ask for additional information and only ask for email and password. We would throw those out there and see how many people provide information on the first page, how many people enter email, how many people click the creative content button, how many people watch their first video, things like that.

Apptimize: Was it difficult to A/B test when you didn’t have a lot of users?

Ammon: Yes it is — so the real thing you have to do there is limit. Once you have tens of thousands of daily or weekly users you can come up with twenty variants and throw them out there and see what works. When you’re first starting, you have tens of users or hundreds of users, so you have to limit your variants. You have two buckets of ideas on how content creation or flow should be. Maybe it should require Facebook to connect in order to get more data or maybe it should be a simple email password login. You can create those two systems and maybe they’ll take you two weeks or three weeks to get a meaningful result. But you’re still trending toward an actual answer or truth as opposed to just guessing and throwing something out there.

Apptimize: So you’ve been gradually improving your app for a number of years now. Do you have any advice or things you wish you had known before you started A/B testing?

Ammon: Probably that you have to evaluate the statistical significance of results and that even when doing that, you can fall into pits. If you’re trying to put analytics into some kind of complicated process, there are bugs that cause something to be slightly wrong. You don’t conceive of one way that a user can pass through a sequence of events in your app. It’s very easy to totally bias the result with mistakes like that. At one point, I made lots of tests on how many videos and which videos we put in each users’ feed. The metric that I was using was video likes. When users you follow take videos with SocialCam, we put a little red badge (icon) on the iOS app. I introduced a bug in my A/B test where a broader class of videos would trigger that badge. So for a long time, I thought we had these really good results where I was getting this huge increase in engagement. But of course it was actually doing was expanding users with a larger volume of badges. We actually didn’t find that for a while until we deployed the winning algorithm. And then we saw users supply how many people would see the badge half expecting to see content that they would want to see. But they would open content, see content that didn’t mean anything to them. So the results were totally thrown off by a bug. But there was no way to catch that bug–it was in the feature itself. You just have to be very critical of results that look too good to be true.

The same thing is true with the trend variation. The human brain is very bad at identifying a meaningful trend. So if you don’t actually do the math, the math is somewhat arguable, sometimes false. It’s easy to think that button B is brought ahead by A. Those numbers look plenty big, B is 50% ahead, that must be the winner. But in many cases that’s not true. If you leave it running for another day or week, you may see a totally different answer. So that can lead to a process where you think you’re climbing a ladder by making all of these little changes, but it’s actually totally random.

Apptimize: How does it work when there’s a person creating a lot of A/B test ideas who isn’t actually able to implement the code themselves?

Ammon: Just coming up with A/B testing ideas is hard. That’s probably as hard as implementing the code. If at all possible, try to test everything. In reality, probably less than half of the things we pushed are actually tested because there’s tons of things that fall in the category of bug fixes or our designer deciding he wants something. Once you’re Google or Facebook, you probably have enough people that you can, in fact, test every design change. But we would limit that testing to key features–the core flows of the app–account creation, content creation, and content consumption and trying to drag those numbers up. So that’s where we think of things like design–does making the like button this much bigger or a little bit prettier increase the rate at which it is clicked? The answer is yes, but by a maximum of 5% or 10%, not 400%.

Apptimize: What were tests that made a big difference?

Ammon: Content Creation.
Filters made a big difference. We did live filters (video effects) so you could pull up the app and look at the world with this effect applied to it. And we got a 2x increase in video creation after releasing filters. After that we tried pushing that . We had this idea that because we saw this huge integration we should put that totally forward. So we actually had a version of the app where when you launched the app and the content creation screen would be super imposed over live video image with filters.

Content Consumption.
We found that mostly to be a matter of the content itself. We had a measure of the impression-to-view ratio and the impression-to-like ratio per item of content. So for each video, we keep track of everyone who sees it in their feed and everyone who clicks the play button and everyone who clicks the like button. And that we would rank content based on those ratios. We had more content than we can display since if people have too much content, they don’t see it all. So let’s say we decide we want to give people ten videos a day and we have 100 videos that we can show. Then we would rank those 100 by the impression-to-like ratio and then take the top 10 and show those. It ended up that people like a conscious indication of content being good was a safer thing for us to use than views.

About Apptimize

Apptimize is an innovation engine that provides A/B testing and feature release management for native mobile, web, mobile web, hybrid mobile, OTT, and server. Industry leaders like HotelTonight, The Wall Street Journal, and Glassdoor have created amazing user experiences with Apptimize.

Thanks for
reading!