5 tips on Google Play A/B experiments to boost your app installs


Enhancing Your App Installations with 5 Valuable Tips for Google Play A/B Experiments
29/09/2023

Google Play experiments is a COOL new feature that enables A/B testing of both texts and visuals in your Play Store listing. It was warmly welcomed by the app developers community. Looks like, going forward, optimizing your app’s page using Google play experiments becomes an everyday task. Here are some tips to get you going

What are Google Play Experiments?

Well, it’s a feature, giving you the option to A/B test different Play Store listing elements in order to optimize the listing, and improve the conversion rate – views to installs. Google provides exact statistics on the test results.

Here’s how it’s done:

Launching a Google play experiment

As a Google Play publisher, visit the Google Play developer console and navigate to the ‘Experiments’ tab on the left sidebar.

As long as you’re testing the global market, you are free to optimize many elements as presented in the image below. If not, you will be limited to testing graphics only, as naturally, text changes are segmented by language.

You can also decide on the size of the target audience to be exposed to the experiment (as long as it’s no more than 50% of the visits)

Sadly, you can’t test the app name or title (makes sense, though)

Not sure why Google even allows for testing several elements all at once, but this is not recommended for obvious statistical reasons. Also, you can’t create more than one experiment at a time, nor can you edit an experiment once it’s started (the option to add variations once the experiment has started is there, but the checkmark is disabled. It’s also possible that this is a bug and will be fixed in the future)

To learn more, visit Google’s guidelines.

Please note that the process can be quite slow. Here is why:

  1. You can do only one test at a time. This can be frustrating
  2. You can only send up to 50% of your traffic to the tested versions. It means you need to wait till enough data is gathered, before you can check on meaningful results and come to a conclusion.

BUT- on the good side, the UX is very friendly and intuitive.

So here are the 5 tips I promised:

  1. Don’t expect magic right from the start! Yes, we read the recent Cinderella stories, but if you have some experience with A/B testing, you surely know that it is a long road. You are moving the needle just a tiny bit with each iteration (which could be still, very meaningful of course…).So, don’t be surprised if Google experiments won’t exactly rock your world (or should I say your install graph). Disclaimer: DO expect major improvements, if you’ve never really paid real attention to your store listing (I am just assuming this is not the case for most of us)
  2. Test DIFFERENT variants – Use common sense and give the system something to work with. To test different variation you gotta test best quality but, totally different variants. You WILL sweat over this, but no shortcuts here. Be creative and spy on competitors or other apps in your category, to get inspired.
  3. Streamline your A/B testing – This new Google experiments feature brings real change to mobile app marketing. I believe that an experiment should always be running, so the optimization process never really stops. But it also means that we must keep new materials coming. Mainly graphics. That’s challenging. Yougonn’a need some help, and you may consider somewhat relaxing your strict brand guidelines, for the sake of conversion.Search for ways to get fresh, updated graphics for a minimal budget. One option is to download form our templates library (available for our members).As an ‘appetizer’, we now offer a FREE collectionofdesigned screenshot templates
  4. Prioritization. Visuals over words – judging from all the industry buzz, it seems like the app icon has the biggest potential to make a change. Makes sense. I mean, think about it… after all, the app icon is the only element that’s visible right there on the results list, before the user makes a tap/no tap decision
  5. Keep at it – Too small improvements, or no improvements. This can be rather frustrating. But hey, it also means that your initial copy and visuals were’nt so bad, right? That’s what I tell myself when I gaze at the utterly similar trend lines of the different variants I am testing. We already mentioned that optimization and A/B testing takes time so keep at it and don’t give up. Color change didn’t work? Try other iconic language, add bullet points to your feature image, or test seasonal graphics (Christmas celebration, New Year feature image, the Olympics…). Something will work eventually. That’s what testing is for.

These tips summarize our experience with Google Experiments, and we are keeping at it (as advised) all the time. We would love to hear about your experience with this new feature and the results you have managed to generate. Please go ahead and share in the comments area of this post.