Experiment Results: Changing Keywords to Increase Downloads

Earlier this year I wrote about changing the keywords for Reminder+ (App Store link) – an experiment to see if I could increase downloads via App Store search.

This is what I wrote back in January:

Whilst I’m confident I can make an improvement, I don’t know how big the improvement will be. I’m not expecting anything dramatic, but I do think double digits is a possibility.

It’s taken a while but I now have some results…

Before and after download numbers


  • Period: 05/01/2016 – 17/03/2016, 72 days
  • App version: 2.0
  • App price: free
  • Downloads: 467
  • Downloads / day: 6.5
  • In-App Purchase: Send via Messages
  • IAP price: $1.99 (Tier 2)
  • IAPs: 37
  • IAPs / day: 0.51
  • IAP conversion rate: 7.9%
  • App Store views: 1411
  • Views per day: 19.6
  • Conversion rate: 33.1%


  • Period: 01/04/2016 – 03/05/2016, 33 days
  • App version: 2.0.1
  • App price: free
  • Downloads: 280
  • Downloads / day: 8.5
  • In-App Purchase: ”Upgrade Pack”
  • IAP price: $1.99 (Tier 2)
  • IAPs: 7
  • IAPs / day: 0.21
  • IAP conversion rate: 2.5%
  • App Store views: 720
  • Views per day: 21.8
  • Conversion rate: 39%

Not quite double figures, but I’d say the change from 6.5 downloads per day to 8.5 downloads per day, is a positive result. (And, though the data only spans a month, the trend is rising slowly.)

The download increase comes from a slightly higher number of views per day and a better conversion rate. The new keywords have higher traffic, but the improved conversion rate suggests that the new keywords are more relevant to the application. I’m pleased about that.

Whilst this experiment was based on an app that has (relatively) a tiny amount of downloads1, I still think the process of reviewing, researching, and iterating on keywords2 – is worth the time investment.

Negative results: In-App Purchases

You may have noticed in the numbers above, that the IAP conversion rate has all but dropped off a cliff, dropping from 7.9% to 2.5%.

To facilitate the keyword change for Reminder+ I made a small bug-fix update to the app so that I could submit v2.0.1…

Unfortunately, the app was rejected. The rejection required a couple of innocuous changes along with one that became bothersome. The IAP Send via Messages was deemed unsuitable as it provides access to built-in iOS capabilities3.

To cut a long story short, I had to change the name, presentation and content of the IAP. Send via Messages became Upgrade Pack.

I anticipated that a less direct IAP would lead to a lower conversion rate – and that’s exactly what happened 😉

For more download and IAP numbers for Reminder+, look at the Downloads and Sales page.

  1. This app has a tiny amount of downloads: will factor into the changes to App Store views and ultimately, downloads.
  2. Researching keywords: if you’re curious about how I go about changing keywords, look at Researching (iOS) App Store Keywords with Sensor Tower.
  3. Rule 11.8 – Apps that use IAP to purchase access to built-in capabilities provided by iOS, watchOS, and tvOS, such as the camera or the gyroscope, or Apple-branded peripherals, such as Apple Pencil or Apple Keyboard, or Apple services, such as Apple Music access or iCloud storage, will be rejected.

Results: An App Store Experiment Based on Keyword Research

As I mentioned in a previous post, the initial release of ReminderBin resulted in almost no downloads; consequently App Analytics data was not available.

On the 19th March I made the app free; as free it started to get some downloads, enough to draw some conclusions for this experiment…

Reminder of experiment goals

From the original post about this experiment – this is what I was trying to find out:

  1. Can the keyword research be trusted? Will ranking in the identified search phrases generate App Store views (traffic)?
  2. Is it possible for a new app that matches the search terms more closely than existing, ranking apps, to rank highly for those terms?
  3. For a particularly narrow niche, can you rely on search terms as the only input when judging market need and conceiving a product to fulfil that need?


App Store views and downloads over the last 4 weeks:

  • 28 days: 26 Mar 2016 – 22 Apr 2016
  • 168 App Store views
  • 6 views per day
  • 1.5 downloads per day
  • 25% view to download conversion rate

Experiment findings

Despite the extremely low downloads, the experiment wasn’t a complete failure.

1. Can the keyword research be trusted? Above, I noted the number of downloads and conversion rate, but it was App Store views that was the primary focus of this experiment.

The app does get views.

SensorTower suggested traffic and I got traffic; to an extent, the keyword research can be trusted.

That said, if I were to repeat this process I’d want to start calibrating on the information being shown in SensorTower (or similar tool) so that it could be used with greater confidence. E.g.

  • Would a different app with similar numbers produce similar results?
  • Are the traffic values in SensorTower linear?
  • Is the traffic contribution from closely related terms like “delete my reminder” and “delete my reminders” completely separate?
  • Etc.

2. Can a new app rank highly? This will differ from app to app and crucially, the amount of competition for the keywords in question. In this case, with low competition, the results show that a new app can rank highly. Take a look at some data from SensorTower for some of the primary keywords for ReminderBin:

3. Can you rely on search term data only to judge market need and conceive a product? Generally, the answer is clearly “no”. Market research and product development are way too complex to be replaced by some keyword numbers in SensorTower.

However, scaling that down a little, I think the numbers above show that keyword research can reveal aspects of market demand. Such research may serve as the input for further investigation, interviews with users, trial features, maybe even a small MVP, etc.

Whilst it’s difficult to conclude much with such small numbers, the (only) two reviews that the app does have – support search term / keyword research being on avenue to judge market need…

Reviewing App Store Page Conversion Rates

I was reminded recently (thanks, Daniel Alm) of how important it is to review your app’s App Store page conversion rate…

What is the App Store page conversion rate?

It is: the percentage of customers that view your app’s “page”, in the store, and then go on to download it.

Note: that means actually selecting to view your app’s individual page; showing up in the search results list doesn’t count as a view of your app.

At a first glance this topic might seem simple: you want a high conversion rate and, by extension, if your conversion rate is too low you’ve got a problem.

Whilst that’s true, I think there are aspects surrounding this conversion rate that are worth consideration…

This is a sales process. The purpose of your app’s page is to convert a potential customer into a download. Whilst we are restricted with what we can do in the App Store, as compared to say, a regular web page, it’s still useful to think about approach it in this way.

Whilst I’m no expert the Internet is full of information on this subject. That said, here are some of the basic things to think about:

  • What will potential customers see first? Visually, the icon, app name and 1st screen shot/preview video attract attention – do they communicate a reason to download your app?
  • Is that reason clear? Are you communicating benefits or features? People don’t buy features…
  • How does that look from a search results page?
  • What do you know about the search terms your app ranks for in the store? Does your page communicate solutions that match these problems?

Monitor the conversion rate between versions. This is more obvious if you were actively trying to improve your conversion rate by changing the App Store page, but what if you weren’t?

Did you add a preview video or change your screen shots? A slight change to the name shown in the store? Or the icon?

Be sure to line up App Analytics queries with changing product versions to check if anything has changed.

How might we go about improving the conversion rate? Whilst we can’t do A/B testing in the App Store, as mentioned above, we can still monitor changes between versions.

Additionally, we can do regular usability testing on our App Store assets before making a change. For example, if you’re thinking about changing the first two screen shots along with the title, mock up two or three different versions and then test them on people.

Show them a version of the assets and then ask: what problem does this app solve? Or, why should you download this app? This type of testing can be very revealing.

Is this high priority work? It’s good to know whether or not you have a conversion rate problem. Addressing a conversion rate problem could be the highest priority work for your app, with respect to improved downloads/sales.

What’s more, the conversion rate is especially important if the majority of your traffic comes from search, as potential users are considering your app for the first time and so you only have the App Store page to convert them.

Conversion rate numbers for Reminder+

The origin of this post was a reminder to look at the conversion rate for Reminder+ (App Store link).

The following numbers are for the lifetime of v2.0 of the app (26th August 2015 – 17 March 2016):

  • Version 2.0
  • App Store views: 3735
  • Downloads: 1300
  • Conversion rate = 34.8%

I’ve heard that between 20% and 50% is a reasonable guide of acceptability for free apps, so 34.8% is not too bad…

However, consider the numbers for the previous version, between 1st April 2015 (when App Analytics started) and 25th August 2015:

  • Version 1.2.2
  • App Store views: 2192
  • Downloads: 1042
  • Conversion rate: 47.5%

That’s a significant drop in performance, worth consideration.

Note: I’ve updated the Reminder+ numbers page to include the App Store conversion data.