Results: An App Store Experiment Based on Keyword Research

As I mentioned in a previous post, the initial release of ReminderBin resulted in almost no downloads; consequently App Analytics data was not available.

On the 19th March I made the app free; as free it started to get some downloads, enough to draw some conclusions for this experiment…

Reminder of experiment goals

From the original post about this experiment – this is what I was trying to find out:

  1. Can the keyword research be trusted? Will ranking in the identified search phrases generate App Store views (traffic)?
  2. Is it possible for a new app that matches the search terms more closely than existing, ranking apps, to rank highly for those terms?
  3. For a particularly narrow niche, can you rely on search terms as the only input when judging market need and conceiving a product to fulfil that need?

Numbers

App Store views and downloads over the last 4 weeks:

  • 28 days: 26 Mar 2016 – 22 Apr 2016
  • 168 App Store views
  • 6 views per day
  • 42 downloads
  • 1.5 downloads per day
  • 25% view to download conversion rate

Experiment findings

Despite the extremely low downloads, the experiment wasn’t a complete failure.

1. Can the keyword research be trusted? Above, I noted the number of downloads and conversion rate, but it was App Store views that was the primary focus of this experiment.

The app does get views.

SensorTower suggested traffic and I got traffic; to an extent, the keyword research can be trusted.

That said, if I were to repeat this process I’d want to start calibrating on the information being shown in SensorTower (or similar tool) so that it could be used with greater confidence. E.g.

  • Would a different app with similar numbers produce similar results?
  • Are the traffic values in SensorTower linear?
  • Is the traffic contribution from closely related terms like “delete my reminder” and “delete my reminders” completely separate?
  • Etc.

2. Can a new app rank highly? This will differ from app to app and crucially, the amount of competition for the keywords in question. In this case, with low competition, the results show that a new app can rank highly. Take a look at some data from SensorTower for some of the primary keywords for ReminderBin:

3. Can you rely on search term data only to judge market need and conceive a product? Generally, the answer is clearly “no”. Market research and product development are way too complex to be replaced by some keyword numbers in SensorTower.

However, scaling that down a little, I think the numbers above show that keyword research can reveal aspects of market demand. Such research may serve as the input for further investigation, interviews with users, trial features, maybe even a small MVP, etc.

Whilst it’s difficult to conclude much with such small numbers, the (only) two reviews that the app does have – support search term / keyword research being on avenue to judge market need…

Results: Price Increase Experiment

At the beginning of the year I decided to try an experiment and changed the price of the In-App Purchase for Reminder+ (App Store link) from Tier 1 ($0.99) to Tier 2 ($1.99). The price change has been in place for eight weeks.

At the time, I made the following prediction about the change:

Changing from $0.99 to $1.99 won’t have a significant effect (decrease) on the IAP conversion rate.

Well, I was wrong about that, but not too wrong.

The conversion rate dropped from 8.9% to 6.9%; looking at that figure alone, I’d say that is a significant difference. But things are more interesting when the price increase is factored in.

Here are the numbers comparing the these two periods. Before the price increase:

  • Period: 26/08/2015 – 04/01/2016, 132 days
  • App version: 2.0
  • App price: free
  • Downloads: 833
  • Downloads / day: 6.31
  • In-App Purchase: Send via Messages
  • IAP price: $0.99 (Tier 1)
  • IAPs: 74
  • IAPs / day: 0.56
  • IAP conversion rate: 8.9%

After the price increase (no other changes; only the IAP price):

  • Period: 05/01/2016 – 29/02/2016, 56 days
  • App version: 2.0
  • App price: free
  • Downloads: 360
  • Downloads / day: 6.4
  • In-App Purchase: Send via Messages
  • IAP price: $1.99 (Tier 2)
  • IAPs: 25
  • IAPs / day: 0.45
  • IAP conversion rate: 6.9%

So, whilst the conversion rate did drop, and my prediction was wrong, the revenue per day still increased:

  • At $0.99: $0.55 / day
  • At $1.99: $0.90 / day

What am I going to do with these results?

Overall the daily revenue is higher with the IAP set at Tier 2 (almost 2/3 higher), so it makes sense to leave it as is.

Additional notes:

  • When users send a reminder using the IAP, there is a small server cost involved. Charging more, per user, for this capability is advantageous
  • Whilst I am biased, I value software, and I think that there is value in this IAP. I prefer the higher price.
  • Although this is un-tested, at this point, I do not think it is worth trying to increase the price further beyond Tier 2.
  • I’ve updated the Reminder+ downloads and sales page. Now that I’ve got additional data for v2.0, comparing the IAP conversion rate with v1.2.2 is more interesting. There older version was better at converting users…

Small numbers

I know that the numbers involved with my app are particularly low. That said, I still think this price experimentation may be of interest to others; in other circumstances the impact of a change could be more meaningful.

ReminderBin: An App Store Experiment Based on Keyword Research

In the last post I mentioned that I’d been working on and had submitted, a new app, as part of an experiment. This post will explain more about how the project started and the framework I put in place for this work.

Born from market research

Disclaimer: calling this ‘market research’ might be a bit of a stretch, but given the theme and scope of the project, there’s some sense in calling it this…

Whilst researching keywords for Reminder+ (App Store link), I noticed some interesting data around the words ‘delete’ and ‘remove’, when paired with ‘reminders’:

  • Searches for these phrases had some amount of traffic, not too far from some of the keywords targeted by Reminder+.
  • These phrases appeared to be underserved by apps in the App Store. For example, between 1 and 5 apps ranking for some search phrases.
  • The keywords/phrases indicated a clear ‘problem’, for which people are searching for a solution.

I started to see a gap, which gave me an idea for a quick experiment…

Experiment

Headline:

If research, in the form of App Store search terms, shows demand for a solution to a given problem, and competition for the keywords involved is low, can a simple, targeted app generate revenue?

Specifically, I’m interested to find out:

  1. Can the keyword research be trusted? Will ranking in the identified search phrases generate App Store views (traffic)?
  2. Is it possible for a new app that matches the search terms more closely than existing, ranking apps, to rank highly for those terms?
  3. For a particularly narrow niche, can you rely on search terms as the only input when judging market need and conceiving a product to fulfil that need?

Desired results

  1. Answer the questions above.
  2. Generate consistent revenue from an app for which App Store search is the only means for generating leads/traffic. Achieve a target of 2 downloads per day at Tier 1.
  3. Complete the experiment with minimal investment. Make the app and submit it for app review in the equivalent of 2 days of work (16 hours).

More on that last point: this was/is an experiment and working on the project was in interruption. I wanted to answer some questions without investing too much time.

Approach

  1. Make an app targeted directly at the problem of deleting reminders.
  2. Use a combination of the target search phrases and my own use/ideas, to flesh out requirements for the app.
  3. If needed, cut features/scope to finish (submit) on time.
  4. Plan to release without telling anyone: no press emails, no website, no Facebook, etc.

Note: I was comfortable with aggressively cutting features/scope in order to release as soon as possible… Because this is going to be a ‘silent launch’ it’s less important that everything be perfect for making a splash. Iterating quickly after initial release, if warranted, is a viable approach.

Motivation for this experiment?

This experiment was/is appealing for a few reasons. I’m particularly interested in the keywords aspect:

  • Ranking in search generates traffic. We can measure how much of that traffic is converted to downloads. Is the product a good fit for what people are searching for? Are my assumptions around product-market-fit correct?
  • I’m interested to see if I can take control of a small number of niche keywords. If successful, this could be a strategy to re-use in the future.

It’s also a good fit for my general approach to development:

  • The experiment is based on reaching potential customers via App Store search; this is a funnel that can be tapped into without the need for more expensive marketing activities.
  • It’s a small project with which I’ve been able to make meaning progress within my schedule. I’ve also used existing knowledge and re-used some code.
  • If further development work is justified, small incremental changes will work well.

Pre for sale results

I submitted the app last week, Wednesday 24th Feb.

  • I spent a total of 20.5 hours from start to clicking ‘submit’ in iTunesConnect.
  • The work was done in little chunks, ranging from 15 minute slots up to 2 hours.
  • Whilst I went beyond the target of 16 hours, the work was close enough to my goal of being a small investment. I’m happy with the progress leading up to submitting.

Update: the app has literally just been approved and made Ready for Sale (around 1 hour ago). I’ll post some numbers after a week or so, when we can see which keywords the app ranks for, what traffic that leads to and how many (if any) downloads result.