Search
Close this search box.

Intelligent app stores using artificial neural networks

Modern App-stores have evolved very quickly but the pricing models are very much out-dated and static in the sense that there is no quantitative way to intelligently estimate the price of an app depending upon the demand and usability of the app. In this article I have tried to answer the question ‘at what price should I sell my app?’ I have extended this idea to a dynamic and adaptive model wherein the prices can change depending upon demand. In both the cases I have estimated the total revenue returns. I have used Lagrange multipliers to arrive at the optimization of neural networks.
Before knowing how this all works I will try to explain a few terms that are going to be useful.
Demand (m) : Average number of installs per day of an app. this is the maximum likelihood of a Poisson distribution.
Latent Demand (M): This is the maximum average demand an app can have at any price, easily we can arrive at this from the number of installs its free app can have. This in some sense represents the usefulness or the marketability of an app.
Price (p): That’s a no brainer.

How does it work?

Ever heard of the Duckworth-Lewis method in cricket? Its an estimate of scores in a game of cricket in case the match has to be abandoned or otherwise cut short. It arrives at a function of score and overs using historical data of the game, meaning we can estimate the score at the end of the match given some initial value conditions. We are going to do similar stuff here. We arrive at a function of Demand vs. Price using historical data, and using the Latent Demand that the app currently has we give estimate for a most optimized price where we can maximize the profits. I have used Lagrange multipliers to differentiate the neural networks and arrive at the most maximized price. A back propagation algorithm has been used for arriving at the neural network. A Modified (modified to implement the Lagrange multipliers) Feed forward algorithm has been used to predict the prices.

Back Propagation

What is a neural network?

It’s an universal approximation function/algorithm which approximates any function – so long as such a function exists; it is meaningless to approximate a non-existent function – to any amount of accuracy given the input data or parameters, it fits a curve that suits the best to give input data. Just like the Newton Forward Method or Taylor Series Approximation. Intact it is a linear regression approximation of any curve over multiple parallel iterations. Now, did I use the word neuron in defining it? No I didn’t!

Why is there a neuron in its name?

Simply, some one felt like naming it that way (it has nothing to do with neurons or the way the human brain functions).
How does it look like in case of two input parameters (M,m in our case)?
Let
Equation
Where P ,M,m stand as defined above and Equation represents the neural net function in one layer with k nodes. This function can be written as below (tanh has been used as the activator function for its mathematical ease in representation in contrast to the sigmoid).
Equation
Any function can be approximated using above setup. The problem of back propagation is finding the values of U,V,W vectors that best fit the above function to predict the value of P.

Maximizing profits with forward feed

Let Equation be the price prediction neural network for a given app whose latent demand M is already known.The problem of predicting the right price to maximize the profit is to see that the value of total sales
* price is maximized. i.e. Equation is maximum.
This converts into a Lagrange multipliers problem in two unknowns P and m. The multiplier equations can be written as follows.
Equation ———————————————-1
Equation—————————-2
Equation———————3
Solving the above three equations and eliminating Equation we get
Equation ———4
This can be reduced to a mathematical form as
Equation
where the weights are already known from the back propagation algorithm. To solve this we can use Newton-Raphson method or Muller Method, though even Bisection Method should be very efficient .

Proof of No guarantee of a primitive and nature of the Equation

We already know that M is the maximum value at which price is 0 and m=0 is another point at price infinity hence. Intuition is that there must be a root for m between m=M and m=0 .
Equation is a monotonically decreasing function; as price decreases the number of installs increases its trivial from basic economics. Hence Equation < 0 and Equation > 0. We can quickly verify that at m=M above equation 4 has a positive value and at m=0 it has a positive value as well. Hence there is no guarantee that there will always exist a root, there are always going to be an even number of solutions, which is understandable as the curve might look like a parabola. At the same time there can be no primitives, this usually happens in the case when an app’s Latent demand is already lower than the market average and any price you put on such an app will not guarantee any maximization of proofs but in all other cases you should find even number of primitives .

Expected Revenue for static and dynamic pricing

In case of a static price the expected price is very straightforward and its Equation (where Equation is the average no.of requests or demand for an app)
In case of dynamic price changes its slightly trickier we need to evaluate it over all of Poisson distribution and arrive at something like this.
Equation
I will explain all the algorithms needed for implementing these in python or R in a separate blog.

Report

The FORRESTER WAVE™: End-User Experience Management, Q3 2022

The FORRESTER WAVE™: End-User Experience Management, Q3 2022