0% found this document useful (0 votes)
121 views

Problem 4. Markov Chains (Initial State Multiplication)

This document discusses Markov chains and their application to modeling consumer brand loyalty over multiple time periods. It provides an example of a Markov chain model with 4 soft drink brands and their transition probabilities. It then presents a 6 brand jeans example, showing the initial market shares and transition matrix. It calculates the probability of consumers remaining with or switching brands over 4 and 5 periods. The summary concludes the document analyzes how Markov chains can model stochastic processes and examines customer brand loyalty over time.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views

Problem 4. Markov Chains (Initial State Multiplication)

This document discusses Markov chains and their application to modeling consumer brand loyalty over multiple time periods. It provides an example of a Markov chain model with 4 soft drink brands and their transition probabilities. It then presents a 6 brand jeans example, showing the initial market shares and transition matrix. It calculates the probability of consumers remaining with or switching brands over 4 and 5 periods. The summary concludes the document analyzes how Markov chains can model stochastic processes and examines customer brand loyalty over time.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Problem 4.

Markov chains (Initial state multiplication):

Suppose that 4 types of soft drinks are obtained in the market: Colombian,
Pepsi Cola, Fanta and Coca Cola when a person has bought Colombian there
is a probability that they will continue to consume 40%, 20% of which will
buy Pepsi Cola, 10% that Fanta buys and 30% that Coca Cola consumes;
when the buyer currently consumes Pepsi Cola there is a probability that he
will continue to buy 30%, 20% buy Colombiana, 20% that Fanta consumes
and 30% Coca Cola; if Fanta is currently consumed, the likelihood of it
continuing to be consumed is 20%, 40% buy Colombian, 20% consume
Pepsi Cola and 20% go to Coca Cola. If you currently consume Coca Cola the
probability that it will continue to consume is 50%, 20% buy Colombian,
20% that consumes Pepsi Cola and 10% that is passed to Fanta.

At present, each Colombian brand, Pepsi Cola, Fanta and Coca Cola have the following
percentages in market share respectively (30%, 25%, 15% and 30%) during week 3.

  COLOMBIANA PEPSI FANTA COCACOLA

COLOMBIANA 0,40 0,20 0,10 0,30

PEPSI COLA 0,20 0,30 0,20 0,30

FANTA 0,40 0,20 0,20 0,20

COCACOLA 0,20 0,20 0,10 0,50

d. Find the transition matrix.

  COLOMBIANA PEPSI FANTA COCACOLA

COLOMBIANA 0,30 0,22 0,13 0,35

PEPSI COLA 0,28 0,23 0,15 0,34

FANTA 0,32 0,22 0,14 0,32


COCACOLA 0,26 0,22 0,13 0,39

e. Find the probability that each user stays with the mark or change to another for
period 4 (problem 4) and period 5 (problem 5).

S0 0,30 0,25 0,15 0,30

S1 0,29 0,225 0,14 0,345

S2 0,286 0,2225 0,137 0,355

S3 0,2845 0,22225 0,136 0,357

S4 0,284 0,222 0,136 0,358

Winqsb
XXXXXXXXXXXXXXX
Problem 5. Markov chains (Initial state multiplication):
Suppose you get 6 types of Jeans brands in the Colombian market: Brand 1, Brand 2,
Brand 3, Brand 4, Brand 5 and Brand 6. The following table shows the odds that you
continue to use the same brand or change it.

STATE BRAND 1 BRAND 2 BRAND 3 BRAND 4 BRAND 5 BRAND 6


BRAND 1 0,2 0,16 0,15 0,21 0,18 0,1
BRAND 2 0,14 0,18 0,2 0,19 0,15 0,14
BRAND 3 0,13 0,16 0,15 0,21 0,2 0,15
BRAND 4 0,21 0,2 0,15 0,2 0,18 0,06
BRAND 5 0,15 0,15 0,15 0,19 0,15 0,21
BRAND 6 0,17 0,16 0,17 0,18 0,19 0,13

At present, brand, have the following percentages in market share respectively (20%,
15%, 17%, 15%, 13% y 20%) during week 4.
MATRIZ DE TRANSICION

STATE BRAND 1 BRAND 2 BRAND 3 BRAND 4 BRAND 5 BRAND 6 SUMA


BRAND 1 0,17 0,1698 0,16 0,1981 0,1738 0,1283 1,000
BRAND 2 0,1654 0,1697 0,1618 0,1973 0,1755 0,1303 1,000
BRAND 3 0,1675 0,1696 0,161 0,1962 0,1737 0,132 1,000
BRAND 4 0,1687 0,1702 0,1612 0,1986 0,1722 0,1291 1,000
BRAND 5 0,1686 0,1691 0,1617 0,1958 0,1761 0,1287 1,000
BRAND 6 0,1669 0,1685 0,1606 0,1973 0,1742 0,1325 1,000

ESTADO INICIAL
BRAND 1 BRAND 2 BRAND 3 BRAND 4 BRAND 5 BRAND 6 SUMA
P0 0,20 0,15 0,17 0,15 0,13 0,20 1,00
P1 0,168 0,169 0,161 0,197 0,17 0,130241 1,00
P2 0,1679 0,1695 0,1611 0,1973 0,17 0,130008977 1,00
P3 0,16791 0,16954 0,16108 0,19725 0,17 0,130008585 1,00

e. Find the probability that each user stays with the mark or change to another for
period 4 (problem 4) and period 5 (problem 5).

  BRAND 1 BRAND 2 BRAND 3 BRAND 4 BRAND 5 BRAND 6


P4 0,20 0,15 0,17 0,15 0,13 0,2

  BRAND 1 BRAND 2 BRAND 3 BRAND 4 BRAND 5 BRAND 6


P5 0,1681 0,1677 0,1615 0,1969 0,1770 0,1288

  SOLVER
               
  BRAND 1 BRAND 2 BRAND 3 BRAND 4 BRAND 5 BRAND 6 SUMA
0,144527
P4 8 0,14592622 0,13864305 0,16977995 0,14994618 0,11190153 1,00
0,144527
  8 0,14592622 0,13864305 0,1697799 0,14994616 0,1119016  
CONCLUSIONS

With the development of this work we can conclude that we can say that the Markov
chains are a tool to analyze the behavior and the government of certain types of
stochastic processes, that is, processes that evolve in a non-deterministic way over
time. a set of states.
That for their elaboration they require knowledge of diverse elements such as the state
and the transition matrix.
These elements were discovered by its creator Márkov, who made a sequence of
connected experiments in chain and the need to discover mathematically the physical
phenomena
This method is very important, since it has begun to be used in recent years as an
instrument of marketing research, to examine and forecast the behavior of customers
from the point of view of their loyalty to a brand and its forms of change to other brands,
the application of this technique, is not only limited to marketing, but its field of action
has been applied in various fields.

BIBLIOGRAPHIC REFERENCES

Ibe, O. (2013). Markov Processes for Stochastic Modeling: Massachusetts, USA:


University of Massachusetts Editorial. Retrieved
from http://bibliotecavirtual.unad.edu.co:2051/login.aspx?
direct=true&db=nlebk&AN=516132&lang=es&site=eds-live
.
Dynkin, E. (1982). Markov Processes and Related Problems of Analysis: Oxford, UK:
Mathematical Institute Editorial. Retrieved
from http://bibliotecavirtual.unad.edu.co:2048/login?
url=http://search.ebscohost.com/login.aspx?
direct=true&db=e000xww&AN=552478&lang=es&site=ehost-live
.
Pineda, R. (2017). Virtual learning object Unit 3. Markov decision processes. [Video
File]. Retrieved from http://hdl.handle.net/10596/13271

Panofsky, A. (2012). Examples In Markov Decision Processes: Singapore: Imperial


College Press Optimization Series. Retrieved
from http://bibliotecavirtual.unad.edu.co:2051/login.aspx?
direct=true&db=nlebk&AN=545467&lang=es&site=eds-live

 Know and develop the themes of unit 3: Third Web Conference, Markov decision
processes [Video File]. Retrieved from http://bit.ly/2ULl9OQ

You might also like