Real Mathematics – What are the chances?! #4

Coffee of Serkan

Certain days of the week (okay; at least six days a week) I visit a specific coffee shop. Almost all the baristas know what I drink because of my frequent visits… Or do they?

My preferences change every six months. In the period of October-March, I only drink either latte or filter coffee, while in the period of April-September I prefer iced latte or berry.

October-March: In case I drink latte today, there are 80% of chances that I will be drinking latte in the next day. If I drink filter coffee today, chances of me drinking filter coffee tomorrow are 60%.

April-September: If I drink iced latte today, tomorrow I will be drinking iced latte with %80 of probability. For berry that probability is 90%.

Diagram of my coffee selections in October-March period.

Question: If I drank filter coffee this morning, what are the chances that I will drink a latte 2 days later? (We are in February.)

This question resides one of the most crucial findings of mathematics with itself: Markov Chain.

A Markov Chain example. I will be explaining what it is in details inside the article.

It is clear to see that there are two different possibilities to drink latte two days from now. Sum of their probabilities will give us our answer:

Probability of drinking filter coffee (0,6) the next day, and latte (0,4) two days later: 0,6*0,4 = 0,24.

Probability of drinking latter (0,4) the next day, and again latte (0,8) two days later: 0,4*0,8 = 0,32.

Probability of drinking latte two days from now: 0,24 + 0,32 = 0,56.

It means the chance is 56%.

One wonders…

  1. Does it matter which day of February it is today?
  2. Would it change the answer if you learn that I drank filter coffee yesterday? Please elaborate your answer.
  3. If I drink iced latte on June 11th, what is the probability of me drinking berry on June 14th?

Driverless Cars

In case you make a simple web search you will see that there are thousands of pages of articles that question where flying cars are. A few generations including mine have been dreaming about flying cars whenever we were just kids. “Back to the Future” was one of the main reasons why we had such dreams. And it is not like we expect time travel. We just want flying cars!


It is 2019 and there are still no flying cars around. Technology developed as much as making driverless cars only. (Only?!)

Decision making systems are among the key technologies needed for building driverless cars. Because a self-driving car will make hundreds of decisions even if it travels short distances.

Gist of decision making systems is the Markov Chain I mentioned in the Serkan’s Coffee. A concept known as Markov Decision Process is the powerful tool that is being used for driverless cars.

Markov Decision Process (MDP): It is a mathematically formulation for decision and control problems with uncertain behavior.

Memory-less Probability

Markov Chain: If there is Markov Chain inside an event or system, future of that system depends only at the current state of the system; not to its past. And it is possible to predict the future of that system.

One of the examples of Markov Chain is Drunkard’s Walk. Reminder: A drunkard makes random decisions while he/she is trying to find his/her home. Assume that the drunkard had made these moves:


Drunkard’s next move will not depend on the previous moves he/she had made. This only depends on his/her current state and probabilities of the possible moves.

If drunkard will move from the point F, there are four possibilities and none of them depend on previous steps the drunkard took.

It goes the same for driverless cars: Decisions will not depend on the previous ones. For example if a driverless car is heading towards traffic lights its decision will depend on the color of the traffic light; not the left turn it made 200 meters behind.

Markov Chain was found more than 100 years ago and it is being used in economy, meteorology, biology, game theory and even modern technologies like driverless cars and voice recognition systems.

Mathematician Family

The person who gave Markov Chain its name is a Russian mathematician called Andrei Markov. His little brother, Vladimir Markov, was also internationally recognized mathematician. Vladimir died because of tuberculosis at the age of 25. Andrei’s son Andrei Markov Jr. was also a mathematician.

Politics and Andrei

Andrei Markov was involved with politics too. He was not in favor of Romanov dynasty which ruled Russia between 1613 and 1917. He showed his opposition with not participating in the 300th year celebration of Romanov dynasty in 1913. Instead he celebrated 200th year anniversary of the Law of Large Numbers! (I’ll get back to Law of Large Numbers later.)

M. Serkan Kalaycıoğlu


Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s