Application of Markov Decision Processes (MDPs) in Petroleum Industry

Wanying Shi, Jian Guo

Abstract


Gasoline and diesel fuel is the lifeblood that keeps our
daily life moving forward. Inefficient operation of fuel supply
leads to unsatisfactory service, time consuming, as well as low
economic benefits. Exploring the optimal timing for gas stations to
replenish gasoline and diesel is of importance. We propose to apply
infinite-horizon Markov Decision Processes (MDPs) to this
dynamic problem. Compared with traditional methods for
determining the optimal timing of replenishment, such as IB,
EOQ, EB, etc., MDPs are better in accurately modeling the
situation which needs sequential decision making under
uncertainties. For the MDPs modelling gas station replenishment
problem, the rewards for any actions taken in the states (the
remaining gasoline and diesel inventory status in the oil tank of the
gas station) is to keep the duration for stockout and the tanker
trucks’ waiting time as low as possible. The optimal policy is to
maximize the rewards. A real world case study was presented and
a revised infinite-horizon MDPs model was constructed to
optimize the time for replenishment. Managerial insights guiding
the actions gas stations should take to optimize their
replenishment strategies are gained.


Keywords


MDPs, optimization, petroleum industry

Full Text:

PDF

Refbacks

  • There are currently no refbacks.