Schedule

The 6th Student Conference on Operational Research

last update: 04 April 2018 / 15:33

Overview

Friday 6th April Saturday 7th April Sunday 8th April
09:30-10:30
 
Scheduling 09:30-10:30
 
Heuristics
10:30-11:00 Coffee Break 10:30-11:00 Coffee Break
11:00-12:00
 
Statistics & Forecasting 2 11:00-12:00
 
Statistics & Forecasting 1
12:00-13:00
 
Registration & Lunch 12:00-13:00
 
Lunch 12:00-13:00
 
Lunch
13:00-13:30 Welcome from SCOR18 Committee 13:00-14:00
 
Dr Jonathan Thompson
(University of Cardiff)
13:00-14:00
 
Youri Moskovic
(Satalia)
13:30-14:30
 
Sophie Parker
(OR Society)
14:15-15:45
 
 
WORHP Workshop 14:15-14:45 Closing
14:45-15:45
 
Linear Programming
15:45-16:15 Coffee Break 15:45-16:15 Coffee Break
16:15-17:15
 
Simulation & Decision Theory 16:15-17:45
 
 
WORHP Workshop
17:30-18:30
 
Simulation
 
 
19:00-23:00
 
Dinner 19:00-23:00
 
Conference Dinner

Linear Programming (Friday / 14:45 – 15:45)

Lorenzo Peirano

Università degli studi di Bergamo

Air intermodal freight transportation: the freight forwarder service problem

Despite being one of the most prevalent figures in international multimodal transportation, freight forwarding companies optimization problems didn’t receive much attention from the literature. In this work we try to fill this gap by presenting the general features of air transportation seanen from the freight forwarder’s prospective and introducing the air transportation freight forwarders service problem (ATFFSP). A MILP formulation of the problem is proposed and results are studied based on real-life data of an italian freight forwarding company.
Keywords: Air Transportation, MILP, Freight Forwarders, time-space network

Saranthorn Phusingha

The University of Edinburgh

Benders Decomposition for the Multi-Period Sales Districting Problem

In the sales districting problem, we are given a set of customers and a set of salesmen in some area. The customers are given as points distributed across the area and the salesmen have to provide services at the customers’ locations to satisfy their requirements. The task is to allocate each customer to one salesman. This partitions the set of customers into subsets, called districts. Each district is expected to have approximately equal workload and travelling time for each salesman to promote fairness among them, and the overall travelling distance should be minimal for economic reasons. We now extend this problem to be more realistic by considering that each customer requires recurring services with different visiting frequencies like every week or two weeks during the planning horizon. This problem is called the ‘Multi-Period Sales Districting Problem’. In addition to determining the sales districts, we also want to get the weekly visiting schedule for the salesmen such that the weekly travelling distances are minimal and the workload and travelling time are balanced each week. Although the problem is very practical, it has been studied just recently.

In this presentation, we focus on the scheduling problem for one salesman in a specific district, which is already an NP-hard problem. We start by proposing a mixed integer linear programming formulation. Afterwards, we develop and implement Benders Decomposition to solve the problem, exploiting the structure of the formulation. We also consider several modifications to enhance the performance of the algorithm.
Keywords: Mixed integer linear programming, sales districting problem, multi-period, Benders Decomposition

Abdul Salam Khan

Arts Et Metiers, ParisTech

An Optimized Approach of Cost, Time and Quality towards Reconfigurable Manufacturing System

A Reconfigurable Manufacturing System provides an enterprise with best in class production systems. Mass production and product diversity are the key congruent of configuration offered by Reconfigurable Manufacturing System. In recent past, Reconfigurable Manufacturing System is posed by contemporary practices with changing dynamics. These dynamics are broadly because of change in demand patterns and change in product family. A high through put, quick changeover with minimum process cost and optimal quality provides an enterprise and production system with a competitive advantage in contemporary markets. There are three configuration aspects in Reconfigurable Manufacturing System literature for generation of systems and Soft configuration is the aspect of Reconfigurable Manufacturing System that focuses upon the process planning aspects of configuration. In this study, we provide with common grounds of Total Quality Management and RMS literature. A discussion is made for the changes in Part-family production and its relation with the level of configuration. Considering a generic production set-up consisting of n rows and m columns; we provide a general formulation for both cost and process time using the Mixed-Integer Linear Programming (MILP) technique. The objective function tends to optimize the processing, carrying and set-up costs as well as the processing and carrying time. We also formulate for the Quality parameters using Tagouchi Experiment. The parameters for quality optimization are process capability index, precision, accuracy, probability of a good part and signal to noise ratio. This study is concluded by providing an industrial example for process optimization in a production system.
Keywords: process cost throughput, Mixed Integer Linear Programming, Tagouchi Experiment

Simulation & Decision Theory (Friday / 16:15 – 17:15)

Matthias Deceuninck

Ghent University

Outpatient appointment scheduling with unpunctuality: an efficient simulation approach

We consider the problem of appointment scheduling in the presence of unpunctual patients. To address this problem, many studies combine simulation with some heuristic search method. Such optimization techniques involve a huge number of repeated schedule evaluations which makes the simulation time an important factor. In this paper, we propose a variance reduction technique to reduce the time to evaluate the performance of a schedule. The proposed method improves the precision of the simulation estimates through the judicious use of control variates and represents considerable improvement compared to traditional Monte Carlo methods. The evaluation approach is then included in a simulation optimization framework to provide insights into scheduling with unpunctual patients. We examine various unpunctuality distributions and show that the optimal scheduling policy changes when unpunctuality is considered. In many cases, scheduling patients in blocks of two or three patients better mitigates the negative effects of patient unpunctuality.
Keywords: Appointment scheduling, Unpunctuality, Simulation optimization, Control Variates

Youssef Hoceini

Tahri Mohammed University of Bechar

Solving the Problem of the Drinking Water Leak in the Cities by a Multicriteria Decision Analysis Approach: The Inspection of Pipe Sections Case of Bechar City

The town of Bechar located in the Western South (Sahara) of Algeria, like all the Algerian cities, a keen water demand, proportionally with the increase in the number of the population, and at the rate of increase in 3.6 %, which of 192909H in 2007, is estimated with 348415H in 2027 and deficit which recorded a level of -51954m3/day in 2013. Like a source of the shortages, we find the climatic vagaries, but also deficiencies in the management of the risks and exploitation of the resources. To remedy this, we propose in the present paper a design and an implementation of solution in the patrimonial management of water networks in the city from data resulting from a series of inspections of sections of pipelines based on Multicriteria decision analysis approach. The latter consists in defining sections of homogeneous classes according to their state. In consultation with the network manager, three classes are defined: “good”, “medium” and “bad”. We treat the problem as a “Beta” assignment (sorting) problem as it is named in the partial aggregation (outranking) approach in multicriteria analysis theory, where we use one of the methods who is ELECTRE Tri to assign sections to categories defined in advance. The results of this work tend to collaborate to solve the problems related to the drinking water leak in the cities and opens perspectives of application of this kind of approach in the field of patrimonial management in general.
Keywords: Multicriteria decision analysis, patrimonial management, drinking water leak, partial aggregation, assignment problematic, ELECTRE Tri

Yagmur Gok

The University of Edinburgh

Simulation-Optimization Methods for Resilient Airport Resource Management

This doctoral research looks at airport resource allocation problems, from the perspective of the airport operator, whilst considering other parties’ (e.g. airlines, ground handlers) needs, by developing and testing novel Simulation-Optimization (SO) methodologies. For long, the OR/MS community has been studying aviation problems, including airport operations related. Efficient use of airport resources is one such problems: it is computationally difficult to solve to optimality for real world instances, and naturally involves the need for trading off the interests of multiple stakeholders (same as above). Virtually all airport resource allocation problems manifest themselves in two main versions. First, tactical allocation plans covering a day/month/season ahead of the day of operations have to be developed and agreed between parties. In this case, robust schedules that can potentially better stand the test of time with the inevitable operational disruptions will be preferred. Second, operational control (or online) reallocation plans have to be developed on the day of operation, to counter disruptions as they happen. For both versions, SO methodologies are proposed. These make use of a combination involving both metaheuristics and Constraint Programming methods, to keep solution times within reasonable limits while catering for the most prominent uncertain aspects of these problems.
Keywords: airport operations, simulation, optimization, metaheuristics, Constraint Programming, Multi-Criteria Decision Making

Simulation (Friday / 17:30 – 18:30)

Usame Yakutcan

University of Hertfordshire

CONSTRUCTING PATIENT FLOW of COPD PATIENTS in OUTPATIENT DEPARTMENT for DISCRETE EVENT SIMULATION

Chronic Obstructive Pulmonary Disease (COPD) is the world’s third deadliest disease that can also frequently lead to emergency admissions. In the UK, there are about 1 million people with COPD which is a progressive lung disease.

COPD is mainly modelled by researchers with the help of Markov modelling and Monte-Carlo simulation for assessing the effectiveness of new drugs, therapies, and treatments. However, there is no work in the literature about COPD patient flow modelling at operational level. First of all, conceptualising COPD patient pathway is needed to better understand COPD services, by bridging healthcare decision makers and model developers in a common language, and more importantly to capture reality within modelling environment.

In this paper, therefore, a COPD patient flow diagram of Outpatient department is developed using healthcare guidelines and COPD service professionals from a well-known hospital in the UK. The pathway, including outpatient clinics and services, is modelled using discrete-event simulation technique. Also, a set of possible scenarios is tested. Lastly, initial model set of results are presented and analysed. In conclusion, the study will be very helpful for conceptualisation studies of different diseases as well as for COPD services in the UK to provide better planning and management.
Keywords: Chronic Obstructive Pulmonary Disease, COPD, Discrete-event Simulation, Patient Flow, Conceptualisation, Outpatient Department

Muhammed Ordu

University of Hertfordshire

Better understanding of clinic utilization for projection in a trauma & orthopaedics outpatient clinic

Population growth is one of the most important factors which increase hospital demands. Limited resources in hospitals might be inadequate to meet increasing hospital demands and this can cause intensive workload for healthcare providers in hospitals. Trauma & Orthopaedics outpatient clinics involve the higher patient activities and follow-up treatments in hospitals. This study aimed to develop a discrete event simulation model for calculating clinic utilization rates of a trauma & orthopaedics outpatient clinic for projection in an English hospital. In the model, we used a big data covering 2009/10 – 2012/13 financial years. Therefore, we tackled the future 3-year demand of the hospital by considering growth projections of the region where the hospital serves. An experimental design including 3 parameters (i.e. demand, clinic slot and number of follow-up), which affect the clinic utilization rate, was taken into account for the scenario analysis. Clinic utilization rates were calculated for the clinic through the scenario analysis which involves a total of 8 experiments for each projected year. This study provides an insight for better understanding of clinic utilization rates in outpatient clinics.
Keywords: Discrete event simulation, Trauma & orthopaedics, Clinic utilization, Growth projection

Emmanuel Musa

Aston University, Birmingham West Midlands

Simulation modelling of product-service systems: a comparative study

To increase revenue and competitive advantage, Original Equipment Manufacturers (OEM) have increased the level and sophistication of service components in their offering in an integrated format known as product-service systems (servitization). Though servitization promises immense benefits for OEMs, managers in servitizing firms are faced with the challenge of determining the cost of achieving outcomes or performance and what to charge. This paper presents discrete event simulation models comparing two variants of product-service systems (traditional product-plus maintenance contract and availability-based contract) under the same operation parameters for a precision machine manufacturer. A cost-price analysis is used to quantify trade-offs between the two models.
Keywords: Discrete-event Simulation, Availability-based contract, Cost, Price

Scheduling (Saturday / 09:30 – 10:30)

Florian Hauck

Freie Universität Berlin

Data-driven Strategic Railway Timetable Optimization

Delayed passenger trains are still a major issue for railway companies and one of the main reasons for displeased customers. In 2017, only about 80% of all ICE and IC trains in Germany arrived on time. In many other countries as well, railway companies struggle to achieve high punctuality rates. One reason for this problem is the complexity of the network infrastructure and the high capacity utilization of rails, combined with an extensive overall planning process on different levels.
The focus of this ongoing research project is to improve punctuality of passenger trains by developing a data-driven optimization approach. Many research projects have already tackled this problem and different approaches have been developed to optimize infrastructure, timetables, or connections. We want to enhance existing approaches by developing a data-driven add-on for subsequently tuning a given timetable based on historic delay data. Therefore, we analyze train delay data from recent years from a German railway company and identify systematic delays. Next, we perform small timetable adaptions in order to avoid such systematic delays. The original timetable should be retained as far as possible so that existing connections and frequencies remain preserved. Therefore, we only adjust buffer times or arrival and departure times within a minor time window. The simultaneous consideration of adaptions for the whole network makes this optimization problem a highly complex combinatorial task.
In the talk, we focus on problem definition, data-analytics results, and a concept for the adjustment of buffer times.
Keywords: Railway Timetable Optimization, Data Analytics, Data-driven Optimization

Xia-Jie Yi

Ghent University

Robust approaches for scheduling round robin soccer league

This paper tries to improve the robustness of soccer schedules via combined proactive and reactive approaches. Despite efforts to compute high-quality initial schedules, they are rarely fully played as planned because of uncertainties like bad weather conditions, conflicts with other events, etc. Consequently, some games are postponed or cancelled, inducing deviations from initial schedules defined as disruptions. Fixtures as they were effectively played are called realized schedules. We assume that disruptions can only be rescheduled to so-called fictive rounds, which are not present in initial schedules.
For evaluating the quality of initial and realized schedules, measures involving breaks, ranking and failures are used. A break corresponds to two consecutive games of the same team with identical home-away advantage. We develop a ranking difference (RD) measure based on the difference between the number of games that a team should have played and actually played after each round. We call a disruption a failure if it cannot be rescheduled.
We investigate several combined proactive and reactive scheduling approaches. Proactive strategies decide when to schedule fictive rounds. Reactive strategies decide to which fictive round a disruption should be rescheduled under one of two assumptions: (i) disruptions are rescheduled immediately and permanently, and (ii) disruptions are rescheduled to fictive rounds temporarily and can be changed later if more information becomes available. In computational experiments, we generate disruptions based on probability distributions resulting from our empirical study, and we combine proactive strategies with reactive strategies to compare the resulting schedules on breaks, RD, and failures.
Keywords: Breaks, Ranking Difference, Robustness, Proactive and Reactive Strategies

Mathieu Vandenberghe

Ghent University

Protecting a Sensitive Queue from Arrival Variability

Many industrial systems can be viewed as a chain of interlinked subsystems. A chief focus of operations research is to optimize these links, particularly the transfer of items from one stage to the next. In some sectors (notably the food industry) it is furthermore critical to minimize the total (waiting) time of items between production stages; for example due to regulations that limit exposure time to ambient temperature. In that sense, queues in these production lines can be called sensitive.
We envision a two-stage production system: stage 1 comprises multiple parallel machines whose processed items must all be served by stage 2, a machine with a single sensitive queue. We assume that the processing times (stage 1) have distinct distributions while all service time distributions (stage 2) are identical. If our objective is to minimize the waiting time of items in the system, we should avoid ‘burstiness’ at the sensitive queue’s entrance. Various strategies can be used for this, but we focus on spreading the expected completion times of stage 1 as evenly as possible. This objective has strong similarities to the Stochastic Break-In-Moment (SBIM) problem in operating room scheduling; with the major difference that the assignment of tasks to machines can be freely chosen in the current problem.
We investigate the improvement that this free assignment enables, and under what conditions it justifies the extra computational cost. We will present the potential of the assignment problem (as compared to SBIM) under various instance sizes, levels of stochasticity and restrictions.
Keywords: Combinatorial Optimization, Stochastic Programming, Production, Assignment

Heuristics (Sunday / 09:30 – 10:30)

Khaled Alrajhi

King Khaled Miliatry Academy

A Hyper Heuristic Algorithm for the Frequency Assignment Problem

This paper studies a hyper heuristic (HH) algorithm for the minimum-order frequency assignment problem (MO-FAP). The goal of the MO-FAP is to assign a frequency to each request while satisfying a set of constraints and minimizing the number of used frequencies. HH can be thought of as an algorithm that combines multiple heuristics to solve hard combinatorial optimization problems. Such heuristics are called low level heuristics (LLHs) and are managed by HH. Different mechanisms for selecting the LLHs are investigated. Several techniques are also used to make the HH algorithm in this study more efficient. One of these is using a lower bound on the number of frequencies that are required for a feasible solution to exist. Moreover, each LLH is associated with an independent tabu list. Overall, the HH algorithm showed competitive performance compared with other algorithms in the literature.
Keywords: Frequency Assignment, Hyper Heuristic, LLH Selection

Sezgi Tekil

Istanbul Gedik University

AN ALGORITHM FOR THE LARGE SCALE BIN PACKING PROBLEM

The bin packing problem is an NP-hard problem that different volumes must be packed into a finite number of bins for the purpose of minimizing the number of bins used.Despite its computational complexity,it can be solved with sophisticated algorithms.Bin packing problem can also be seen as a special case of cutting stock problem.Also,many variations of this problem has been confronted in literature, such as one,two and three dimensional packing,linear packing, packing by weight,packing by cost.Likewise,there are also so many applications of bin packing problem algorithms,such as container loading,loading trucks with weight capacity constraints,creating file backups in media,technology mapping and cloud computing.
International trade has become significant with the evolution of the technology.For keeping on the logistic operations with low cost,higher accuracy and rapidity,the mathematical optimization tools and models are highly used and also needed.Moreover,the container loading problem is highly occurred in the logistic operations.At that time,appropriate algorithms must be applied due to get optimum results.However in some cases,these algorithms aren’t enough for the solutions of bin packing problems.In this study, the logistic operations and container loading plan of a filter factory was investigated.For improving the work load speed,decreasing the number of container used and making appropriate optimum decisions, 2D Combinatorial Area Fitting Algorithm(2D-CAMFA) has been generated.Consequently,an exact algorithm of two dimensional bin packing problem Self-Best Fit Algorithm was compared with 2D-CAMFA model and optimum result was taken from 2D-CAMFA.After examining of the modeld,it was applied to the real data of filter package and optimum result has been taken.
Keywords: Bin packing, Heuristic Algorithms, Container Loading Problems, 2D-CAMFA, Shelf Best Fit Algorithm

David Van Bulck

Ghent University

Scheduling time-relaxed double round-robin tournaments with availability constraints

In the time-relaxed double round-robin problem with availability constraints, we are given a set of slots and a set of teams that meet one another a fixed number of times. Furthermore, each team provides slots in which it can host a game, and slots in which it cannot play at all. Since time-relaxed schedules contain (many) more slots than there are matches per team, the number of rest days between two consecutive matches of the same team can vary considerably. Ideally, a team has at least R days rest between two consecutive matches in order to avoid injuries. Hence, we apply a penalty each time a team has less than R days between two consecutive matches; this penalty increases as the number of rest days decreases. The goal is to find a schedule that minimizes the total penalty.

To solve this problem, we first propose fix-and-relax decomposition methods based on the teams as well as on time-intervals. Basically, fix-and-relax methods initially solve a relaxed IP-formulation and then gradually reoptimize the model with the integrality constraints enabled for a small subset of variables. The method then fixes the variable values of the selected group and repeats the procedure until all variables have an integral value. Second, we propose a genetic algorithm enhanced with an improvement heuristic that sequentially solves a transportation problem which (re)schedules all home games of a team. For numerous artificial and real-life instances, these heuristics generate near-optimal schedules using considerably less computational resources compared to integer programming (Gurobi).
Keywords: time-relaxed sports scheduling, fix-and-relax, evolutionary algorithms

Statistics & Forecasting 1 (Sunday / 11:00 – 12:00)

Leonie Tabea Goldmann

The University of Edinburgh

Use of Social Media Big Data for Enhancing the Effectiveness of Credit Allocation to Companies

A large number of studies focusses on predicting bankruptcy of companies using various data sources, however current models can still be improved to achieve even better predictions. This research introduces a new method to more accurately predict the probability that a company will suffer financial distress. More specifically, this research exploits the vast increase in social media data put out by companies that should enable to develop more predictive models than those developed in the past. This will be done by relating a completely new type of information than has been considered in literature so far, that contained in company tweets put out on Twitter, to the incidence of distress. It is expected that prior to distress the content and other aspects of tweets will differ between that posted by those companies that subsequently suffer distress and those that do not. A large sample of NYSE listed companies are used and specialist tools are applied to download and analyse the content of tweets together with standard control variables. Performances of models with and without Tweets are compared, as well as performances of models that use different types of classifiers.
This research will make novel contributions to the literature, since this has not been done before. Furthermore, the research will have impact with considerable significance and reach in that banks, building societies, data companies and industrial corporations can potentially make use of the models the research will develop.
Keywords: Credit Risk Modeling, Sentiment Analysis, Bankruptcy Prediction

Andrea Mor

University of Brescia

Dynamic traveling salesman problem with uncertain release dates

The dynamic traveling salesman problem with uncertain release dates (DTSP-urd) that is a problem in which a dispatcher receives goods from its suppliers as the distribution takes place. The goods to be delivered are available for the distribution only after they arrive at the depot of the dispatcher. The arrival time of the goods at the depot is called the release date of a parcel. In particular, in the DTSP-urd, release dates are considered to be uncertain and dynamically updated as the distribution takes place. The objective of the problem is to minimize the total time needed to serve all customers, given by the sum of the traveling time and the waiting time at the depot. The waiting time is due to the fact that the vehicle has to wait at the depot for all the parcels of the customers it is going to serve in the next route. A reoptimization technique is proposed to tackle the dynamic aspect of the problem and three dynamic policies with increasing reoptimization frequency are introduced. Two models for the solution of the problem at each decision epoch are discussed. The first one is an adaptive deterministic model where a point estimation of the release dates is used. The second is a stochastic model exploiting the entire probabilistic information available for the release dates.
Keywords: Traveling salesman problem with release dates, dynamicity, uncertainty, reoptimization

Kelvin Onyibor

European University of Lefke

Aid-Growth Relationship: Evidence from A Triangular Co-Integration Analysis For The 5 Poorest Countries Of The World

This paper focus on the impact of Foreign Aids on Economic Growth for the 5 poorest economies of the world (Niger, Congo, Burundi, Malawi, and Central African Republic). Employing endogenous growth model and annual time series data from 1986 – 2015, we examine this growing nexus using a triangulation of approaches and a robust model specification, hence testing both the long run and short run relationship between foreign aid and economic growth through the ARDL Bound test of Co-Integration (long-run relationship) and incorporating an Error Correction Model to test for short-run relationship alongside the speed of adjustment back to the long run equilibrium after a short run shock. The result shows a positive and significant long-run relationship for all countries except for Burundi which displayed no long-run relationship. Interestingly, no short run relationship was found for all countries. The derived conclusion from this paper is that government have to re-evaluate the sectoral allocation of this foreign Aids received to ensure a higher efficiency and positive effect on economic growth of this countries.
Keywords: foreign aid, economic growth, Auto regressive distribution Lag Model, Error Correction Model

Statistics & Forecasting 2 (Saturday / 11:00 – 12:00)

Ezgi Aslan

Yildiz Technical University

Prediction of Particulate Matter Concentrations: A Literature Review

Taking precaution against air pollution is essential due to its significant adverse effects on human health. Thus forecasting air polluter concentrations has attracted special attention in the recent times. Particulate matter (PM) is one of the most critical air polluters considering its link to cardiovascular and respiratory diseases and lung cancer. In this study, we aim to present a literature review on PM concentration prediction methods.
Keywords: Air Pollution, Air Quality Forecasting, Machine Learning, Particulate Matter, PM10, PM2.5

Stefanos Leonardos

Greece National and Kapodistrian University of Athens

Comparative Statics via Stochastic Orderings in a Two-Echelon Supply Chain

We revisit the classic Cournot model and extend it to a two-echelon supply chain with an upstream supplier who operates under demand uncertainty and multiple downstream retailers who compete over quantity. The supplier’s belief about retail demand is modeled via a continuous probability distribution function $F$. If $F$ has the decreasing generalized mean residual life (DGMRL) property, then the supplier’s optimal pricing policy exists and is the unique fixed point of the mean residual life (MRL) function. This closed form representation of the supplier’s equilibrium strategy facilitates a transparent comparative statics and sensitivity analysis. We utilize the theory of stochastic orderings to study the response of the equilibrium fundamentals – wholesale price, retail price and quantity – to varying demand distribution parameters. We examine supply chain performance, in terms of the distribution of profits, supply chain efficiency, in terms of the Price of Anarchy, and complement our findings with numerical results.
Keywords: Continuous Distributions, Demand Uncertainty, Generalized Mean Residual Life, Comparative Statics, Sensitivity Analysis, Stochastic Orderings

Ehsan Yadollahi

Ghent University

The Impact of Forecasting Accuracy on Stochastic Periodic Inventory Routing Problem

The Inventory Routing Problem (IRP) is an underlying optimization problem for a warehouse replenishing multiple retailers to satisfy their demands while minimizing transportation and inventory costs within the system. Forecasting accuracy plays an important role in IRP, because the demands are stochastic and the planning is done for a longer horizon. The uncertainty level among the planning horizon makes it challenging for the decision makers to find the optimum level of products to be delivered to the retailers as well as the best routing system to reach them. In this paper we consider the impact of forecasting accuracy on the inventory and service level for a set of randomly dispersed retailers around a central warehouse in a distribution centre. We optimize the inventory levels and expected costs for each retailer for an illustrative example. The results will be simulated to compare the actual and expected outcomes of the model.
Keywords: Inventory Routing Problem, Forecasting, Simulation, Statistical Distribution, Safety Stock