Tuesday, March 5, 11 AM
Online lunch talks
- February 2017, Tuesday 7 (12:30 am): by Maria-Isabel Restrepo (INRIA Centre de recherche Lille-Nord Europe)
Integrated Shift Scheduling and Load Assignment Optimization for Attended Home Delivery
In this work, we study an integrated shift scheduling and load assignment optimization problem for attended home delivery. The proposed approach is divided into two phases, each one corresponding to a different planning level: tactical and operational. In the tactical planning, a daily master plan is generated for each courier. This master plan defines the working shifts, the origin-destination pairs to visit, and the number of packages to deliver. In the operational planning, delivery orders are allocated to couriers in real-time. The stochastic and dynamic nature of customer orders is included in the tactical and operational decision levels, respectively. Results on real-world based instances from a delivery company, demonstrate that our approach provides robust tactical solutions that easily accommodate to fluctuations in customer orders, preventing additional costs related to the underutilization of couriers and the use of external couriers to satisfy all delivery requests.
- January 2017, Tuesday 10 (12:30 am): by Trivikram Dokka (Lancaster University Management School)
Robust Toll Pricing
We study the toll pricing problem when the non-toll costs on the network are not fixed and can vary over time. We assume that users who take their decisions, after the tolls are fixed, have full information of all costs before making their decision. Toll-setters, on the other hand, do not have any information of the future costs on the network. The only information toll-setters have is historical information (sample) of the network costs. In this work, we study this problem on parallel networks and networks with few number of paths in the single origin-destination setting. We formulate the toll pricing problem in this setting as a distributionally robust optimization problem and propose a method to solve to it. We illustrate the usefulness of our approach by numerical experiments using a parallel network.
- November 2016, Wednesday 23 (02:30 pm): by Jorge Amaya (Universidad de Chile)
Applied Research in OR at the Center for Mathematical Modeling, University of Chile :
We address three optimization problems arising from the industry:
- Production Scheduling for the Mining Industry (sequence of ore extraction);
- Crew Scheduling for Train Transportation (drivers assignment); and
- Optimal Resource Allocation in Public Education Systems (school location).
In this talk we will briefly describe the main concepts associated to these problems and the corresponding mathematical models. These projects are being developed in collaboration with industrial companies.
- September 2016, Wednesday 14 (01:00 pm): by Johan Barthelemy (University of Wollongong)
A stochastic activity-based model and a strategic agent framework for traffic simulation
The work presented in this seminar focuses on the simulation of the mobility behaviour of (large) populations, i.e. the travel demand and its assignment on the road network.
The travel demand is simulated using a stochastic activity-based approach in which the travel demand is derived from the activities performed by the individuals. The proposed model is distribution-based and requires only minimal information, but is designed to easily take advantage of any additional network-related data available. The proposed activity-based approach has been applied to the Belgian synthetic population. The quality of the agent behaviour is discussed using statistical criteria and results shows that the model produces satisfactory results.
The demand can then be assigned on the network. For that purpose, a novel agent-based framework is being developed. The proposal relies on the assumption that travellers take routing policies rather than paths, leading us to introduce the possibility for each simulated agent to apply, in real time, a strategy allowing him to possibly re-route his path depending on the perceived local traffic conditions. The re-routing process allows the agents to directly react to any change in the road network. For the sake of simplicity, the strategy is modelled with a simple neural network whose parameters are determined during a preliminary training stage. The inputs of such neural network read the local information about the route network and the output gives the action to undertake: stay on the same path or modify it. As the agents use only local information, the overall network topology does not really matter, thus the strategy is able to cope with large networks. Numerical experiments are performed to test the robustness and adaptability to new environments. The methodology is also compared against MATSim and real world data. The outcome of the experiments suggest that this work-in-progress already produces encouraging results.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Etilux Room
- May 2016, Wednesday 25 (10:30 am): Greet Vanden Berghe (KU Leuven)
Theoretical and practical challenges in nurse rostering
Health care is under high pressure to improve efficiency, given increasing requirements and decreasing resources. Among the activities to optimise, nurse rostering is one of the most relevant to address. The problem is computationally complex and has potential for improvement through automated decision support. Personnel rosters also have a considerable socio-economic impact.
This optimisation problem has yielded ample practice-oriented operational research approaches. Despite the vast amount of academic research results, it remains hard for novice developers to profit from general insights or re-usable approaches. This `cold startâ€™ issue can be partially explained by complicated regulations typical for personnel environments with 24/7 duties and different in almost every organisation. The very same issue also persists due to the lack of a theoretical framework for nurse rostering.
From an academic point of view, interesting models have been developed for varying nurse rostering problems. The approaches range from self-rostering and manual problem decompositions to different levels of automated decision support.
The seminar will focus on the challenging interplay between practical and theoretical nurse rostering contributions.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room 1715
- May 2016, Friday 20 (01:30 pm): Gilbert Laporte (HEC Montreal)
The Fascinating History of the Vehicle Routing Problem
The Vehicle Routing Problem (VRP), introduced in 1959 by Dantzig and Ramser, plays a central role in distribution management. It consists of designing a set least cost delivery or collection routes for a set of vehicles based at a depot and visiting a set of geographically scattered customers, subject to a variety of constraints. The most common constraints are capacity constraints, duration constraints and time windows. This talk will concentrate on the so-called classical VRP with capacity constraints only. The VRP is ubiquitous and highly important from an economic point of view. From a research perspective, it occupies a central role in operations research. Its study by the scientific community has fueled the development and growth of several families of exact and approximate algorithms. Exact algorithms such as branch-and-cut, column generation and branch-and-cut-and-price owe part of their evolution to the study of the VRP. Similarly, the most common classical heuristics and most of the more recent metaheuristics have been developed through the study of the VRP. In this talk I will highlight several of these developments. In spite of all the attention the VRP has received over the past 55 years, it can still only be solved exactly for relatively small instances (with slightly more than 100 customers) and the corresponding algorithms are rather intricate. Over the past 10 years or so, several powerful metaheuristics have been put forward for the approximate solutions of the VRP. The best ones combine concepts borrowed from local search and genetic search. Nowadays, the best metaheuristics can generate rather quickly solutions whose value lies within 1% of the best known solution values on a set of benchmark instances. This talk will also review these developments. It will close with some research outlooks.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room 030
- April 2016, Friday 29 (02:00 pm): Laurie Garrow (Georgia Institute of Technology)
Estimation of discrete choice modeling parameters for revenue management models
We develop a new parameter estimation routine for estimating multinomial logit models for which one alternative is completely censored. Our method is based on decomposing the log likelihood function into particular marginal and conditional components. Simulations based on industry hotel data clearly demonstrate the superior computational performance of our method over existing methods that are capable of estimating price effects. We consider extensions of our methodology to Generalized Extreme Value (GEV) discrete choice models that allow for flexible product substitution patterns. We show how the GEV choice-based sampling estimator can be applied to estimate parameters associated with censored alternatives, and derive identification rules that can be used to determine when these parameters are unique. Empirical examples based on simulated datasets demonstrate the large-sample consistency of estimators and provide insights into data requirements needed to estimate these models for finite samples.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room 1715
- April 2016, Wednesday 27 (10:30 am): Philippe Chevalier (UCL)
Models for horizontal supply chain collaboration
Horizontal collaboration can bring significant benefits to firms even in a competitive environment. In this talk we will present models for different types of collaboration ranging from Joint Ventures to joint planning of production equipment. Depending on the nature of the collaboration agreement, the incentive problems to be solved to reach an agreement that is both efficient and stable are different. In this talk we present different models to study collaboration agreements between firms. We analyse the investment decision, the capacity allocation decision and the short term planning decision.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room Etilux
- March 2016, Tuesday 8 (01:30 pm): Dr. Tom Van Lier (Vrije Universiteit Brussel)
An external cost calculator framework for evaluating the sustainability of transport solutions
Not taking external transport costs into account distorts market price signals, results in too high (road) traffic levels and avoids arriving at a societal optimum. The economic theory is sound, but practical applications are difficult due to the complexity of monetizing the different externalities in a sufficiently accurate way. The presentation will consist of three parts. A short introduction on the different transport externalities, with the current state-of-the-art methodologies for determining the external costs will be given. In a second part, some cases of external transport cost calculations for assessing the sustainability of transport options will be discussed, e.g in the field of horizontal logistics collaboration and teleworking. A third part will discuss the challenges for particular external cost categories in more detail, and will explore some potential ways forward to arrive at more accurate external cost assessments.
- February 2016, Wednesday 17 (10:30 am): Thierry Vanelslander (University of Antwerp)
PORT-RELATED INNOVATION: IS THERE A RECIPE FOR SUCCESS?
Innovation in general seems to happen very rapidly these days. However, the poor innovative strength displayed by the transport sector in the broad sense often contrasts strongly with that evidenced elsewhere. At the same time, it can be concluded from existing literature and studies that quite a lot of innovative concepts in transportation have been studied in detail. The main focus hitherto however has always been on inventing or introducing new concepts and procedures.
First, a typology of different types of innovation is being developed, illustrated with the actual occurrence in practice of each innovation type.
This paper assesses which innovations will generate which chain impacts, what conditions will conduce actors to innovate, or prevent them from doing so, and finally also what governments can do to stimulate innovation.
The present paper makes an application to innovation by seaport-related operators through a number of cases. On these cases, a standard set of information is collected. For collecting the information, use is made of existing documents on the cases at hand, on which a scientific review is performed. Equally, sector and project contacts are used for verifying and completing the information. In order to be able to derive meaningful conclusions from the set of cases, the fuzzy-set qualitative comparative analysis (fsQCA) technique is used.
The analysis shows that no unique recipe for innovation success exists, but that sub-groups of cases can be discerned which have common characteristics that consistently lead to innovation success, dealing with the terminal operators, the shipping lines, infrastructure and the innovation champion.
The findings are particularly relevant for those in business and in policy-making who are involved in implementing or promoting innovation.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room Etilux
- February 2016, Tuesday 1 (01:30 pm): by Diego Cattaruzza (Ecole Centrale de Lille)
Vehicle Routing Problems for City Logistics
City Logistics has been introduced to study and understand the transport activities in urban areas. Optimisation of urban delivery planning needs to take into account the dynamics of the city in order to provide efficient services. In nowadays delivery systems it is common to forbid heavy trucks to enter city centers. Trucks are forced to unload at logistic platforms located in the outskirts of the city. Merchandise is then loaded into smaller (and possibly eco-friendly) vehicles in charge of final delivery to customers.
First we present a survey of the movements of goods that occur in cities. This motivates the study we conducted and the development of optimisation tools for delivery activities in urban areas.
Second, we introduce the Multi-Trip Vehicle Routing Problem with Time Windows and Release Dates.
Routing problems that take into account time windows and multiple delivery trips are commonly studied by the scientific community, especially when applied to the urban context. On the other side, the study of routing problems that consider release dates on the products is not. Here the release date models the instant the merchandise is available at the logistic platform and ready for final dispatching. A genetic procedure is presented to tackle the problem.
This work was conducted in the context of the MODUM project (http://www-lipn.univ-paris13.fr/modum) that was founded by the French National Research Agency – ANR.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room Etilux
- December 2015, Thursday 3 (11:00 am): by Pieter Vansteenwegen (KU Leuven)
Advanced methods for robust timetabling for the Belgian railway network
In nearly saturated station areas the limited capacity is one of the main reasons of delay propagation. Spreading the trains well in these areas has a big impact on the total travel time in practice of all passengers in the railway network in case of frequently occurring small delays. We focus on improving the performance in the bottleneck of the network in order to improve the performance of the whole railway network. We propose a heuristic method to improve an existing timetable and also a method that builds from scratch a routing plan and a cyclic timetable that optimally spread the trains in space and time. These methods are applied to a case study of the Brussels’ railway station area, the bottleneck in the Belgian railway system. Both methods are able to improve the timetabling significantly.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room 138
- November 2015, Friday 6 (01:30 pm): by Roel Leus (KU Leuven)
An exact algorithm for parallel machine scheduling with conflicts
We consider an extension of classic parallel machine scheduling, where an undirected conflict graph is part of the input. Each node in the graph represents a job and an edge implies that its two jobs cannot be scheduled on the same machine. The goal is to find an assignment of the jobs to the machines such that the maximum completion time is minimized. We present an exact algorithm based on branch and price.
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room 034
- October 2015, Friday 9 (01:30 pm): by Professor Christoph Buchheim (TU Dortmund)
Robust Preprocessing for Real-time Combinatorial Optimization
We investigate a new concept in robust combinatorial optimization that aims at alleviating the so-called price of robustness: instead of a single solution as in the strictly robust setting, we allow to compute k solutions for a given number k, assuming that we may choose the best one of these solutions once the actual scenario is revealed. The aim is thus to solve a min-max-min problem: find k feasible solutions such that the respective best of them is optimal in the worst case. In a typical application of such an approach, an uncertain combinatorial optimization problem needs to be solved frequently and quickly, e.g., a parcel service needs to solve the same vehicle routing problem every day under changing traffic situations. After a potentially expensive preprocessing phase for computing the k solutions, the best of them can then be chosen very quickly every day once the actual traffic situation is known. We discuss the complexity of the resulting min-max-min problem for convex and (time permitting) discrete uncertainty. In the case of a convex uncertainty set, we show that the latter problem is as easy as the underlying certain problem if k is greater than the number of variables and if we can optimize a linear function over the uncertainty set in polynomial time. In the case of discrete uncertainty with a fixed number of scenarios, we show, among other results, that the problem is weakly NP-hard and admits an FPTAS for the shortest path problem, the minimum spanning tree problem, and the knapsack problem, while it is tractable for the min-cut problem.
(joint work with Jannis Kurtz)
Where: HEC-University of Liege – 14, rue Louvrex (N1)- 4000 Liège Room Etilux
- June 2015, Tuesday 16 (10:45 am): by Luce Brotcorne (Inria Lille – Nord Europe)
Une approche basée sur la programmation mathématique à deux niveaux pour résoudre des problèmes de tarification
Je présenterai tout d’abord une introduction à la programmation mathématique à deux niveaux en m’attardant tout particulièrement sur leurs potentiels à résoudre des problèmes de tarification. En effet les modèles à deux niveaux permettent de représenter des processus de décision hiérarchisé où un agent de décision (meneur) prend intrinsèquement en compte la réaction d’un autre agent (suiveur) pour atteindre son objectif.
Ensuite je décrirai plus en détails une application dans le contexte énergétique. Dans ce cas le meneur est un fournisseur d’électricité et le suiveur est un ensemble de clients reliés à un opérateur de Smart Grid. Je concluerai après avoir discuté des résultats obtenus.
- March 2015, Friday 20 (02:00 pm): by Bernard Fortz (ULB)
Computational strategies for a multi-period network design and routing problem
The conventional multicommodity capacitated network design problem deals with the simultaneous optimization of capacity installation and traffic flow routing, where a fixed cost is incurred for opening a link and a linear routing cost is paid for sending traffic flow over a link.The routing decision must be performed such that traffic flows remain bounded by the installed capacities. In this talk, we generalize this problem over multiple time periods using an increasing convex cost function which takes into account congestion (number of routing paths per edge) and delay (routing path length).
We propose a compact Mixed Integer Linear Program (MILP) formulation for this problem, based on the aggregation of traffic flows by destination following the per-destination routing decision process underlying packet networks. We observe that the resolution with realistic topologies and traffic demands becomes rapidly intractable with state-of-the-art solvers due to the weak linear programming bound of the proposed MILP formulation. We also introduce an extended formulation where traffic flows are disaggregated by source-destination pairs, while keeping the requirement of destination-based routing decisions. This extended formulation provides for all evaluated topologies stronger linear programming lower bounds than the base formulation. However, this formulation still suffers from the large size of the resulting variables and constraints sets; hence, solving the linear relaxation of the problem becomes intractable when the network size increases.
In this talk, we investigate different computational strategies to overcome the computational limits of the formulations. We propose different branch-and-cut strategies and a Lagrangian relaxation approach.
Joint work with Enrico Gorgone (ULB) and Dimitri Papadimitriou (Alcatel-Lucent Bell Labs)
- March 2015, Tuesday 03 (10:00 am): by Serguei Iassinovski (Project Manager at Multitel)
The Intelligent RAO Simulator
The Intelligent RAO Simulator is a hybrid tool combining advantages of a discrete-event simulator and an expert system based on production rules.The presentation includes the RAO basics, complex system representation(elements and process), an overview of main features using simple example, demonstrative and real applications. Also, the simulation model of electricity distribution grid controlled by SCADA under a cyber attack is presented in more details (the model is developed in the frames of an FP7 research project).
- January 2015, Thursday 22 (10:30 am): Kris Coolen(HEC-ULg)
Sequential diagnosis of k-out-of-n systems with imperfect tests
A k-out-of-n system configuration requires that, for the overall system to be functional, at least k out of the total of n components be working. We consider the problem of sequentially testing the components of a k-out-of-n system in order to learn the state of the system, when the tests are costly and when the individual component tests are imperfect, which means that a test can identify a component as working when in reality it is down, and vice versa. Each component is tested at most once. Since tests are imperfect, even when all components are tested the state of the system is not necessarily known with certainty, so we impose a threshold for the probability of correctness of the system state as a stopping criterion for the inspection (or diagnosis).
We define different classes of inspection policies and we examine global optimality of each of the classes. We find that a globally optimal policy for diagnosing k-out-of-n systems with imperfect tests can be found in polynomial time. This result holds under certain restrictions on the values of the parameters. Of the three policy classes studied, the dominant policies always contain a global optimum, while elementary policies are compact in representation. The newly introduced class of so-called `interrupted block-walking’ policies combines these merits of global optimality and of compactness.
- November 2014, Friday 28 (02:00 pm): Prof. Maurice Queyranne (Université Catholique de Louvain)
Modeling convex subsets of points
A subset S of a given set of points in a convexity structure is convex if every given point that is in the convex hull of S is itself in S. We are interested in modeling these convexity restrictions when the given set of points is finite. Such restrictions arise, usually in a low-dimensional space (and subject to additional constraints), in many applications, such as in mining, forestry, location, data mining, political districting, and police quadrant design. Modeling convex subset restrictions is well understood for the standard (vector space) convexity in the one-dimensional case: optimization and separation are well solved (in linear time), and a polyhedral description in the natural variables and a linear-sized ideal extended formulation are known. On the other hand, we show that the optimization problem (to find a maximum weight convex subset of given points with weights of arbitrary signs) is NP-hard for the standard convexity in dimensions three and higher, and inapproximable when the dimension is part of the input. For the two-dimensional (planar) case, by Carathéodory’s Theorem convexity can be enforced by a polynomial (quartic) number of linear inequalities in the natural binary variables, but the resulting formulation is very weak. We present a compact (i.e., polynomial-size) ideal extended formulation, which is related to the cubic-time dynamic programming optimization algorithm of Bautista-Santiago et al. (2011). We seek more compact or tighter formulations and faster separation algorithms, that could be used for more complex optimization problems with convex subsets of given points. We also consider these questions in related convexity structures.
(This talk will report on past and current work with numerous co-authors.)
- October 2014, Thursday 16 (01:00 pm): Kris Braekers (Hasselt University)
A bi-objective homecare scheduling problem: analyzing the trade-off between costs and patient convenience
A homecare scheduling problem in which a set of nurses has to visit a set of patients to perform home- and healthcare activities at patients’ home locations is studied. The problem may be considered as a vehicle routing problem with many side constraints such as time windows, nurse working times, nurse-patient incompatibilities, multiple transportation modes and preferences of patients regarding visit times and nurses. Our goal is to analyze the trade-off between the operating costs of the company offering these services and the level of patient convenience offered. For this purpose a bi-objective optimization problem has been defined, minimizing travel and overtime costs, and minimizing deviation from patient preferences. Small problem instances have been solved exactly using a MIP formulation. To solve larger instances, a meta-heuristic approach based on the Multi-Directional Local Search framework, using Large Neighborhood Search as a subheuristic, has been developed. This meta-heuristic method will be discussed and experimental results will be presented.
This is a joint work with Sophie Parragh and Fabien Tricoire from the University of Vienna
- September 2014, Monday 15 (10:30 am): Thibaut Lust (Université Pierre et Marie Curie)
Multiobjective combinatorial optimization: current and future challenges
Many real optimization problems are multiobjective by nature, involving conflicting objectives. In a multiobjective formulation, a solution simultaneously minimizing each objective does not exist, we have instead a set of solutions called Pareto optimal solutions. A Pareto optimal solution is a solution for which it is impossible to find another solution that dominates it, that is at least as good on all objectives and better for at least one objective. In this talk, we will present the pros and cons of multiobjective formulations through the difficulties of generating all Pareto optimal solutions of multiobjective combinatorial optimization problems (multiobjective knapack problems, multiobjective traveling salesman problems, etc.) Other formulations than Pareto optimization will be presented (by using Lorenz dominance, Choquet integral) and new research challenges around multiobjective optimization will be presented.
- May 2014, Friday 9 (02:30 pm): W. H. WOODALL (Virginia Tech, Blacksburg, VA 24061-0439, USA)
The Difficulty in Designing Control Charts with Estimated Parameters
The performance of control charts, such as the Shewhart X control chart, with estimated in-control parameters has been widely discussed in the literature. Previous studies showed, for example, that at least 400/(n-1) Phase I samples, where n > 1 is the sample size, are required so that the X- chart performs on average as if the in-control process parameter values were known. This recommendation was based on the in-control expected average run length (ARL) performance. The reliance on the expected ARL metric, however, neglects the practitioner-to-practitioner variability. This variability occurs due to the different historical data sets practitioners use, which results in varying parameter estimates, control limits, and in-control ARL values. In this presentation, it is shown that taking this additional type of variability into consideration leads to much larger Phase I samples, far beyond what many previous researchers have recommended, in order to have low levels of variation of in-control performance among practitioners. The standard deviation of the ARL (SDARL) metric is used to evaluate performance for various amounts of Phase I data. Surprisingly, we show that for a variety of charts no realistic Phase I sample size is sufficient to have confidence that the attained in-control performance is close to that desired. These results have significant implications on the relationship between process monitoring theory and practice. An alternative approach is presented for designing control charts.
- May 2014, Tuesday 6 (11:00 am): Franco Mascia & Manuel López-Ibáñez (IRIDIA, ULB)
Automatic algorithm configuration methods and automatic design of metaheuristics
Automatic algorithm configuration methods have shown that automatically tuning the parameters of optimization algorithms may lead to much better results, while at the same time saving significant effort to the human designer. Moreover, the potential of these methods to handle a large number of numerical, categorical and conditional parameters opens the door to more powerful applications. One of these applications is the automatic design of metaheuristics. In particular, our recent work has shown that it is possible to use an automatic configuration tool, such as irace, to generate hybrid local search algorithms. The design space of the hybrid local search algorithms is given as a grammar, from which particular algorithms may be instantiated. We convert this grammar description to a parameter space, which can be tuned for a particular problem by means of an automatic configuration tool. The resulting system allows a human designer to automatically find the best hybrid local search algorithm for a particular problem among thousands of potential algorithm designs just by implementing a few problem-specific components.
- April 2014, Thursday 3 (9:00 am): Pierre SCHAUS (Université Catholique de Louvain)
Constraint programming: an overview
In this talk I will introduce constraint programming (strengths and weaknesses). I will explain some details about the internals of a CP Solver and the facilities offered to extend the solver. I will also illustrate that many CP constraints embed well-known ideas and algorithms from OR (flows, assignments, relaxations, etc) for solving optimization problems with CP.
- March 2014, Tuesday 25 (3:00 pm): Prof. Dr. Herbert Meyr (Universität Hohenheim, Stuttgart)
Supply Chain Planning: software support and promising fields of future research
The seminar will describe the history of computer support in supply chain planning (SCP). On basis of this, the common structure of today’s commercial SCP software – so-called Advanced Planning systems (APS) as, for example, the SAP Advanced Planner and Optimizer or Oracle Value Chain Planning – will be discussed. Deficiencies of current APS give reason to briefly sketch some promising fields of further scientific research like the design of industry-specific hierarchical planning systems, simultaneous lotsizing and scheduling in consumer goods industries, the transfer of revenue management ideas to manufacturing industries, using advanced demand information for forecasting and for measuring decoupling points or robustness issues of strategic network design.
- December 2013, Friday 6 (2:00 pm): Ashwin Itoo (HEC-University of Liege)
Harnessing Big (Text) Data with Natural Language Processing
The ubiquity of ICT, and especially, the pervasiveness of Social Media networks have resulted in the generation of a massive amount of data. This phenomenon is commonly referred to as Big Data. It is considered to be of such great relevance that the Economist magazine devoted one issue to it, entitled the Data Deluge. According to estimates by IBM, the volume of data by 2020 is expected to be of the order of 3500 exabytes.
An increasing number of organizations have realized the value that can be extracted from Big Data, and subsequently exploited for corporate activities, such as marketing. In fact, the most successful organizations thrive and achieve their competitive edge based on their ability to leverage upon and monetize huge amounts of data.
In my presentation, I will start by introducing and defining Big Data, since we often hear the term “Big Data” without having a clear indication of what it stands for. I will also provide statistics from well-known data sources (e.g. Facebook, Twitter) to illustrate the astounding rate of data creation.
A significant fraction of Big Data is in unstructured format, including text (e.g. Facebook messages, Tweets, customer opinions on blog) , video (e.g. on Youtube) and photos (e.g. Instangram, Flickr). I will focus on text data. I will delve into the fascinating topic of Natural Language Processes (known as Text Analytics in business), which is my area of expertise. I will give an overview of this topic, and illustrate a sentiment analysis application developed in collaboration with Philips Consumer Lifestyle. This application automatically classifies opinions of consumers from blogs, forums (e.g. Amazon reviews) as either positive or negative depending on their polarity.
Finally, I will discuss more general issues associated with Big Data and Text Analytics, including the emergence of Data Science as a field of study in its own right, and key research questions/issues that are expected to be crucial in research associated with Big Data in the future.
Big data and analytics have rocketed to the top of the corporate agenda. The most successful companies in the digital age have thrived and achieved their competitive edge based on their ability to leverage and to monetize data.
- November 2013, Friday 8 (10:00 am): Joseph B. (Joe) MAZZOLA (Cleveland State University)
Business Analytics and Big Data: The emerging field of Data Science
- October 2013, Thursday 16 (11:00 am): Tom Van Woensel (Eindhoven University of Technology)
Combining Passenger and Freight Transportation with Public Scheduled Line Services
The Pickup and Delivery Problem (PDP) with public scheduled line services concerns scheduling a set of vehicles to serve two types of requests (passengers and freight). Part of the freight journey can be carried on scheduled public transportation. We propose an arc-based mixed integer programming formulation which is solved to optimality using CPLEX. Computational results provide a clear understanding of the benefits of combining passenger and freight transportation in current networks.
It is a joint work with Vaeceslav Slavic and Emrah Demir.
- September 2013, Friday 20 (10:00 am): Prof. Martine Labbé (Université Libre de Bruxelles)
A bilevel programming approach for network pricing optimization problems
Consider a general pricing model involving two levels of decision-making. The upper level (leader) imposes prices on a specified set of goods or services while the lower level (follower) optimizes its own objective function, taking into account the pricing scheme of the leader. This model belongs to the class of bilevel optimization problems where both objective functions are bilinear.
In this talk, we review this class of hierarchical problems from both theoretical and algorithmic points of view.
We first briefly introduce a general taxation model. and present some of its properties. Then, we focus on the problem of setting tolls on a specified subset of arcs of a multicommodity transportation network. In this context the leader corresponds to the profit-maximizing owner of the network, and the follower to users traveling between nodes of the network. The users are assigned to shortest paths with respect to a generalized cost equal to the sum of the actual prices for using specific arcs plus routing costs.
Among others, we present complexity results, identify some polynomial cases and propose mixed integer linear formulations for those pricing problem.
- June 2013, Monday 24 (02:00 pm): Gilbert Laporte (Canada Research Chair in Distribution Management, HEC Montréal)
Minimizing Greenhouse Emissions in Vehicle Routing :
The Vehicle Routing Problem holds a central place in distribution management. In the classical version of the problem the aim is to distribute goods to a set of geographically dispersed customers under a cost minimization objective and various side constraints. In recent years a number of researchers have started investigating the joint minimization of cost and greenhouse emissions in vehicle routing, giving rise to the so-called “Pollution-Routing Problem”. Whereas cost depends primarily on distance travelled, emissions are a function of distance, vehicle load, speed and a variety of other factors. The aim of this talk is to describe recent research results in this context.
About the speaker: Gilbert Laporte, Ph.D., MSRC
CIRRELT and Canada Research Chair in Distribution Management -HEC Montréal
Canada Research Chair: http://www.hec.ca/chairedistributique
Google Scholar profile
Where: HEC-Management School of the University of Liege – 14, rue Louvrex (N1)- 4000 Liège
- June 2013, Thursday 13 (09:30 am): Thierry Pironet (University of Liège)
Stochastic Optimization in Multi-Periods Transportation Problems
Transportation problems are usually formalized as independent single period deterministic models (VRP-PDP) while they are mostly stochastic, subsequent and multi-periods in nature even if the decision process is daily performed. Some on-line, time-dependent, periodic or dynamic settings are now addressed requiring some distinctions among themselves. On the other hand, stochastic optimization techniques such as stochastic programming allow us to tackle some ranges of problems (2 or multi stages problems, continuous variables, convex recourse function, continuous distribution law), yet if we consider reality such as a rolling horizon problem with integer-binary variables and highly discrete distribution law there is still a gap to fill in optimization techniques. Transportation problems often belongs to this family of here or there, yes or no, now-tomorrow or never decision problems. So, our first aim is to show the economic benefit for a company to deal with the approximated multi-periods stochastic model instead of its deterministic mono-period caricature. The second aim is to provide a methodological procedure and a toolbox of algorithms to deal with the hard stochastic optimization and its numerical statistical validation. Finally, we discuss the robustness of the procedures when the distribution law is different from the expected one.
- March 2013, Monday 11 (09:30 am): Prof. José F. Oliveira (University of Porto, Portugal)
2D Rectangular Cutting and Packing Problems: all the same or all different?
There are many optimization problems dealing with the cutting of rectangles from larger rectangular pieces. These problems appear in literature, but under a very diverse set of names. Designations as, for instance, cutting stock, bin packing, rectangular packing, rectangular cutting, pallet loading, are common. Sometimes the same designation is even used for obviously different problems. But do these designations really correspond to different problems, in the sense that they must be solved with different resolution approaches? In this talk we will go through the different 2D rectangular cutting and packing problems, highlighting the differences among several classes of problems and exemplifying some resolution approaches, both (meta)heuristic and exact. Some conclusions about the computational experiments and test instances used in those experiments will also be drawn.
- February 2013, Monday 25 (11:30 am): Julien Hambuckers (HEC-ULg).
New issues for the goodness-of-fit tests of the error distribution in Finance : a comparison between Sinh-arcsinh and Generalized Hyperbolic distributions
Since 2008, the financial crisis shows the limits of the current risk models, in particular for Value-at-Risk (VaR) predictions. In this article, we consider a multiplicative heteroskedastic structure of fiancial returns and we propose a non-conventional methodology to study new issues concerning the goodness-of-fit tests of the error distribution in finance. More precisely, we use different estimation and model selection procedures (Berk-Jones (1978) tests, Sarno and Valente (2004) hypothesis testing, Diks et al. (2011) weighting method), based on the local adaptive volatility estimator (LAVE) of Mercurio and Spokoiny (2004) and the bootstrap methodology, to solve the problem of heteroskedasticity and to compare the fit performances of candidate density functions with a focus on the tail. In particular, we introduce the sinh-arcsinh (SHASH) distributions (Jones and Pewsey, 2009) and we show that this family of density functions provides better bootstrap version of the integrated mean squared error (IMSE) and better weighted Kullback-Leibler distances than the HYP and NIG density functions, on five stock indexes datasets, indicating that the SHASH functions are suitable candidates for VaR models.
Joint work with C. Heuchenne.
- January 2013, Monday 21(02:00 pm): Prof. Jean-Sébastien Tancrez from Louvain School of Management (Université catholique de Louvain).
A location-inventory model for large three-level supply chains
We study the location-inventory problem in three-level supply networks. Our model integrates three decisions: the distribution centers location, flows allocation, and shipment sizes. We propose a nonlinear continuous formulation, including transportation, fixed, handling and holding costs, which decomposes into a closed-form equation and a linear program when the DC flows are fixed. We thus develop an iterative heuristic that estimates the DC flows a priori, solves the linear program, and then improves the DC flow estimations. Extensive numerical experiments show that the approach can design large supply networks both effectively and efficiently, and a case study is discussed.
Joint work with Jean-Charles Lange and Pierre Semal.
- December 2012, Tuesday 11 (10:30 am): Prof. dr. Kenneth Sörensen (University of Antwerp)
Metaheuristics – the metaphor exposed
In recent years, the field of combinatorial optimization has witnessed a true tsunami of “novel” metaheuristic methods, most of them based on a metaphor of some natural or man-made process. The behavior of virtually any species of insects, the flow of water, musicians playing together — it seems that no idea is too far-fetched to serve as inspiration to launch yet another metaheuristic. In this talk we will argue that this line of research is threatening to lead the area of metaheuristics away from scientific rigour. We will examine the historical context that gave rise to the increasing use of metaphors as inspiration and justification for the development of new methods, discuss the reasons for the vulnerability of the metaheuristics field to this line of research, and point out its fallacies. At the same time, truly innovative research of high quality is being performed as well. We conclude the talk by discussing some of the properties of this research and by pointing out some of the most promising research avenues for the field of metaheuristics.
- November 2012, Tuesday 27 (10:30am): Dr. Alper Döyen (HEC-ULg)
Disaster Mitigation and Humanitarian Relief Logistics
We are interested in two distinct problems within the disaster management context. These problems are modeled as two-stage stochastic integer programs since stochasticity is inherent in natural disasters. First, we consider a humanitarian relief logistics model which can be used as a pre-disaster planning tool that also considers post-disaster decisions to give an effective response aftermath an earthquake. In this model, decisions are made for location of pre- and post-disaster rescue centers, the amount of relief items to be stocked at the pre-disaster rescue centers, the amount of relief item flows at each echelon, and the amount of relief item shortage. The objective is to minimize the total cost of facility location, inventory holding, transportation, and shortage. Since building and transportation network retrofitting decisions influence the pre-disaster planning of post-disaster response decisions, we propose an integrated model that includes these retrofitting decisions as well. Model seeks to minimize the total cost of retrofitting, transportation and relief item shortage under a limited mitigation budget, which is allocated efficiently among the retrofitting alternatives. The amount of relief item demand is a decision variable that is determined according to the post-disaster damage of buildings. The deterministic equivalents of both models are mixed-integer linear programming models and they are solved by Lagrangean heuristic methods. Results on randomly generated test instances show that the proposed solution methods for both models exhibit good performance under different parameter settings. In addition, various analyses are carried out to clearly understand the model behavior and to validate the incorporation of uncertainty.
- November 2012, Tuesday 20(10:30am): Pr. Mario Cools (University of Liege)
Calibration opportunities for activity-based travel demand models predicting passenger transport
In the field of passenger demand modeling, activity-based travel demand models are often cited as state-of-the-art models. Having many advantages over classical methods, one of the difficulties of this “new” type of models is the matching of model outcomes with traffic counts observed on the network. This presentation will focus on some of the calibration opportunities in this regard, and will also formulate a critical note to these state-of-the-art models.
- November 2012, Tuesday 13 (10:30am): Bruno F. Santos (University of Coimbra, Portugal)
Optimization approaches applied to the strategic planning of transportation infrastructures
The strategic decisions with regard to the investments in transportation infrastructures usually involve large amounts of money. Therefore, decisions should be carefully analyzed. Cost-benefit analysis, usually based on trial-and-error approaches, does not allow the full exploration of the solution space. This can only be done by using optimization techniques. This talk will give an overview of the work developed by the author on the usage of optimization approaches to solve strategic transportation investment planning problems. The different approaches will be illustrated with the application to several case studies, including the definition of national/regional road network plans and the location of freight intermodal terminals in Belgium.
- Seminar 24/10/2011, 10am (in French): Catherine Mancel
Ecole Nationale de l’Aviation Civile, Toulouse
Title: Optimisation de la gestion des opérations d’escale sur une plateforme aéroportuaire
On s’intéresse au problème de la gestion des activités d’escales sur une plateforme aéroportuaire. Les opérations d’escale sont réalisées par différents prestataires de services, à l’aide de véhicules propres à chaque type d’opération. Ces différents intervenants doivent se coordonner de façon à réaliser les escales de chaque avion dans le temps imparti, tout en respectant les contraintes d’ordonnancement des tâches pour chaque avion et les contraintes liées à l’utilisation des véhicules de service. Or, on constate qu’une part importante des retards des vols est générée pendant l’escale des avions à leur poste de stationnement, il est donc intéressant d’étudier ce problème pour tenter de trouver des stratégies permettant de minimiser ces retards. Le problème d’optimisation combinatoire sous-jacent est un problème hybride d’ordonnancement de tâches et de tournée de véhicule qui, à notre connaissance, est très peu étudié dans la littérature. On propose un modèle mathématique puis on explore différentes pistes de résolution en s’appuyant sur une étude comparative de notre problème avec des problèmes académiques. Nous discuterons également du problème de la génération de jeux de données pertinents, nécessaires pour pouvoir évaluer les méthodes de résolution envisagées.
Where: HEC-Management School of the University of Liege – 14, rue Louvrex – 4000 Liège
Contacts: M.Schyns at ulg.ac.be
- Seminar: Gilbert Laporte (May 17, 2011 2:00 pm):
Chaire de recherche du Canada en distributique
Canada Research Chair in Distribution Management
HEC Montréal, Canada
“Optimizing cost and quality of service in dial-a-ride operations”
In dial-a-ride problems, the aim is to design pickup and delivery routes for users wishing to be transported between given origins and destinations. These routes are subject to several operational constraints and must also take user inconvenience into account. We will present a tabu search multicriteria algorithm capable of incorporating operational costs and quality of service for users.
This work is derived from Julie Paquette’s Ph.D. thesis, codirected by Gilbert Laporte and Jean-François Cordeau. François Bellavance and Marta M. B. Pascoal also took part in this research.
- Working group UGR-GreatRoad (April 2011):
reportage vidéo ULg WebTV (in french)
- “Introduction to the workshop, Research teams and research activities at the University of Liège”,Yves Crama, Director, QuantOM
- “Generating cutting planes for mixed-integer programs from multi-row relaxations”,Quentin Louveaux
- “Multiple vehicle loading optimization with stochastic supply”, Thierry Pironet
- “A capacity game in transportation planning”,Guillaume Amand
- “Inventory and vehicle routing for perishable items”,Reza Sadrabadi
- “Automatic aircraft cargo load planning”, Sabine Limbourg and Michael Schyns (+video)
- “Collective axiom of revealed preference: Complexity result and heuristic tests”,Fabrice Talla-Nobibon
- “Delivery chains and control charts”,Alireza Faraz
- “Working Group UGR-GreatRoad : Progress and perspectives”,Anass Nagih, UGR-GreatRoad project coordinator
- ORBEL 24 (January 2010)