Seminar "Combinatorial Optimization, Algorithms"

The seminars are usually scheduled on Monday at 2pm. If you are interested in presenting your research in our regular seminar or getting information about the upcoming talks, please contact the organizers, Eunjung Kim (eun-jung.kim@dauphine.fr) and Michail Lampis (michail.lampis@lamsade.dauphine.fr). 

Séminaires du Pôle 2 : "Optimisation combinatoire, algorithmique"

Organisation.

Eun Jung Kim et Michail Lampis

Liste de diffusion.

Possibilité de s’inscrire à la liste de diffusion en envoyant un email à : eun-jung.kimping@lamsade.dauphinepong.fr ou michail.lampisping@lamsade.dauphinepong.fr

Les séminaires ont lieu en général le lundi à 14h.

Prochains séminaires :

vendredi 18 mai 2018, 13h45, D 102 : Olivier Spanjaard (Université Pierre et Marie Curie) : Quelques contributions en optimisation dans l’incertain et optimisation robuste

Après avoir brièvement rappelé les principaux enjeux de la théorie de la décision algorithmique, nous présenterons dans la première partie de l’exposé quelques résultats récents sur l’optimisation de critères alternatifs à l’espérance d’utilité en décision séquentielle dans l’incertain. Plus précisément, nous montrerons que la détermination d’une politique optimale au sens d’une fonction d’utilité bilinéaire antisymétrique est NP-difficile, mais que le problème devient de complexité polynomiale si on se restreint au modèle de l’utilité espérée pondérée. La seconde partie de l’exposé portera sur l’optimisation robuste avec données intervalles. Plus précisément, nous présenterons une procédure générique de calcul d’une borne inférieure du minimax regret. Cette borne inférieure est calculée à l’aide d’un algorithme de double oracle (qui peut être vu comme une méthode de génération de colonnes et de contraintes en programmation mathématique). Cette méthode peut être implantée efficacement pour tout problème d’optimisation combinatoire robuste dont la variante classique est "facile".

mercredi 16 mai 2018, 13h45, P 510 : Cédric Bentz (Conservatoire National des Arts et Métiers (CNAM)) : Quelques problèmes de recouvrement dans les graphes

Dans cette présentation, j’évoquerai quelques résultats représentatifs obtenus sur quatre familles de problèmes de recouvrement dans les graphes : multicoupes (multicoupes multicritères, multicoupes planaires), colorations (généralisations des colorations équitables), arbres et réseaux de Steiner (arbres de Steiner avec capacités sur les arêtes/arcs, réseaux de Steiner avec capacités sur les arêtes/arcs en présence de pannes), ainsi que d-bloqueurs et certaines de leurs variantes (complexité du problème des arêtes les plus vitales dans les couplages bipartis, cas particuliers de graphes bipartis et graphes complets).


Séminaires précédents :

mardi 15 mai, 10h15, salle P 303 : André Rossi (Université d’Angers) : Applications de méthodes hybrides exactes pour l’optimisation dans l’incertain

Cet exposé présente, à travers deux applications, le développement d’algorithmes hybrides combinant programmation mathématique et heuristiques pour résoudre des problèmes d’optimisation dans l’incertain. La première application a trait au suivi de cibles mobiles à l’aide d’un réseau de capteurs sans fil. Le problème consiste à ordonnancer l’activité des capteurs de manière à assurer la surveillance de toutes les cibles à tout instant, en maximisant l’aptitude du réseau à surveiller des zones d’intérêts prédéfinies une fois la mission de surveillance courante achevée, et à minimiser l’énergie dépensée par les capteurs. Une approche multiobjectif lexicographique est utilisée, et trois sous-problèmes successifs sont résolus par génération de colonnes. L’incertitude spatiale des cibles est prise en compte à l’aide de disques d’incertitude nécessitant que toute la surface de ces disques soit couverte par les capteurs. Des heuristiques sont utilisées aussi bien dans les problèmes de pricing de la génération de colonnes, que dans le calcul de bornes supérieures, sans compromettre l’optimalité des solutions produites. La second application est un problème d’ordonnancement dans les lignes d’assemblage, où la durée des opérations est incertaine. Deux mesures de robustesse reposant sur le rayon de stabilité sont utilisées et comparées, et deux modèles PLNE pour maximiser ces mesures sont proposées. La résolution de ces problèmes est facilitée par la réduction des intervalles d’affectation des opérations aux stations de travail. Cette réduction est effectuée à l’aide d’un algorithme qui exploite la connaissance d’une solution réalisable de bonne qualité, qui peut être retournée par une heuristique, ou par un solveur de PLNE. Après avoir souligné le caractère parfois peu réaliste et restrictif des deux mesures de robustesse basées sur le rayon de stabilité, un troisième indicateur permettant de pallier ces défauts est proposé et illustré.

vendredi 11 mai 2018, 14h, C 108 : Clemens Thielen (University of Kaiserslautern) : Approximation of Multiobjective Optimization Problems, Scheduling Games, and Decision Support in Physician Scheduling

This talk presents the results of three recently published articles from the areas of multiobjective optimization, algorithmic game theory, and personnel scheduling.

In the first part of the talk, we present a general, polynomial-time approximation method for biobjective minimization problems. This method yields approximations of the Pareto set as well as of the associated budget-constrained problem for every biobjective minimization problem for which the corresponding weighted sum scalarization can be approximated in polynomial time.

In the second part of the talk, we consider a game theoretical model of multistage interval scheduling problems in which each job consists of exactly one task (interval) for each of t stages (machines). In the game theoretical model, the machine of each stage is controlled by a different selfish player who wants to maximize her total profit, where the profit for scheduling the task of a job is a fraction of the weight of the job that is determined by the set of players that also schedule their corresponding task of this job. We provide criteria for the existence of pure Nash equilibria and prove bounds on the Price of Anarchy and the Price of Stability for different social welfare functions.

The third part of the talks presents an integer programming model for the generation of duty rosters for physicians at a large department of orthopedics and trauma surgery that is used in practice since January 2016. Besides the specific constraints that need to be respected and the resulting model, we additionally present a decision support component for the handling of unpredictable disruptions (caused, e.g., by absence of physicians due to illness) and evaluate the generated duty rosters from a practical perspective.

lundi 9 avril 2018, 14h, : Vadim V. Lozin (University of Warwick) : From structure to algorithms : breaking the boundaries

Many algorithmic problems that are generally intractable may become
easy when restricted to instances of particular structure. When and why
a difficult problem becomes easy ? To answer these questions, we employ
the notion of boundary classes of graphs. In this talk, we focus on the maximum
independent set problem and shed some light on the structure of the boundary
separating difficult instances of this problem from polynomially solvable ones.
We also analyze algorithmic tools to break the boundary and discuss
similar questions with respect to some other algorithmic problems, including
Satisfiability, the central problem of Theoretical Computer Science.

lundi 26 mars 2018, 14h, P301 : Magnus Wahlström (Royal Holloway, London) : FPT-algorithms via LP-relaxations

LP-relaxations are traditionally (within theoretical computer science)
used for computing approximate solutions to NP-hard problems, but
across the last few years there have been several examples where
LP-relaxations have been used to guide FPT-algorithms — that is,
exact (non-approximate) algorithms whose running time is bounded in
terms of a "parameter" of the input instance. This approach has given
algorithms that are simultaneously simpler and faster for a range of
central problems in parameterized complexity. At the same time, this
is applicable only to specific problems and relaxations ; an arbitrary
LP-relaxation, even one that has good properties in an approximation
sense, will in general give no useful guidance with respect to exact
solutions.

In this talk, we give an overview of these FPT applications, and the
conditions they impose on the LP-relaxation. Our main focus is on an
approach of "discrete relaxation" referred to as Valued CSPs, but we
also briefly survey more immediately combinatorial conditions, as well
as related algorithms that solve these problems more efficiently
(e.g., so-called linear-time FPT algorithms) by bypassing the
LP-solver.

lundi 26 fevrier 2018, 14h, A407 : Henning Fernau (Theoretical Computer Science in Trier) : Self-monitoring approximation algorithms

Reduction rules are one of the key techniques for the design of parameterized algorithms.
They can be seen as formalizing a certain kind of heuristic approach to solving hard combinatorial problems.
We propose to use such a strategy in the area of approximation algorithms.
One of the features that we may gain is a self-monitoring property.
This means that the algorithm that is run on a given instance $I$ can predict an approximation factor of the solution produced on $I$ that
is often (according to experiments) far better than the theoretical estimate that is based on a worst-case analysis.

Bibliography :

[1] F. N. Abu-Khzam, C. Bazgan, M. Chopin, and H. Fernau.
Data reductions and combinatorial bounds for improved approximation
algorithms. Journal of Computer and System Sciences, 82:503—520, 2016.

[2] L. Brankovic and H. Fernau.
A novel parameterised approximation algorithm for minimum vertex cover.
Theoretical Computer Science, 511:85—108, 2013.

lundi 11 décembre 2017, 14h, A 304 : Ararat Harutyunyan (LAMSADE) : A disproof of the normal graphs conjecture

Abstract : A graph G is called normal if there exist two coverings, C and S, of its vertex set such that every member of C induces a clique in G, every member of S induces an independent set in G, and any clique in C and independent set in S have a non-empty intersection. Normal graphs derive their motivation from information theory, where they are related to the Shannon capacity of a graph. In particular, they form a family which extends the class of perfect graphs. It was conjectured by De Simone and Körner [DAM ’99] that a graph G is normal if G does not contain C_5, C_7 and the complement of C_7 as an induced subgraph. Using random graphs and rather routine probabilistic methods, we give a disproof of this conjecture.

Joint work with Lucas Pastor (G-SCOP, Grenoble) and Stéphan Thomassé (ENS Lyon).

lundi 20 novembre 2017, 14h, P 301 : Julien Baste (LIP6) : F-M-DELETION parameterized by treewidth

For a fixed collection of graphs F, the F-M-DELETION problem consists in, given a graph G and an integer k, decide whether there exists a set S of vertices of G of size at most k such that G without the vertices of S does not contain any of the graphs of F as a minor. This problem is a generalization of some well known problems as VERTEX COVER (F=K_2), FEEDBACK VERTEX SET (F=K_3), or VERTEX PLANARIZATION (F=K_5, K_3,3 ). We are interested in the parameterized complexity of F-M-DELETION when the parameter is the treewidth of the input graph, denoted by tw. Our objective is to determine, for a fixed F, the smallest function f such that F-M-DELETION can be solved in time f(tw)*poly(n) on n-vertex graphs.

lundi 13 novembre 2017, 14h, salle A (C206, 2ème étage) : Tomáš Toufar (Charles University, Czech Republic) : Parameterized Approximation Schemes for Steiner Trees with Small Number of Steiner Vertices

Slides

We study the Steiner Tree problem, in which a set of terminal vertices
needs to be connected in the cheapest possible way in an edge-weighted
graph. This problem has been extensively studied from the viewpoint of
approximation and also parametrization. In particular, on one hand
Steiner Tree is known to be APX-hard, and W[2]-hard on the other, if
parameterized by the number of non-terminals (Steiner vertices) in the
optimum solution. In contrast to this we
give an efficient parameterized approximation scheme (EPAS), which
circumvents both hardness results. Moreover, our methods imply the
existence of a polynomial size approximate kernelization scheme
(PSAKS) for the assumed parameter.
We further study the parameterized approximability of other variants
of Steiner Tree, such as Directed Steiner Tree and Steiner Forest. For
neither of these an EPAS is likely to exist for the studied parameter :
for Steiner Forest an easy observation shows that the problem is
APX-hard, even if the input graph contains no Steiner vertices. For
Directed Steiner Tree we prove that computing a constant approximation
for this parameter is W[1]-hard. Nevertheless, we
show that an EPAS exists for Unweighted Directed Steiner Tree. Also we
prove that there is an EPAS and a PSAKS for Steiner Forest if in
addition to the number of Steiner vertices, the number of connected
components of an optimal solution is considered to be a parameter.

lundi 16 octobre 2017, 14h, salle A (2ème étage) : Eunjung Kim (LAMSADE) : Erdos-Posa Property of Chordless Cycles and its Applications

A chordless cycle is a cycle of length at least 4 that has no chord. We prove that the class of all chordless cycles has the Erdos-Posa property, which resolves the major open question concerning the Erdos-Posa property. We complement our main result by showing that the class of all chordless cycles of length at least l for any fixed l ≥ 5 does not have the Erdos-Posa property.

Our proof for chordless cycles is constructive : in polynomial time, one can either find k + 1 vertex-disjoint chordless cycles, or ck2 log k vertices hitting every chordless cycle for some constant c. It immediately implies an approximation algorithm of factor O(opt log opt) for Chordal Vertex Deletion, which improves the best known approximation by Agrawal et. al. The improved approximation algorithm entails improvement over the known kernelization for Chordal Vertex Deletion.

As a corollary, for a non-negative integral function w defined on the vertex set of a graph G, the minimum value \sum_x\in S w(x) over all vertex sets S where G − S is forest is at most O(k2 log k) where k is the maximum number of cycles (not necessarily vertex-disjoint) in G such that each vertex v is used at most w(v) times.

lundi 2 octobre 2017, 14h, P303 : Giuseppe F. Italiano (Universita’ di Roma "Tor Vergata") : 2-Connectivity in Directed Graphs

We survey some recent results on 2-edge and 2-vertex
connectivity in directed graphs. Despite being complete analogs of the
corresponding notions on undirected graphs, in digraphs 2-connectivity
has a much richer and more complicated structure. For undirected
graphs it has been known for over 40 years how to compute all bridges,
articulation points, 2-edge- and 2-vertex-connected components in
linear time, by simply using depth first search. In the case of
digraphs, however, the very same problems have been much more
challenging and have been tackled only very recently.

Exposés de 2016-2017.
Exposés de 2015-2016.
Exposés de 2014-2015.

Pour les exposés antérieurs, voir cette page.