Algorithmics and Computational Complexity

Data Reduction and Problem Kernels (DARE)

Synopsis

Preprocessing by means of data reduction rules is a common approach for the algorithmic treatment of combinatorially difficult and compute-intensive problems. Typically, the data reduction approaches used so far have an ad hoc character and are not systematically developed. Theory construction has not taken place for this universally applicable method so far. With the concept of the problem kernels originating from parameterized complexity, it is now for the first time possible to undertake a systematic, theoretically supported investigation on the effectiveness of data reduction techniques. In addition, these theoretical investigations can—as shown in first case studies—very effectively guide the search for new, better data reductions. With algorithm engineering, practical tools can be developed, and the actual capability of data reduction rules in most diverse applications can be examined. Eventually, this opens up new application potential for this basic algorithmic technology.

Funded by

Deutsche Forschungsgemeinschaft (Project DARE, NI 369/11)

Time

March 2010 - January 2014

Head: Rolf Niedermeier

People

Organization name Algorithmics and Computational Complexity
Room TEL 509A
Office HourThursday 10 - 11

Dr.

Mathias Weller

former member