Loading [MathJax]/extensions/tex2jax.js

Automated Dynamic Algorithm Configuration

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Autorschaft

Externe Organisationen

  • Albert-Ludwigs-Universität Freiburg
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 27
  • Captures
    • Readers: 24
see details

Details

OriginalspracheEnglisch
Seiten (von - bis)1633-1699
Seitenumfang67
FachzeitschriftJournal of Artificial Intelligence Research
Jahrgang75
PublikationsstatusVeröffentlicht - 30 Dez. 2022

Abstract

The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution, e.g., to adapt to the current part of the optimization landscape. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior-art to tackle this problem; (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.

ASJC Scopus Sachgebiete

Zitieren

Automated Dynamic Algorithm Configuration. / Adriaensen, Steven; Biedenkapp, André; Shala, Gresa et al.
in: Journal of Artificial Intelligence Research, Jahrgang 75, 30.12.2022, S. 1633-1699.

Publikation: Beitrag in FachzeitschriftArtikelForschungPeer-Review

Adriaensen S, Biedenkapp A, Shala G, Awad N, Eimer T, Lindauer M et al. Automated Dynamic Algorithm Configuration. Journal of Artificial Intelligence Research. 2022 Dez 30;75:1633-1699. doi: 10.1613/jair.1.13922, 10.48550/arXiv.2205.13881
Adriaensen, Steven ; Biedenkapp, André ; Shala, Gresa et al. / Automated Dynamic Algorithm Configuration. in: Journal of Artificial Intelligence Research. 2022 ; Jahrgang 75. S. 1633-1699.
Download
@article{2872944e4e864ceeb0fb4ba913b231b8,
title = "Automated Dynamic Algorithm Configuration",
abstract = "The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.",
keywords = "cs.AI, cs.LG, cs.NE",
author = "Steven Adriaensen and Andr{\'e} Biedenkapp and Gresa Shala and Noor Awad and Theresa Eimer and Marius Lindauer and Frank Hutter",
note = "Publisher Copyright: {\textcopyright} 2022 AI Access Foundation. All rights reserved.",
year = "2022",
month = dec,
day = "30",
doi = "10.1613/jair.1.13922",
language = "English",
volume = "75",
pages = "1633--1699",
journal = "Journal of Artificial Intelligence Research",
issn = "1076-9757",
publisher = "Morgan Kaufmann Publishers, Inc.",

}

Download

TY - JOUR

T1 - Automated Dynamic Algorithm Configuration

AU - Adriaensen, Steven

AU - Biedenkapp, André

AU - Shala, Gresa

AU - Awad, Noor

AU - Eimer, Theresa

AU - Lindauer, Marius

AU - Hutter, Frank

N1 - Publisher Copyright: © 2022 AI Access Foundation. All rights reserved.

PY - 2022/12/30

Y1 - 2022/12/30

N2 - The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.

AB - The performance of an algorithm often critically depends on its parameter configuration. While a variety of automated algorithm configuration methods have been proposed to relieve users from the tedious and error-prone task of manually tuning parameters, there is still a lot of untapped potential as the learned configuration is static, i.e., parameter settings remain fixed throughout the run. However, it has been shown that some algorithm parameters are best adjusted dynamically during execution. Thus far, this is most commonly achieved through hand-crafted heuristics. A promising recent alternative is to automatically learn such dynamic parameter adaptation policies from data. In this article, we give the first comprehensive account of this new field of automated dynamic algorithm configuration (DAC), present a series of recent advances, and provide a solid foundation for future research in this field. Specifically, we (i) situate DAC in the broader historical context of AI research; (ii) formalize DAC as a computational problem; (iii) identify the methods used in prior art to tackle this problem; and (iv) conduct empirical case studies for using DAC in evolutionary optimization, AI planning, and machine learning.

KW - cs.AI

KW - cs.LG

KW - cs.NE

UR - http://www.scopus.com/inward/record.url?scp=85148436940&partnerID=8YFLogxK

U2 - 10.1613/jair.1.13922

DO - 10.1613/jair.1.13922

M3 - Article

VL - 75

SP - 1633

EP - 1699

JO - Journal of Artificial Intelligence Research

JF - Journal of Artificial Intelligence Research

SN - 1076-9757

ER -

Von denselben Autoren