Publication

Selecting the tuning parameter in penalized Gaussian graphical models

Abbruzzo, A., Vujacic, I., Mineo, A. M. & Wit, E. C., May-2019, In : Statistics and Computing. 29, 3, p. 559-569 11 p.

Research output: Contribution to journalArticleAcademicpeer-review

APA

Abbruzzo, A., Vujacic, I., Mineo, A. M., & Wit, E. C. (2019). Selecting the tuning parameter in penalized Gaussian graphical models. Statistics and Computing, 29(3), 559-569. https://doi.org/10.1007/s11222-018-9823-5

Author

Abbruzzo, Antonino ; Vujacic, Ivan ; Mineo, Angelo M. ; Wit, Ernst C. / Selecting the tuning parameter in penalized Gaussian graphical models. In: Statistics and Computing. 2019 ; Vol. 29, No. 3. pp. 559-569.

Harvard

Abbruzzo, A, Vujacic, I, Mineo, AM & Wit, EC 2019, 'Selecting the tuning parameter in penalized Gaussian graphical models', Statistics and Computing, vol. 29, no. 3, pp. 559-569. https://doi.org/10.1007/s11222-018-9823-5

Standard

Selecting the tuning parameter in penalized Gaussian graphical models. / Abbruzzo, Antonino; Vujacic, Ivan; Mineo, Angelo M.; Wit, Ernst C.

In: Statistics and Computing, Vol. 29, No. 3, 05.2019, p. 559-569.

Research output: Contribution to journalArticleAcademicpeer-review

Vancouver

Abbruzzo A, Vujacic I, Mineo AM, Wit EC. Selecting the tuning parameter in penalized Gaussian graphical models. Statistics and Computing. 2019 May;29(3):559-569. https://doi.org/10.1007/s11222-018-9823-5


BibTeX

@article{16692e214e05418c8e541942f8130c9e,
title = "Selecting the tuning parameter in penalized Gaussian graphical models",
abstract = "Penalized inference of Gaussian graphical models is a way to assess the conditional independence structure in multivariate problems. In this setting, the conditional independence structure, corresponding to a graph, is related to the choice of the tuning parameter, which determines the model complexity or degrees of freedom. There has been little research on the degrees of freedom for penalized Gaussian graphical models. In this paper, we propose an estimator of the degrees of freedom in l(1)-penalized Gaussian graphical models. Specifically, we derive an estimator inspired by the generalized information criterion and propose to use this estimator as the bias term for two information criteria. We called these tuning parameter selectors GAIC and GBIC. These selectors can be used to choose the tuning parameter, i.e., the optimal tuning parameter is the minimizer of GAIC or GBIC. A simulation study shows that GAIC tends to improve the performance of both AIC-type and CV-type model selectors, in terms of estimation quality (entropy loss function) in high-dimensional setting. Moreover, GBIC model selector improves the performance of both BIC-type and CV-type model selectors, in terms of support recovery (F-score). A data analysis shows that GBIC selects a tuning parameter that produces a sparser graph with respect to BIC and a CV-type model selector (KLCV).",
keywords = "Penalized likelihood, Kullback-Leibler divergence, Model complexity, Model selection, Generalized information criterion",
author = "Antonino Abbruzzo and Ivan Vujacic and Mineo, {Angelo M.} and Wit, {Ernst C.}",
year = "2019",
month = "5",
doi = "10.1007/s11222-018-9823-5",
language = "English",
volume = "29",
pages = "559--569",
journal = "Statistics and Computing",
issn = "0960-3174",
publisher = "SPRINGER",
number = "3",

}

RIS

TY - JOUR

T1 - Selecting the tuning parameter in penalized Gaussian graphical models

AU - Abbruzzo, Antonino

AU - Vujacic, Ivan

AU - Mineo, Angelo M.

AU - Wit, Ernst C.

PY - 2019/5

Y1 - 2019/5

N2 - Penalized inference of Gaussian graphical models is a way to assess the conditional independence structure in multivariate problems. In this setting, the conditional independence structure, corresponding to a graph, is related to the choice of the tuning parameter, which determines the model complexity or degrees of freedom. There has been little research on the degrees of freedom for penalized Gaussian graphical models. In this paper, we propose an estimator of the degrees of freedom in l(1)-penalized Gaussian graphical models. Specifically, we derive an estimator inspired by the generalized information criterion and propose to use this estimator as the bias term for two information criteria. We called these tuning parameter selectors GAIC and GBIC. These selectors can be used to choose the tuning parameter, i.e., the optimal tuning parameter is the minimizer of GAIC or GBIC. A simulation study shows that GAIC tends to improve the performance of both AIC-type and CV-type model selectors, in terms of estimation quality (entropy loss function) in high-dimensional setting. Moreover, GBIC model selector improves the performance of both BIC-type and CV-type model selectors, in terms of support recovery (F-score). A data analysis shows that GBIC selects a tuning parameter that produces a sparser graph with respect to BIC and a CV-type model selector (KLCV).

AB - Penalized inference of Gaussian graphical models is a way to assess the conditional independence structure in multivariate problems. In this setting, the conditional independence structure, corresponding to a graph, is related to the choice of the tuning parameter, which determines the model complexity or degrees of freedom. There has been little research on the degrees of freedom for penalized Gaussian graphical models. In this paper, we propose an estimator of the degrees of freedom in l(1)-penalized Gaussian graphical models. Specifically, we derive an estimator inspired by the generalized information criterion and propose to use this estimator as the bias term for two information criteria. We called these tuning parameter selectors GAIC and GBIC. These selectors can be used to choose the tuning parameter, i.e., the optimal tuning parameter is the minimizer of GAIC or GBIC. A simulation study shows that GAIC tends to improve the performance of both AIC-type and CV-type model selectors, in terms of estimation quality (entropy loss function) in high-dimensional setting. Moreover, GBIC model selector improves the performance of both BIC-type and CV-type model selectors, in terms of support recovery (F-score). A data analysis shows that GBIC selects a tuning parameter that produces a sparser graph with respect to BIC and a CV-type model selector (KLCV).

KW - Penalized likelihood

KW - Kullback-Leibler divergence

KW - Model complexity

KW - Model selection

KW - Generalized information criterion

U2 - 10.1007/s11222-018-9823-5

DO - 10.1007/s11222-018-9823-5

M3 - Article

VL - 29

SP - 559

EP - 569

JO - Statistics and Computing

JF - Statistics and Computing

SN - 0960-3174

IS - 3

ER -

ID: 81819668