# BLiP: Bayesian Linear Programming

##### Posted on Feb 09, 2024

In genetic fine-mapping, high correlations between nearby genetic variants make it hard to identify the exact locations of causal variants.

The statistical task is to output as many disjoint regions containing a signal as possible, each as small as possible, while controlling false positives.

similar problems arise when locating stars in astronomical surveys and in changepoint detection

common Bayesian approaches to these problems involve computing a posteriors over signal locations

however, existing procedures to translate these posteriors into credible regions for the signals fail to capture all the information in the posterior, leading to lower power and (sometimes) inflated false discoveries

introduce Bayesian Linear Programming (BLiP), which can efficiently convert any posterior distribution over signals into credible regions for signals.

## Introduction

when variables are highly correlated, it can be nearly impossible to certify that any individual variable is important

### Problem Statement

it can be very difficult to perfectly localize signals, we allow ourselves to discover any group or region $G\subset \cL$, which asserts that at least one signal exists in $G$

Suppose we seek to discover signals in a set of locations $\cL$. Let $R\ge 0$ be the number of discoveries and let $G_1,\ldots, G_R$ denote the discovered regions. Given a notion of statistical power for a set of discovered regions, denoted by $Power(G_1,\ldots, G_R)$, we seek to maximize

$\max \bbE[Power(G_1,\ldots, G_R)]\\ s.t. FDR \le q\\ G_1, \ldots, G_R \subset \cL \text{ are disjoint}$

As an input, BLiP takes an approximate posterior distribution over the location of the signals. For example, if $Y\mid X$ follows a generalized linear model (GLM) where nonzero coefficients are signals, one can se a MCMC algorithm to sample from the posterior distribution of the model coefficients and use the MCMC samples as the input for BLiP.

BLiP will output a set of disjoint regions, each containing a signal, which nearly maximizes expected power while controlling false positives.

in variable selection problems

• nearly all groups contain at least one signal variable
• discover as many groups as possible
• the groups are as small as possible

SuSiE: use a novel variational approximation (which is accurate when the number of signals is small) to approximate the posterior and then processes that approximate posterior to perform resolution-adaptive inference

BLiP: only the latter task, but it can do on any posterior

## Bayesian Linear Programming for Resolution-Adaptive Signal Detection

Given a set of locations $\cL$ and data $\cD$, suppose a method discovers a disjoint set of groups $G_1,\ldots, G_R\subset\cL$. Let $w:2^{\cL}\rightarrow\IR$ be a weighting function on $2^{\cL}$, the set of all subsets of $\cL$. Define the resolution-adjusted power as follows $$Power(G_1,\ldots, G_R) = \sum_{r=1}^RI_{G_r}w(G_r)\,,$$ where $I_{G_r}$ is the indicator of whethere a signal truly exist in $G_r\subset \cL$.

### Bayesian Linear Programming for FDR Control

given candidate groupings $\cG$, the first key observation is that maximizing power corresponds to maximizing a linear function

$p_G=\bbE[I_G\mid\cD]$: posterior inclusion probability

let $x_G\in{0, 1}$ be the indicator of whether the procedure discovers group $G$: here ${x_G}_{G\in\cG}$ are the optimization variables which completely determine our discoveries $G_1,\ldots, G_R$

$\bbE[Power(G_1,\ldots, G_R)\mid \cD] = \sum_{G\in\cG} p_Gw(G)x_G$

Published in categories Note