FastScannerΒΆ

class glimix_core.lmm.FastScanner(y, X, QS, v)[source]ΒΆ

Approximated fast inference over several covariates.

Specifically, it maximizes the marginal likelihood

p(𝐲)β±Ό = 𝓝(𝐲 | πš‡πœ·β±Ό + π™Όβ±ΌπœΆβ±Ό, 𝑠ⱼ(𝙺 + 𝑣𝙸)),

over 𝜷ⱼ, 𝜢ⱼ, and sβ±Ό. Matrix Mβ±Ό is the candidate set defined by the user. Variance 𝑣 is not optimised for performance reasons. The method assumes the user has provided a reasonable value for it.

Parameters
  • y – Real-valued outcome.

  • X – Matrix of covariates.

  • QS – Economic eigendecomposition ((Q0, Q1), S0) of K.

  • v – Variance due to iid effect.

Notes

The implementation requires further explanation as it is somehow obscure. Let πš€πš‚πš€α΅€ = 𝙺, where πš€πš‚πš€α΅€ is the eigendecomposition of 𝙺. Let 𝙳 = (πš‚ + 𝑣𝙸) and 𝙳₀ = (πš‚β‚€ + 𝑣𝙸₀), where πš‚β‚€ is the part of πš‚ with positive values. Therefore, solving

(𝙺 + 𝑣𝙸)𝐱 = 𝐲

for 𝐱 is equivalent to solving

πš€β‚€π™³β‚€πš€β‚€α΅€π± + π‘£πš€β‚πš€β‚α΅€π± = πš€β‚€π™³β‚€πš€β‚€α΅€π± + 𝑣(𝙸 - πš€β‚€πš€β‚€α΅€)𝐱 = 𝐲.

for 𝐱. Let

𝙱 = πš€β‚€π™³β‚€β»ΒΉπš€β‚€α΅€ if 𝑣=0, and 𝙱 = πš€β‚€π™³β‚€β»ΒΉπš€β‚€α΅€ + 𝑣⁻¹(𝙸 - πš€β‚€πš€β‚€α΅€) if 𝑣>0.

We therefore have

𝐱 = 𝙱𝐲

as the solution of (𝙺 + 𝑣𝙸)𝐱 = 𝐲.

Let 𝐛ⱼ = [πœ·β±Όα΅€ πœΆβ±Όα΅€]α΅€ and 𝙴ⱼ = [πš‡ 𝙼ⱼ]. The optimal parameters according to the marginal likelihood are given by

(𝙴ⱼᡀ𝙱𝙴ⱼ)𝐛ⱼ = 𝙴ⱼᡀ𝙱𝐲

and

𝑠 = 𝑛⁻¹𝐲ᡀ𝙱(𝐲 - 𝙴ⱼ𝐛ⱼ).
__init__(y, X, QS, v)[source]ΒΆ

Initialize self. See help(type(self)) for accurate signature.

Methods

__init__(y,Β X,Β QS,Β v)

Initialize self.

fast_scan(M[,Β verbose])

LMLs, fixed-effect sizes, and scales for single-marker scan.

null_lml()

scan(M)

LML, fixed-effect sizes, and scale of the candidate set.

Attributes

null_beta

Optimal 𝜷 according to the marginal likelihood.

null_beta_covariance

Covariance of the optimal 𝜷 according to the marginal likelihood.

null_beta_se

Standard errors of the optimal 𝜷.

null_scale

Optimal s according to the marginal likelihood.