Research

Northwestern × LBNL JGI

GenomeOcean MoE

Research on Mixture of Experts architectures for genomic foundation models. Abstract submitted to ISMB 2026; paper in preparation.

NeMoMegatron-CoreMixture of ExpertsGenomics

Context

Mixture of Experts has become a standard way to scale NLP foundation models with sub-linear compute growth — more total parameters without a proportional increase in per-token cost. This project investigates whether that scaling direction transfers to genomic foundation models. The work builds on GenomeOcean, an open-source prokaryotic genomic foundation model, and is a collaboration with Zhong Wang’s group at Lawrence Berkeley National Laboratory’s Joint Genome Institute.

Status

An abstract describing the work has been submitted to ISMB 2026 (Intelligent Systems for Molecular Biology, the ISCB’s flagship conference). Decisions are pending. Methodology and results will be posted on this page after the paper is public; until then, architecture choices, experiments, and findings are held back to respect the review process and academic priority.