All Tags
Browse through all available tags to find articles on topics that interest you.
Browse through all available tags to find articles on topics that interest you.
Showing 1 results for this tag.
Prediction-powered Inference by Mixture of Experts
This paper introduces a Mixture of Experts (MOE)-powered semi-supervised inference framework that enhances Prediction-Powered Inference (PPI) by leveraging multiple predictors. The framework adapts to unknown predictor performance, combines their collective power, and offers a best-expert guarantee, improving inferential efficiency with abundant unlabeled data.