All Tags
Browse through all available tags to find articles on topics that interest you.
Browse through all available tags to find articles on topics that interest you.
Showing 2 results for this tag.
Invariant Transformation and Resampling based Epistemic-Uncertainty Reduction
This paper introduces a resampling technique for trained AI models that leverages invariant transformations of input data to reduce epistemic uncertainty and improve inference accuracy. By aggregating inferences from multiple transformed samples, the method offers a way to enhance performance without re-training, potentially balancing model size and effectiveness.
Large Language Models for Limited Noisy Data: A Gravitational Wave Identification Study
This paper investigates the effectiveness of large language models (LLMs) for identifying gravitational wave signals in complex, noisy observational data, demonstrating superior performance over traditional neural networks with significantly fewer labeled samples and without reliance on large simulated datasets.