Customer-obsessed science


-
April 11, 2024This year’s papers address topics such as speech enhancement, spoken-language understanding, dialogue, paralinguistics, and pitch estimation.
-
April 03, 2024Solution method uses new infrastructure that reduces proof-checking overhead by more than 90%.
-
April 01, 2024Caltech professor and Amazon Scholar John Preskill wins Bell Prize for applying both classical and quantum computing to the problem of learning from quantum experiments.
-
March 25, 2024Automated method that uses gradients to identify salient layers prevents regression on previously seen data.
April 11, 2024
An animation that projects traffic fluctuations onto the U.S. map offers an example of how the Supply Chain Optimization Technologies team uses data visualization to glean insights.
-
NAACL 20242024In contemporary machine learning approaches to bilingual lexicon induction (BLI), a model learns a mapping between the embedding spaces of a language pair. Recently, retrieve- and-rank approach to BLI has achieved state of the art results on the task. However, the problem remains challenging in low-resource settings, due to the paucity of data. The task is complicated by factors such as lexical variation
-
NAACL 20242024Large language models (LLMs) tend to inadequately integrate input context during text generation, relying excessively on encoded prior knowledge in model parameters, potentially resulting in generated text with factual inconsistencies or contextually unfaithful content. LLMs utilize two primary knowledge sources: 1) prior (parametric) knowledge from pretraining, and 2) contextual (non-parametric) knowledge
-
NAACL 20242024Dictionary example sentences play an important role in illustrating word definitions and usage, but manually creating quality sentences is challenging. Prior works have demonstrated that language models can be trained to generate example sentences. However, they re- lied on costly customized models and word sense datasets for generation and evaluation of their work. Rapid advancements in foundational models
-
April 09, 2024How the team behind Echo Frames delivered longer battery life and improved sound quality inside the slim form factor of a pair of eyeglasses.
-
March 27, 2024The submission period opens March 27 and closes on May 7.
-
March 21, 2024The principal economist and his team address unique challenges using techniques at the intersection of microeconomics, statistics, and machine learning.