Live Session
Thursday Posters
Industry Poster
Scale-Invariant Learning-to-Rank
Christian Sommeregger (Expedia Group), Alessio Petrozziello (Expedia Group), Xiaoke Liu (Expedia Group) and Ye-Sheen Lim (Expedia Group).
Abstract
At Expedia, learning-to-rank (LTR) models plays a key role on our website in sorting and presenting information more relevant to users, such as search filters, property rooms, amenities, and images. A major challenge in deploying these models is ensuring consistent feature scaling between training and production data, as discrepancies can lead to unreliable rankings when deployed. Normalization techniques like feature standardization and batch normalization could address these issues but are impractical in production due to latency impacts and the difficulty of distributed real-time inference. To address consistent feature scaling issue, we introduce a scale-invariant LTR framework which combines a deep and a wide neural network to mathematically guarantee scale-invariance in the model at both training and prediction time.