Breaking the Memory Barrier

Oct 27, 2024 · 15m 32s
Breaking the Memory Barrier
Description

🧠 Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss This research paper introduces Inf-CL, a novel approach for contrastive learning that dramatically reduces GPU memory usage...

show more
🧠 Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss

This research paper introduces Inf-CL, a novel approach for contrastive learning that dramatically reduces GPU memory usage during training, allowing for near-infinite batch sizes. The authors address the issue of quadratic memory growth in traditional methods by implementing a tile-based computation strategy that partitions the contrastive loss calculation into smaller, sequentially computed blocks. To further enhance efficiency, they propose a multi-level tiling strategy that leverages ring-based communication at the GPU level and fused kernels at the CUDA core level, minimizing I/O overhead. The experiments demonstrate that Inf-CL significantly outperforms previous methods, achieving unprecedented batch sizes while maintaining accuracy and comparable training speed. This breakthrough opens new possibilities for large-scale contrastive learning, paving the way for advancements in areas such as self-supervised learning and dense text retrieval.

📎 Link to paper
show less
Information
Author Shahriar Shariati
Organization Shahriar Shariati
Website -
Tags

Looks like you don't have any active episode

Browse Spreaker Catalogue to discover great new content

Current

Podcast Cover

Looks like you don't have any episodes in your queue

Browse Spreaker Catalogue to discover great new content

Next Up

Episode Cover Episode Cover

It's so quiet here...

Time to discover new episodes!

Discover
Your Library
Search