We're in beta. Stay tuned for updates.x
Loading...
PODCAST

AI Post Transformers

AI-generated podcast where hosts Hal Turing and Dr. Ada Shannon discuss the latest research papers and reports in machine learning, AI systems, and optimization. Featuring honest critical analysis, proper citations, and nerdy humor.

All Episodes

Dario Amodei: Machines of Loving Grace
AI Post Transformers ·
2026/02/11
en
LongCat: Scaling Embeddings Outperforms Scaling...
AI Post Transformers ·
2026/02/11
en
Reinforced Attention Learning
AI Post Transformers ·
2026/02/11
en
Sapient Intelligence: Hierarchical Reasoning Model
AI Post Transformers ·
2026/02/11
en
A Comprehensive Survey of Mixture-of-Experts:...
AI Post Transformers ·
2026/02/11
en
Advances in Attention Distillation for Efficient...
AI Post Transformers ·
2026/02/11
en
ChunkKV: Semantic-Preserving KV Cache Compression for...
AI Post Transformers ·
2026/02/11
en
DR. KERNEL: Reinforcement Learning for Optimized...
AI Post Transformers ·
2026/02/11
en
00:15:17
Towards a Science of Scaling Agent Systems
AI Post Transformers ·
2026/02/09
en
Towards a Science of Scaling Agent Systems
AI Post Transformers ·
2026/02/09
en
00:15:29
Moloch’s Bargain: Market Incentives and the Rise of...
AI Post Transformers ·
2026/02/06
en
00:15:38
Claude Opus 4.6 Technical Report and Agent Capabilities
AI Post Transformers ·
2026/02/06
en
00:15:50
Advancing regulatory variant effect prediction with...
AI Post Transformers ·
2026/02/06
en
00:16:57
Uncertainty-aware genomic deep learning with...
AI Post Transformers ·
2026/02/06
en
00:16:15
Distilling GNN Knowledge into Non-Neural Cell Graph...
AI Post Transformers ·
2026/02/06
en
00:19:01
DeepSearchQA: Bridging the Comprehensiveness Gap for...
AI Post Transformers ·
2026/02/06
en
00:17:35
Reinforcement Learning via Self-Distillation
AI Post Transformers ·
2026/02/06
en
00:37:19
On-Policy Self-Distillation for Advanced LLM Reasoning
AI Post Transformers ·
2026/02/06
en
00:37:00
Knowledge distillation to context distillation
AI Post Transformers ·
2026/02/06
en
00:18:29
2015: Distilling the Knowledge in a Neural Network
AI Post Transformers ·
2026/02/06
en
990 results

Similar Podcasts