Machine Learning 2.0 with Hugging Face Transformers - Julien Simon
Conference: Conference Talks
According to the latest State of AI report, "transformers have emerged as a general-purpose architecture for ML. Not just for Natural Language Processing, but also Speech, Computer Vision or even protein structure prediction." Indeed, the Transformer architecture has proven very efficient on a wide variety of Machine Learning tasks. But how can we keep up with the frantic pace of innovation? Do we really need expert skills to leverage these state-of-the-art models? Or is there a shorter path to creating business value in less time? In this code-level talk, we'll gradually build and deploy a Machine Learning application based on Transformers models. Along the way, you'll learn about the portfolio of open source and commercial Hugging Face solutions, how they can help you become hyper-productive in order to deliver high-quality Machine Learning solutions faster than ever before. Julien is currently Chief Evangelist at Hugging Face. He's recently spent 6 years at Amazon Web Services where he was the Global Technical Evangelist for AI & Machine Learning. Prior to joining AWS, Julien served for 10 years as CTO/VP Engineering in large-scale startups. ✅ Connect with Julien: https://www.linkedin.com/in/juliensimon/ ✅ Connect with Optimized AI Conference on LinkedIn - https://www.linkedin.com/company/oaiconference/ ✅ Connect with Optimized AI Conference on Twitter: https://x.com/southerndsc ✅ Visit Optimized AI Conference Website - https://www.oaiconference.com/
Discussion (0)
Join the discussion!
Subscribe to post comments and join our community of developers.