172: Transformers and Large Language Models

11 Mar 2024 • 86 min • EN
86 min
00:00
01:26:08
No file found

172: Transformers and Large Language Models Intro topic: Is WFH actually WFC? News/Links: Falsehoods Junior Developers Believe about Becoming Seniorhttps://vadimkravcenko.com/shorts/falsehoods-junior-developers-believe-about-becoming-senior/Pure PursuitTutorial with python code: https://wiki.purduesigbots.com/software/control-algorithms/basic-pure-pursuit Video example: https://www.youtube.com/watch?v=qYR7mmcwT2w PID without a PHDhttps://www.wescottdesign.com/articles/pid/pidWithoutAPhd.pdfGoogle releases Gemmahttps://blog.google/technology/developers/gemma-open-models/ Book of the ShowPatrick: The Eye of the World by Robert Jordan (Wheel of Time)https://amzn.to/3uEhg6vJason: How to Make a Video Game All By Yourselfhttps://amzn.to/3UZtP7b Patreon Plug https://www.patreon.com/programmingthrowdown?ty=h Tool of the ShowPatrick: Stadia Controller Wifi to Bluetooth Unlockhttps://stadia.google.com/controller/index_en_US.htmlJason: FUSE and SSHFShttps://www.digitalocean.com/community/tutorials/how-to-use-sshfs-to-mount-remote-file-systems-over-ssh Topic: Transformers and Large Language Models How neural networks store informationLatent variablesTransformersEncoders & DecodersAttention LayersHistoryRNNVanishing Gradient ProblemLSTMShort term (gradient explodes), Long term (gradient vanishes)Differentiable algebraKey-Query-ValueSelf AttentionSelf-Supervised Learning & Forward ModelsHuman FeedbackReinforcement Learning from Human FeedbackDirect Policy Optimization (Pairwise Ranking) ★ Support this podcast on Patreon ★

From "Programming Throwdown"

Listen on your iPhone

Download our iOS app and listen to interviews anywhere. Enjoy all of the listener functions in one slick package. Why not give it a try?

App Store Logo
application screenshot

Popular categories