ClipPhrase
RU

models can

825
clips found
3
videos

Examples in video

YouTubeDazed288K views · 2025-09-10
Alex Consani on how to walk, viral TikToks, Leo power and more | The dA-Zed Guide to Being
models can be like whatever gender they”
>> I'd like to accomplish a goal wherewant.
Play at 5:12
YouTubeThe Diary Of A CEO679K views · 2026-02-23
Uber CEO: I Have To Be Honest, AI Will Replace 9.4 Million Jobs At Uber!
“beings and when the models can learn”
but we're learning real time as humanreal time
Play at 83:30
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“kickstarted a major change in what the models can do and how people use them.”
let the optimization go on to much bigger scales, which kind of- What kind of domains is RLVR amenable to?
Play at 100:20
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“these kind of more open-ended domains so the models can learn a lot more.”
the attention is, where they're trying to push this set of methods into- I think that's called reinforcement learning with AI feedback, right?
Play at 101:07
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“social and largely in the data in ways that AI models can't”
singularity idea, whereas I think research is messy,process. But what we do have today is really powerful and these
Play at 184:30
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“we talked about knowledge. Essentially, there's a ratio where big models can absorb”
models, you need more compute, because we talked about having more parameters andmore from data, and then you get more benefit out of this.
Play at 70:14
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“at some point you reach a limit, because small models can only do so much.”
attention mechanism. You get a solid understanding of how things work, butThe problem with learning about LLMs at scale is that it's
Play at 119:06
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“have today capable of doing? I think the language models can solve a lot”
ASI, Artificial Superintelligence, and what are the language models that weof tasks, but a key milestone among the AI community is
Play at 159:13
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“we have foundation models we can specialize. That's like the”
is there a threshold for AGI? I think the real cool thing here is thatbreakthrough. Right now, I think we are not there yet because, first,
Play at 195:29
YouTubeLex Fridman758K views · 2026-01-31
State of AI in 2026: LLMs, Coding, Scaling Laws, China, Agents, GPUs, AGI | Lex Fridman Podcast #490
“Smaller models from researchers can be something like five to 10 trillion.”
Pre-training dataset size is measured in trillions of tokens.researchers can be something like five to 10 trillion. Um,
Play at 69:08