Quote:
Originally Posted by shovelquest
[You must be logged in to view images. Log in or Register.]
Can I not jsut run that on my regular computer? I have to have those specs or something?
|
it will unironically be like the AI from hitch hikers guide if it works at all. instead of just printing out the response like your normally used to seeing by using these websites, imagine waiting like a minute per letter to appear on the screen
it's probably even longer than a minute per char depending on hardware, DeepSeek is 671 billion parameters, my computer was struggling to run a local LLM that was only 7 billion parameters.
my pc is also old as fuck though
but yeah this is literally some if you PC can't run Crysis type shit don't bother. basically. and just give your data to whoever wants to host a LLM for you
Quote:
|
Another way to think about this is that, accounting for Moore’s Law + inflation (2% per year), you’ll be able to buy enough compute to run R1 locally for $1000 in 5-7 years, so 2030-2032
|
6 grand or even 13 is still insanely cheap for what it is, especially when you remember and adjust for inflation the price of PCs in the 1980s
i just googled what my familes first computer cost in 1985, and this LLM setup is only 2 grand more adjusted for inflation