Quote:
Originally Posted by Dulu
[You must be logged in to view images. Log in or Register.]
So the question is, how far are we from making a game like P99 in AI?
|
If we accept the dubious premise that LLMs can spit out a doom-sized game now with minimal intervention(human intervention cost probably scales more like exponential with project size) and oversimplify a lot -
Compute cost as a function of code-base size scales between O(n^2) and O(n), with traditional attention vs. various linearity hacks/Chinchilla
Doom was ~50k LOC
Everquest through Velious probably had between 500k and 1m LOC (EQMacEMU's server has ~2m for example)
Compute scales at something like 2x per 2-3 years according to Gelsinger vs. Huang's updating of Moore's law.
So maybe log2(500k/50k)*2 to 2*log2(1m/50k)*3, or ~7 to 26 years for the compute power.
You will almost certainly run out of training data long before that. We're basically out now.
I hope that helps. I need to get back to debugging the COBOL Cursor keeps inserting into my secure networking code.
Edit: for an interesting albeit bad comparison, Anthropic claims their C-compiler experiment was basically autonomous(still required expert developer designed test-cases and a CI pipeline that would be impossible for a game) and ran in 2 weeks for $20k, producing 100k LOC. It sometimes works.