2 Comments
User's avatar
Andrew's avatar

How do you think about the costs of LLM inference, and what that has to impact the incentive / monetization direction of a game?

David Kaye's avatar

Well, it naturally inclines the design towards games as a service models, as they are the only ones that will be able to support the ongoing cost of inference (unless and until it becomes efficient enough to run locally).