2 Comments

How do you think about the costs of LLM inference, and what that has to impact the incentive / monetization direction of a game?

Expand full comment

Well, it naturally inclines the design towards games as a service models, as they are the only ones that will be able to support the ongoing cost of inference (unless and until it becomes efficient enough to run locally).

Expand full comment