5 Comments
User's avatar
EM's avatar

Model providers will need to increase their context windows 💀 this is impressive!

Expand full comment
mogwai.'s avatar

loool. most model providers are at 1M context length as we speak.

Expand full comment
Brown's avatar

_Ah, all this English no too much for…_

Just kidding! This is excellent stuff; I will try it. But I'm curious if you send this for every new session, coupled with the notes from previous conversations. If so, I presume that each new session is going to end quickly

Expand full comment
mogwai.'s avatar

your mental model for how commercial llm products work is severely outdated. first of all, you can set this prompt globally, which i have, which means i don't have to copy/paste it after the first submission.

secondly, context is managed expertly with LLMs now. sessions can run forever.

Expand full comment
Brown's avatar

Oh wow! Thank you, I didn't know this

Expand full comment