ChatGPT creates its own language to extend conversations, and it’s getting weirder
It’s not Skynet yet, but ChatGPT just wrote its own language to extend its 8K-limited conversations. In a curious turn of events, ChatGPT users discovered that GPT-4 is able to compress long conversations and create a compressed language that you can later use as new prompts.
This basically recreates the exact same conversation when entered as a new prompt in chat. In fact, it not only lets you pick up where you left off, but also extends ChatGPT conversations beyond the word limit.
As you know, ChatGPT has a word limit – although reports seem to vary as to what that number is.Some say GPT-4 gives you about 25,000 words when writing Jeremy Nguyen (opens in a new tab) Say it has about 8,000K words. Not that most users tend to hit that cutoff point, but it does mean that ChatGPT may drop outright, or even in the middle, if your current conversation lasts too long. So you’ll have to start a whole new chat, which can be very frustrating if you haven’t done it yet and need more information.
Named Shogtongue Fodor (opens in a new tab) On Twitter, this new condensed language lets you bypass the word count so you can continue your conversation, which is very useful, especially when your query turns into a complicated rabbit hole or your chat becomes Became a month-long chat (or you just need a good friend who remembers everything you used to tell them).
GPT-4 still needs context
According to Nguyen, ChatGPT doesn’t simply create compressed messages for you when you run out of words. You still have to ask it to compress the current conversation with very specific instructions.
In his example, he specifies that the compression should be “lossless, but produce the minimum number of tokens that can be fed into the LLM as-is and produce the same output.” He also instructs it to use a different language, notation, and “Other Early Starts”.
This must also be done when entering that condensed message in a new chat to help out – although some GPT-4 users don’t appear to be doing this, adding context helps minimize errors.
It’s worth noting that the language isn’t foolproof yet. According to Nguyen, GPT-3.5 cannot read compressed languages, while “GPT-4 through the API is difficult.” With ChatGPT, it works best on GPT-4, an improved version of its predecessor. So, if you think this is going to be a long, complicated conversation that would make Proust proud, you better stick with that chatbot.
If you want to learn more about ChatGPT, click here Everything You Need to Know About AI Chatbots. Not a fan?maybe you’ll like it google bard better one.