Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

OpenAI API: How do I count tokens before(!) I send an API request?

OpenAI’s text models have a context length, e.g.: Curie has a context length of 2049 tokens.
They provide max_tokens and stop parameters to control the length of the generated sequence. Therefore the generation stops either when stop token is obtained, or max_tokens is reached.

The issue is: when generating a text, I don’t know how many tokens my prompt contains. Since I do not know that, I cannot set max_tokens = 2049 – number_tokens_in_prompt.

This prevents me from generating text dynamically for a wide range of text in terms of their length. What I need is to continue generating until the stop token.

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

My questions are:

  • How can I count the number of tokens in Python API? So that I will set max_tokens parameter accordingly.
  • Is there a way to set max_tokens to the max cap so that I won’t need to count the number of prompt tokens?

>Solution :

As stated in the official OpenAI article:

To further explore tokenization, you can use our interactive Tokenizer
tool, which allows you to calculate the number of tokens and see how
text is broken into tokens. Alternatively, if you’d like to tokenize
text programmatically, use Tiktoken as a fast BPE tokenizer
specifically used for OpenAI models.
Other such libraries you can
explore as well include transformers package for Python or the
gpt-3-encoder package for NodeJS.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading