-
Notifications
You must be signed in to change notification settings - Fork 362
OpenAI #4068
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I used the macro you described. If you want, I can send you a copy to check and a screenshot of the Txs configuration. |
The macro was developed by another user. Any questions around that should be posted at his github page. |
I consider this answered. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Environment
I am trying to use the OpenAI feature in TeXstudio. I believe I have it configured correctly; however, I keep receiving this error.
Traceback (most recent call last):
File "C:\Users\gerar\Downloads\openai_python_script.py", line 34, in response = send_message(message_log) File "C:\Users\gerar\Downloads\openai_python_script.py", line 22, in send_message return client.chat.completions.create( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^ model="GPT-4o mini", ^^^^^^^^^^^^^^^^^^^^ ...<6 lines>... temperature=0.7, ^^^^^^^^^^^^^^^^ stream=True) ^^^^^^^^^^^^ File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai_utils_utils.py", line 287, in wrapper return func(args, *kwargs) File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai\resources\chat\completions\completions.py", line 925, in create return self._post( ~~~~~~~~~~^ "/chat/completions", ^^^^^^^^^^^^^^^^^^^^ ...<43 lines>... stream_cls=Stream[ChatCompletionChunk], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai_base_client.py", line 1239, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai_base_client.py", line 1034, in request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': 'invalid model ID', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Process exited with error(s)
I am not sure where to go from here.
The text was updated successfully, but these errors were encountered: