8000 OpenAI · Issue #4068 · texstudio-org/texstudio · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

OpenAI #4068

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
BeerStud opened this issue May 9, 2025 · 4 comments
Closed

OpenAI #4068

BeerStud opened this issue May 9, 2025 · 4 comments

Comments

@BeerStud
Copy link
BeerStud commented May 9, 2025

Environment

  • TeXstudio: TeXstudio 4.8.7 (git 4.8.7)
  • Qt: Using Qt Version 6.9.0, compiled with Qt 6.9.0 R
  • OS: Windows 10 - 64bit
  • TeX distribution: MiKtex

I am trying to use the OpenAI feature in TeXstudio. I believe I have it configured correctly; however, I keep receiving this error.

Traceback (most recent call last):
File "C:\Users\gerar\Downloads\openai_python_script.py", line 34, in response = send_message(message_log) File "C:\Users\gerar\Downloads\openai_python_script.py", line 22, in send_message return client.chat.completions.create( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^ model="GPT-4o mini", ^^^^^^^^^^^^^^^^^^^^ ...<6 lines>... temperature=0.7, ^^^^^^^^^^^^^^^^ stream=True) ^^^^^^^^^^^^ File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai_utils_utils.py", line 287, in wrapper return func(args, *kwargs) File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai\resources\chat\completions\completions.py", line 925, in create return self._post( ~~~~~~~~~~^ "/chat/completions", ^^^^^^^^^^^^^^^^^^^^ ...<43 lines>... stream_cls=Stream[ChatCompletionChunk], ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai_base_client.py", line 1239, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\gerar\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.13_qbz5n2kfra8p0\LocalCache\local-packages\Python313\site-packages\openai_base_client.py", line 1034, in request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': 'invalid model ID', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Process exited with error(s)

I am not sure where to go from here.

@sunderme
Copy link
Member

What openai feature have you used ? The macro described here ?

Txs supports openai directly, see manual

@BeerStud
Copy link
Author

I used the macro you described. If you want, I can send you a copy to check and a screenshot of the Txs configuration.

@sunderme
Copy link
Member

The macro was developed by another user. Any questions around that should be posted at his github page.
That said, I still would recommend to use txs' own feature instead.

@sunderme
Copy link
Member

I consider this answered.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants
0