8000 FR: A more lenient JSON parsing (allow markdown parsing) in inference structured calls · Issue #1267 · karakeep-app/karakeep · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
FR: A more lenient JSON parsing (allow markdown parsing) in inference structured calls #1267
Open
@Nixellion

Description

@Nixellion

Describe the Bug

Can't get Ollama to work.

Ollama itself runs on a separate server that's accessible on LAN. Other services can reach it and are using it fine.

However in KaraKeep I get this:

2025-04-15T07:27:17.160Z error: [inference][4] inference job failed: ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
    at checkOk (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:77:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:141:3)
    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25)
    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4271)
    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:5305)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-04-15T07:27:17.171Z info: [inference][4] Starting an inference job for bookmark with id "d9608fk7q3tigncah1osfn7w"
2025-04-15T07:27:17.176Z error: [inference][4] inference job failed: ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
    at checkOk (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:77:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:141:3)
    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25)
    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4271)
    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:5305)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-04-15T07:27:17.189Z info: [inference][4] Starting an inference job for bookmark with id "d9608fk7q3tigncah1osfn7w"
2025-04-15T07:27:17.193Z error: [inference][4] inference job failed: ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
    at checkOk (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:77:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:141:3)
    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25)
    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4271)
    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:5305)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-04-15T07:27:17.206Z info: [inference][4] Starting an inference job for bookmark with id "d9608fk7q3tigncah1osfn7w"
2025-04-15T07:27:17.210Z error: [inference][4] inference job failed: ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
ResponseError: json: cannot unmarshal object into Go struct field ChatRequest.format of type string
    at checkOk (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:77:9)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
    at async post (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:141:3)
    at async Ollama.processStreamableRequest (/app/apps/workers/node_modules/.pnpm/ollama@0.5.14/node_modules/ollama/dist/browser.cjs:252:25)
    at async OllamaInferenceClient.runModel (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:4271)
    at async OllamaInferenceClient.inferFromText (/app/apps/workers/node_modules/.pnpm/@hoarder+shared@file+packages+shared_better-sqlite3@11.3.0/node_modules/@hoarder/shared/inference.ts:2:5305)
    at async inferTagsFromText (/app/apps/workers/openaiWorker.ts:6:3133)
    at async inferTags (/app/apps/workers/openaiWorker.ts:6:3410)
    at async Object.runOpenAI [as run] (/app/apps/workers/openaiWorker.ts:6:6868)
    at async Runner.runOnce (/app/apps/workers/node_modules/.pnpm/liteque@0.3.2_better-sqlite3@11.3.0/node_modules/liteque/dist/runner.js:2:2656)
2025-04-15T07:27:17.441Z info: [search][5] Completed successfully

Here's my compose:

version: "3.8"
services:
  web:
    image: ghcr.io/karakeep-app/karakeep:${KARAKEEP_VERSION:-release}
    restart: unless-stopped
    volumes:
      # By default, the data is stored in a docker volume called "data".
      # If you want to mount a custom directory, change the volume mapping to:
      # - /path/to/your/directory:/data
      - data:/data
    ports:
      - 3355:3000
    environment:
      MEILI_ADDR: http://meilisearch:7700
      BROWSER_WEB_URL: http://chrome:9222
      KARAKEEP_VERSION: release
      NEXTAUTH_URL: http://192.168.17.19:3355
      NEXTAUTH_SECRET: xxxxxxx
      MEILI_MASTER_KEY: xxxxxxx
      OLLAMA_BASE_URL: http://192.168.17.192:11434
      INFERENCE_TEXT_MODEL: qwen2.5-coder:7b-instruct-q4_0
      INFERENCE_IMAGE_MODEL: minicpm-v:latest
      INFERENCE_CONTEXT_LENGTH: 2048
      # OPENAI_API_KEY: ...

      # You almost never want to change the value of the DATA_DIR variable.
      # If you want to mount a custom directory, change the volume mapping above instead.
      DATA_DIR: /data # DON'T CHANGE THIS
  chrome:
    image: gcr.io/zenika-hub/alpine-chrome:123
    restart: unless-stopped
    command:
      - --no-sandbox
      - --disable-gpu
      - --disable-dev-shm-usage
      - --remote-debugging-address=0.0.0.0
      - --remote-debugging-port=9222
      - --hide-scrollbars
  meilisearch:
    image: getmeili/meilisearch:v1.13.3
    restart: unless-stopped
    environment:
      MEILI_NO_ANALYTICS: "true"
    volumes:
      - meilisearch:/meili_data

volumes:
  meilisearch:
  data:

Steps to Reproduce

  1. Set OLLAMA up as per documentation
  2. No worky

Expected Behaviour

  1. Ollama worky

Screenshots or Additional Context

No response

Device Details

No response

Exact Karakeep Version

0.23.2

Have you checked the troubleshooting guide?

  • I have checked the troubleshooting guide and I haven't found a solution to my problem

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingpri/mediumMed priority issuestatus/approvedThis issue is ready to be implemented

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0