-
Notifications
You must be signed in to change notification settings - Fork 201
Insights: containers/ramalama
Overview
Could not load contribution data
Please try again later
1 Release published by 1 person
-
v0.9.2
published
Jun 16, 2025
35 Pull requests merged by 10 people
-
Red Hat Konflux kflux-prd-rh03 update ramalama
#1542 merged
Jun 17, 2025 -
Create tempdir when run as non-root user
#1551 merged
Jun 17, 2025 -
Add GGML_VK_VISIBLE_DEVICES env var
#1547 merged
Jun 17, 2025 -
Tabs to spaces
#1549 merged
Jun 17, 2025 -
Run bats test with TMPDIR pointing at /mnt/tmp
#1548 merged
Jun 17, 2025 -
model: always pass in GPU offloading parameters
#1502 merged
Jun 17, 2025 -
Add dnf update -y to Fedora ROCm build
#1544 merged
Jun 17, 2025 -
Deduplicate code
#1539 merged
Jun 16, 2025 -
Downgrade whisper
#1543 merged
Jun 16, 2025 -
Bump to v0.9.2
#1537 merged
Jun 16, 2025 -
Upgrade podman
#1540 merged
Jun 16, 2025 -
Make minimum version of Python consistent
#1512 merged
Jun 16, 2025 -
Convert tabs to spaces
#1538 merged
Jun 16, 2025 -
honor the user specifying the image
#1527 merged
Jun 16, 2025 -
chore: bump ramalama-stack to 0.2.1
#1536 merged
Jun 16, 2025 -
Not sure this is supposed to be here
#1535 merged
Jun 16, 2025 -
Suggest using uv pip install to get missing module
#1532 merged
Jun 16, 2025 -
Add ramalama chat command
#1531 merged
Jun 15, 2025 -
Refactor config and arg typing
#1488 merged
Jun 15, 2025 -
Change the FROM for asahi container image
#1523 merged
Jun 15, 2025 -
Add colors to "ramalama serve" if we can
#1529 merged
Jun 15, 2025 -
Add --all option to ramalama ls
#1528 merged
Jun 14, 2025 -
Update to add multi-modal
#1522 merged
Jun 13, 2025 -
Do not run with --tty when not in interactive mode
#1506 merged
Jun 13, 2025 -
chore(common/intel_gpus): detect arc a770, a750
#1517 merged
Jun 13, 2025 -
fix(deps): update dependency huggingface-hub to ~=0.33.0
#1505 merged
Jun 13, 2025 -
This installs ramalama via uv if python3 version is too old
#1497 merged
Jun 13, 2025 -
Wait for upto 16 seconds for model to load
#1510 merged
Jun 13, 2025 -
Update black target version
#1513 merged
Jun 13, 2025 -
For
ramalama ls
shorten huggingface lines#1516 merged
Jun 13, 2025 -
Add Python shebang files to linting
#1514 merged
Jun 12, 2025 -
Ignore errors when removing snapshot directory
#1511 merged
Jun 12, 2025 -
Increase retry attempts to attempt to connect to server
#1507 merged
Jun 12, 2025 -
fix: remove unneeded dependency from Llama Stack container
#1503 merged
Jun 11, 2025 -
This is not a multi-model model
#1499 merged
Jun 11, 2025
6 Pull requests opened by 4 people
-
Remove libexec files
#1504 opened
Jun 11, 2025 -
:latest tag should not be assumed for non-OCI artefacts
#1534 opened
Jun 16, 2025 -
Trying to save space
#1541 opened
Jun 16, 2025 -
Replace ramalama-client-code with ramalama chat
#1550 opened
Jun 17, 2025 -
chore(deps): update registry.access.redhat.com/ubi9/ubi docker tag to v9.6-1749542372
#1554 opened
Jun 17, 2025 -
chore(deps): update registry.access.redhat.com/ubi9/ubi docker tag to v9.6-1749542372
#1555 opened
Jun 18, 2025
7 Issues closed by 3 people
-
ramalama-cli image for 0.7.5 reports 0.7.4 version
#1259 closed
Jun 17, 2025 -
Fallback to Vulkan on unsupported AMD GPUs
#1482 closed
Jun 17, 2025 -
Invalid argument to --image silently ignored
#1525 closed
Jun 16, 2025 -
Distributed inferencing
#1115 closed
Jun 13, 2025 -
Detects Intel Arc Graphics A770, A750
#1515 closed
Jun 13, 2025 -
`ramalama serve smolvlm` bailing out
#1508 closed
Jun 12, 2025 -
`isort` not working on `make validate` after `make install-requirements`
#1178 closed
Jun 11, 2025
8 Issues opened by 6 people
-
Suspect code in Model.garbage_collection
#1553 opened
Jun 17, 2025 -
Exception starting RamaLama after a fresh install on macOS
#1552 opened
Jun 17, 2025 -
keeping shortnames up to date
#1546 opened
Jun 17, 2025 -
AI models in microVMs
#1533 opened
Jun 16, 2025 -
No user-directed error if model fails to load
#1524 opened
Jun 13, 2025 -
quay.io/ramalama/ramalama-rag:0.8.5 is broken
#1521 opened
Jun 13, 2025 -
s3 pulling support
#1519 opened
Jun 12, 2025 -
Reasoning flag
#1509 opened
Jun 12, 2025
15 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
cpu type rpc worker
#1485 commented on
Jun 12, 2025 • 2 new comments -
Start process of moving python-ramalama to ramalama
#1498 commented on
Jun 11, 2025 • 1 new comment -
Explore replacing python3 ollama puller with "podman artifact pull"
#1112 commented on
Jun 11, 2025 • 0 new comments -
"ramalama lightspeed" command
#1432 commented on
Jun 11, 2025 • 0 new comments -
Skip huggingface-cli fallback for llama-server style huggingface uris
#1493 commented on
Jun 12, 2025 • 0 new comments -
Using ramalama server or run with --rag induces a core dump
#1306 commented on
Jun 13, 2025 • 0 new comments -
RamaLama won't recognize RX5700XT
#804 commented on
Jun 15, 2025 • 0 new comments -
README example failing with `ModuleNotFoundError: No module named 'ramalama'`
#1368 commented on
Jun 15, 2025 • 0 new comments -
unusual podman error with CUDA CDI, but nvidia-smi works
#1487 commented on
Jun 16, 2025 • 0 new comments -
Radeon RX 6700 XT not utilized
#1129 commented on
Jun 17, 2025 • 0 new comments -
Dependency Dashboard
#136 commented on
Jun 17, 2025 • 0 new comments -
TMT: run tests with GPUs
#1101 commented on
Jun 18, 2025 • 0 new comments -
python package fixes
#1411 commented on
Jun 13, 2025 • 0 new comments -
Switch python3-ramalama to ramalama
#1433 commented on
Jun 11, 2025 • 0 new comments -
Do not attempt huggingface-cli fallback for llama-server style uris
#1494 commented on
Jun 16, 2025 • 0 new comments