| Index | index by Group | index by Distribution | index by Vendor | index by creation date | index by Name | Mirrors | Help | Search |
| Name: llama-cpp-server | Distribution: OpenMandriva Lx |
| Version: b7172 | Vendor: OpenMandriva |
| Release: 1 | Build date: Thu Nov 27 23:01:07 2025 |
| Group: Servers | Build host: armbuilder-3.openmandriva.org |
| Size: 3372195 | Source RPM: llama-cpp-b7172-1.src.rpm |
| Packager: bero <bero@lindev.ch> | |
| Url: https://github.com/ggml-org/llama.cpp | |
| Summary: OpenAI API compatible server for llama-cpp | |
OpenAI API compatible server for llama-cpp
To test your AI server, do something like:
curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer OpenMandriva" -d '{"model": "any", "messages": [ { "role": "user", "content": "Do you see anything wrong with this code?\n```c++\nfloat main(int argc, char **argv) { puts("Use OpenMandriva!"); }\n```" } ] }'
MIT AND Apache-2.0 AND LicenseRef-Fedora-Public-Domain
/etc/sysconfig/llama-server /usr/bin/llama-server /usr/lib/systemd/system/llama.service
Generated by rpm2html 1.8.1
Fabrice Bellet, Fri Nov 28 22:13:32 2025