AI on demand: MinerU: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
| Line 1: | Line 1: | ||
stepping stone AG is proud to serve customers with current Language Models and OCR solutions. | stepping stone AG is proud to serve customers with current Language Models and OCR solutions. | ||
All of our LLM services are reachable from https://llm.stoney-cloud.com/ | All of our LLM services are reachable from https://llm.stoney-cloud.com/. | ||
== Calling a model == | == Calling a model == | ||
Latest revision as of 13:37, 5 March 2026
stepping stone AG is proud to serve customers with current Language Models and OCR solutions. All of our LLM services are reachable from https://llm.stoney-cloud.com/.
Calling a model
We assume you've received your API key from us in our usual manner. The route above is available in the usual OpenAI-compatible manner. For instance, receiving a list of available model is done via https://llm.stoney-cloud.com/v1/models which requires you to provide your key. As an initial service we provide access to some OCR models.
STONEY_KEY=<your-key-here>
MODEL_NAME=<hosted-model-name-here>
curl -s https://llm.stoney-cloud.com/v1/chat/completions \
-H "Authorization: Bearer $STONEY_KEY" \
-H "Content-Type: application/json" \
-d "{
\"model\":\"${MODEL_NAME}\",
\"messages\":[
{\"role\":\"user\",\"content\":\"Describe an imaginary document.\"}
],
\"max_tokens\":2000}" | jq .
Inspecting your usage
curl -s https://llm.stoney-cloud.com/v1/usage -H "Authorization: Bearer $STONEY_KEY" | jq .
We issue one key per model for now. The usage is thus per-model-per-key.