AI on Demand: MinerU: Difference between revisions
Jump to navigation
Jump to search
(Created page with "Category: LLM as a Service") |
No edit summary |
||
| (3 intermediate revisions by 2 users not shown) | |||
| Line 1: | Line 1: | ||
[[Category: | stepping stone AG is proud to serve customers with current Language Models and OCR solutions. | ||
All of our LLM services are reachable from https://llm.stoney-cloud.com/ | |||
== Calling a model == | |||
We assume you've received your API key from us in our usual manner. The route above is available in the usual OpenAI-compatible manner. | |||
For instance, receiving a list of available model is done via https://llm.stoney-cloud.com/v1/models which requires you to provide your key. | |||
As an initial service we provide access to some OCR models. | |||
<syntaxhighlight lang="bash"> | |||
STONEY_KEY=<your-key-here> | |||
MODEL_NAME=<hosted-model-name-here> | |||
curl -s https://llm.stoney-cloud.com/v1/chat/completions \ | |||
-H "Authorization: Bearer $STONEY_KEY" \ | |||
-H "Content-Type: application/json" \ | |||
-d "{ | |||
\"model\":\"${MODEL_NAME}\", | |||
\"messages\":[ | |||
{\"role\":\"user\",\"content\":\"Describe an imaginary document.\"} | |||
], | |||
\"max_tokens\":2000}" | jq . | |||
</syntaxhighlight> | |||
== Inspecting your usage == | |||
<syntaxhighlight lang="bash"> | |||
curl -s https://llm.stoney-cloud.com/v1/usage -H "Authorization: Bearer $STONEY_KEY" | jq . | |||
</syntaxhighlight> | |||
We issue one key per model for now. The usage is thus per-model-per-key. | |||
[[Category:AI on Demand]] | |||
Latest revision as of 10:56, 3 March 2026
stepping stone AG is proud to serve customers with current Language Models and OCR solutions. All of our LLM services are reachable from https://llm.stoney-cloud.com/
Calling a model
We assume you've received your API key from us in our usual manner. The route above is available in the usual OpenAI-compatible manner. For instance, receiving a list of available model is done via https://llm.stoney-cloud.com/v1/models which requires you to provide your key. As an initial service we provide access to some OCR models.
STONEY_KEY=<your-key-here>
MODEL_NAME=<hosted-model-name-here>
curl -s https://llm.stoney-cloud.com/v1/chat/completions \
-H "Authorization: Bearer $STONEY_KEY" \
-H "Content-Type: application/json" \
-d "{
\"model\":\"${MODEL_NAME}\",
\"messages\":[
{\"role\":\"user\",\"content\":\"Describe an imaginary document.\"}
],
\"max_tokens\":2000}" | jq .
Inspecting your usage
curl -s https://llm.stoney-cloud.com/v1/usage -H "Authorization: Bearer $STONEY_KEY" | jq .
We issue one key per model for now. The usage is thus per-model-per-key.