AI on demand: MinerU: Difference between revisions

From MediaWiki
Jump to navigation Jump to search
No edit summary
 
Line 9: Line 9:
<syntaxhighlight lang="bash">
<syntaxhighlight lang="bash">
STONEY_KEY=<your-key-here>
STONEY_KEY=<your-key-here>
MODEL_NAME=<hosted-model-name-here>
MODEL_ID=<hosted-model-name-here>


curl -s https://llm.stoney-cloud.com/v1/chat/completions \
curl -s https://llm.stoney-cloud.com/v1/chat/completions \
Line 15: Line 15:
   -H "Content-Type: application/json" \
   -H "Content-Type: application/json" \
   -d "{
   -d "{
     \"model\":\"${MODEL_NAME}\",
     \"model\":\"${MODEL_ID}\",
     \"messages\":[
     \"messages\":[
       {\"role\":\"user\",\"content\":\"Describe an imaginary document.\"}
       {\"role\":\"user\",\"content\":\"Describe an imaginary document.\"}

Latest revision as of 10:34, 27 March 2026

stepping stone AG is proud to serve customers with current Language Models and OCR solutions. All of our LLM services are reachable from https://llm.stoney-cloud.com/.

Calling a model

We assume you've received your API key from us in our usual manner. The route above is available in the usual OpenAI-compatible manner. For instance, receiving a list of available model is done via https://llm.stoney-cloud.com/v1/models which requires you to provide your key. As an initial service we provide access to some OCR models.

STONEY_KEY=<your-key-here>
MODEL_ID=<hosted-model-name-here>

curl -s https://llm.stoney-cloud.com/v1/chat/completions \
  -H "Authorization: Bearer $STONEY_KEY" \
  -H "Content-Type: application/json" \
  -d "{
    \"model\":\"${MODEL_ID}\",
    \"messages\":[
      {\"role\":\"user\",\"content\":\"Describe an imaginary document.\"}
    ],
    \"max_tokens\":2000}" | jq .

Inspecting your usage

curl -s https://llm.stoney-cloud.com/v1/usage   -H "Authorization: Bearer $STONEY_KEY" | jq .

We issue one key per model for now. The usage is thus per-model-per-key.