<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://wiki.stoney-cloud.com/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Eju-kha</id>
	<title>MediaWiki - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://wiki.stoney-cloud.com/w/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Eju-kha"/>
	<link rel="alternate" type="text/html" href="https://wiki.stoney-cloud.com/wiki/Special:Contributions/Eju-kha"/>
	<updated>2026-04-09T10:34:28Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.39.6</generator>
	<entry>
		<id>https://wiki.stoney-cloud.com/w/index.php?title=AI_on_demand:_MinerU&amp;diff=935</id>
		<title>AI on demand: MinerU</title>
		<link rel="alternate" type="text/html" href="https://wiki.stoney-cloud.com/w/index.php?title=AI_on_demand:_MinerU&amp;diff=935"/>
		<updated>2026-02-17T12:52:13Z</updated>

		<summary type="html">&lt;p&gt;Eju-kha: /* Calling a model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;stepping stone AG is proud to serve customers with current Language Models and OCR solutions.&lt;br /&gt;
All of our LLM services are reachable from https://llm.stoney-cloud.com/&lt;br /&gt;
&lt;br /&gt;
== Calling a model ==&lt;br /&gt;
We assume you&#039;ve received your API key from us in our usual manner. The route above is available in the usual OpenAI-compatible manner.&lt;br /&gt;
For instance, receiving a list of available model is done via https://llm.stoney-cloud.com/v1/models which requires you to provide your key.&lt;br /&gt;
As an initial service we provide access to some OCR models.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
STONEY_KEY=&amp;lt;your-key-here&amp;gt;&lt;br /&gt;
MODEL_NAME=&amp;lt;hosted-model-name-here&amp;gt;&lt;br /&gt;
&lt;br /&gt;
curl -s https://llm.stoney-cloud.com/v1/chat/completions \&lt;br /&gt;
  -H &amp;quot;Authorization: Bearer $STONEY_KEY&amp;quot; \&lt;br /&gt;
  -H &amp;quot;Content-Type: application/json&amp;quot; \&lt;br /&gt;
  -d &amp;quot;{&lt;br /&gt;
    \&amp;quot;model\&amp;quot;:\&amp;quot;${MODEL_NAME}\&amp;quot;,&lt;br /&gt;
    \&amp;quot;messages\&amp;quot;:[&lt;br /&gt;
      {\&amp;quot;role\&amp;quot;:\&amp;quot;user\&amp;quot;,\&amp;quot;content\&amp;quot;:\&amp;quot;Describe an imaginary document.\&amp;quot;}&lt;br /&gt;
    ],&lt;br /&gt;
    \&amp;quot;max_tokens\&amp;quot;:2000}&amp;quot; | jq .&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Inspecting your usage ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
curl -s https://llm.stoney-cloud.com/v1/usage   -H &amp;quot;Authorization: Bearer $STONEY_KEY&amp;quot; | jq .&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We issue one key per model for now. The usage is thus per-model-per-key.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category: LLM as a Service]]&lt;/div&gt;</summary>
		<author><name>Eju-kha</name></author>
	</entry>
	<entry>
		<id>https://wiki.stoney-cloud.com/w/index.php?title=AI_on_demand:_MinerU&amp;diff=934</id>
		<title>AI on demand: MinerU</title>
		<link rel="alternate" type="text/html" href="https://wiki.stoney-cloud.com/w/index.php?title=AI_on_demand:_MinerU&amp;diff=934"/>
		<updated>2026-02-11T14:32:53Z</updated>

		<summary type="html">&lt;p&gt;Eju-kha: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;stepping stone AG is proud to serve customers with current Language Models and OCR solutions.&lt;br /&gt;
All of our LLM services are reachable from https://llm.stoney-cloud.com/&lt;br /&gt;
&lt;br /&gt;
== Calling a model ==&lt;br /&gt;
We assume you&#039;ve received your API key from us in our usual manner. The route above is available in the usual OpenAI-compatible manner.&lt;br /&gt;
For instance, receiving a list of available model is done via https://llm.stoney-cloud.com/v1/models which requires you to provide your key.&lt;br /&gt;
As an initial service we provide access to [https://huggingface.co/opendatalab/MinerU2.5-2509-1.2B MinerU2.5-2509-1.2B], called &amp;quot;mineru&amp;quot; herein.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
STONEY_KEY=&amp;lt;your-key-here&amp;gt;&lt;br /&gt;
MODEL_NAME=&amp;quot;mineru&amp;quot;&lt;br /&gt;
&lt;br /&gt;
curl -s https://llm.stoney-cloud.com/v1/chat/completions \&lt;br /&gt;
  -H &amp;quot;Authorization: Bearer $STONEY_KEY&amp;quot; \&lt;br /&gt;
  -H &amp;quot;Content-Type: application/json&amp;quot; \&lt;br /&gt;
  -d &amp;quot;{&lt;br /&gt;
    \&amp;quot;model\&amp;quot;:\&amp;quot;${MODEL_NAME}\&amp;quot;,&lt;br /&gt;
    \&amp;quot;messages\&amp;quot;:[&lt;br /&gt;
      {\&amp;quot;role\&amp;quot;:\&amp;quot;user\&amp;quot;,\&amp;quot;content\&amp;quot;:\&amp;quot;Describe an imaginary document.\&amp;quot;}&lt;br /&gt;
    ],&lt;br /&gt;
    \&amp;quot;max_tokens\&amp;quot;:2000}&amp;quot; | jq .&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Inspecting your usage ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;syntaxhighlight lang=&amp;quot;bash&amp;quot;&amp;gt;&lt;br /&gt;
curl -s https://llm.stoney-cloud.com/v1/usage   -H &amp;quot;Authorization: Bearer $STONEY_KEY&amp;quot; | jq .&lt;br /&gt;
&amp;lt;/syntaxhighlight&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We issue one key per model for now. The usage is thus per-model-per-key.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
[[Category: LLM as a Service]]&lt;/div&gt;</summary>
		<author><name>Eju-kha</name></author>
	</entry>
</feed>