Main Page: Difference between revisions
Jump to navigation
Jump to search
| (One intermediate revision by one other user not shown) | |||
| Line 47: | Line 47: | ||
* [[AI on demand: allenai/olmOCR-2-7B]] | * [[AI on demand: allenai/olmOCR-2-7B]] | ||
* [[AI on demand: mistralai/Voxtral-Mini-3B-2507]] | * [[AI on demand: mistralai/Voxtral-Mini-3B-2507]] | ||
* [[AI on demand: openai/whisper-large-v3]] | |||
* [[AI on demand: opendatalab/MinerU2.5-2509-1.2B]] | * [[AI on demand: opendatalab/MinerU2.5-2509-1.2B]] | ||
* <s>[[AI on demand: swiss-ai/Apertus-70B-Instruct-2509]]</s> (discontinued) | * <s>[[AI on demand: swiss-ai/Apertus-70B-Instruct-2509]]</s> (discontinued since the 7th of May 2026! ) | ||
[[:Category:Troubleshooting|Troubleshooting]] | [[:Category:Troubleshooting|Troubleshooting]] | ||
Latest revision as of 11:11, 8 May 2026
Introduction
Welcome to the wiki of our OpenStack based stoney cloud infrastructure. This collection serves as a comprehensive resource for all aspects of our cloud environment. Here, you will find guides for managing and utilizing our stoney cloud.
Getting started
Overview
- AI on demand: Usage (list models, show usage, ...)
- AI on demand: BAAI/bge-m3
- AI on demand: BAAI/bge-reranker-v2-m3
- AI on demand: MiniMaxAI/MiniMax-M2.5
- AI on demand: NVIDIA/NVIDIA-Nemotron-3-Super-120B-A12B
- AI on demand: Qwen/Qwen3-Coder-Next
- AI on demand: Qwen/Qwen3.5-35B-A3B-FP8
- AI on demand: allenai/olmOCR-2-7B
- AI on demand: mistralai/Voxtral-Mini-3B-2507
- AI on demand: openai/whisper-large-v3
- AI on demand: opendatalab/MinerU2.5-2509-1.2B
AI on demand: swiss-ai/Apertus-70B-Instruct-2509(discontinued since the 7th of May 2026! )
Quick links