Main Page: Difference between revisions
Jump to navigation
Jump to search
| Line 47: | Line 47: | ||
* [[AI on demand: allenai/olmOCR-2-7B]] | * [[AI on demand: allenai/olmOCR-2-7B]] | ||
* [[AI on demand: mistralai/Voxtral-Mini-3B-2507]] | * [[AI on demand: mistralai/Voxtral-Mini-3B-2507]] | ||
* [[AI on demand: openai/whisper-large-v3]] | |||
* [[AI on demand: opendatalab/MinerU2.5-2509-1.2B]] | * [[AI on demand: opendatalab/MinerU2.5-2509-1.2B]] | ||
* <s>[[AI on demand: swiss-ai/Apertus-70B-Instruct-2509]]</s> (discontinued) | * <s>[[AI on demand: swiss-ai/Apertus-70B-Instruct-2509]]</s> (discontinued) | ||
Revision as of 17:56, 7 May 2026
Introduction
Welcome to the wiki of our OpenStack based stoney cloud infrastructure. This collection serves as a comprehensive resource for all aspects of our cloud environment. Here, you will find guides for managing and utilizing our stoney cloud.
Getting started
Overview
- AI on demand: Usage (list models, show usage, ...)
- AI on demand: BAAI/bge-m3
- AI on demand: BAAI/bge-reranker-v2-m3
- AI on demand: MiniMaxAI/MiniMax-M2.5
- AI on demand: NVIDIA/NVIDIA-Nemotron-3-Super-120B-A12B
- AI on demand: Qwen/Qwen3-Coder-Next
- AI on demand: Qwen/Qwen3.5-35B-A3B-FP8
- AI on demand: allenai/olmOCR-2-7B
- AI on demand: mistralai/Voxtral-Mini-3B-2507
- AI on demand: openai/whisper-large-v3
- AI on demand: opendatalab/MinerU2.5-2509-1.2B
AI on demand: swiss-ai/Apertus-70B-Instruct-2509(discontinued)
Quick links