Production-grade security, compliance controls, and governance frameworks designed for enterprise deployment from day one.
Deploy on your terms — cloud-hosted for convenience or self-hosted within your enterprise perimeter for maximum control and data sovereignty.
Extend capabilities through Actions, MCP integrations, and custom connectors to integrate seamlessly with your existing tools and workflows.
Connect leading LLMs to your organization's knowledge and tools. Deliver permission-aware enterprise search, chat with files/URLs, deep research, configurable agents, and automation via actions.
Model selector, projects, file/URL chats, and multi-source synthesis for comprehensive research.
Instruction-driven agents grounded with knowledge and powered by custom actions.
Unified, permission-aware search across all connected sources with advanced filters.
Connect wikis, cloud storage, ticketing, messaging, code repos, and custom systems.
AI Desk provides a comprehensive suite of features for enterprise AI deployment, from core chat functionality to advanced integrations and governance controls.
| Category | Feature | Included | Notes |
|---|---|---|---|
| Chat | Rich chat UI, model selector | ✓ | Projects, files/URLs, deep research |
| Agents | Instruction-driven agents | ✓ | Ground with knowledge; attach actions |
| Enterprise Search | Permission-aware unified search | ✓ | Filters, source ACLs respected |
| Web Search | Pluggable providers + scraper | ✓ | Exa AI supported |
| Actions | Built-in + custom actions | ✓ | OAuth actions for delegated access |
| MCP | Model Context Protocol (SSE) | ✓ | SSE servers supported |
| Code Interpreter | In-app code execution | ✓ | Alpha per documentation |
| Image Generation | Text-to-image | ✓ | Configurable providers |
| Connectors | Wikis, storage, tickets, etc. | ✓ | Broad ecosystem |
| Permission Sync | Document-level access | ✓ | Respects source permissions |
| Slack Federated | Scale Slack ingestion | ✓ | Auth + indexing controls |
| LLM Controls | Access policies, visibility | ✓ | Per provider/model controls |
| Analytics | Usage, history, exports | ✓ | Custom analytics supported |
| Identity | Users & groups | ✓ | SSO options described in docs |
| Observability | Logging & telemetry | ✓ | Multilingual settings available |
| Architecture | Layered design | ✓ | App, data, infrastructure layers |
| Data Flows | Indexing & query design | ✓ | External services configurable |
| Data Handling | 3rd-party LLM policies | ✓ | Storage & training data posture |
| Security FAQ | Encryption & incident | ✓ | Compliance-focused answers |
| Deployment | Cloud or self-hosted | ✓ | Docker, Kubernetes |
| Scaling | KEDA autoscaling | ✓ | Affinity & tolerations |
| Environments | Air-gapped support | ✓ | Offline-friendly posture |
| Config | Env vars & settings | ✓ | Domain, SSL, logging/observability |
| APIs | REST + playground | ✓ | Chat (streaming), user mgmt, search |
| Streaming | LD-JSON packets | ✓ | Message IDs + final search docs |
| Extensibility | Actions & MCP | ✓ | Secure integrations |
Get started with AI Desk today. Our team provides deployment support, custom integrations, and ongoing assistance to help you scale confidently.
Configure LLM providers (OpenRouter, Ollama), horizontal scaling with KEDA, custom domains with SSL/TLS, and air-gapped deployment options.