Skip to content

How it works

Architecture

Cysic OpenClaw combines hosted runtime infrastructure with native Cysic inference support:

  • Inference layer: OpenClaw uses Cysic Inference as its native inference service.
  • Hosting layer: Each user's OpenClaw instance runs on Cysic-managed servers.
  • User access layer: Users interact with their instance through supported channels such as Telegram, Discord, and Feishu.

This gives users a managed OpenClaw experience without requiring local deployment.

Security model

Security is a key part of the design:

  • Cysic-managed API access: OpenClaw itself does not need to expose raw external access directly.
  • Instance isolation: Each instance runs in an isolated environment, similar to a long-lived sandbox.
  • Reduced local risk: Users do not need to grant high local machine permissions just to keep OpenClaw running.

Service flow

flowchart LR
  User[User] --> Channel[Telegram / Discord / Feishu]
  Channel --> Instance[Hosted OpenClaw Instance]
  Instance --> Inference[Cysic Inference]
  Instance --> Skills[Installed Skills]
  Instance --> Billing[Cysic Billing]