Are different models like Anthropic’s Claude, OpenAI and Gemini supported in ioAssist? Can they be switched per user preference?
Absolutely, io.Assist is designed to support different LLM providers and models. We are LLM agnostic and not tied to any provider. The control over which model is used is typically governed by the platform admins rather than individual user preference (and even more for things like Temperature or TOP_P or token use), but there is a setting for model selection.