Smarter Onboarding, New Embedding Capabilities, Org-Wide Governance & 400+ New Models

This November, FloTorch.AI delivered two major releases focused on improving user experience, strengthening governance, and expanding model coverage for developers and enterprise teams. Here’s everything new across the platform this month.
🚀 Instant Setup & Seamless Model Access
Default FloTorch Models Added (AWS Bedrock Powered)
To speed up onboarding, FloTorch now includes five pre-configured global models available to all users without any setup:
- flotorch/nova-micro
- flotorch/sonnet-4-5
- flotorch/nova-pro
- flotorch/haiku-4-5
- flotorch/nova-lite
This ensures every new user can immediately start building and evaluating workflows.
📊 Clearer Usage Insights with Token-Based Tracking
FloTorch has transitioned from a dollar-based credit display to a more intuitive token-based usage system.
Key improvements include:
- A percentage-based usage view for clarity across model combinations
- Cleaned-up transaction logs showing only model name + token counts
- A helpful daily allowance note:
“You get token credits worth approximately 65.48 million Nova Micro model tokens daily.” - T&C…!!
This gives users a predictable and transparent view of platform consumption.
🧠 New Embedding Capabilities
Full Embedding Model Management in Console
Users can now create, configure, and manage embedding models directly inside the FloTorch Console.
This update improves:
- Vector generation workflows
- RAG system configuration
- Integration with downstream applications
The experience is now unified, simplified, and ready for production-grade use cases.
🧩 Built-in Local Tools for Testing & Debugging
FloTorch introduced new local tools that support experimentation in local or on-prem environments—ideal for testing prompt chains, model routing, and agent logic.
New tools include:
- CoinGecko MCP — Market data access without API keys
- GitHub MCP — Repository and issue interactions
- Supabase MCP — Simple database operations
These tools help teams iterate quickly without relying on external setup.
Along with these, We do support the addition of MCP tools that can be used by agents in addition to the list.
🏛️ Organization-Wide Governance & Policy Controls
Admins can now enforce workspace-wide and org-wide provider configurations.
This brings:
- Centralized governance
- Uniform provider setups
- Better compliance for regulated environments
- Reduced configuration overhead across teams
- Providers configuration support LLMs, Vector DBs, Guardrails, memory and GPU resources
Enterprise teams can now scale safely with consistent controls in place.
🌐 400+ New Models via OpenRouter
FloTorch added OpenRouter as a new provider, unlocking access to more than 400 AI models that can now be used directly in FloTorch workflows.
This greatly expands model diversity for testing, experimentation, and production deployments.
✨ Additional Enhancements & Fixes
Multi-Choice Response Support
Models can now return structured multi-choice outputs—useful for evaluation, forms, decision scoring, and workflow logic.
Fix: Embedding Models Failing with “Provider Not Initialized”
A bug affecting OpenAI-compatible embedding models has been resolved, ensuring smooth execution across all embedding-dependent tasks.
Fix: Missing Experiment Data for N-Shot Prompts
Experiment dashboard visibility issues involving N-shot prompts and KNN retrieval have been fixed.
📚 Documentation Updates
All latest guides remain available:
💬 Support & Community
Join the FloTorch community or reach out for help: Email: support@flotorch.ai
Looking Ahead
December will bring more improvements across Blueprints, model routing, evaluation tooling, and enterprise orchestration. Stay tuned!



.png)