Discourse AI Configuration Tutorial: Enable Free AI Features Using Silicon-Based Flow API

This article will guide you through enabling and configuring Discourse’s AI search, AI translation, and AI-related topics features, which are part of the Discourse AI plugin. This implementation is based on official tutorials and practical operations for Chinese websites.

Demo Site: https://bbs.eeclub.top/

This is a Discourse forum I built that uses the Discourse AI plugin. It supports multilingual content, automatically translates posts into multiple languages after publication, and employs AI moderation for new user posts to reduce spam content.

I will publish a Discourse forum setup tutorial in the near future.

Webmaster Discussion Group: 767557452


Prerequisites

The AI API used in this article comes from SiliconFlow. Register through my link to receive 20 million free tokens (approximately $14).

After registering a SiliconFlow account, create an API Key: Go to the left-side console menu “API Keys” → “Create New API Key” → Copy the sk-xxxxxxxxxx key.

Record these two general endpoints (will be used later):

  • LLM Chat: https://api.siliconflow.cn/v1/chat/completions
  • Embedding: https://api.siliconflow.cn/v1/embeddings

SiliconFlow is compatible with OpenAI format, allowing direct reuse of official AI plugin’s OpenAI configuration page.


For access to additional models like ChatGPT, Claude, Gemini, Doubao, ERNIE Bot, 360, Grok, etc., consider the DMXAPI platform.


Discourse AI Plugin Overview

Discourse AI is a community management-focused AI assistant that saves operational time, ensures community safety and order, while enhancing user engagement and management insights.

1. Moderation & Management

  • Automated Moderation: Precisely detects toxic content, marks NSFW posts, 99% accuracy spam filtering, one-click activation with flexible fine-tuning.
  • Custom AI Assistant: Supports custom system prompts and parameters, can search forums, access web pages, and retrieve uploaded documents, providing services through chat/private messages.
  • Practical Toolkit: Built-in proofreading, translation, and content optimization functions, capable of generating summaries, titles, and smart dates.

2. Engagement & Discovery

  • Semantic Search: Breaks keyword limitations, accurately matches contextual content, improves search efficiency.
  • Related Topics Recommendation: Based on deep semantic similarity analysis, pushes related discussions at the end of topics to promote continuous interaction.
  • Quick Summarization: Condenses core information from long conversations, helping users quickly catch up and reduce information lag.

3. Insights & Analysis

  • Community Sentiment Monitoring: Performs sentiment and emotion scoring on discussion content, capturing user attitude trends.
  • Automated Reporting: Generates data reports on forum activity, hot discussions, and user behavior to assist management decisions.
  • AI Usage Monitoring: Tracks token consumption and request volumes across different models and functions, clearly showing cost and usage patterns.

4. Data Security

  • Data Ownership: AI data and community content are stored together, users permanently own their data.
  • Privacy Protection: Uses open-weight LLMs without training models on user data, ensuring content security and control.
  • Multi-Provider Support: Choose from OpenAI, Anthropic, Microsoft Azure, and 10+ AI service providers, compatible with custom models.

Configuring LLM (Large Language Model)

What is an LLM: LLM (Large Language Model) serves as the “brain” for AI functions, responsible for understanding natural language and generating responses (e.g., translation results, search summaries). SiliconFlow offers multiple LLM models compatible with OpenAI interfaces.

In the Discourse Admin Panel, you must first enable the Discourse AI plugin before LLM model settings become visible.

Navigate to the AI plugin’s Settings page, click LLM, scroll down to Unconfigured LLM Templates, and click Custom - Manual ConfigurationSettings.

  • Provider: Select OpenAI
  • Service URL: Enter the LLM API endpoint https://api.siliconflow.cn/v1/chat/completions (check SiliconFlow’s developer documentation for updates)
  • API Key: Use the key copied earlier
  • Model Name: Set a custom name
  • Model ID: Copy from SiliconFlow’s Model Square (note some models cannot use free credits). I selected Pro/deepseek-ai/DeepSeek-V3.2-Exp

Select Tokenizer as OpenAiTokenizer. The Context Window value (e.g., 160K for my model) should be entered as 160000. Click Submit, then Run Test to verify.

You can repeat these steps to add multiple models or AI providers.

Function settings allow assigning different models to specific features. Simple functions can use free models.


Configuring Embedding Model

What is an Embedding Model: Embedding models convert text into computer-understandable “semantic vectors,” crucial for AI search and related topic recommendations (e.g., recognizing “Discourse email configuration” and “How to set Discourse email notifications” as semantically identical).

Click EmbeddingsNew Embedding

  • Provider: Select OpenAI
  • Embedding Service URL: https://api.siliconflow.cn/v1/embeddings (check documentation for updates)
  • Embedding API Key: Use the key copied earlier
  • Model Name: Set a custom name
  • Tokenizer: Select BgeM3Tokenizer
  • Model ID: BAAI/bge-m3 (free on SiliconFlow)
  • Distance Function: Select Negative Inner Product
  • Sequence Length: 8000

Click Save, then Run Test to verify.


AI Feature Settings

In the AI plugin settings, select a default model under AI default LLM model.

Scroll down to AI helper enabled to activate the AI assistant. Configure user groups with access permissions. This assistant aids in post creation (translation, proofreading, Markdown tables, titles, etc.).

Enable embeddings via AI embeddings enabled and select BAAI/bge-m3 under AI embeddings selected model.

Enable summarization via AI summarization enabled to generate topic summaries.

Other AI settings are available for exploration. The AI translation feature automatically translates content to users’ preferred languages.


Recommended Reading

English Version: https://blog.zeruns.top/archives/78.html