- Sort Score
- Num 10 results
- Language All
Results 11 - 20 of 94 for LLM (0.04 seconds)
The search processing time has exceeded the limit. The displayed results may be partial.
-
scripts/general-llm-prompt.md
Motov Yurii <******@****.***> 1773831336 +0100
Created: Sun Apr 05 07:19:11 GMT 2026 - Last Modified: Wed Mar 18 10:55:36 GMT 2026 - 14.6K bytes - Click Count (0) -
src/main/java/org/codelibs/fess/llm/AbstractLlmClient.java
/** * Gets the configured LLM type. * * @return the LLM type from configuration */ protected abstract String getLlmType(); /** * Gets the configuration prefix for this provider. * Used to look up per-prompt-type parameters from FessConfig. * * @return the config prefix (e.g. "rag.llm.openai") */Created: Tue Mar 31 13:07:34 GMT 2026 - Last Modified: Sat Mar 21 06:04:58 GMT 2026 - 72K bytes - Click Count (0) -
src/main/java/org/codelibs/fess/llm/LlmClientManager.java
* It manages registered LLM clients and provides access to the configured * LLM provider based on the current configuration. */ public class LlmClientManager { private static final Logger logger = LogManager.getLogger(LlmClientManager.class); /** The list of registered LLM clients. */ protected final List<LlmClient> clientList = new CopyOnWriteArrayList<>(); /**Created: Tue Mar 31 13:07:34 GMT 2026 - Last Modified: Thu Mar 19 11:10:51 GMT 2026 - 17.4K bytes - Click Count (0) -
src/main/java/org/codelibs/fess/llm/LlmClient.java
* governing permissions and limitations under the License. */ package org.codelibs.fess.llm; import java.util.List; import java.util.Map; /** * Interface for LLM (Large Language Model) clients. * Implementations provide integration with different LLM providers * such as Ollama, OpenAI, and Google Gemini. * * In addition to low-level chat operations, this interface defines
Created: Tue Mar 31 13:07:34 GMT 2026 - Last Modified: Thu Mar 19 07:04:54 GMT 2026 - 7.3K bytes - Click Count (0) -
scripts/translate.py
if lang == "en": continue lang_prompt_path = Path(f"docs/{lang}/llm-prompt.md") if lang_prompt_path.exists(): translatable_langs.append(lang) return translatable_langs @app.command() def list_llm_translatable() -> list[str]: translatable_langs = get_llm_translatable() print("LLM translatable languages:", translatable_langs) return translatable_langs
Created: Sun Apr 05 07:19:11 GMT 2026 - Last Modified: Thu Mar 19 17:37:41 GMT 2026 - 15.8K bytes - Click Count (0) -
src/main/java/org/codelibs/fess/llm/LlmStreamCallback.java
*/ package org.codelibs.fess.llm; /** * Callback interface for receiving streaming chat responses from LLM. */ @FunctionalInterface public interface LlmStreamCallback { /** * Called for each chunk of the streaming response. * * @param chunk the text chunk from the LLM response * @param done true if this is the final chunk */Created: Tue Mar 31 13:07:34 GMT 2026 - Last Modified: Mon Jan 12 10:32:40 GMT 2026 - 1.2K bytes - Click Count (0) -
docs/en/docs/advanced/vibe.md
The body should be annotated with `Any`, because the request and the response would be... well... **anything**. 🤷 The idea is that you would receive the payload and send it **directly** to an LLM provider, using a `prompt` to tell the LLM what to do, and return the response **as is**. No questions asked. You don't even need to write the body of the function. The `@app.vibe()` decorator does everything for you based on AI vibes:
Created: Sun Apr 05 07:19:11 GMT 2026 - Last Modified: Wed Apr 01 16:16:24 GMT 2026 - 2K bytes - Click Count (0) -
src/main/java/org/codelibs/fess/llm/LlmException.java
* either express or implied. See the License for the specific language * governing permissions and limitations under the License. */ package org.codelibs.fess.llm; import org.codelibs.fess.exception.FessSystemException; /** * Exception thrown when an error occurs during LLM operations. * * @author FessProject */ public class LlmException extends FessSystemException { private static final long serialVersionUID = 1L;Created: Tue Mar 31 13:07:34 GMT 2026 - Last Modified: Sat Mar 07 01:53:06 GMT 2026 - 3.5K bytes - Click Count (0) -
docs/en/docs/_llm-test.md
# LLM test file { #llm-test-file } This document tests if the <abbr title="Large Language Model">LLM</abbr>, which translates the documentation, understands the `general_prompt` in `scripts/translate.py` and the language specific prompt in `docs/{language code}/llm-prompt.md`. The language specific prompt is appended to `general_prompt`. Tests added here will be seen by all designers of language specific prompts. Use as follows:Created: Sun Apr 05 07:19:11 GMT 2026 - Last Modified: Thu Mar 05 18:13:19 GMT 2026 - 11K bytes - Click Count (0) -
docs/ja/docs/_llm-test.md
# LLM テストファイル { #llm-test-file } このドキュメントは、ドキュメントを翻訳する <abbr title="Large Language Model - 大規模言語モデル">LLM</abbr> が、`scripts/translate.py` の `general_prompt` と、`docs/{language code}/llm-prompt.md` の言語固有プロンプトを理解しているかをテストします。言語固有プロンプトは `general_prompt` の末尾に追加されます。 ここに追加したテストは、すべての言語固有プロンプトの設計者が参照します。 使い方: * 言語固有プロンプトを用意します - `docs/{language code}/llm-prompt.md`。Created: Sun Apr 05 07:19:11 GMT 2026 - Last Modified: Fri Mar 20 14:07:17 GMT 2026 - 13.5K bytes - Click Count (0)