| Langertha |
The clan of fierce vikings with 🪓 and 🛡️ to AId your rAId |
0.201 |
metacpan |
| Langertha::Engine::AKI |
AKI.IO native API |
0.201 |
metacpan |
| Langertha::Engine::AKIOpenAI |
AKI.IO via OpenAI-compatible API |
0.201 |
metacpan |
| Langertha::Engine::Anthropic |
Anthropic API |
0.201 |
metacpan |
| Langertha::Engine::DeepSeek |
DeepSeek API |
0.201 |
metacpan |
| Langertha::Engine::Gemini |
Google Gemini API |
0.201 |
metacpan |
| Langertha::Engine::Groq |
GroqCloud API |
0.201 |
metacpan |
| Langertha::Engine::MiniMax |
MiniMax API |
0.201 |
metacpan |
| Langertha::Engine::Mistral |
Mistral API |
0.201 |
metacpan |
| Langertha::Engine::NousResearch |
Nous Research Inference API |
0.201 |
metacpan |
| Langertha::Engine::Ollama |
Ollama API |
0.201 |
metacpan |
| Langertha::Engine::OllamaOpenAI |
Ollama via OpenAI-compatible API |
0.201 |
metacpan |
| Langertha::Engine::OpenAI |
OpenAI API |
0.201 |
metacpan |
| Langertha::Engine::Perplexity |
Perplexity Sonar API |
0.201 |
metacpan |
| Langertha::Engine::Whisper |
Whisper compatible transcription server |
0.201 |
metacpan |
| Langertha::Engine::vLLM |
vLLM inference server |
0.201 |
metacpan |
| Langertha::Raider |
Autonomous agent with conversation history and MCP tools |
0.201 |
metacpan |
| Langertha::Request::HTTP |
A HTTP Request inside of Langertha |
0.201 |
metacpan |
| Langertha::Response |
LLM response with metadata |
0.201 |
metacpan |
| Langertha::Role::Chat |
Role for APIs with normal chat functionality |
0.201 |
metacpan |
| Langertha::Role::ContextSize |
Role for an engine where you can specify the context size (in tokens) |
0.201 |
metacpan |
| Langertha::Role::Embedding |
Role for APIs with embedding functionality |
0.201 |
metacpan |
| Langertha::Role::HTTP |
Role for HTTP APIs |
0.201 |
metacpan |
| Langertha::Role::JSON |
Role for JSON |
0.201 |
metacpan |
| Langertha::Role::Langfuse |
Langfuse observability integration |
0.201 |
metacpan |
| Langertha::Role::Models |
Role for APIs with several models |
0.201 |
metacpan |
| Langertha::Role::OpenAICompatible |
Role for OpenAI-compatible API format |
0.201 |
metacpan |
| Langertha::Role::OpenAPI |
Role for APIs with OpenAPI definition |
0.201 |
metacpan |
| Langertha::Role::ResponseFormat |
Role for an engine where you can specify structured output |
0.201 |
metacpan |
| Langertha::Role::ResponseSize |
Role for an engine where you can specify the response size (in tokens) |
0.201 |
metacpan |
| Langertha::Role::Seed |
Role for an engine that can set a seed |
0.201 |
metacpan |
| Langertha::Role::Streaming |
Role for streaming support |
0.201 |
metacpan |
| Langertha::Role::SystemPrompt |
Role for APIs with system prompt |
0.201 |
metacpan |
| Langertha::Role::Temperature |
Role for an engine that can have a temperature setting |
0.201 |
metacpan |
| Langertha::Role::ThinkTag |
Configurable think tag filtering for reasoning models |
0.201 |
metacpan |
| Langertha::Role::Tools |
Role for MCP tool calling support |
0.201 |
metacpan |
| Langertha::Role::Transcription |
Role for APIs with transcription functionality |
0.201 |
metacpan |
| Langertha::Stream |
Iterator for streaming responses |
0.201 |
metacpan |
| Langertha::Stream::Chunk |
Represents a single chunk from a streaming response |
0.201 |
metacpan |
| LangerthaX |
Bring your own viking! |
0.201 |
metacpan |