25 articles tagged with "Open Source"
Infisical: Open-Source Secrets Management Platform Guide
Infisical is an open-source secrets manager with E2E encryption, CLI, SDKs, Kubernetes Operator, and secret rotation — self-hosted alternative to Vault.
Infisical: Leitfaden zur Open-Source-Plattform für Secrets-Verwaltung
Infisical ist ein Open-Source-Secrets-Manager mit E2E-Verschlüsselung, CLI, SDKs, Kubernetes-Operator und Rotation — Alternative zu HashiCorp Vault.
Infisical: Guía de la Plataforma Open Source para Gestión de Secretos
Infisical es un gestor de secretos open source con cifrado E2E, CLI, SDKs, Operador Kubernetes y rotación automática — alternativa a HashiCorp Vault.
Infisical: Guide de la Plateforme Open Source de Gestion des Secrets
Infisical est un gestionnaire de secrets open source avec chiffrement E2E, CLI, SDKs, Opérateur Kubernetes et rotation — alternative à HashiCorp Vault.
Infisical: Guia da Plataforma Open Source de Gestão de Segredos
Infisical é um gerenciador de segredos open source com criptografia E2E, CLI, SDKs, Operador Kubernetes e rotação automática — alternativa ao HashiCorp Vault.
Valkey: The Open-Source Redis Fork Migration Guide
Valkey is the Linux Foundation's open-source Redis fork. Install it, migrate from Redis with zero data loss, and configure clusters and Sentinel HA.
Valkey: Vollständiger Leitfaden zum Open-Source-Redis-Fork
Valkey ist der Open-Source-Redis-Fork der Linux Foundation. Installieren, ohne Datenverlust von Redis migrieren und Cluster sowie Sentinel konfigurieren.
Valkey: Guía Completa del Fork Open Source de Redis
Valkey es el fork open source de Redis bajo la Linux Foundation. Instálalo, migra sin perder datos y configura clústeres y Sentinel con esta guía.
Valkey : Guide Complet du Fork Open Source de Redis
Valkey est le fork open source de Redis par la Linux Foundation. Installez-le, migrez depuis Redis sans perte de données et configurez clusters et Sentinel.
Valkey: Guia Completo do Fork Open Source do Redis
Valkey é o fork open source do Redis pela Linux Foundation. Instale, migre do Redis sem perda de dados e configure clusters e Sentinel com este guia completo.
BookStack: Self-Hosted Wiki and Documentation Platform
Deploy BookStack for self-hosted wiki and documentation. Organized as Shelves to Books to Chapters. Features WYSIWYG editing, LDAP auth, diagrams, and full API.
DocuSeal: Self-Hosted Document Signing — Free DocuSign Alternative
Deploy DocuSeal for self-hosted document signing and e-signatures. Create templates, send for signing, track status, and store documents on your server.
Listmonk: Self-Hosted Newsletter and Mailing List Manager — Free Mailchimp Alternative
Deploy Listmonk for self-hosted newsletters via Docker. High-performance Go app with templating, analytics, multi-list support, and no subscriber limits.
Mattermost: Self-Hosted Team Messaging — Open Source Slack Alternative
Deploy Mattermost for self-hosted team messaging. Channels, threads, file sharing, and integrations — an open-source Slack alternative you fully control.
NocoDB: Self-Hosted Airtable Alternative — Turn Any Database into a Spreadsheet
Deploy NocoDB, the open-source Airtable alternative. Turn MySQL, PostgreSQL, or SQLite into smart spreadsheets with forms, views, REST APIs, and Docker.
Ollama: Run AI Language Models Locally — Setup, GPU Acceleration, and API Guide
Run LLMs like Llama 3, Mistral, Gemma, and Phi locally with Ollama. Covers installation, GPU acceleration, Docker, REST API, and Open WebUI integration.
Ollama: KI-Sprachmodelle Lokal Ausführen — Setup, GPU und API
LLMs wie Llama 3, Mistral, Gemma und Phi lokal mit Ollama ausführen. Installation, GPU-Beschleunigung, Docker, REST-API und Open WebUI-Integration.
Ollama: Ejecuta Modelos de IA Localmente — Instalación, GPU y API
Ejecuta LLMs como Llama 3, Mistral, Gemma y Phi localmente con Ollama. Instalación, aceleración GPU, Docker, API REST e integración con Open WebUI.
Ollama : Exécutez des Modèles IA Localement — Installation, GPU et API
Exécutez des LLMs comme Llama 3, Mistral, Gemma et Phi localement avec Ollama. Installation, accélération GPU, Docker, API REST et intégration Open WebUI.
Ollama: Execute Modelos de IA Localmente — Instalação, GPU e API
Execute LLMs como Llama 3, Mistral, Gemma e Phi localmente com Ollama. Instalação, aceleração GPU, Docker, API REST e integração com Open WebUI.
Plausible Analytics: Self-Hosted, Privacy-First Google Analytics Alternative
Deploy Plausible Analytics for privacy-friendly web analytics. No cookies, GDPR-compliant. Covers Docker setup, goals, custom events, and GA4 comparison.
Stable Diffusion WebUI: Self-Hosted AI Image Generation — Free, Private, GPU-Accelerated
Run Stable Diffusion locally for AI image generation with AUTOMATIC1111 WebUI, ComfyUI, SDXL models, LoRA fine-tuning, ControlNet, and GPU tuning.
Whisper: Self-Hosted Speech-to-Text with OpenAI's Model — Local, Private, Free
Run OpenAI's Whisper speech-to-text locally for private, free transcription. Covers CLI, Docker, GPU, Whisper.cpp for CPU, faster-whisper, and web UI.
Whisper: Transcripción de Voz a Texto Self-Hosted — Local, Privado y Gratuito
Ejecuta el modelo Whisper de OpenAI localmente para transcripción de audio gratuita y privada. Incluye CLI, Docker, GPU, whisper.cpp para CPU e interfaces web.
Nextcloud: Self-Hosted Cloud Storage, Calendar, and Collaboration Platform
Deploy Nextcloud with Docker for self-hosted storage, calendar, and contacts. Covers reverse proxy, Redis, OnlyOffice, Talk video calls, and storage tuning.