Extract personal data with a self-hosted LLM Mistral NeMo

Extract personal data with a self-hosted LLM Mistral NeMo

Manual
High
13
Nodes
Manual
Trigger
High
Complexity
7/22/2025
Added

Workflow Overview

Total Nodes
13
Node Types
8

Node Types

chat Trigger
When chat message received
1 node
lm Chat Ollama
Ollama Chat Model
1 node
output Parser Autofixing
Auto-fixing Output Parser
1 node
output Parser Structured
Structured Output Parser
1 node
chain Llm
Basic LLM Chain
1 node
no Op
On Error
1 node
sticky Note
Sticky Note, Sticky Note1, Sticky Note2, Sticky Note3, Sticky Note6, Sticky Note7
6 nodes
set
Extract JSON Output
1 node

Workflow JSON

7.99 KB
{
  "id": "HMoUOg8J7RzEcslH",
  "meta": {
    "instanceId": "3f91626b10fcfa8a3d3ab8655534ff3e94151838fd2709ecd2dcb14afb3d061a",
    "templateCredsSetupCompleted": true
  },
  "name": "Extract personal data with a self-hosted LLM Mistral NeMo",
  "tags": [],
  "nodes": [
    {
      "id": "7e67ae65-88aa-4e48-aa63-2d3a4208cf4b",
      "name": "When chat message received",
      "type": "@n8n/n8n-nodes-langchain.chatTrigger",
      "position": [
        -500,
        20
      ],
      "webhookId": "3a7b0ea1-47f3-4a94-8ff2-f5e1f3d9dc32",
      "parameters": {
        "options": {}
      },
      "typeVersion": 1.1
    },
    {
      "id": "e064921c-69e6-4cfe-a86e-4e3aa3a5314a",
      "name": "Ollama Chat Model",
      "type": "@n8n/n8n-nodes-langchain.lmChatOllama",
      "position": [
        -280,
        420
      ],
      "parameters": {
        "model": "mistral-nemo:latest",
        "options": {
          "useMLock": true,
          "keepAlive": "2h",
          "temperature": 0.1
   ...

Showing first 1000 characters. Click "Expand" to view the full JSON.