|
EditSelectionViewlerminalWindowHelp( Pull requests EditSelectionViewlerminalWindowHelp( Pull requests • screenpipe/screenpipe • GitHubDNS / Nameservers I Hostingen@ Nginx Proxy Manages@ Screenpipe — Archive(*SQLite Web: archive.db( SQLite Web: db.salitescreenpipI) DXP4800PLCV Оптичен инте:V NO COIDER COENENConnected to remote.AFFiNE - All In One KndAll docs • AFFiNEPayments Loaaen(NirDiamant/GenAl Aaentel Add SwarmSeNew Tab@ Location Loggel} Finance Hub%Finance Hub— New TabFinance Hub^ trancactions totallY Filters|« search..dd /mm /yyyyAll Sources+ → SearchYou can cợng a rongg ton locạ ivClone Panositony0993G OIRA, VRASISS TORA Y5NC ‹ 40 libl O | Alexieva kids- in 4h 19 mA]I100% (7 Sat 9 May 9:41:47do PaymentsI Upload CSV€ RefreshAll Statusesdd /mm/yyyyAll lypes@* v Sign InAll TagsOpen ChatShow All CommandsOpen RecentOpen File or Folde:New Untitled Text FileAR*NIPl +- 08-191...
|
Code
|
Visual Studio Code
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update, 1 requires restart
3
Claude Code
Containers
EXPLORER
EXPLORER
No Folder Opened Section
NO FOLDER OPENED
NO FOLDER OPENED
Connected to remote.
Open Folder
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
Open Chat
⌃
⌘
I
Show All Commands
⇧
⌘
P
Open Recent
⌃
R
Open File or Folder
⌘
O
New Untitled Text File
⌘
N
Problems (⇧⌘M)
PROBLEMS
Output (⇧⌘U)
OUTPUT
Debug Console (⇧⌘Y)
DEBUG CONSOLE
Terminal (⌃`)
TERMINAL
Ports
PORTS
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Downloading VS Code Server......
|
Code
|
Visual Studio Code
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update, 1 requires restart
3
Claude Code
Containers
EXPLORER
EXPLORER
No Folder Opened Section
NO FOLDER OPENED
NO FOLDER OPENED
Connected to remote.
Open Folder
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
Open Chat
⌃
⌘
I
Show All Commands
⇧
⌘
P
Open Recent
⌃
R
Open File or Folder
⌘
O
New Untitled Text File
⌘
N
Problems (⇧⌘M)
PROBLEMS
Output (⇧⌘U)
OUTPUT
Debug Console (⇧⌘Y)
DEBUG CONSOLE
Terminal (⌃`)
TERMINAL
Ports
PORTS
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Downloading VS Code Server......
|
Code
|
Visual Studio Code
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update, 1 requires restart
3
Claude Code
Containers
EXPLORER
EXPLORER
No Folder Opened Section
NO FOLDER OPENED
NO FOLDER OPENED
Connected to remote.
Open Folder
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
Open Chat
⌃
⌘
I
Show All Commands
⇧
⌘
P
Open Recent
⌃
R
Open File or Folder
⌘
O
New Untitled Text File
⌘
N
Problems (⇧⌘M)
PROBLEMS
Output (⇧⌘U)
OUTPUT
Debug Console (⇧⌘Y)
DEBUG CONSOLE
Terminal (⌃`)
TERMINAL
Ports
PORTS
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Downloading VS Code Server...
Open Folder
/home/Adm1n/
22 Results
OK
Show Local
..
..
.bun
.bun
.cache
.cache
.claude
.claude
.claude.lock
.claude.lock
.codeium
.codeium
.config
.config
.copilot
.copilot
.docker
.docker
.dotnet
.dotnet
.local
.local
.npm
.npm
.nvm
.nvm
.rest-client
.rest-client
.ssh
.ssh
.th-client
.th-client
.vscode-server
.vscode-server
.windsurf-server
.windsurf-server
#recycle
#recycle
Music
Music
Online Document
Online Document
Photos
Photos...
|
Code
|
Visual Studio Code
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update, 1 requires restart
3
Claude Code
Containers
EXPLORER
EXPLORER
No Folder Opened Section
NO FOLDER OPENED
NO FOLDER OPENED
Connected to remote.
Open Folder
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
Open Chat
⌃
⌘
I
Show All Commands
⇧
⌘
P
Open Recent
⌃
R
Open File or Folder
⌘
O
New Untitled Text File
⌘
N
Problems (⇧⌘M)
PROBLEMS
Output (⇧⌘U)
OUTPUT
Debug Console (⇧⌘Y)
DEBUG CONSOLE
Terminal (⌃`)
TERMINAL
Ports
PORTS
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Downloading VS Code Server...
Open Folder
/home/Adm1n/
22 Results
OK
Show Local
..
..
.bun
.bun
.cache
.cache
.claude
.claude
.claude.lock
.claude.lock
.codeium
.codeium
.config
.config
.copilot
.copilot
.docker
.docker
.dotnet
.dotnet
.local
.local
.npm
.npm
.nvm
.nvm
.rest-client
.rest-client
.ssh
.ssh
.th-client
.th-client
.vscode-server
.vscode-server
.windsurf-server
.windsurf-server
#recycle
#recycle
Music
Music
Online Document
Online Document
Photos
Photos...
|
Code
|
Visual Studio Code
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update, 1 requires restart
3
Claude Code
Containers
EXPLORER
EXPLORER
No Folder Opened Section
NO FOLDER OPENED
NO FOLDER OPENED
Connected to remote.
Open Folder
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
Open Chat
⌃
⌘
I
Show All Commands
⇧
⌘
P
Open Recent
⌃
R
Open File or Folder
⌘
O
New Untitled Text File
⌘
N
Problems (⇧⌘M)
PROBLEMS
Output (⇧⌘U)
OUTPUT
Debug Console (⇧⌘Y)
DEBUG CONSOLE
Terminal (⌃`)
TERMINAL
Ports
PORTS
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Downloading VS Code Server...
Open Folder
/volume2/docker/finance/
5 Results
OK
Show Local
..
..
auth
auth
dsk-uploader
dsk-uploader
finance-hub
finance-hub
payments-logger
payments-logger...
|
Code
|
Visual Studio Code
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
R Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X)
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml
.env
Opening Remote...
Notifications
Sign In
iTerm2ShellEditViewSessionScriptsProfilesWindowHelpssh(ahlscreenpipe"Alexieva kids • in 4 h 18 m100% C8DOCKERO ₴1DEV (-zsh)O $2APP (-zsh)• ₴з-zsh• 84= [backend 8/8]COPYsrc/src=> [backend]exportingto image=> => exporting layers= = writing image sha256:905d8807ca787f954b365481088c05dcfb9ced02966b3d01f322c10cea27e564==>namingto docker.io/library/finance-hub-backend=> [frontend internal] load build definition from Dockerfiletransferring dockerfile: 204B=> [frontendinternal]load .dockerignore=>=> transferring context: 2B[frontend internal] load build context= => transferring context: 49.54kBCACHED [frontend 2/5] WORKDIR /app[frontend 3/5] COPY package.json package-lock.json*= [frontend 4/5] RUN npm install= [frontend 5/5] COPY= [frontend] exporting to image= =› exporting layers= => writing image sha256:6b5413a30bcfd3f9d27df726ce52b3b1d9a0b6а177566122117f258ce7b958dd=> => namingto docker.io/library/finance-hub-frontend[+] Running 5/5• Network finance-hub_defaultCreated• Volume "finance-hub_pgdata"Created• Container finance-hub-db-1Healthy• Container finance-hub-backend-1Started‹ Container finance-hub-frontend-1StartedAdm1n@DXP4800PLUS-B5F8:/volume2/docker/finance/finance-hub$ Connection to [IP_ADDRESS] closed by remote host.Connection to [IP_ADDRESS] closed.lukas@Lukas-Kovaliks-MacBook-Pro-Jiminny ~ $ nasAdm1n@DXP4800PLUS-B5F8:~$ U• ₴5-zsh•*6sshSat 9 May 9:42:06T81X70.351.251.150.050.0s0.0s0.050.150.0s0.0s0.0s0.050.2s21.0s0.351.151.150.0s0.050.2s0.[IP_ADDRESS]...
|
Code
|
finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X)
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
docker-compose.yml, Editor Group 1
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel...
|
Code
|
finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Compose
Editor Language Status: $(copilot) No inline suggestion available, Inline suggestions
LF
UTF-8
Spaces: 2
Ln 11, Col 13
Info: Setting up SSH Host nas: Setting up SSH tunnel...
|
Code
|
docker-compose.yml — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Compose
Editor Language Status: $(copilot) No inline suggestion available, Inline suggestions
LF
UTF-8
Spaces: 2
Ln 11, Col 13
Info: Setting up SSH Host nas: Setting up SSH tunnel...
|
Code
|
docker-compose.yml — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Claude Code, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Activating Extensions...
Activating Extensions...
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel...
|
Code
|
Claude Code — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Claude Code, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Untitled
Session history
New session
Tired of repeating yourself? Tell Claude to remember what you’ve told it using CLAUDE.md.
Prefer the Terminal experience?
Switch back in Settings.
Switch back in Settings.
Close banner
⌘ Esc to focus or unfocus Claude
⌘ Esc to focus or unfocus Claude
Add
Show command menu (/)
docker-compose.yml
docker-compose.yml
Edit automatically
Edit automatically...
|
Code
|
Claude Code — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Claude Code, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Untitled
Session history
New session
Tired of repeating yourself? Tell Claude to remember what you’ve told it using CLAUDE.md.
Prefer the Terminal experience?
Switch back in Settings.
Switch back in Settings.
Close banner
⌘ Esc to focus or unfocus Claude
⌘ Esc to focus or unfocus Claude
Add
Show command menu (/)
docker-compose.yml
docker-compose.yml
Edit automatically
Edit automatically
Local
Local
Web
Web
Untitled now
Untitled
now
Design new payment-logger and dsk-uploader hybrid app 8h
Design new payment-logger and dsk-uploader hybrid app
8h...
|
Code
|
Claude Code — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Claude Code, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Untitled
Session history
New session
Tired of repeating yourself? Tell Claude to remember what you’ve told it using CLAUDE.md.
Prefer the Terminal experience?
Switch back in Settings.
Switch back in Settings.
Close banner
⌘ Esc to focus or unfocus Claude
⌘ Esc to focus or unfocus Claude
Add
Show command menu (/)
docker-compose.yml
docker-compose.yml
Edit automatically
Edit automatically
Local
Local
Web
Web
Untitled now
Untitled
now
Design new payment-logger and dsk-uploader hybrid app Rename session Delete session
Design new payment-logger and dsk-uploader hybrid app
Rename session
Delete session...
|
Code
|
Claude Code — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=https://notifier.lakylak.xyz
NOTIFIER_CHANNEL=viber
NOTIFY_DEFAULT_PHONE=[PHONE]
TZ=Europe/Sofia
BACKEND_PORT=3001
FRONTEND_PORT=5175
```
Removed vs payments-logger: `JWT_SECRET`, `JWT_EXPIRE_MINUTES`.
---
## Implementation Order
1. Create folder structure and write `docker-compose.yml`, `.env.example`, `.gitignore`
2. Write `backend/prisma/schema.prisma` and migration SQL
3. Write `backend/package.json` (add `csv-parse`, `iconv-lite`, `multer`; remove `bcryptjs`, `jose`)
4. Write `backend/Dockerfile`
5. Write `backend/src/auth.js` (Authentik middleware)
6. Copy `backend/src/parser.js` verbatim from payments-logger
7. Write `backend/src/csvParser.js` (Python port)
8. Write `backend/src/routes/upload.js`
9. Write `backend/src/routes/payments.js` (from payments-logger + 5 changes listed above)
10. Write `backend/src/index.js`
11. Copy frontend base from payments-logger; delete `auth.js` + `AuthProvider.jsx`
12. Update `main.jsx`, `App.jsx`, `FilterBar.jsx`, `PaymentTable.jsx`
13. Write `frontend/src/components/UploadPanel.jsx`
14. Update `frontend/vite.config.js`
15. `docker compose build && docker compose up -d`
16. Run verification checklist
---
## Verification
**Schema**
- `payments` table has `source`, `currency`, `debit_bgn`, `credit_bgn`, `transaction_type`, `payer_account`
- No `users` table; `Source` enum exists
**Auth**
- `GET /api/payments` → 401 without `x-authentik-username` header
- `POST /api/payments/ingest` → 201 without any header
- `curl -H "x-authentik-username: test" localhost:3001/api/payments` → 200
**SMS Ingest**
- DSK POS SMS → `source=INGEST`, `currency=EUR`, correct amount/card/recipient
- Apple Wallet structured body → `type=WALLET`, `source=INGEST`
- Rate limiter → 429 after 200 req/min
**CSV Upload**
- Single DSK CSV → rows with `source=UPLOAD`, `currency=BGN`
- Multiple CSVs in one request → all merged
- `ТРАНСФЕР СОБСТВЕНИ СМЕТКИ` rows counted in `skipped`, not imported
- Card regex match from `Основание` column
- Auto-tags applied (ЗАПЛАТА→Salary, LIDL→Groceries)
- cp1251 file imports correctly
**UI**
- Source badge visible in payment table (SMS indigo, CSV emerald)
- Source dropdown in FilterBar filters correctly
- Amount shows correct currency per row
- Upload Panel: drag-and-drop works, result summary shows counts
- Logout redirects to Authentik sign_out
- No login form shown (Authentik handles it at NPM level)
- Tags, send, skip, delete all still work
## Reference Files
- `/volume2/docker/finance/payments-logger/backend/src/routes/payments.js`
- `/volume2/docker/finance/payments-logger/backend/src/parser.js`
- `/volume2/docker/finance/payments-logger/backend/src/index.js`
- `/volume2/docker/finance/payments-logger/backend/src/auth.js`
- `/volume2/docker/finance/payments-logger/backend/prisma/schema.prisma`
- `/volume2/docker/finance/payments-logger/frontend/src/App.jsx`
- `/volume2/docker/finance/payments-logger/frontend/src/components/PaymentTable.jsx`
- `/volume2/docker/finance/payments-logger/frontend/src/components/FilterBar.jsx`
- `/volume2/docker/finance/dsk-uploader/dskuploader.py`
- `/volume2/docker/finance/auth/docker-compose.yml`
Claude’s Plan
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
Stayed in plan mode
Thinking
Thinking
Good question. Here's the full breakdown:
Backend Tech Stack — Options
Backend Tech Stack — Options
Option A: Node.js + Express + Prisma (my recommendation)
Option A: Node.js + Express + Prisma (my recommendation)
Why:
payments-logger is already Node.js/Express/Prisma. The new app is 80% payments-logger with CSV upload added on top. Keeping the same stack means:
Copy
parser.js
verbatim, zero rewrite
Copy the entire
routes/payments.js
with minimal edits
Same Prisma schema workflow, same migrations, same Docker image
The CSV parsing logic (the only Python-specific thing) ports cleanly to
csv-parse
— it's just regex + string manipulation, nothing pandas-specific
Trade-off:
Have to port the Python CSV parsing to JS. It's ~150 lines and mostly string manipulation — straightforward.
Option B: Python + FastAPI + SQLAlchemy
Option B: Python + FastAPI + SQLAlchemy
Rewrite everything in Python. dsk-uploader is Python, so CSV parsing is already done.
Pro:
Keep the dsk-uploader CSV logic as-is (no porting). Python has better data science libraries if you later add analytics.
Con:
Lose the entire payments-logger backend (routes, parser, Prisma schema) — full rewrite. FastAPI + SQLAlchemy is a different mental model than Express + Prisma. Frontend stays React anyway, so the stack split doesn't simplify anything.
Verdict:
More work overall, gains almost nothing.
Option C: Hybrid microservices (Node.js backend + Python CSV service)
Option C: Hybrid microservices (Node.js backend + Python CSV service)
Keep Node.js as the main API, add a separate Python FastAPI service just for CSV parsing. They share the DB.
Pro:
Zero porting — Python CSV code used as-is.
Con:
Two backend services to maintain, deploy, and keep in sync. Inter-service HTTP calls. Two Dockerfiles. For ~150 lines of CSV logic, this is massively over-engineered.
Verdict:
Not worth it.
Option D: Python + Flask (exactly like dsk-uploader)
Option D: Python + Flask (exactly like dsk-uploader)
Extend dsk-uploader into a full payments app.
Pro:
Pandas for CSV, no porting.
Con:
Jinja2 templates instead of React (or add React separately). Would need to reimplement the entire payments-logger (SMS parser, notification system, tag management, pagination/filtering). Flask's Jinja templating is far less capable than the existing React UI. The React frontend is the stronger UI foundation.
Verdict:
Worst of all options — rebuild the harder parts (UI, SMS parser, notification) for no gain.
Summary
Summary
Option
CSV porting effort
App rewrite effort
Result quality
A (Node.js)
Medium (150 lines)
Minimal
Best
B (Python FastAPI)
None
High (full backend)
Worse
C (Hybrid)
None
Low + infra overhead
Overkill
D (Python Flask)
None
Very high
Worst
Option
A (Node.js)
B (Python FastAPI)
C (Hybrid)
D (Python Flask)
CSV porting effort
Medium (150 lines)
None
None
None
App rewrite effort
Minimal
High (full backend)
Low + infra overhead
Very high
Result quality
Best
Worse
Overkill
Worst
Option
CSV porting effort
App rewrite effort
Result quality
The CSV parser port (the only cost of Option A) is a one-time ~150 line exercise. Everything else in the plan reuses payments-logger code directly. That's why Node.js is the clear winner here.
Want to adjust the stack, or shall I proceed with Option A?
Message actions
ok sounds good. Implement all you suggested
Thinking
Thinking
Starting implementation. Let me first read all the reference files I'll need to copy from.
Read
parser.js
parser.js
Read
payments.js
payments.js
,
index.js
index.js
Read
App.jsx
App.jsx
Read
FilterBar.jsx
FilterBar.jsx
Read
PaymentTable.jsx
PaymentTable.jsx
Thinking
Thinking
Now let me read the remaining frontend files and the Dockerfile from payments-logger.
Read
main.jsx
main.jsx
Read
index.css
index.css
Read
package.json
package.json
Read
Dockerfile
Dockerfile
Read
Dockerfile
Dockerfile
Read
vite.config.js
vite.config.js
Read
tailwind.config.js
tailwind.config.js
Read
postcss.config.js
postcss.config.js
Thinking
Thinking
Read
index.html
index.html
Read
dskuploader.py
dskuploader.py
Thinking
Thinking
Let me also grab the remaining frontend components.
Read
PaymentCard.jsx
PaymentCard.jsx
Read
PaymentList.jsx
PaymentList.jsx
Read
schema.prisma
schema.prisma
Thinking
Thinking
Good, I have everything I need. Let me set up the todo list and start building.
Thinking
Thinking
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Bash
Create finance-hub directory structure
IN
mkdir -p /volume2/docker/finance/finance-hub/backend/src/routes \
/volume2/docker/finance/finance-hub/backend/prisma/migrations/20260508_init \
/volume2/docker/finance/finance-hub/frontend/src/components && echo "done"
Copy code to clipboard
OUT
done
Write
docker-compose.yml
docker-compose.yml
45 lines
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Write
.env.example
.env.example
15 lines
# Database
[ENV_SECRET]
# Notifier service (for SMS payment notifications)
NOTIFIER_URL=https://notifier.lakylak.xyz
NOTIFIER_CHANNEL=viber
NOTIFY_DEFAULT_PHONE=[PHONE]
# Timezone (used by SMS parser for Bulgarian bank timestamps)
TZ=Europe/Sofia
# Ports (optional — defaults shown)
BACKEND_PORT=3001
FRONTEND_PORT=5175
Write
.gitignore
.gitignore
5 lines
.env
node_modules/
dist/
*.log
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|
|
Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧ Explorer (⇧⌘E)
Search (⇧⌘F)
Source Control (⌃⇧G)
Run and Debug (⇧⌘D)
Remote Explorer
Extensions (⇧⌘X) - 2 require update
2
Claude Code
Containers
EXPLORER
EXPLORER
Explorer Section: finance [SSH: nas]
Explorer Section: finance [SSH: nas]
FINANCE [SSH: NAS]
auth
dsk-uploader
finance-hub
backend
frontend
.env
.env.example
.gitignore
docker-compose.yml
payments-logger
Outline Section
OUTLINE
OUTLINE
Timeline Section
TIMELINE
TIMELINE
docker-compose.yml, Editor Group 1
.env, Editor Group 1
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
version: '3.8'
services:
db:
image: postgres:16-alpine
restart: unless-stopped
environment:
POSTGRES_USER: finance
POSTGRES_PASSWORD: [PASSWORD]
POSTGRES_DB: finance_hub
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U finance -d finance_hub"]
interval: 5s
timeout: 5s
retries: 5
backend:
build: ./backend
restart: unless-stopped
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports:
- "${BACKEND_PORT:-3001}:3001"
depends_on:
db:
condition: service_healthy
frontend:
build: ./frontend
restart: unless-stopped
ports:
- "${FRONTEND_PORT:-5175}:5173"
depends_on:
- backend
volumes:
pgdata:
Design new payment-logge…, Editor Group 2
remote SSH: nas
SSH: nas
No Problems
0
0
No Ports Forwarded
0
Notifications
Sign In
Sign In
Info: Setting up SSH Host nas: Setting up SSH tunnel
Design new payment-logger and dsk-uploader hybrid app
Session history
New session
Message actions
payments.js
payments.js
ets create a new app that should be combination of payment-logger and dsk-uploader. It should have authorization via authentik (auth folder). All three folders (payment-logger, dsk-uploader and auth) are just refference these will be removed later. Auth project is separated it lives on its own. First reveiw them and see how these should be combined. It will be whole new app (also the folder name). Think very carefully of whatr these two apps do and how cold they be combined. THerer should be common db and uploader should store data the same way the /ingest does. It should be properly marked in UI if it is upload or ingest or both. FIrst think of tech stack and plan carefully.
Thinking
Thinking
I'll explore all three reference projects in parallel to understand their structure before planning.
Agent:
Explore payment-logger app structure and functionality
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How data flows (ingest, storage, retrieval)
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, and how data is ingested. Be thorough on the /ingest endpoint specifically.
Agent:
Explore dsk-uploader app structure and functionality
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Tech stack (frontend framework, backend framework, database)
2. What the app does - its core purpose and features
3. Database schema - all tables, relationships
4. API routes - all endpoints, their purpose and request/response shapes
5. How file uploads work - what files, what format, how parsed
6. Any auth/middleware setup
7. Docker/deployment setup
8. Key files and their roles
Report with: directory structure, tech stack summary, database schema details, all API endpoints listed, UI features, how file upload parsing works, and what data is extracted from uploads. Be very thorough about the data model and what fields are stored.
Agent:
Explore auth folder and authentik integration
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. What auth system is being used (Authentik)
2. How authentication/authorization is implemented - OAuth2, OIDC, session tokens, JWT?
3. Middleware - how is auth enforced on routes?
4. Any frontend auth flow - login redirect, token handling, protected routes
5. Environment variables related to auth
6. Docker/deployment setup for the auth service itself
7. Any example of how other apps integrate with this auth
Report with: directory structure, auth flow description, middleware code, environment variable names, and integration patterns. This is a reference for building auth into a new combined app.
Message actions
payments.js
payments.js
continue
Thinking
Thinking
Launching parallel exploration of all three reference projects.
Agent:
Explore payment-logger app structure
IN
Explore the /volume2/docker/finance/payments-logger directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How data is ingested (look at /ingest route especially)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore dsk-uploader app structure
IN
Explore the /volume2/docker/finance/dsk-uploader directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. Tech stack (frontend framework, backend framework, database, etc.)
3. What the app does - its core functionality
4. How file uploads work (what format, what processing happens)
5. Database schema (what tables/collections exist)
6. API routes and their purposes
7. Frontend pages and components
8. Docker setup
9. Any auth/session handling
Please read key files: package.json files, main entry points, route files, database schema files, docker-compose.yml, and any frontend components. Be thorough - I need a complete picture to plan a new combined app.
Agent:
Explore auth project structure
IN
Explore the /volume2/docker/finance/auth directory thoroughly. I need to understand:
1. Full directory structure (tree)
2. What authentication system is used (authentik integration details)
3. How OAuth/OIDC flow works
4. Any middleware or helper functions for protecting routes
5. Session management approach
6. How frontend handles auth state
7. Any example of how other apps integrate with this auth system
8. Docker setup and environment variables
Please read all files carefully - this auth system will be integrated into a new combined app. I need to understand exactly how to use it.
Thinking
Thinking
Now I have a complete picture of all three. Let me design the plan.
Agent:
Design combined app architecture
IN
Design a detailed implementation plan for a new combined web app that merges two existing applications. Here is full context:
---
## App 1: payments-logger (Node.js)
- **Purpose**: Ingest Bulgarian bank SMS → parse → store in PostgreSQL → review/tag/notify in React UI
- **Stack**: Node.js 20 + Express + Prisma 5 + PostgreSQL 16 + React 18 + Vite + Tailwind CSS
- **Auth**: Custom JWT (bcryptjs + jose), tokens in localStorage
- **Ingest**: POST /api/payments/ingest (public) — SMS text OR structured JSON (Apple Wallet)
- **Schema** (PostgreSQL via Prisma):
- `payments` table: id, raw_message, date, type (POS/ATM/INTERNET/ECOM/P2P/WALLET), card, recipient, amount, balance, status (UNPROCESSED/SENT/SKIPPED), notifyPhone, notifiedAt, created_at, updated_at
- `tags` table: id, name, color — M2M with payments via `_PaymentToTag`
- `users` table: id, username, hashed_password (this will be REMOVED)
- **UI**: Single-page React app — PaymentTable (sortable, filterable, taggable), FilterBar, status actions (send/skip), notification system
- **Parser** (backend/src/parser.js): Regex parser for Bulgarian DSK Bank SMS, extracts date/time (DD/MM/YYYY HH:MM), card mask, transaction type, recipient, amount, balance
## App 2: dsk-uploader (Python/Flask)
- **Purpose**: Upload DSK bank CSV exports → parse/normalize → upload to Notion database
- **Stack**: Python 3.11 + Flask + Pandas + Custom Notion SDK + Bootstrap 5
- **Auth**: None (open)
- **CSV format** (DSK Bank Bulgarian format, columns):
- `Дата` (date, DD.MM.YYYY)
- `Вид на трансакцията` (transaction type, Bulgarian)
- `Основание` (reason/description — contains card number regex: `^\d{6}x{6}\d{4}$`)
- `Дебит BGN` (debit amount, may be empty)
- `Кредит BGN` (credit amount, may be empty)
- `Наредител/Получател` (orderer/recipient name)
- `Номер сметка на наредителя / получателя` (account number)
- **Processing**: merge multiple CSVs, normalize dates, extract card numbers from reason via regex, auto-generate tags (keyword heuristics: ЗАПЛАТА→Salary, NETFLIX→Subscriptions, etc.), filter internal transfers
- **Output**: Notion database pages (this will be REPLACED with local PostgreSQL)
## App 3: auth (Authentik)
- **Mode**: Proxy mode via NPM (forward auth)
- **How it works**: NPM intercepts all requests, calls Authentik outpost's auth endpoint. On success, NPM injects headers into proxied request:
- `X-authentik-username`
- `X-authentik-email`
- `X-authentik-groups`
- **No code integration needed** in the app itself — just trust these headers from NPM
- **Logout**: Redirect user to `/outpost.goauthentik.io/sign_out`
---
## What the New Combined App Must Do
1. **Single PostgreSQL database** for all transactions
2. **SMS Ingest** (public endpoint) — same as payments-logger /ingest, source=INGEST
3. **CSV Upload** (authenticated) — parse DSK CSV files, store transactions with source=UPLOAD into the same DB schema
4. **Unified UI** — show all payments in one table regardless of source
5. **Source indicator** in UI — badge showing "INGEST", "UPLOAD", or "BOTH" (if duplicate detection added)
6. **Source filter** in FilterBar — filter by All/Ingest/Upload
7. **Authentik auth** — remove custom JWT entirely; trust X-authentik-* headers from NPM
8. **All existing payments-logger features**: tagging, send notification, skip, status tracking, filters, sorting, pagination
9. **Remove Notion dependency entirely** — all data stays in local PostgreSQL
---
## Technical Constraints
- Keep Node.js backend (not Python) — extend payments-logger's stack
- CSV parsing: use `csv-parse` npm package (handles encoding, streaming)
- File upload: use `multer` npm middleware
- Auth middleware: read X-authentik-username header, 401 if missing (but /api/payments/ingest stays public)
- Add `source` enum field (INGEST/UPLOAD) to payments schema
- Add `currency` field (EUR for SMS ingest, BGN for CSV uploads) — or just store as-is
- Remove `users` table from schema
---
## Please Design
1. Complete folder structure for the new app (suggest name: "finance-hub" or similar — note this is a placeholder, user will confirm)
2. Prisma schema with new fields (`source`, `currency`)
3. All backend API routes
4. CSV parser design in Node.js (port of the Python logic)
5. Authentik auth middleware
6. Frontend component structure (React) — what's reused from payments-logger, what's new
7. Docker Compose setup
8. Environment variables needed
9. Step-by-step implementation order
10. Verification checklist
Be specific about file paths and code structure. This will be used to write actual implementation code.
Thinking
Thinking
Good plan. Let me ask the one thing I need before writing the final plan.
AskUserQuestion
OUT
User has answered your questions: "What should the new combined app be called? This will be the folder name under /volume2/docker/finance/ and the app branding."="finance-hub". You can now continue with the user's answers in mind.
Thinking
Thinking
Writing the final plan now.
Thinking
Thinking
Write
ets-create-a-new-generic-sun.md
ets-create-a-new-generic-sun.md
326 lines
# finance-hub — Implementation Plan
## Context
Two separate finance apps exist as references and will be retired:
- **payments-logger** (Node.js/React): Ingests Bulgarian bank SMS notifications, stores payments in PostgreSQL, provides a review/tag/notify UI.
- **dsk-uploader** (Python/Flask): Uploads DSK bank CSV exports, parses them, sends to Notion.
The new app **finance-hub** (`/volume2/docker/finance/finance-hub/`) replaces both with a single unified system: all transactions (SMS-ingested and CSV-uploaded) land in one PostgreSQL database and one React UI. Authentication is delegated entirely to Authentik (proxy mode via NPM — no custom JWT).
---
## Tech Stack
| Layer | Choice | Rationale |
|-------|--------|-----------|
| Backend | Node.js 20 + Express + Prisma 5 | Extends payments-logger; no Python needed |
| CSV parsing | `csv-parse` + `iconv-lite` | Handles Bulgarian cp1251 encoding, RFC4180 quoted fields |
| File upload | `multer` (memory storage) | Bank CSVs are small (<500 KB), no disk cleanup needed |
| Database | PostgreSQL 16 | Same as payments-logger |
| Frontend | React 18 + Vite + Tailwind CSS + Lucide React | Same as payments-logger |
| Auth | Authentik proxy via NPM headers | No custom JWT; removes users table entirely |
| Container | Docker Compose | Same pattern as both reference apps |
**Removed vs payments-logger:** `bcryptjs`, `jose`, `users` table, login form, JWT storage.
**Removed vs dsk-uploader:** Python/Flask, Pandas, Notion SDK, all Notion references.
**Added:** `csv-parse`, `iconv-lite`, `multer`, `source` + `currency` fields, `UploadPanel` component.
---
## Folder Structure
```
/volume2/docker/finance/finance-hub/
├── docker-compose.yml
├── .env
├── .env.example
├── .gitignore
├── backend/
│ ├── Dockerfile
│ ├── package.json
│ ├── prisma/
│ │ ├── schema.prisma
│ │ └── migrations/
│ │ ├── migration_lock.toml
│ │ └── 20260508_init/
│ │ └── migration.sql
│ └── src/
│ ├── index.js ← entry point (Authentik middleware wired here)
│ ├── auth.js ← Authentik header middleware (replaces JWT auth)
│ ├── parser.js ← SMS parser (copy verbatim from payments-logger)
│ ├── csvParser.js ← NEW: DSK CSV parser (port of Python dskuploader.py)
│ └── routes/
│ ├── payments.js ← existing routes + source/currency additions
│ └── upload.js ← NEW: POST /api/upload/csv
└── frontend/
├── Dockerfile
├── package.json
├── vite.config.js
├── tailwind.config.js
├── postcss.config.js
├── index.html
└── src/
├── main.jsx ← remove AuthProvider wrapper
├── index.css
├── App.jsx ← remove auth state, add Upload tab toggle
└── components/
├── FilterBar.jsx ← add source filter select
├── PaymentTable.jsx ← add Source badge column + currency display
├── PaymentCard.jsx ← minor source badge addition
├── PaymentList.jsx ← unchanged
└── UploadPanel.jsx ← NEW: drag-and-drop CSV upload UI
```
---
## Database Schema (Prisma)
File: `backend/prisma/schema.prisma`
```prisma
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
model Payment {
id Int @id @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status { UNPROCESSED SENT SKIPPED }
enum Source { INGEST UPLOAD }
```
**Key decisions:**
- No `User` model — Authentik owns identity.
- `currency`: `EUR` for SMS ingest, `BGN` for CSV uploads.
- `debitBgn`, `creditBgn`, `transactionType`, `payerAccount`: nullable CSV-only columns; INGEST rows store nulls. Avoids a union query for the unified list view.
- `balance` is always null for CSV rows (DSK export does not include running balance).
- Fresh consolidated migration — no data migration from reference apps required.
---
## API Routes
| Method | Path | Auth | Description |
|--------|------|------|-------------|
| GET | /api/health | public | Health check |
| POST | /api/payments/ingest | public | SMS or structured ingest (source=INGEST) |
| GET | /api/payments | required | List with filters/sort/pagination (+ source filter) |
| GET | /api/payments/meta/tags | required | All tags |
| GET | /api/payments/meta/filters | required | Filter options incl. `sources` array |
| GET | /api/payments/:id | required | Single payment |
| PATCH | /api/payments/:id | required | Update status |
| DELETE | /api/payments/:id | required | Delete |
| POST | /api/payments/:id/send | required | Send notification |
| POST | /api/payments/:id/skip | required | Skip |
| POST | /api/payments/:id/tags | required | Add/upsert tag |
| DELETE | /api/payments/:id/tags/:tagId | required | Remove tag |
| POST | /api/upload/csv | required | DSK CSV file upload (source=UPLOAD) |
---
## Key Implementation Details
### auth.js (replaces entire old auth module)
```js
const PUBLIC_PATHS = new Set(['/api/health', '/api/payments/ingest']);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) return res.status(401).json({ error: 'Unauthorized' });
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '').split(',').map(g => g.trim()).filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
```
### csvParser.js (port of dskuploader.py)
- `iconv-lite` decodes buffer as cp1251 (DSK Bank export encoding), falls back to UTF-8
- `csv-parse` parses the decoded text with `columns: true`
- Columns: `Дата`, `Вид на трансакцията`, `Основание`, `Дебит BGN`, `Кредит BGN`, `Наредител/Получател`, `Номер сметка на наредителя / получателя`
- Card extraction: regex `/^\d{6}x{6}\d{4}$/` on first token of `Основание`
- Skips rows where `Вид на трансакцията === 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ'`
- Auto-tags via keyword rules (ЗАПЛАТА→Salary, LIDL→Groceries, NETFLIX→Subscriptions, etc.) — same logic as Python `generate_tags()`
- Returns `{ rows: PaymentData[], skipped: number, errors: string[] }`
### payments.js changes from payments-logger
1. Add `source: 'INGEST'` and `currency` to the `/ingest` create call
2. Add `source` to the `GET /` where clause filter
3. Add `sources` to `meta/filters` response
4. Currency-aware amount formatting in notification message
5. Remove all JWT/auth references (no `/auth/register`, `/auth/login`)
### upload.js (new)
- `multer` memory storage, max 10 files × 10 MB
- Calls `parseDskCsv(buffer)` per file
- Upserts tags via `prisma.tag.upsert` then connects
- Returns `{ imported, skipped, errors, payments[] }`
### Frontend changes
- **Delete**: `auth.js`, `AuthProvider.jsx`
- **main.jsx**: Remove `<AuthProvider>` wrapper
- **App.jsx**: Replace `authFetch` with plain `fetch` (Authentik session cookie travels automatically); logout → `window.location.href = '/outpost.goauthentik.io/sign_out'`; add "Payments" / "Upload CSV" tab toggle
- **FilterBar.jsx**: Add source `<select>` (All / SMS Ingest / CSV Upload); widen grid to 5 cols
- **PaymentTable.jsx**: Add `Source` column with `SMS` (indigo) / `CSV` (emerald) badge; show `${amount} ${currency}` in amount cell
- **UploadPanel.jsx** (new): Drag-and-drop zone + file list + import button + result summary with error accordion; calls `POST /api/upload/csv` with `FormData`
---
## Docker Compose
```yaml
services:
db:
image: postgres:16-alpine
environment: { POSTGRES_USER: finance, POSTGRES_PASSWORD: [PASSWORD] POSTGRES_DB: finance_hub }
volumes: [pgdata:/var/lib/postgresql/data]
healthcheck: { test: pg_isready -U finance -d finance_hub, interval: 5s }
backend:
build: ./backend
environment:
DATABASE_URL: [CONNECTION_STRING]
PORT: "3001"
NOTIFIER_URL: ${NOTIFIER_URL}
NOTIFIER_CHANNEL: ${NOTIFIER_CHANNEL:-viber}
NOTIFY_DEFAULT_PHONE: ${NOTIFY_DEFAULT_PHONE}
TZ: ${TZ:-Europe/Sofia}
ports: ["${BACKEND_PORT:-3001}:3001"]
depends_on: { db: { condition: service_healthy } }
frontend:
build: ./frontend
ports: ["${FRONTEND_PORT:-5175}:5173"]
depends_on: [backend]
volumes: { pgdata: }
```
**NPM setup:** Frontend port proxied through Authentik forward auth. Backend port exposed separately (unprotected) so the public `/api/payments/ingest` remains reachable by iOS Shortcuts / SMS gateways without login.
---
## Environment Variables (.env)
```bash
[ENV_SECRET]
NOTIFIER_URL=[URL_WITH_CREDENTIALS] @default(autoincrement())
rawMessage String @map("raw_message")
date DateTime?
type String?
card String?
recipient String?
amount Float?
currency String? @default("EUR")
balance Float?
source Source @default(INGEST)
status Status @default(UNPROCESSED)
tags Tag[]
notifiedAt DateTime? @map("notified_at")
notifyPhone String? @map("notify_phone")
debitBgn Float? @map("debit_bgn")
creditBgn Float? @map("credit_bgn")
transactionType String? @map("transaction_type")
payerAccount String? @map("payer_account")
createdAt DateTime @default(now()) @map("created_at")
updatedAt DateTime @updatedAt @map("updated_at")
@@map("payments")
}
model Tag {
id Int @id @default(autoincrement())
name String @unique
color String @default("#6b7280")
payments Payment[]
@@map("tags")
}
enum Status {
UNPROCESSED
SENT
SKIPPED
}
enum Source {
INGEST
UPLOAD
}
Write
migration.sql
migration.sql
55 lines
-- CreateEnum
CREATE TYPE "Status" AS ENUM ('UNPROCESSED', 'SENT', 'SKIPPED');
CREATE TYPE "Source" AS ENUM ('INGEST', 'UPLOAD');
-- CreateTable
CREATE TABLE "payments" (
"id" SERIAL PRIMARY KEY,
"raw_message" TEXT NOT NULL,
"date" TIMESTAMP(3),
"type" TEXT,
"card" TEXT,
"recipient" TEXT,
"amount" DOUBLE PRECISION,
"currency" TEXT DEFAULT 'EUR',
"balance" DOUBLE PRECISION,
"source" "Source" NOT NULL DEFAULT 'INGEST',
"status" "Status" NOT NULL DEFAULT 'UNPROCESSED',
"notified_at" TIMESTAMP(3),
"notify_phone" TEXT,
"debit_bgn" DOUBLE PRECISION,
"credit_bgn" DOUBLE PRECISION,
"transaction_type" TEXT,
"payer_account" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP
);
-- CreateTable
CREATE TABLE "tags" (
"id" SERIAL PRIMARY KEY,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL DEFAULT '#6b7280'
);
-- CreateUniqueIndex
CREATE UNIQUE INDEX "tags_name_key" ON "tags"("name");
-- CreateTable (M2M join)
CREATE TABLE "_PaymentToTag" (
"A" INTEGER NOT NULL,
"B" INTEGER NOT NULL,
CONSTRAINT "_PaymentToTag_AB_pkey" PRIMARY KEY ("A", "B")
);
CREATE INDEX "_PaymentToTag_B_index" ON "_PaymentToTag"("B");
-- AddForeignKey
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_A_fkey"
FOREIGN KEY ("A") REFERENCES "payments"("id") ON DELETE CASCADE ON UPDATE CASCADE;
ALTER TABLE "_PaymentToTag"
ADD CONSTRAINT "_PaymentToTag_B_fkey"
FOREIGN KEY ("B") REFERENCES "tags"("id") ON DELETE CASCADE ON UPDATE CASCADE;
Write
migration_lock.toml
migration_lock.toml
4 lines
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
package.json
package.json
26 lines
{
"name": "finance-hub-backend",
"version": "1.0.0",
"main": "src/index.js",
"scripts": {
"start": "node src/index.js",
"dev": "nodemon src/index.js",
"prisma:generate": "prisma generate",
"prisma:migrate": "prisma migrate deploy"
},
"dependencies": {
"@prisma/client": "^5.22.0",
"cors": "^2.8.5",
"csv-parse": "^5.5.6",
"express": "^4.21.1",
"express-rate-limit": "^7.4.0",
"iconv-lite": "^0.6.3",
"morgan": "^1.10.0",
"multer": "^1.4.5-lts.1"
},
"devDependencies": {
"nodemon": "^3.1.7",
"prisma": "^5.22.0"
}
}
Write
Dockerfile
Dockerfile
18 lines
FROM node:20-alpine
RUN apk add --no-cache openssl
WORKDIR /app
COPY package.json package-lock.json* ./
RUN npm install
COPY prisma ./prisma
RUN npx prisma generate
COPY src ./src
EXPOSE 3001
CMD ["sh", "-c", "npx prisma migrate deploy && node src/index.js"]
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
auth.js
auth.js
27 lines
const PUBLIC_PATHS = new Set([
'/api/health',
'/api/payments/ingest',
]);
function authentikMiddleware(req, res, next) {
if (PUBLIC_PATHS.has(req.path)) return next();
const username = req.headers['x-authentik-username'];
if (!username) {
return res.status(401).json({ error: 'Unauthorized — missing Authentik session' });
}
req.user = {
username,
email: req.headers['x-authentik-email'] || null,
groups: (req.headers['x-authentik-groups'] || '')
.split(',')
.map(g => g.trim())
.filter(Boolean),
};
next();
}
module.exports = { authentikMiddleware };
Write
parser.js
parser.js
104 lines
/**
* Parses SMS payment messages from Bulgarian banks (DSK Bank, etc.)
*
* Supported formats:
*
* POS / INTERNET / ECOM / P2P payment:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY s karta CARD na POS s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM withdrawal:
* DSK Bank. Na DD/MM/YYYY v HH:MM sa iztegleni AMOUNT CURRENCY s karta CARD ot ATM s adres: RECIPIENT. Nalichni: BALANCE CURRENCY.
*
* ATM utility payment (amount may include fee as AMOUNT/FEE):
* DSK Bank. Na DD/MM/YYYY v HH:MM sa plateni AMOUNT CURRENCY/FEE CURRENCY s karta CARD na ATM s adres:RECIPIENT. Nalichni: BALANCE CURRENCY.
*/
const LOCAL_TZ = process.env.TZ || 'Europe/Sofia';
/**
* Convert a local-timezone date/time to a UTC Date object.
* Uses Intl to resolve the actual UTC offset (DST-aware).
*/
function localToUtc(year, month, day, hour, minute) {
const naive = new Date(Date.UTC(year, month - 1, day, hour, minute, 0));
const formatter = new Intl.DateTimeFormat('en-US', {
timeZone: LOCAL_TZ,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false,
});
const parts = {};
formatter.formatToParts(naive).forEach(p => { parts[p.type] = p.value; });
const localAtNaive = new Date(Date.UTC(
parseInt(parts.year), parseInt(parts.month) - 1, parseInt(parts.day),
parseInt(parts.hour) % 24, parseInt(parts.minute), parseInt(parts.second),
));
const offsetMs = localAtNaive.getTime() - naive.getTime();
return new Date(Date.UTC(year, month - 1, day, hour, minute, 0) - offsetMs);
}
function parsePaymentSms(message) {
const result = {
rawMessage: message,
date: null,
type: null,
card: null,
recipient: null,
amount: null,
balance: null,
};
// Date and time: "Na DD/MM/YYYY v HH:MM"
const dateMatch = message.match(/Na (\d{2})\/(\d{2})\/(\d{4}) v (\d{2}):(\d{2})/i);
if (dateMatch) {
const [, day, month, year, hour, minute] = dateMatch;
result.date = localToUtc(
parseInt(year), parseInt(month), parseInt(day),
parseInt(hour), parseInt(minute),
);
}
// Card mask: "s karta 400915***4447" or "s karta 483890***7162"
const cardMatch = message.match(/s karta\s+([\d*]+)/i);
if (cardMatch) {
result.card = cardMatch[1];
}
// Transaction type: supports both prepositions
// "na POS" / "na ATM" / "na INTERNET" etc. (payment)
// "ot ATM" (withdrawal)
const typeMatch = message.match(/(?:na|ot)\s+(POS|ATM|INTERNET|ECOM|P2P)\b/i);
if (typeMatch) {
result.type = typeMatch[1].toUpperCase();
}
// Recipient address: "s adres: MERCHANT" or "s adres:MERCHANT" (no space variant)
const recipientMatch = message.match(/s adres:\s*([^.]+)\./i);
if (recipientMatch) {
result.recipient = recipientMatch[1].trim();
}
// Amount: handles both verbs and the AMOUNT/FEE suffix format
// "sa plateni 7.78 EUR"
// "sa iztegleni 400.00 EUR"
// "sa plateni 0.50 EUR/0.50 EUR" → captures 0.50 (the charged amount, ignoring fee)
const amountMatch = message.match(/sa (?:plateni|iztegleni)\s+([\d.,]+)\s+[A-Z]{3}/i);
if (amountMatch) {
result.amount = parseFloat(amountMatch[1].replace(',', '.'));
}
// Balance: "Nalichni: 2583.07 EUR."
const balanceMatch = message.match(/Nalichni:\s*([\d.,]+)\s+[A-Z]{3}/i);
if (balanceMatch) {
result.balance = parseFloat(balanceMatch[1].replace(',', '.'));
}
return result;
}
module.exports = { parsePaymentSms };
Write
csvParser.js
csvParser.js
175 lines
/**
* DSK Bank CSV parser — Node.js port of dskuploader.py
*
* DSK Bank exports use Windows-1251 (cp1251) encoding.
* Each row maps to a Payment record with source=UPLOAD, currency=BGN.
*/
const { parse } = require('csv-parse');
const iconv = require('iconv-lite');
const SKIP_TYPE = 'ТРАНСФЕР СОБСТВЕНИ СМЕТКИ';
const CARD_REGEX = /^\d{6}x{6}\d{4}$/;
const POS_REGEX = /^\s*ПЛАЩАНЕ\s+НА\s+ПОС\s+\d{2}\.\d{2}\.\d{4}\s+\d{2}:\d{2}/;
const COL = {
DATE: 'Дата',
TYPE: 'Вид на трансакцията',
REASON: 'Основание',
DEBIT: 'Дебит BGN',
CREDIT: 'Кредит BGN',
PAYEE: 'Наредител/Получател',
ACCT: 'Номер сметка на наредителя / получателя',
};
const TAG_RULES = [
['reason', 'ЗАПЛАТА', 'Salary'],
['reason', 'ТЕГЛЕНЕ НА ATM', 'ATM'],
['reason', 'ПЛАЩАНЕ ПО ЗАЕМ', 'Home Credit'],
['reason', 'АВТ.ТАКСА ОБСЛУЖВАНЕ', 'Bills'],
['transactionType', 'КОМУНАЛНИ УСЛУГИ', 'Bills'],
['payee', 'VIVACOM', 'Subscriptions'],
['payee', 'Google', 'Subscriptions'],
['payee', 'SkyShowtime', 'Subscriptions'],
['payee', 'NETFLIX', 'Subscriptions'],
['payee', 'LUKOIL', 'Bills'],
['payee', 'CityGate', 'Bills'],
['payee', 'CBA', 'Groceries'],
['payee', 'FANTASTICO', 'Groceries'],
['payee', 'LIDL', 'Groceries'],
];
function parseNum(val) {
if (val == null || val === '') return null;
if (typeof val === 'number') return isNaN(val) ? null : val;
const s = String(val).trim().replace(/\xa0/g, '').replace(/ /g, '').replace(',', '.');
const n = parseFloat(s);
return isNaN(n) ? null : n;
}
function parseDate(val) {
if (!val) return null;
const s = String(val).trim();
const m = s.match(/^(\d{2})\.(\d{2})\.(\d{4})$/);
if (m) {
return new Date(Date.UTC(parseInt(m[3]), parseInt(m[2]) - 1, parseInt(m[1])));
}
return null;
}
function processReasonAndCard(reason) {
if (!reason || typeof reason !== 'string') return { reason: '', card: null };
const parts = reason.trim().split(' ');
let card = null;
let cleanReason = reason.trim();
if (parts[0] && CARD_REGEX.test(parts[0])) {
card = parts[0];
cleanReason = parts.slice(1).join(' ').trim();
}
if (POS_REGEX.test(cleanReason)) {
const posParts = cleanReason.split('<br/>');
try {
const dateTime = posParts[0].split('ПОС ')[1];
cleanReason = `POS PAYMENT ${dateTime}`;
} catch (_) { /* keep original */ }
}
return { reason: cleanReason.replace(/\s+/g, ' ').trim(), card };
}
function generateTags(fields) {
const tags = new Set();
for (const [field, keyword, tagName] of TAG_RULES) {
if ((fields[field] || '').includes(keyword)) {
tags.add(tagName);
}
}
return Array.from(tags);
}
function processRow(row) {
const transactionType = (row[COL.TYPE] || '').trim();
if (transactionType === SKIP_TYPE) return null;
const { reason, card } = processReasonAndCard(row[COL.REASON]);
const payee = (row[COL.PAYEE] || '').trim();
const payerAccount = (row[COL.ACCT] || '').trim();
const debitBgn = parseNum(row[COL.DEBIT]);
const creditBgn = parseNum(row[COL.CREDIT]);
const date = parseDate(row[COL.DATE]);
const autoTags = generateTags({ reason, transactionType, payee, debitBgn, creditBgn });
const amount = debitBgn ?? creditBgn ?? null;
const rawMessage = [
row[COL.DATE] && `Date: ${row[COL.DATE]}`,
transactionType && `Type: ${transactionType}`,
payee && `Payee: ${payee}`,
debitBgn != null && `Debit: ${debitBgn} BGN`,
creditBgn != null && `Credit: ${creditBgn} BGN`,
].filter(Boolean).join(' | ');
return {
rawMessage,
date,
type: null,
card,
recipient: payee || null,
amount,
currency: 'BGN',
balance: null,
source: 'UPLOAD',
debitBgn,
creditBgn,
transactionType: transactionType || null,
payerAccount: payerAccount || null,
autoTags,
};
}
/**
* Parse a Buffer containing a DSK Bank CSV (cp1251 or UTF-8).
* Returns { rows, skipped, errors }.
*/
async function parseDskCsv(buffer) {
// Try cp1251 first (DSK Bank export encoding), fall back to UTF-8
let text = iconv.decode(buffer, 'cp1251');
if (!text.includes(COL.DATE)) {
text = buffer.toString('utf-8');
}
return new Promise((resolve, reject) => {
const rows = [];
const errors = [];
let skipped = 0;
const parser = parse(text, {
columns: true,
skip_empty_lines: true,
trim: true,
relax_column_count: true,
});
parser.on('readable', () => {
let record;
while ((record = parser.read()) !== null) {
try {
const row = processRow(record);
if (row === null) { skipped++; } else { rows.push(row); }
} catch (err) {
errors.push(err.message);
}
}
});
parser.on('error', reject);
parser.on('end', () => resolve({ rows, skipped, errors }));
});
}
module.exports = { parseDskCsv };
Write
index.js
index.js
39 lines
const express = require('express');
const cors = require('cors');
const morgan = require('morgan');
const rateLimit = require('express-rate-limit');
const { authentikMiddleware } = require('./auth');
const paymentsRouter = require('./routes/payments');
const uploadRouter = require('./routes/upload');
const app = express();
const PORT = process.env.PORT || 3001;
app.use(cors());
app.use(express.json({ limit: '16kb' }));
app.use(morgan('combined'));
// Rate-limit the public ingest endpoint before auth middleware
const ingestLimiter = rateLimit({
windowMs: 60 * 1000,
max: 200,
standardHeaders: true,
legacyHeaders: false,
message: { error: 'Too many requests, slow down' },
});
app.use('/api/payments/ingest', ingestLimiter);
// Authentik header auth (skips /api/health and /api/payments/ingest)
app.use(authentikMiddleware);
app.get('/api/health', (_req, res) => {
res.json({ status: 'ok', timestamp: new Date().toISOString() });
});
app.use('/api/payments', paymentsRouter);
app.use('/api/upload', uploadRouter);
app.listen(PORT, '[IP_ADDRESS]', () => {
console.log(`Finance Hub API running on port ${PORT}`);
});
Update Todos
Create folder structure and root config files
Write backend Prisma schema and migration
Write backend package.json and Dockerfile
Write backend src files (auth, parser, csvParser, index)
Write backend routes (payments.js, upload.js)
Write frontend config files
Write frontend React components
Build and verify with docker compose
Write
payments.js
paymen...
|
Code
|
Design new payment-logge… — finance [SSH: nas]
|
NULL
|