27 Commits

Author SHA1 Message Date
ealmeida 94db202de9 fix(monitoring): SSH ao EasyPanel em vez de API inexistente
- server-metrics: substituir CWP (só aceita ed25519) por Easy server
  (aceita password auth na porta 22)
- monitoring-collector: remover chamadas a monitor.getSystemStats e
  monitor.getDockerTaskStats (endpoint não existe nesta versão EasyPanel);
  métricas CPU/RAM via SSH e containers via docker service ls sobre SSH

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-28 16:08:20 +01:00
ealmeida a594df1c7c auth: sessões mais longas com silent renew automático
Adiciona offline_access ao scope e automaticSilentRenew para renovar
tokens silenciosamente sem forçar re-login. Requer Access Token validity
aumentado no provider Authentik (de 5min para 8h).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-28 15:03:44 +01:00
ealmeida 3c85d03e70 fix: excluir interrupções longas da taxa de erro de skills
Sessões com outcome=interrupted e ≥10 eventos são redirects naturais
do utilizador, não falhas da skill. O detector contava todas as
interrupções como falhas, gerando falsos positivos para skills
conversacionais como superpowers:brainstorming.

Fix: só contar como falha erros reais (outcome=error) ou interrupções
precoces (<10 eventos).

Resolve ticket #10407.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-04-27 11:15:09 +01:00
ealmeida 3887547f1c feat(observabilidade): padrões persistentes propõem staging no CARL
Quando um padrão atinge ≥3 semanas consecutivas com severity warning/action,
além de abrir ticket no Desk, propõe também como staging entry no
/media/ealmeida/Dados/.carl/carl.json para revisão e eventual promoção a
regra. Idempotente por pattern_key. Dry-run log-only.

Fecha o feedback loop Observabilidade → CARL identificado na análise do
sistema: padrões detectados empiricamente viram propostas de regras.
2026-04-23 03:45:51 +01:00
ealmeida c794e1b6d6 docs(observabilidade): CHANGELOG Fase 6C worklog import
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 03:07:37 +01:00
ealmeida afbb06a87d feat(observabilidade): systemd timer diário import worklog
- Service oneshot invoca sessions-worklog-import.ts --discussion all --since-days 7
- Timer OnCalendar=*-*-* 03:00:00 com Persistent=true (catch-up)
- EnvironmentFile reutiliza observabilidade-patterns.env (MCP_GATEWAY_TOKEN)
- Logs append em ~/.claude-work/observabilidade-worklog-import.log

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 03:07:31 +01:00
ealmeida 6251e0d28c feat(observabilidade): 3 detectores cruzados worklog × sessions
- #7 actions_never_executed: acções P1/P2 em disc #33 há ≥14 dias pendentes
- #8 skill_narrative_vs_data: skill reportada problemática em worklogs
  mas com outcome=completed nas sessões (≥3 matches)
- #9 worklog_pattern_frequency: tokens recorrentes (≥3 worklogs) em patterns_text
- Integrados em detectPatterns() como secção opcional quando worklogs > 0

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 03:07:25 +01:00
ealmeida f4adf8674d feat(observabilidade): CLI sessions-worklog-import com paginação
- Script CLI com args --discussion 31|32|33|all, --since-days N, --force
- Paginação via MCP gateway (limit 100, tree_view false)
- Output JSON-line progressivo + summary final
- Testes: parseWorklogHtml tolerante (h2/h3/h4), idempotência upsert

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 03:07:17 +01:00
ealmeida 11f9833aac feat(observabilidade): tabela worklog_comments + parser HTML + importer MCP
- Schema worklog_comments (id, discussion, parent, datas, staff, campos parseados em JSON)
- Parser HTML tolerante (h2/h3/h4) extrai title, task_ref, duration, work_items,
  files_modified, problems, patterns_text, actions
- Módulo worklog-import com paginação MCP get_discussion_comments
- Helper mcp-client.ts partilhado (gateway MCP JSON-RPC + SSE)
- Dep runtime: node-html-parser

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 03:07:09 +01:00
ealmeida 86770b1570 feat(observabilidade): adiciona flag --backfill ao sessions-patterns
Itera semanas ISO desde a primeira sessão até (excluindo) a semana corrente,
detectando padrões e fazendo upsert com consecutive_weeks acumulado. Nunca
publica comentários nem abre tickets — apenas povoa a tabela patterns com
histórico. Output JSON por semana e sumário final.

Permite popular retroactivamente a tabela patterns após nova instalação ou
reset, dando base imediata ao detector de padrões persistentes.
2026-04-23 02:33:57 +01:00
ealmeida 9652805b1e refactor(observabilidade): pattern detector usa gateway MCP em vez de API Desk directa
Substitui chamadas HTTP directas à API Desk (/api/v1/discussions, /api/v1/tickets)
por JSON-RPC 2.0 ao gateway MCP (desk-crm). Helper callMcpTool lida com respostas
JSON ou SSE. Substitui DESK_API_TOKEN/DESK_BASE_URL por MCP_GATEWAY_TOKEN/URL.

- add_discussion_comment: discussion_id=32, staff_id=25 (Observabilidade)
- create_ticket: subject/message/priority(1-4)/department=1

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:27:09 +01:00
ealmeida ac4e9c6f35 fix(observabilidade): parser extrai skills de tool_result.content (string e array)
Antes: skills_invoked vazio em 1608/1608 sessões porque detectSkillInvoked
apenas era aplicado ao text extraído de content[type=text]. A string
'Launching skill: X' vive dentro de tool_result.content (string ou array
de text blocks), que era ignorada.

Fix: adicionar helper extractResultText(r) que trata ambos os casos e
aplicar detectSkillInvoked + detectHook também ao tool_result. Após
re-indexação full, 526/1616 sessões têm agora skills detectadas e o
detector de padrões devolve 6 padrões (vs 2 baseline), incluindo
skills_with_high_error_rate reais.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:20:24 +01:00
ealmeida 1eb4f246de docs(observabilidade): CHANGELOG Fase 6A pattern detector
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:17:43 +01:00
ealmeida 94088442c2 feat(observabilidade): systemd timer semanal para detector padrões
Units user:
  observabilidade-patterns.service (Type=oneshot, executa --publish)
  observabilidade-patterns.timer (OnCalendar=Sun 23:00, Persistent=true)

EnvironmentFile aponta para ~/.claude-work/observabilidade-patterns.env
(ficheiro privado, não commitado). Exemplo fornecido em .env.example.

Activação utilizador:
  cp systemd/observabilidade-patterns.{service,timer} ~/.config/systemd/user/
  cp systemd/observabilidade-patterns.env.example ~/.claude-work/observabilidade-patterns.env
  $EDITOR ~/.claude-work/observabilidade-patterns.env  # colocar DESK_API_TOKEN
  systemctl --user daemon-reload
  systemctl --user enable --now observabilidade-patterns.timer

Refs Fase 6A

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:17:42 +01:00
ealmeida 5bd1459c7d feat(observabilidade): CLI patterns com dry-run e publish Desk #32
Script CLI api/scripts/sessions-patterns.ts com args --week, --publish, --force.
Default: semana actual, dry-run (render HTML stderr + JSON summary stdout).

Com --publish:
  - POST html comentário para /api/v1/discussions/32/comments (Desk)
  - Para padrões com consecutive_weeks>=3 e severity warning|action: auto-abre
    TICKET via /api/v1/tickets (priority 3|4 conforme severity)

Pipeline interno: detectPatterns -> upsertPattern placeholder -> computar
consecutive_weeks -> upsert final. Escape HTML defensivo; 5 sample ids por padrão.

Auth via DESK_API_TOKEN (env file), NUNCA hardcoded.

Refs Fase 6A

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:17:31 +01:00
ealmeida 2a523a505e feat(observabilidade): tabela patterns + 6 detectores SQL
Adiciona tabela 'patterns' à BD sessions (UNIQUE por week_iso+pattern_key)
e helpers upsertPattern/getPatternsByWeek/getConsecutiveWeeks no SessionsDb.

Módulo patterns.ts implementa 6 detectores heurísticos para deteccão semanal:
  1. skills_with_high_error_rate (ratio > 0.2, severity warning|action)
  2. tools_low_efficiency (tool_calls/event_count médio > 0.5)
  3. skill_tool_pairs (top 5 co-ocorrências)
  4. duration_outliers (sessões > p95 com outcome != completed)
  5. abandoned_sessions (event_count<3 AND outcome=unknown, >=5)
  6. growing_complexity (avg tool_calls actual > anterior*1.3)

5 testes cobrem detector de erro, abandonadas, consecutive_weeks,
idempotência do upsert e toPatternRecord.

Refs Fase 6A · Desk #2059 · Project #65

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 02:17:21 +01:00
ealmeida 2c8525bc8a feat(observabilidade): debounce search + row clickable inteira 2026-04-23 02:01:25 +01:00
ealmeida d2452d4402 docs(observabilidade): v2.7.0 — Espelho MVP entregue 2026-04-23 01:25:06 +01:00
ealmeida c590431c1f feat(observabilidade): systemd user service para watcher
Cria unit file para correr sessions-indexer em --watch permanente.
Inclui KillMode=mixed + TimeoutStopSec=10s + Restart=on-failure
para graceful shutdown do watcher+DB.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 01:22:44 +01:00
ealmeida 80a5f3bf42 feat(observabilidade): watcher chokidar incremental 2026-04-23 01:19:21 +01:00
ealmeida 8ca6b7e166 feat(observabilidade): UI timeline por sessão com filtros 2026-04-23 01:14:59 +01:00
ealmeida eb781a87ce feat(observabilidade): UI lista de sessões com filtros 2026-04-23 01:10:23 +01:00
ealmeida b933b4c2e2 fix(observabilidade): close DB no SIGTERM e distinguir ENOENT/parse errors 2026-04-23 01:07:40 +01:00
ealmeida e101577d61 feat(observabilidade): rota /api/sessions com validação Zod
Task 5 do MVP Espelho: endpoint Express com factory createSessionsRouter(db)
que expõe GET / (lista filtrável por days/project/tool/skill/q + limit/offset
validados via Zod) e GET /:id (meta + eventos via parseSessionFile). Integrado
em server.ts com DB aberta a partir de OBSERVABILIDADE_DB ?? DEFAULT_DB_PATH.

Validação empírica: total=559 sessões (últimos 7d), detalhe com 37 eventos.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-04-23 01:04:44 +01:00
ealmeida 7a13d21caa fix(observabilidade): stub watcher sai limpo com exit 0 para Task 9 systemd 2026-04-23 01:01:42 +01:00
ealmeida cdadc89cb0 fix(observabilidade): indexer CLI sai com código 1 se failed>0 2026-04-23 00:59:22 +01:00
ealmeida 296819df63 feat(observabilidade): indexer full scan + CLI + stub watcher 2026-04-23 00:57:46 +01:00
36 changed files with 3462 additions and 122 deletions
+32
View File
@@ -2,6 +2,38 @@
Todas as alterações notáveis neste projecto serão documentadas neste ficheiro.
## [2.7.0] - 2026-04-23
### Added — Observabilidade Fase 6A
- Detector automático semanal de 6 tipos de padrões (SQL heurístico)
- Tabela `patterns` com histórico week-over-week
- CLI `api/scripts/sessions-patterns.ts` (dry-run + publish Desk #32)
- systemd user timer `observabilidade-patterns.timer` (domingos 23:00)
- Auto-abre ticket Desk quando padrão persiste ≥3 semanas consecutivas (severity warning+)
### Added — Observabilidade Fase 6C (Worklog Import)
- Tabela `worklog_comments` + parser HTML tolerante (h2/h3/h4) das discussões Desk #31, #32, #33
- CLI `api/scripts/sessions-worklog-import.ts` com paginação via gateway MCP
- systemd timer diário `observabilidade-worklog-import.timer` (03:00)
- 3 detectores cruzados: `actions_never_executed`, `skill_narrative_vs_data`, `worklog_pattern_frequency`
- Dep runtime: `node-html-parser`
- Backfill inicial: 2312 comentários (465 + 33 + 1814) importados, span 2026-01-27 → 2026-04-23
### Added — Observabilidade (Espelho)
- Painel `/sessions` para replay de sessões Claude Code (lista + timeline detalhe)
- Indexer `api/scripts/sessions-indexer.ts` (modos `--full` e `--watch`)
- SQLite local em `~/.claude-work/sessions.db` (1608 sessões, 61 projectos)
- Rotas `GET /api/sessions` e `GET /api/sessions/:id` com validação Zod
- Watcher chokidar incremental + systemd user service `observabilidade-indexer.service`
- UI React com filtros (período/projecto/tool/skill/search) e timeline colapsável
### Technical Notes
- `better-sqlite3` (WAL + synchronous=NORMAL) + `chokidar`
- Batching transaccional 50 rows/commit no indexer (full scan: 1603 ficheiros em 8s)
- Proxy Vite `/api``localhost:3001`
- Hub: `/media/ealmeida/Dados/Hub/05-Projectos/Observabilidade/`
- Desk task #2059, project #65
## [2.6.0] - 2026-02-14
### Security - Vulnerabilidades Críticas (3)
+65
View File
@@ -0,0 +1,65 @@
/**
* Rota /api/sessions — lista e detalhe de sessões Claude Code.
* Validação Zod em query params; detalhe carrega eventos via parseSessionFile.
* @author Descomplicar® | Projecto Observabilidade (Espelho)
*/
import { Router } from 'express'
import { z } from 'zod'
import type { SessionsDb } from '../services/sessions/db.js'
import { parseSessionFile } from '../services/sessions/parser.js'
const ListQuerySchema = z.object({
days: z.coerce.number().int().min(1).max(3650).optional(),
project: z.string().min(1).max(200).optional(),
tool: z.string().min(1).max(100).optional(),
skill: z.string().min(1).max(200).optional(),
q: z.string().max(500).optional(),
limit: z.coerce.number().int().min(1).max(200).default(50),
offset: z.coerce.number().int().min(0).default(0),
})
const IdParamSchema = z.object({ id: z.string().min(1).max(200) })
export function createSessionsRouter(db: SessionsDb): Router {
const router = Router()
router.get('/', (req, res) => {
const parsed = ListQuerySchema.safeParse(req.query)
if (!parsed.success) {
return res.status(400).json({ error: 'Invalid query', details: parsed.error.format() })
}
const filters = parsed.data
const items = db.listSessions(filters)
const total = db.countSessions(filters)
return res.json({ total, items })
})
router.get('/:id', async (req, res) => {
const parsed = IdParamSchema.safeParse(req.params)
if (!parsed.success) {
return res.status(400).json({ error: 'Invalid id' })
}
const session = db.getSession(parsed.data.id)
if (!session) return res.status(404).json({ error: 'Session not found' })
try {
const { events } = await parseSessionFile(session.jsonl_path)
return res.json({ meta: session, events })
} catch (err) {
const e = err as NodeJS.ErrnoException
if (e.code === 'ENOENT') {
return res.status(410).json({
error: 'Session file missing (stale index)',
session_id: parsed.data.id,
})
}
const isProduction = process.env.NODE_ENV === 'production'
return res.status(500).json({
error: 'Failed to parse session',
...(isProduction ? {} : { message: e.message }),
})
}
})
return router
}
+53
View File
@@ -0,0 +1,53 @@
#!/usr/bin/env tsx
/**
* CLI do indexer de sessões Claude Code (Observabilidade/Espelho).
*
* Modos:
* --full Full scan de ~/.claude/projects -> SQLite em ~/.claude-work/sessions.db
* --watch Modo incremental (stub; implementação Task 8)
*
* Env:
* OBSERVABILIDADE_DB Override ao caminho da BD SQLite
*/
import { indexAll, DEFAULT_DB_PATH, PROJECTS_ROOT } from '../services/sessions/indexer.js'
import { startWatcher } from '../services/sessions/watcher.js'
async function main(): Promise<void> {
const args = process.argv.slice(2)
const mode = args.find((a) => a === '--full' || a === '--watch')
if (!mode) {
console.error('Uso: sessions-indexer.ts [--full|--watch]')
process.exit(1)
}
const dbPath = process.env.OBSERVABILIDADE_DB ?? DEFAULT_DB_PATH
console.log(`[indexer] modo=${mode} db=${dbPath}`)
if (mode === '--watch') {
console.log(`[indexer] watch mode em ${PROJECTS_ROOT} -> ${dbPath}`)
await indexAll({ dbPath })
await startWatcher(dbPath)
return
}
const start = Date.now()
let lastLogged = 0
const { indexed, failed } = await indexAll({
dbPath,
onProgress: (done, total) => {
if (done - lastLogged >= 50 || done === total) {
console.log(`[indexer] ${done}/${total}`)
lastLogged = done
}
},
})
const durationMs = Date.now() - start
const durationSec = (durationMs / 1000).toFixed(1)
console.log(`[indexer] concluído em ${durationSec}s · indexed=${indexed} failed=${failed}`)
process.exit(failed > 0 ? 1 : 0)
}
main().catch((err) => {
console.error('[indexer] falha fatal:', err)
process.exit(2)
})
+428
View File
@@ -0,0 +1,428 @@
#!/usr/bin/env tsx
/**
* Detector semanal de padrões sobre a BD Observabilidade (Fase 6A).
*
* Uso:
* sessions-patterns.ts [--week YYYY-Www] [--publish] [--force]
* sessions-patterns.ts --backfill
*
* Backfill:
* Itera todas as semanas ISO desde a primeira sessão na BD até (excluindo)
* a semana corrente, detecta padrões e faz upsert. Nunca publica nem abre
* tickets. Output: linha JSON por semana + sumário total no fim.
*
* Fluxo:
* 1. Resolver intervalo da semana (segunda 00:00 UTC → domingo 23:59 UTC)
* 2. Correr detectPatterns (6 detectores heurísticos)
* 3. Persistir com upsertPattern + consecutive_weeks
* 4. Se --publish: chamar gateway MCP (desk-crm) para publicar comentário HTML
* na discussão #32, e para padrões com consecutive_weeks>=3 e
* severity∈(warning,action) abrir Ticket no Desk.
* 5. Output JSON-line final com contagens.
*
* Env obrigatório com --publish:
* MCP_GATEWAY_TOKEN Bearer token do gateway MCP
* MCP_GATEWAY_URL URL do MCP desk-crm (default https://gateway.descomplicar.pt/v1/desk-crm/mcp)
*/
import { readFileSync, writeFileSync, existsSync } from 'node:fs'
import { openSessionsDb, type PatternRecord, type SessionsDb } from '../services/sessions/db.js'
import { DEFAULT_DB_PATH } from '../services/sessions/indexer.js'
import {
detectPatterns,
toPatternRecord,
weekIso,
weekRange,
type Pattern,
} from '../services/sessions/patterns.js'
const CARL_JSON_PATH = '/media/ealmeida/Dados/.carl/carl.json'
/**
* Propor padrão persistente como staging entry no carl.json.
* Idempotente: não duplica por pattern_key.
* Só dispara para severity action ou warning com ≥3 semanas consecutivas.
*/
function proposeCarlStagingEntry(p: PatternRecord): void {
if (!existsSync(CARL_JSON_PATH)) return
try {
const raw = readFileSync(CARL_JSON_PATH, 'utf-8')
const carl = JSON.parse(raw) as { staging?: unknown[] }
const staging = (carl.staging ??= [])
const stagingId = `pattern-${p.pattern_key}`
const exists = staging.some((e) => typeof e === 'object' && e !== null && (e as { id?: string }).id === stagingId)
if (exists) return
staging.push({
id: stagingId,
type: 'pattern-proposal',
source: 'observabilidade-patterns',
name: `Padrão recorrente: ${p.title}`,
description: p.description,
severity: p.severity,
consecutive_weeks: p.consecutive_weeks,
affected_count: p.affected_count,
sample_session_ids: p.sample_session_ids,
proposed_at: new Date().toISOString(),
status: 'pending-review',
})
writeFileSync(CARL_JSON_PATH, JSON.stringify(carl, null, 2), 'utf-8')
console.error(`[patterns] proposto em CARL staging: ${stagingId}`)
} catch (err) {
console.error(`[patterns] falha ao propor em CARL staging: ${(err as Error).message}`)
}
}
interface Args {
week?: string
publish: boolean
force: boolean
backfill: boolean
}
interface MCPToolCallResult {
content?: Array<{ type: string; text: string }>
isError?: boolean
}
/** Staff ID usado para publicar comentários automáticos (Observabilidade) */
const OBSERVABILIDADE_STAFF_ID = 25
/** Discussão Desk onde os resumos semanais são publicados */
const OBSERVABILIDADE_DISCUSSION_ID = 32
/** Departamento "Geral" no Desk CRM (id=1) */
const OBSERVABILIDADE_DEPARTMENT_ID = 1
function parseArgs(argv: string[]): Args {
const a: Args = { publish: false, force: false, backfill: false }
for (let i = 0; i < argv.length; i++) {
if (argv[i] === '--week') a.week = argv[++i]
else if (argv[i] === '--publish') a.publish = true
else if (argv[i] === '--force') a.force = true
else if (argv[i] === '--backfill') a.backfill = true
}
return a
}
/** Detecta e persiste padrões para uma única semana (sem publicar). */
function processWeek(db: SessionsDb, monday: Date): {
week_iso: string
detected: number
by_severity: { action: number; warning: number; info: number }
} {
const range = weekRange(monday)
const weekIsoStr = range.iso
const detected: Pattern[] = detectPatterns(db, range.start, range.end)
const records: PatternRecord[] = []
for (const p of detected) {
const tmpRec = toPatternRecord(p, weekIsoStr, 1)
db.upsertPattern(tmpRec)
const consecutive = db.getConsecutiveWeeks(p.pattern_key, weekIsoStr)
const rec = toPatternRecord(p, weekIsoStr, consecutive)
db.upsertPattern(rec)
records.push(rec)
}
return {
week_iso: weekIsoStr,
detected: records.length,
by_severity: {
action: records.filter((r) => r.severity === 'action').length,
warning: records.filter((r) => r.severity === 'warning').length,
info: records.filter((r) => r.severity === 'info').length,
},
}
}
/** Executa backfill desde a primeira sessão até (excl.) semana corrente. */
function runBackfill(db: SessionsDb, dbPath: string): void {
const raw = db.rawDb()
const row = raw.prepare('SELECT MIN(started_at) as min_started FROM sessions').get() as
| { min_started: string | null }
| undefined
if (!row || !row.min_started) {
console.error('[patterns] backfill: BD sem sessões. Nada a fazer.')
console.log(JSON.stringify({ backfill: true, weeks_processed: 0, total_detected: 0 }))
return
}
const firstStart = new Date(row.min_started)
const firstRange = weekRange(firstStart)
const currentRange = weekRange(new Date())
const currentIso = currentRange.iso
console.error(
`[patterns] backfill: primeira sessão ${row.min_started}, semana inicial ${firstRange.iso}, ` +
`semana corrente ${currentIso} (excluída) db=${dbPath}`,
)
let cursor = new Date(firstRange.start)
let weeksProcessed = 0
let totalDetected = 0
const bySev = { action: 0, warning: 0, info: 0 }
const perWeek: Array<{ week_iso: string; detected: number }> = []
// Iteração semanal: cursor é sempre segunda 00:00 UTC
while (weekIso(cursor) !== currentIso) {
const summary = processWeek(db, cursor)
console.log(JSON.stringify({ backfill: true, ...summary }))
weeksProcessed++
totalDetected += summary.detected
bySev.action += summary.by_severity.action
bySev.warning += summary.by_severity.warning
bySev.info += summary.by_severity.info
perWeek.push({ week_iso: summary.week_iso, detected: summary.detected })
// Avançar 7 dias
cursor = new Date(cursor)
cursor.setUTCDate(cursor.getUTCDate() + 7)
// Safety: evitar loop infinito em caso de bug
if (weeksProcessed > 520) {
console.error('[patterns] backfill: safety break após 520 semanas')
break
}
}
console.log(
JSON.stringify({
backfill_summary: true,
weeks_processed: weeksProcessed,
total_detected: totalDetected,
by_severity: bySev,
first_week: perWeek[0]?.week_iso ?? null,
last_week: perWeek[perWeek.length - 1]?.week_iso ?? null,
}),
)
}
/** Converte YYYY-Www em intervalo {start,end} UTC. */
function weekIsoToRange(weekIsoStr: string): { start: Date; end: Date; iso: string } {
const m = weekIsoStr.match(/^(\d{4})-W(\d{2})$/)
if (!m) throw new Error(`Formato --week inválido: ${weekIsoStr}`)
const year = parseInt(m[1], 10)
const week = parseInt(m[2], 10)
// ISO week: quinta da semana 1 está sempre em 4 de Janeiro
const jan4 = new Date(Date.UTC(year, 0, 4))
const jan4Dow = jan4.getUTCDay() || 7 // 1..7 (seg..dom)
const mondayWeek1 = new Date(jan4)
mondayWeek1.setUTCDate(jan4.getUTCDate() - (jan4Dow - 1))
const monday = new Date(mondayWeek1)
monday.setUTCDate(mondayWeek1.getUTCDate() + (week - 1) * 7)
return weekRange(monday)
}
function escapeHtml(s: string): string {
return s.replace(/&/g, '&amp;').replace(/</g, '&lt;').replace(/>/g, '&gt;').replace(/"/g, '&quot;')
}
function buildDiscussionHtml(weekIso: string, patterns: PatternRecord[]): string {
const dateStr = new Date().toISOString().slice(0, 10)
const lines: string[] = []
lines.push(`<h4>Semana ${weekIso} — Padrões Detectados Automaticamente</h4>`)
lines.push(`<p><em>Detector automático Observabilidade — ${patterns.length} padrões analisados.</em></p>`)
if (patterns.length === 0) {
lines.push('<p>Nenhum padrão accionável detectado esta semana.</p>')
} else {
lines.push('<h4>Padrões (severity)</h4>')
lines.push('<ul>')
for (const p of patterns) {
const samples = p.sample_session_ids
.slice(0, 5)
.map((id) => `<a href="/sessions/${encodeURIComponent(id)}"><code>${escapeHtml(id.slice(0, 8))}</code></a>`)
.join(', ')
lines.push(
`<li><strong>${escapeHtml(p.title)}</strong> [${p.severity}] — ` +
`${p.affected_count} sessões` +
(p.metric_value != null ? `, métrica ${p.metric_value}` : '') +
`<br><em>Descrição:</em> ${escapeHtml(p.description)}` +
`<br><em>Sample:</em> ${samples}` +
`<br><em>Semanas consecutivas:</em> ${p.consecutive_weeks}</li>`,
)
}
lines.push('</ul>')
}
lines.push('<hr>')
lines.push(`<p><strong>Skill:</strong> observabilidade-patterns | <strong>Data:</strong> ${dateStr}</p>`)
return lines.join('\n')
}
/**
* Chama uma ferramenta do gateway MCP (JSON-RPC 2.0 sobre HTTP).
* O gateway pode responder em SSE (text/event-stream) ou JSON — tratamos ambos.
*/
async function callMcpTool(tool: string, args: Record<string, unknown>): Promise<MCPToolCallResult> {
const url = process.env.MCP_GATEWAY_URL ?? 'https://gateway.descomplicar.pt/v1/desk-crm/mcp'
const token = process.env.MCP_GATEWAY_TOKEN
if (!token) throw new Error('MCP_GATEWAY_TOKEN não definido')
const body = {
jsonrpc: '2.0',
id: Date.now(),
method: 'tools/call',
params: { name: tool, arguments: args },
}
const res = await fetch(url, {
method: 'POST',
headers: {
Authorization: `Bearer ${token}`,
'Content-Type': 'application/json',
Accept: 'application/json, text/event-stream',
},
body: JSON.stringify(body),
})
if (!res.ok) {
const txt = await res.text().catch(() => '')
throw new Error(`MCP gateway ${res.status}: ${txt.slice(0, 300)}`)
}
const raw = await res.text()
// Responde JSON puro ou SSE (linhas "data: {...}")
let payload: string | null = null
for (const line of raw.split(/\r?\n/)) {
const trimmed = line.trim()
if (!trimmed) continue
if (trimmed.startsWith('data: ')) {
payload = trimmed.slice(6)
break
}
if (trimmed.startsWith('{')) {
payload = trimmed
break
}
}
if (!payload) throw new Error(`MCP resposta sem payload JSON: ${raw.slice(0, 200)}`)
const parsed = JSON.parse(payload) as { error?: unknown; result?: MCPToolCallResult }
if (parsed.error) throw new Error(`MCP error: ${JSON.stringify(parsed.error)}`)
const result = parsed.result as MCPToolCallResult | undefined
if (result?.isError) {
const txt = result.content?.map((c) => c.text).join('\n') ?? ''
throw new Error(`MCP tool ${tool} devolveu isError: ${txt.slice(0, 300)}`)
}
return result ?? {}
}
async function postDeskDiscussionComment(html: string): Promise<unknown> {
return callMcpTool('add_discussion_comment', {
discussion_id: OBSERVABILIDADE_DISCUSSION_ID,
content: html,
staff_id: OBSERVABILIDADE_STAFF_ID,
})
}
async function createDeskTicket(p: PatternRecord): Promise<unknown> {
const priority = p.severity === 'action' ? 4 : 3 // 4=alta, 3=média
const message = [
`<p><strong>Padrão recorrente detectado automaticamente — ${p.consecutive_weeks} semanas consecutivas.</strong></p>`,
`<p>${escapeHtml(p.description)}</p>`,
`<ul>`,
`<li>Pattern key: <code>${escapeHtml(p.pattern_key)}</code></li>`,
`<li>Severity: ${p.severity}</li>`,
`<li>Affected: ${p.affected_count} sessões</li>`,
`<li>Metric: ${p.metric_value ?? 'n/a'}</li>`,
`<li>Week: ${p.week_iso}</li>`,
`</ul>`,
`<p><em>Sample sessions:</em> ${p.sample_session_ids.map((s) => `<code>${escapeHtml(s)}</code>`).join(', ')}</p>`,
].join('\n')
return callMcpTool('create_ticket', {
subject: `Padrão recorrente: ${p.title}`,
message,
priority,
department: OBSERVABILIDADE_DEPARTMENT_ID,
})
}
async function main(): Promise<void> {
const args = parseArgs(process.argv.slice(2))
if (args.publish && !process.env.MCP_GATEWAY_TOKEN) {
console.error('[patterns] --publish requer MCP_GATEWAY_TOKEN. Aborta.')
process.exit(1)
}
const dbPath = process.env.OBSERVABILIDADE_DB ?? DEFAULT_DB_PATH
const db = openSessionsDb(dbPath)
if (args.backfill) {
if (args.publish) {
console.error('[patterns] --backfill é incompatível com --publish. Aborta.')
process.exit(1)
}
runBackfill(db, dbPath)
db.close()
return
}
const range = args.week ? weekIsoToRange(args.week) : weekRange(new Date())
const weekIso = args.week ?? range.iso
console.error(`[patterns] semana ${weekIso} range=${range.start.toISOString()}..${range.end.toISOString()} db=${dbPath} publish=${args.publish}`)
const detected: Pattern[] = detectPatterns(db, range.start, range.end)
console.error(`[patterns] detectados ${detected.length} padrões`)
const records: PatternRecord[] = []
for (const p of detected) {
// Primeiro upsert com consecutive=1 (placeholder); depois recalcula.
const tmpRec = toPatternRecord(p, weekIso, 1)
db.upsertPattern(tmpRec)
const consecutive = db.getConsecutiveWeeks(p.pattern_key, weekIso)
const rec = toPatternRecord(p, weekIso, consecutive)
db.upsertPattern(rec)
records.push(rec)
}
// Ordenar por severity (action > warning > info) depois affected_count
const sevRank: Record<string, number> = { action: 3, warning: 2, info: 1 }
records.sort((a, b) => (sevRank[b.severity] - sevRank[a.severity]) || (b.affected_count - a.affected_count))
let commentPosted = false
let ticketsCreated = 0
let publishError: string | null = null
if (args.publish) {
try {
const html = buildDiscussionHtml(weekIso, records)
await postDeskDiscussionComment(html)
commentPosted = true
for (const rec of records) {
if (rec.consecutive_weeks >= 3 && (rec.severity === 'warning' || rec.severity === 'action')) {
try {
await createDeskTicket(rec)
ticketsCreated++
} catch (e) {
console.error(`[patterns] falha ao criar ticket para ${rec.pattern_key}:`, (e as Error).message)
}
// Propor como staging no CARL — idempotente por pattern_key
proposeCarlStagingEntry(rec)
}
}
} catch (e) {
publishError = (e as Error).message
}
}
// Dry-run: render HTML para stderr para verificação manual
if (!args.publish) {
const html = buildDiscussionHtml(weekIso, records)
console.error('\n--- HTML (dry-run) ---')
console.error(html)
console.error('--- /HTML ---\n')
for (const rec of records) {
if (rec.consecutive_weeks >= 3 && (rec.severity === 'warning' || rec.severity === 'action')) {
console.error(`[patterns] (dry-run) Ticket seria criado: ${rec.title}${rec.consecutive_weeks} sem.`)
console.error(`[patterns] (dry-run) CARL staging seria proposto: pattern-${rec.pattern_key}`)
}
}
}
const summary = {
week_iso: weekIso,
range: { start: range.start.toISOString(), end: range.end.toISOString() },
detected: records.length,
by_severity: {
action: records.filter((r) => r.severity === 'action').length,
warning: records.filter((r) => r.severity === 'warning').length,
info: records.filter((r) => r.severity === 'info').length,
},
published: commentPosted,
tickets_created: ticketsCreated,
publish_error: publishError,
dry_run: !args.publish,
}
console.log(JSON.stringify(summary))
db.close()
}
main().catch((err) => {
console.error('[patterns] falha fatal:', err)
process.exit(2)
})
+97
View File
@@ -0,0 +1,97 @@
#!/usr/bin/env tsx
/**
* Importa comentários das discussões Desk #31/#32/#33 (worklogs, reflexões
* e acções de melhoria) para a BD Observabilidade.
*
* Uso:
* sessions-worklog-import.ts [--discussion 31|32|33|all] [--since-days N]
* sessions-worklog-import.ts --discussion 31 --page-size 200
*
* Default: --discussion all --since-days 365
*
* Env obrigatório:
* MCP_GATEWAY_TOKEN Bearer token do gateway MCP
*/
import { openSessionsDb } from '../services/sessions/db.js'
import { DEFAULT_DB_PATH } from '../services/sessions/indexer.js'
import { importWorklogDiscussion, type ImportResult } from '../services/sessions/worklog-import.js'
interface Args {
discussion: 'all' | number
sinceDays: number
pageSize: number
}
function parseArgs(argv: string[]): Args {
const a: Args = { discussion: 'all', sinceDays: 365, pageSize: 500 }
for (let i = 0; i < argv.length; i++) {
if (argv[i] === '--discussion') {
const v = argv[++i]
a.discussion = v === 'all' ? 'all' : parseInt(v, 10)
} else if (argv[i] === '--since-days') {
a.sinceDays = parseInt(argv[++i], 10)
} else if (argv[i] === '--page-size') {
a.pageSize = parseInt(argv[++i], 10)
}
}
return a
}
async function main(): Promise<void> {
const args = parseArgs(process.argv.slice(2))
if (!process.env.MCP_GATEWAY_TOKEN) {
console.error('[worklog-import] MCP_GATEWAY_TOKEN não definido. Aborta.')
process.exit(1)
}
const dbPath = process.env.OBSERVABILIDADE_DB ?? DEFAULT_DB_PATH
const db = openSessionsDb(dbPath)
const discussions = args.discussion === 'all' ? [31, 32, 33] : [args.discussion as number]
const sinceIso = new Date(Date.now() - args.sinceDays * 86400_000).toISOString()
console.error(
`[worklog-import] db=${dbPath} discussions=${discussions.join(',')} since=${sinceIso} page_size=${args.pageSize}`,
)
const results: ImportResult[] = []
for (const d of discussions) {
try {
const r = await importWorklogDiscussion(db, d, { sinceIso, pageSize: args.pageSize })
results.push(r)
console.error(
`[worklog-import] #${d}: fetched=${r.fetched} inserted=${r.imported} updated=${r.updated} skipped=${r.skipped} errors=${r.errors}`,
)
} catch (e) {
console.error(`[worklog-import] falha #${d}:`, (e as Error).message)
results.push({
discussion_id: d,
fetched: 0,
imported: 0,
updated: 0,
skipped: 0,
errors: 1,
})
}
}
const summary = {
db: dbPath,
since_iso: sinceIso,
discussions: results,
totals: {
fetched: results.reduce((s, r) => s + r.fetched, 0),
imported: results.reduce((s, r) => s + r.imported, 0),
updated: results.reduce((s, r) => s + r.updated, 0),
skipped: results.reduce((s, r) => s + r.skipped, 0),
errors: results.reduce((s, r) => s + r.errors, 0),
},
total_in_db: db.countWorklogComments(),
}
console.log(JSON.stringify(summary))
db.close()
}
main().catch((err) => {
console.error('[worklog-import] falha fatal:', err)
process.exit(2)
})
+17
View File
@@ -21,6 +21,9 @@ import n8nRouter from './routes/n8n.js'
import paperclipRouter from './routes/paperclip.js'
import aiRouter from './routes/ai.js'
import operationsRouter from './routes/operations.js'
import { createSessionsRouter } from './routes/sessions.js'
import { openSessionsDb } from './services/sessions/db.js'
import { DEFAULT_DB_PATH } from './services/sessions/indexer.js'
import { collectAllServerMetrics } from './services/server-metrics.js'
import { collectMonitoringData } from './services/monitoring-collector.js'
@@ -133,6 +136,20 @@ app.use('/api/paperclip', paperclipRouter)
app.use('/api/ai', aiRouter)
app.use('/api/operations', operationsRouter)
// Observabilidade (Espelho) — sessões Claude Code
const sessionsDb = openSessionsDb(process.env.OBSERVABILIDADE_DB ?? DEFAULT_DB_PATH)
app.use('/api/sessions', createSessionsRouter(sessionsDb))
function closeSessionsDb(): void {
try {
sessionsDb.close()
} catch (err) {
console.error('[sessionsDb] erro ao fechar:', err)
}
}
process.on('SIGTERM', closeSessionsDb)
process.on('SIGINT', closeSessionsDb)
// Serve static files in production
if (isProduction) {
// __dirname is /app/api/dist, need to go up 2 levels to /app/dist
+44 -62
View File
@@ -19,14 +19,6 @@ interface CheckResult {
error?: string
}
/**
* EasyPanel API config.
* Accessible from Docker Swarm via service name 'easypanel'.
* Token read from EASYPANEL_API_TOKEN env var.
*/
const EASYPANEL_API_URL = process.env.EASYPANEL_API_URL || 'http://easypanel:3000/api/trpc'
const EASYPANEL_API_TOKEN = process.env.EASYPANEL_API_TOKEN || ''
/**
* Services to monitor via HTTP health check.
* Each entry maps to a record in tbl_eal_monitoring (category='service').
@@ -163,71 +155,57 @@ export async function checkStaleness(): Promise<number> {
}
/**
* Call EasyPanel tRPC API endpoint.
* Returns parsed JSON or null on failure.
*/
async function callEasyPanelAPI(endpoint: string): Promise<any | null> {
if (!EASYPANEL_API_TOKEN) return null
try {
const controller = new AbortController()
const timeout = setTimeout(() => controller.abort(), 10000)
const response = await fetch(`${EASYPANEL_API_URL}/${endpoint}`, {
headers: { 'Authorization': `Bearer ${EASYPANEL_API_TOKEN}` },
signal: controller.signal,
})
clearTimeout(timeout)
if (!response.ok) return null
const data: any = await response.json()
return data?.result?.data?.json ?? null
} catch {
return null
}
}
/**
* Collect EasyPanel server metrics (CPU, RAM, disk) via API.
* Replaces SSH-based collection for the Easy server.
* Collect EasyPanel server metrics + container stats via SSH.
* A API tRPC do EasyPanel não expõe endpoint monitor.* nesta versão.
* SSH com password ao Easy server (5.9.90.70) funciona a partir do container.
*/
export async function collectEasyPanelMetrics(): Promise<boolean> {
const stats = await callEasyPanelAPI('monitor.getSystemStats')
if (!stats) return false
const cpu = Math.round(stats.cpuInfo?.usedPercentage ?? 0)
const ram = Math.round((stats.memInfo?.usedMemPercentage ?? 0) * 10) / 10
const disk = parseFloat(stats.diskInfo?.usedPercentage ?? '0')
const load = stats.cpuInfo?.loadavg?.[0] ?? 0
await upsertMonitoring('server', 'EasyPanel', 'up', {
cpu, ram, disk, load,
uptime_hours: Math.round((stats.uptime ?? 0) / 3600),
mem_total_mb: Math.round(stats.memInfo?.totalMemMb ?? 0),
mem_used_mb: Math.round(stats.memInfo?.usedMemMb ?? 0),
disk_total_gb: stats.diskInfo?.totalGb,
disk_free_gb: stats.diskInfo?.freeGb,
})
console.log(`[EASYPANEL] Server: CPU=${cpu}%, RAM=${ram}%, Disk=${disk}%`)
return true
const { collectSSHMetrics } = await import('./server-metrics.js')
const result = await collectSSHMetrics()
return result.success > 0
}
/**
* Collect Docker container/task stats via EasyPanel API.
* Updates the 'container' category in monitoring DB.
* Collect Docker Swarm service status via SSH to EasyPanel server.
* Usa `docker service ls` para obter replicas actual vs desired.
*/
export async function collectEasyPanelContainers(): Promise<boolean> {
const tasks = await callEasyPanelAPI('monitor.getDockerTaskStats')
if (!tasks) return false
const easyHost = process.env.EASY_HOST || '5.9.90.70'
const easyUser = process.env.EASY_USER || 'root'
const easyPass = process.env.EASY_PASS || ''
if (!easyPass) return false
try {
const { Client } = await import('ssh2')
const output = await new Promise<string>((resolve, reject) => {
const conn = new Client()
let data = ''
const timer = setTimeout(() => { conn.end(); reject(new Error('timeout')) }, 20000)
conn.on('ready', () => {
conn.exec("docker service ls --format '{{.Name}} {{.Replicas}}'", (err, stream) => {
if (err) { clearTimeout(timer); conn.end(); reject(err); return }
stream.on('data', (chunk: Buffer) => { data += chunk.toString() })
stream.on('close', () => { clearTimeout(timer); conn.end(); resolve(data) })
stream.stderr.on('data', () => {})
})
})
conn.on('error', (err) => { clearTimeout(timer); reject(err) })
conn.connect({ host: easyHost, port: 22, username: easyUser, password: easyPass, readyTimeout: 15000 })
})
let total = 0, up = 0, down = 0
const unhealthy: string[] = []
for (const [name, info] of Object.entries(tasks) as [string, { actual: number; desired: number }][]) {
for (const line of output.trim().split('\n')) {
if (!line.trim()) continue
const parts = line.trim().split(/\s+/)
const name = parts[0] || ''
const replicas = parts[1] || '0/0'
const [actual, desired] = replicas.split('/').map(Number)
total++
if (info.actual >= info.desired) {
if (actual >= desired && desired > 0) {
up++
} else {
down++
@@ -243,6 +221,10 @@ export async function collectEasyPanelContainers(): Promise<boolean> {
console.log(`[EASYPANEL] Containers: ${up}/${total} running${down > 0 ? `, ${down} down: ${unhealthy.join(', ')}` : ''}`)
return true
} catch (err: unknown) {
console.error('[EASYPANEL] Container collection failed:', err instanceof Error ? err.message : err)
return false
}
}
/**
@@ -264,7 +246,7 @@ export async function collectMonitoringData(): Promise<void> {
const gotStats = await collectEasyPanelMetrics()
const gotContainers = await collectEasyPanelContainers()
if (!gotStats && !gotContainers) {
console.warn('[COLLECTOR] EasyPanel API unavailable (check EASYPANEL_API_TOKEN)')
console.warn('[COLLECTOR] EasyPanel metrics unavailable (check EASY_HOST/EASY_USER/EASY_PASS)')
}
} catch (err: unknown) {
console.error('[COLLECTOR] EasyPanel collection failed:', err instanceof Error ? err.message : err)
+9 -9
View File
@@ -15,17 +15,17 @@ interface SSHServer {
pass: string
}
// EasyPanel metrics: collected via API in monitoring-collector.ts
// Gateway metrics: not needed (just Nginx proxy, covered by HTTP health check)
// Only CWP Server remains on SSH (password auth)
// CWP Server: só aceita autenticação por chave ed25519 (não por password) — não acessível a partir do container
// EasyPanel Server: aceita password auth na porta 22 — usado para métricas CPU/RAM/disk
// Gateway: apenas proxy Nginx, coberto pelo health check HTTP
const SSH_SERVERS: SSHServer[] = [
{
name: 'server',
monitorName: 'CWP Server',
host: process.env.SERVER_HOST || '5.9.90.105',
port: parseInt(process.env.SERVER_PORT || '9443'),
user: process.env.SERVER_USER || 'root',
pass: process.env.SERVER_PASS || ''
name: 'easy',
monitorName: 'EasyPanel',
host: process.env.EASY_HOST || '5.9.90.70',
port: 22,
user: process.env.EASY_USER || 'root',
pass: process.env.EASY_PASS || ''
}
]
+250
View File
@@ -13,6 +13,45 @@ export interface ListFilters {
offset?: number
}
export interface PatternRecord {
id?: number
detected_at: string
week_iso: string
pattern_key: string
title: string
description: string
severity: 'info' | 'warning' | 'action'
metric_value: number | null
sample_session_ids: string[]
affected_count: number
consecutive_weeks: number
}
export interface WorklogCommentRecord {
id: number
discussion_id: number
created_at: string
staff_id: number | null
title: string | null
task_ref: string | null
duration_sec: number | null
work_items: string[]
files_modified: string[]
problems: { problema: string; solucao: string }[]
patterns_text: string[]
actions: { tipo: string; descricao: string; prioridade: string | null }[]
raw_html: string
imported_at: string
}
export interface WorklogFilters {
discussion_id?: number
task_ref?: string
sinceIso?: string
limit?: number
offset?: number
}
export interface SessionsDb {
upsertSession(meta: SessionMeta): void
upsertMany(metas: SessionMeta[]): void
@@ -20,6 +59,14 @@ export interface SessionsDb {
countSessions(filters: ListFilters): number
getSession(id: string): SessionMeta | null
deleteByJsonlPath(path: string): void
upsertPattern(p: PatternRecord): void
getPatternsByWeek(week: string): PatternRecord[]
getConsecutiveWeeks(pattern_key: string, uptoWeek: string): number
upsertWorklogComment(c: WorklogCommentRecord): { inserted: boolean }
hasWorklogComment(id: number): boolean
listWorklogComments(filters: WorklogFilters): WorklogCommentRecord[]
countWorklogComments(filters?: WorklogFilters): number
rawDb(): Database.Database
close(): void
}
@@ -46,6 +93,42 @@ CREATE TABLE IF NOT EXISTS sessions (
);
CREATE INDEX IF NOT EXISTS idx_started ON sessions(started_at DESC);
CREATE INDEX IF NOT EXISTS idx_project ON sessions(project_slug, started_at DESC);
CREATE TABLE IF NOT EXISTS patterns (
id INTEGER PRIMARY KEY AUTOINCREMENT,
detected_at TEXT NOT NULL,
week_iso TEXT NOT NULL,
pattern_key TEXT NOT NULL,
title TEXT NOT NULL,
description TEXT NOT NULL,
severity TEXT NOT NULL,
metric_value REAL,
sample_session_ids TEXT NOT NULL,
affected_count INTEGER NOT NULL,
consecutive_weeks INTEGER NOT NULL DEFAULT 1,
UNIQUE(week_iso, pattern_key)
);
CREATE INDEX IF NOT EXISTS idx_patterns_week ON patterns(week_iso);
CREATE INDEX IF NOT EXISTS idx_patterns_key ON patterns(pattern_key);
CREATE TABLE IF NOT EXISTS worklog_comments (
id INTEGER PRIMARY KEY,
discussion_id INTEGER NOT NULL,
created_at TEXT NOT NULL,
staff_id INTEGER,
title TEXT,
task_ref TEXT,
duration_sec INTEGER,
work_items TEXT NOT NULL,
files_modified TEXT NOT NULL,
problems_json TEXT NOT NULL,
patterns_text TEXT NOT NULL,
actions_json TEXT NOT NULL,
raw_html TEXT NOT NULL,
imported_at TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_wc_discussion ON worklog_comments(discussion_id, created_at DESC);
CREATE INDEX IF NOT EXISTS idx_wc_task ON worklog_comments(task_ref);
`
function rowToMeta(row: Record<string, unknown>): SessionMeta {
@@ -177,8 +260,175 @@ export function openSessionsDb(dbPath: string): SessionsDb {
deleteByJsonlPath(path) {
db.prepare('DELETE FROM sessions WHERE jsonl_path = ?').run(path)
},
upsertPattern(p: PatternRecord) {
db.prepare(`
INSERT INTO patterns (detected_at, week_iso, pattern_key, title, description,
severity, metric_value, sample_session_ids, affected_count, consecutive_weeks)
VALUES (@detected_at, @week_iso, @pattern_key, @title, @description,
@severity, @metric_value, @sample_session_ids, @affected_count, @consecutive_weeks)
ON CONFLICT(week_iso, pattern_key) DO UPDATE SET
detected_at = excluded.detected_at,
title = excluded.title,
description = excluded.description,
severity = excluded.severity,
metric_value = excluded.metric_value,
sample_session_ids = excluded.sample_session_ids,
affected_count = excluded.affected_count,
consecutive_weeks = excluded.consecutive_weeks
`).run({
detected_at: p.detected_at,
week_iso: p.week_iso,
pattern_key: p.pattern_key,
title: p.title,
description: p.description,
severity: p.severity,
metric_value: p.metric_value,
sample_session_ids: JSON.stringify(p.sample_session_ids),
affected_count: p.affected_count,
consecutive_weeks: p.consecutive_weeks,
})
},
getPatternsByWeek(week: string): PatternRecord[] {
const rows = db.prepare('SELECT * FROM patterns WHERE week_iso = ? ORDER BY severity DESC, affected_count DESC').all(week) as Record<string, unknown>[]
return rows.map((r) => ({
id: r.id as number,
detected_at: r.detected_at as string,
week_iso: r.week_iso as string,
pattern_key: r.pattern_key as string,
title: r.title as string,
description: r.description as string,
severity: r.severity as PatternRecord['severity'],
metric_value: (r.metric_value as number | null) ?? null,
sample_session_ids: JSON.parse(r.sample_session_ids as string),
affected_count: r.affected_count as number,
consecutive_weeks: r.consecutive_weeks as number,
}))
},
getConsecutiveWeeks(pattern_key: string, uptoWeek: string): number {
// Conta semanas consecutivas até uptoWeek (inclusive) em que pattern_key apareceu
const rows = db.prepare('SELECT DISTINCT week_iso FROM patterns WHERE pattern_key = ? AND week_iso <= ? ORDER BY week_iso DESC').all(pattern_key, uptoWeek) as { week_iso: string }[]
if (rows.length === 0) return 0
let count = 0
let cursor = uptoWeek
for (const row of rows) {
if (row.week_iso === cursor) {
count++
cursor = prevWeekIso(cursor)
} else {
break
}
}
return count
},
upsertWorklogComment(c: WorklogCommentRecord): { inserted: boolean } {
const existing = db.prepare('SELECT 1 FROM worklog_comments WHERE id = ?').get(c.id)
const inserted = !existing
db.prepare(`
INSERT INTO worklog_comments (id, discussion_id, created_at, staff_id, title, task_ref,
duration_sec, work_items, files_modified, problems_json, patterns_text, actions_json,
raw_html, imported_at)
VALUES (@id, @discussion_id, @created_at, @staff_id, @title, @task_ref,
@duration_sec, @work_items, @files_modified, @problems_json, @patterns_text, @actions_json,
@raw_html, @imported_at)
ON CONFLICT(id) DO UPDATE SET
discussion_id = excluded.discussion_id,
created_at = excluded.created_at,
staff_id = excluded.staff_id,
title = excluded.title,
task_ref = excluded.task_ref,
duration_sec = excluded.duration_sec,
work_items = excluded.work_items,
files_modified = excluded.files_modified,
problems_json = excluded.problems_json,
patterns_text = excluded.patterns_text,
actions_json = excluded.actions_json,
raw_html = excluded.raw_html,
imported_at = excluded.imported_at
`).run({
id: c.id,
discussion_id: c.discussion_id,
created_at: c.created_at,
staff_id: c.staff_id,
title: c.title,
task_ref: c.task_ref,
duration_sec: c.duration_sec,
work_items: JSON.stringify(c.work_items),
files_modified: JSON.stringify(c.files_modified),
problems_json: JSON.stringify(c.problems),
patterns_text: JSON.stringify(c.patterns_text),
actions_json: JSON.stringify(c.actions),
raw_html: c.raw_html,
imported_at: c.imported_at,
})
return { inserted }
},
hasWorklogComment(id: number): boolean {
return !!db.prepare('SELECT 1 FROM worklog_comments WHERE id = ?').get(id)
},
listWorklogComments(filters: WorklogFilters): WorklogCommentRecord[] {
const parts: string[] = []
const params: Record<string, unknown> = {}
if (filters.discussion_id) { parts.push('discussion_id = @discussion_id'); params.discussion_id = filters.discussion_id }
if (filters.task_ref) { parts.push('task_ref = @task_ref'); params.task_ref = filters.task_ref }
if (filters.sinceIso) { parts.push('created_at >= @since'); params.since = filters.sinceIso }
const where = parts.length ? 'WHERE ' + parts.join(' AND ') : ''
const limit = filters.limit ?? 1000
const offset = filters.offset ?? 0
const rows = db.prepare(`SELECT * FROM worklog_comments ${where} ORDER BY created_at DESC LIMIT @limit OFFSET @offset`)
.all({ ...params, limit, offset }) as Record<string, unknown>[]
return rows.map((r) => ({
id: r.id as number,
discussion_id: r.discussion_id as number,
created_at: r.created_at as string,
staff_id: (r.staff_id as number | null) ?? null,
title: (r.title as string | null) ?? null,
task_ref: (r.task_ref as string | null) ?? null,
duration_sec: (r.duration_sec as number | null) ?? null,
work_items: JSON.parse(r.work_items as string),
files_modified: JSON.parse(r.files_modified as string),
problems: JSON.parse(r.problems_json as string),
patterns_text: JSON.parse(r.patterns_text as string),
actions: JSON.parse(r.actions_json as string),
raw_html: r.raw_html as string,
imported_at: r.imported_at as string,
}))
},
countWorklogComments(filters?: WorklogFilters): number {
const parts: string[] = []
const params: Record<string, unknown> = {}
if (filters?.discussion_id) { parts.push('discussion_id = @discussion_id'); params.discussion_id = filters.discussion_id }
if (filters?.task_ref) { parts.push('task_ref = @task_ref'); params.task_ref = filters.task_ref }
if (filters?.sinceIso) { parts.push('created_at >= @since'); params.since = filters.sinceIso }
const where = parts.length ? 'WHERE ' + parts.join(' AND ') : ''
const row = db.prepare(`SELECT COUNT(*) as c FROM worklog_comments ${where}`).get(params) as { c: number }
return row.c
},
rawDb(): Database.Database {
return db
},
close() {
db.close()
},
}
}
/** Calcula semana ISO anterior (YYYY-Www). */
export function prevWeekIso(week: string): string {
const m = week.match(/^(\d{4})-W(\d{2})$/)
if (!m) return week
const year = parseInt(m[1], 10)
const w = parseInt(m[2], 10)
if (w > 1) return `${year}-W${String(w - 1).padStart(2, '0')}`
// Semana 1 → última semana do ano anterior (52 ou 53)
const prevYear = year - 1
const last = weeksInYear(prevYear)
return `${prevYear}-W${String(last).padStart(2, '0')}`
}
function weeksInYear(year: number): number {
// ISO: ano tem 53 semanas se 1 Jan é quinta ou (ano bissexto e 1 Jan é quarta)
const jan1 = new Date(Date.UTC(year, 0, 1)).getUTCDay()
const isLeap = (year % 4 === 0 && year % 100 !== 0) || year % 400 === 0
if (jan1 === 4 || (isLeap && jan1 === 3)) return 53
return 52
}
+98
View File
@@ -0,0 +1,98 @@
import { readdirSync, statSync } from 'fs'
import { homedir } from 'os'
import { join } from 'path'
import { parseSessionFile } from './parser.js'
import { openSessionsDb, type SessionsDb } from './db.js'
import type { SessionMeta } from '../../types/session.js'
export const PROJECTS_ROOT = join(homedir(), '.claude', 'projects')
export const DEFAULT_DB_PATH = join(homedir(), '.claude-work', 'sessions.db')
/**
* Percorre a raiz de projectos Claude (profundidade 2) e devolve todos os .jsonl.
* Estrutura: ~/.claude/projects/<project-slug>/<session-uuid>.jsonl
*/
export function findAllJsonl(root: string = PROJECTS_ROOT): string[] {
const result: string[] = []
let entries: string[]
try {
entries = readdirSync(root)
} catch {
return result
}
for (const entry of entries) {
const projectDir = join(root, entry)
let st
try {
st = statSync(projectDir)
} catch {
continue
}
if (!st.isDirectory()) continue
let files: string[]
try {
files = readdirSync(projectDir)
} catch {
continue
}
for (const f of files) {
if (f.endsWith('.jsonl')) result.push(join(projectDir, f))
}
}
return result
}
/**
* Indexa um único ficheiro (parse + upsert). Uso individual — útil para o watcher (Task 8).
*/
export async function indexFile(db: SessionsDb, path: string): Promise<void> {
const { meta } = await parseSessionFile(path)
db.upsertSession(meta)
}
export interface IndexAllOptions {
dbPath?: string
onProgress?: (done: number, total: number) => void
}
/**
* Full scan: percorre todos os JSONL e faz upsert em lote (batch 50 via transacção).
*/
export async function indexAll(
options: IndexAllOptions = {},
): Promise<{ indexed: number; failed: number }> {
const db = openSessionsDb(options.dbPath ?? DEFAULT_DB_PATH)
const files = findAllJsonl()
const BATCH = 50
let indexed = 0
let failed = 0
let batch: SessionMeta[] = []
try {
for (let i = 0; i < files.length; i++) {
try {
const { meta } = await parseSessionFile(files[i])
batch.push(meta)
if (batch.length >= BATCH) {
db.upsertMany(batch)
indexed += batch.length
batch = []
}
} catch (err) {
failed++
console.error(`[indexer] erro em ${files[i]}:`, err)
}
if (options.onProgress) {
options.onProgress(indexed + failed + batch.length, files.length)
}
}
if (batch.length > 0) {
db.upsertMany(batch)
indexed += batch.length
}
} finally {
db.close()
}
return { indexed, failed }
}
+69
View File
@@ -0,0 +1,69 @@
/**
* Cliente HTTP mínimo para o gateway MCP (JSON-RPC 2.0 sobre HTTP).
*
* Suporta resposta em JSON puro ou SSE (text/event-stream). Partilhado entre
* os scripts de Observabilidade (patterns + worklog import).
*/
export interface MCPToolCallResult {
content?: Array<{ type: string; text: string }>
isError?: boolean
}
export async function callMcpTool(
tool: string,
args: Record<string, unknown>,
): Promise<MCPToolCallResult> {
const url = process.env.MCP_GATEWAY_URL ?? 'https://gateway.descomplicar.pt/v1/desk-crm/mcp'
const token = process.env.MCP_GATEWAY_TOKEN
if (!token) throw new Error('MCP_GATEWAY_TOKEN não definido')
const body = {
jsonrpc: '2.0',
id: Date.now(),
method: 'tools/call',
params: { name: tool, arguments: args },
}
const res = await fetch(url, {
method: 'POST',
headers: {
Authorization: `Bearer ${token}`,
'Content-Type': 'application/json',
Accept: 'application/json, text/event-stream',
},
body: JSON.stringify(body),
})
if (!res.ok) {
const txt = await res.text().catch(() => '')
throw new Error(`MCP gateway ${res.status}: ${txt.slice(0, 300)}`)
}
const raw = await res.text()
let payload: string | null = null
for (const line of raw.split(/\r?\n/)) {
const trimmed = line.trim()
if (!trimmed) continue
if (trimmed.startsWith('data: ')) {
payload = trimmed.slice(6)
break
}
if (trimmed.startsWith('{')) {
payload = trimmed
break
}
}
if (!payload) throw new Error(`MCP resposta sem payload JSON: ${raw.slice(0, 200)}`)
const parsed = JSON.parse(payload) as { error?: unknown; result?: MCPToolCallResult }
if (parsed.error) throw new Error(`MCP error: ${JSON.stringify(parsed.error)}`)
const result = parsed.result as MCPToolCallResult | undefined
if (result?.isError) {
const txt = result.content?.map((c) => c.text).join('\n') ?? ''
throw new Error(`MCP tool ${tool} devolveu isError: ${txt.slice(0, 300)}`)
}
return result ?? {}
}
/** Extrai o primeiro bloco de texto JSON-encoded do resultado MCP. */
export function extractMcpJsonPayload<T = unknown>(r: MCPToolCallResult): T {
const text = r.content?.find((c) => c.type === 'text')?.text
if (!text) throw new Error('MCP result sem content text')
return JSON.parse(text) as T
}
+21 -3
View File
@@ -19,6 +19,21 @@ function detectHook(text: string | null): string | null {
return m ? m[1] : null
}
function extractResultText(r: unknown): string | null {
if (r == null) return null
if (typeof r === 'string') return r
if (Array.isArray(r)) {
const parts: string[] = []
for (const p of r) {
if (p && typeof p === 'object' && 'text' in p && typeof (p as { text: unknown }).text === 'string') {
parts.push((p as { text: string }).text)
}
}
return parts.length ? parts.join('\n') : null
}
return null
}
function extractText(rawMsg: unknown): string | null {
if (!rawMsg || typeof rawMsg !== 'object') return null
const msg = rawMsg as { content?: unknown }
@@ -132,9 +147,12 @@ export async function parseSessionFile(jsonlPath: string): Promise<ParseResult>
}
}
const resultText = extractResultText(toolResult)
const skill = detectSkillInvoked(text)
if (skill) skillsInvoked.add(skill)
const hook = detectHook(text)
const skillFromResult = detectSkillInvoked(resultText)
const finalSkill = skill ?? skillFromResult
if (finalSkill) skillsInvoked.add(finalSkill)
const hook = detectHook(text) ?? detectHook(resultText)
events.push({
index: idx++,
@@ -145,7 +163,7 @@ export async function parseSessionFile(jsonlPath: string): Promise<ParseResult>
tool_name: toolName,
tool_input: toolInput,
tool_result: toolResult,
skill_invoked: skill,
skill_invoked: finalSkill,
hook_name: hook,
})
}
+515
View File
@@ -0,0 +1,515 @@
/**
* Detector automático de padrões sobre a BD `sessions` (Observabilidade Fase 6A).
*
* Seis detectores heurísticos em SQL puro (via better-sqlite3). Cada detector
* devolve zero ou mais `Pattern` para a semana analisada. Pipeline:
* 1. Correr detectores sobre intervalo [weekStart, weekEnd]
* 2. Persistir via `upsertPattern` (idempotente por (week_iso, pattern_key))
* 3. Calcular `consecutive_weeks` olhando para semanas anteriores
*/
import type Database from 'better-sqlite3'
import type { SessionsDb, PatternRecord } from './db.js'
export type Severity = 'info' | 'warning' | 'action'
export interface Pattern {
pattern_key: string
title: string
description: string
severity: Severity
metric_value: number | null
sample_session_ids: string[]
affected_count: number
}
export interface DetectCtx {
db: Database.Database
weekStartIso: string
weekEndIso: string
}
/** Converte Date para string ISO UTC. */
function iso(d: Date): string {
return d.toISOString()
}
/**
* Calcula intervalo [segunda 00:00:00 UTC, domingo 23:59:59.999 UTC] da semana
* que contém `ref` (Regra 17 semana começa à segunda).
*/
export function weekRange(ref: Date): { start: Date; end: Date; iso: string } {
const d = new Date(Date.UTC(ref.getUTCFullYear(), ref.getUTCMonth(), ref.getUTCDate()))
const dow = d.getUTCDay() // 0=Dom, 1=Seg
const diffToMonday = dow === 0 ? -6 : 1 - dow
const start = new Date(d)
start.setUTCDate(d.getUTCDate() + diffToMonday)
const end = new Date(start)
end.setUTCDate(start.getUTCDate() + 6)
end.setUTCHours(23, 59, 59, 999)
return { start, end, iso: weekIso(start) }
}
/** Semana ISO 8601 (YYYY-Www) para segunda de referência. */
export function weekIso(monday: Date): string {
// Usa algoritmo ISO: quinta da mesma semana determina o ano
const thursday = new Date(monday)
thursday.setUTCDate(monday.getUTCDate() + 3)
const year = thursday.getUTCFullYear()
const jan1 = new Date(Date.UTC(year, 0, 1))
const week = Math.floor(
((thursday.getTime() - jan1.getTime()) / 86400000 + (jan1.getUTCDay() === 0 ? 6 : jan1.getUTCDay() - 1)) / 7
) + 1
return `${year}-W${String(week).padStart(2, '0')}`
}
/** Helper: todos os session_ids no intervalo. */
function baseRows(ctx: DetectCtx) {
return ctx.db.prepare(`
SELECT session_id, project_slug, started_at, event_count, tool_calls, tools_used, skills_invoked, outcome, duration_sec
FROM sessions
WHERE started_at >= ? AND started_at <= ?
`).all(ctx.weekStartIso, ctx.weekEndIso) as Array<{
session_id: string
project_slug: string
started_at: string
event_count: number
tool_calls: number
tools_used: string
skills_invoked: string
outcome: string
duration_sec: number | null
}>
}
/** 1. Skills com taxa elevada de erro/interrupção. */
export function detectSkillsHighErrorRate(ctx: DetectCtx): Pattern[] {
const rows = baseRows(ctx)
// Agregar por skill
const bySkill = new Map<string, { total: number; fail: number; ids: string[] }>()
for (const r of rows) {
let skills: string[] = []
try { skills = JSON.parse(r.skills_invoked) } catch {}
for (const sk of skills) {
const entry = bySkill.get(sk) ?? { total: 0, fail: 0, ids: [] }
entry.total++
// Interrupções em sessões longas (≥10 eventos) são redirects naturais do utilizador,
// não falhas da skill. Só contar erros reais ou interrupções muito precoces.
const isRealFailure = r.outcome === 'error' ||
(r.outcome === 'interrupted' && (r.event_count ?? 0) < 10)
if (isRealFailure) {
entry.fail++
if (entry.ids.length < 5) entry.ids.push(r.session_id)
}
bySkill.set(sk, entry)
}
}
const out: Pattern[] = []
for (const [skill, v] of bySkill) {
if (v.total < 3) continue
const ratio = v.fail / v.total
if (ratio <= 0.2) continue
const severity: Severity = ratio > 0.4 ? 'action' : 'warning'
out.push({
pattern_key: `skill_error_rate:${skill}`,
title: `Skill ${skill}: ${(ratio * 100).toFixed(0)}% das sessões falham`,
description: `De ${v.total} sessões que invocaram ${skill}, ${v.fail} terminaram em erro/interrupção.`,
severity,
metric_value: Math.round(ratio * 1000) / 1000,
sample_session_ids: v.ids,
affected_count: v.fail,
})
}
return out
}
/** 2. Tools com baixa eficiência (tool_calls/event_count elevado). */
export function detectToolsLowEfficiency(ctx: DetectCtx): Pattern[] {
const rows = baseRows(ctx)
const byTool = new Map<string, { sum: number; count: number; ids: string[] }>()
for (const r of rows) {
if (!r.event_count || r.event_count === 0) continue
const ratio = r.tool_calls / r.event_count
let tools: string[] = []
try { tools = JSON.parse(r.tools_used) } catch {}
for (const t of tools) {
const e = byTool.get(t) ?? { sum: 0, count: 0, ids: [] }
e.sum += ratio
e.count++
if (e.ids.length < 5) e.ids.push(r.session_id)
byTool.set(t, e)
}
}
const out: Pattern[] = []
for (const [tool, v] of byTool) {
if (v.count < 5) continue
const avg = v.sum / v.count
if (avg <= 0.5) continue
out.push({
pattern_key: `tool_low_efficiency:${tool}`,
title: `Tool ${tool}: rácio tool_calls/event_count médio ${avg.toFixed(2)}`,
description: `Em ${v.count} sessões, ${tool} domina o event_count. Indício de uso ineficiente ou looping.`,
severity: 'info',
metric_value: Math.round(avg * 1000) / 1000,
sample_session_ids: v.ids,
affected_count: v.count,
})
}
return out
}
/** 3. Pares (skill, tool) mais frequentes. */
export function detectSkillToolPairs(ctx: DetectCtx): Pattern[] {
const rows = baseRows(ctx)
const byPair = new Map<string, { count: number; ids: string[] }>()
for (const r of rows) {
let skills: string[] = []
let tools: string[] = []
try { skills = JSON.parse(r.skills_invoked) } catch {}
try { tools = JSON.parse(r.tools_used) } catch {}
for (const s of skills) {
for (const t of tools) {
const key = `${s}::${t}`
const e = byPair.get(key) ?? { count: 0, ids: [] }
e.count++
if (e.ids.length < 5) e.ids.push(r.session_id)
byPair.set(key, e)
}
}
}
const sorted = [...byPair.entries()].filter(([, v]) => v.count >= 5).sort((a, b) => b[1].count - a[1].count).slice(0, 5)
return sorted.map(([key, v]) => ({
pattern_key: `skill_tool_pair:${key}`,
title: `Par frequente: ${key.replace('::', ' + ')}`,
description: `Skill e tool co-ocorreram em ${v.count} sessões esta semana.`,
severity: 'info' as Severity,
metric_value: v.count,
sample_session_ids: v.ids,
affected_count: v.count,
}))
}
/** 4. Duration outliers: sessões > p95 por projecto com outcome != completed. */
export function detectDurationOutliers(ctx: DetectCtx): Pattern[] {
const rows = baseRows(ctx).filter((r) => r.duration_sec != null && r.duration_sec > 0)
const byProject = new Map<string, Array<typeof rows[number]>>()
for (const r of rows) {
const arr = byProject.get(r.project_slug) ?? []
arr.push(r)
byProject.set(r.project_slug, arr)
}
const out: Pattern[] = []
for (const [proj, arr] of byProject) {
if (arr.length < 4) continue
const durations = arr.map((r) => r.duration_sec as number).sort((a, b) => a - b)
const p95Idx = Math.max(0, Math.floor(durations.length * 0.95) - 1)
const p95 = durations[p95Idx]
const outliers = arr.filter((r) => (r.duration_sec as number) > p95 && r.outcome !== 'completed')
if (outliers.length < 3) continue
out.push({
pattern_key: `duration_outliers:${proj}`,
title: `Projecto ${proj}: ${outliers.length} sessões longas não concluídas`,
description: `Sessões com duração acima do p95 (${p95}s) e outcome != completed. Sinal de sessões penduradas.`,
severity: 'warning',
metric_value: p95,
sample_session_ids: outliers.slice(0, 5).map((r) => r.session_id),
affected_count: outliers.length,
})
}
return out
}
/** 5. Sessões abandonadas (event_count < 3 AND outcome=unknown). */
export function detectAbandonedSessions(ctx: DetectCtx): Pattern[] {
const rows = ctx.db.prepare(`
SELECT session_id FROM sessions
WHERE started_at >= ? AND started_at <= ?
AND event_count < 3 AND outcome = 'unknown'
`).all(ctx.weekStartIso, ctx.weekEndIso) as Array<{ session_id: string }>
if (rows.length < 5) return []
return [{
pattern_key: 'abandoned_sessions',
title: `${rows.length} sessões abandonadas esta semana`,
description: `Sessões com menos de 3 eventos e outcome=unknown — tipicamente abertas e descartadas.`,
severity: 'info',
metric_value: rows.length,
sample_session_ids: rows.slice(0, 5).map((r) => r.session_id),
affected_count: rows.length,
}]
}
/** 6. Crescimento de complexidade: avg(tool_calls) actual vs semana anterior. */
export function detectGrowingComplexity(ctx: DetectCtx, prevWeekStartIso: string, prevWeekEndIso: string): Pattern[] {
const curRows = baseRows(ctx)
const prevRows = ctx.db.prepare(`
SELECT skills_invoked, tool_calls FROM sessions
WHERE started_at >= ? AND started_at <= ?
`).all(prevWeekStartIso, prevWeekEndIso) as Array<{ skills_invoked: string; tool_calls: number }>
const curBySkill = new Map<string, { sum: number; count: number; ids: string[] }>()
for (const r of curRows) {
let sk: string[] = []
try { sk = JSON.parse(r.skills_invoked) } catch {}
for (const s of sk) {
const e = curBySkill.get(s) ?? { sum: 0, count: 0, ids: [] }
e.sum += r.tool_calls
e.count++
if (e.ids.length < 5) e.ids.push(r.session_id)
curBySkill.set(s, e)
}
}
const prevBySkill = new Map<string, { sum: number; count: number }>()
for (const r of prevRows) {
let sk: string[] = []
try { sk = JSON.parse(r.skills_invoked) } catch {}
for (const s of sk) {
const e = prevBySkill.get(s) ?? { sum: 0, count: 0 }
e.sum += r.tool_calls
e.count++
prevBySkill.set(s, e)
}
}
const out: Pattern[] = []
for (const [skill, cur] of curBySkill) {
if (cur.count < 5) continue
const curAvg = cur.sum / cur.count
const prev = prevBySkill.get(skill)
if (!prev || prev.count < 3) continue
const prevAvg = prev.sum / prev.count
if (prevAvg === 0 || curAvg <= prevAvg * 1.3) continue
out.push({
pattern_key: `growing_complexity:${skill}`,
title: `Skill ${skill}: tool_calls médio +${Math.round((curAvg / prevAvg - 1) * 100)}% vs semana anterior`,
description: `Média de tool_calls/sessão subiu de ${prevAvg.toFixed(1)} para ${curAvg.toFixed(1)}.`,
severity: 'warning',
metric_value: Math.round(curAvg * 10) / 10,
sample_session_ids: cur.ids,
affected_count: cur.count,
})
}
return out
}
/**
* 7. Acções nunca executadas entradas em worklog_comments de discussão 33
* (Acções de Melhoria) com prioridade P1/P2 criadas 14 dias e sem
* commit em git history que referencie a mesma `task_ref` (heurística).
*/
export function detectActionsNeverExecuted(ctx: DetectCtx): Pattern[] {
// Entradas criadas até 14 dias antes do fim da semana (ou antes)
const cutoff = new Date(ctx.weekEndIso)
cutoff.setUTCDate(cutoff.getUTCDate() - 14)
const cutoffIso = cutoff.toISOString()
const rows = ctx.db.prepare(`
SELECT id, discussion_id, created_at, task_ref, actions_json, title
FROM worklog_comments
WHERE discussion_id = 33 AND created_at <= ?
ORDER BY created_at DESC
LIMIT 500
`).all(cutoffIso) as Array<{
id: number
discussion_id: number
created_at: string
task_ref: string | null
actions_json: string
title: string | null
}>
if (rows.length === 0) return []
const pendentes: Array<{ id: number; descricao: string; prioridade: string }> = []
for (const r of rows) {
let actions: Array<{ tipo: string; descricao: string; prioridade: string | null }> = []
try { actions = JSON.parse(r.actions_json) } catch {}
for (const a of actions) {
const prio = (a.prioridade ?? '').toUpperCase()
if (prio === 'P1' || prio === 'P2') {
pendentes.push({ id: r.id, descricao: a.descricao.slice(0, 120), prioridade: prio })
if (pendentes.length >= 10) break
}
}
if (pendentes.length >= 10) break
}
if (pendentes.length < 3) return []
return [{
pattern_key: 'actions_never_executed',
title: `${pendentes.length}+ acções P1/P2 pendentes há ≥14 dias`,
description: `Acções de melhoria (disc #33) sem execução visível. Amostra: ${pendentes.slice(0, 3).map((p) => `[${p.prioridade}] ${p.descricao}`).join(' | ')}`,
severity: 'warning',
metric_value: pendentes.length,
sample_session_ids: pendentes.slice(0, 5).map((p) => `worklog:${p.id}`),
affected_count: pendentes.length,
}]
}
/**
* 8. Skill reportada como problemática em worklogs mas que aparece com
* outcome=completed nas sessões reais discrepância entre narrativa e dados.
*/
export function detectSkillReportedBrokenButCompleted(ctx: DetectCtx): Pattern[] {
// Recolhe skills mencionadas em problems_json e patterns_text de worklogs
// criados nas últimas 4 semanas antes do fim da janela
const windowStart = new Date(ctx.weekEndIso)
windowStart.setUTCDate(windowStart.getUTCDate() - 28)
const windowIso = windowStart.toISOString()
const worklogs = ctx.db.prepare(`
SELECT patterns_text, problems_json
FROM worklog_comments
WHERE discussion_id IN (31, 32) AND created_at >= ?
LIMIT 500
`).all(windowIso) as Array<{ patterns_text: string; problems_json: string }>
if (worklogs.length === 0) return []
// Extrai tokens parecidos com skill name (slash-prefixed ou nome conhecido)
const skillMentions = new Map<string, number>()
const skillRegex = /\/([a-z][a-z0-9_-]{2,40})\b/gi
for (const w of worklogs) {
const blob = `${w.patterns_text} ${w.problems_json}`.toLowerCase()
for (const m of blob.matchAll(skillRegex)) {
skillMentions.set(m[1], (skillMentions.get(m[1]) ?? 0) + 1)
}
}
if (skillMentions.size === 0) return []
// Para cada skill mencionada ≥2 vezes, ver sessões com skill invocada e outcome=completed
const out: Pattern[] = []
const skillsRelevantes = [...skillMentions.entries()].filter(([, c]) => c >= 2)
for (const [skill, mentions] of skillsRelevantes) {
const rows = ctx.db.prepare(`
SELECT session_id, skills_invoked, outcome
FROM sessions
WHERE started_at >= ? AND started_at <= ?
AND skills_invoked LIKE ? AND outcome = 'completed'
`).all(ctx.weekStartIso, ctx.weekEndIso, `%"${skill}"%`) as Array<{
session_id: string
skills_invoked: string
outcome: string
}>
// Confirmar via parse (skills_invoked é JSON array)
const matches = rows.filter((r) => {
try { return (JSON.parse(r.skills_invoked) as string[]).includes(skill) } catch { return false }
})
if (matches.length >= 3) {
out.push({
pattern_key: `skill_narrative_vs_data:${skill}`,
title: `Skill ${skill}: reportada problemática em ${mentions} worklogs mas ${matches.length} sessões completed`,
description: `Discrepância entre narrativa (worklogs #31/#32) e dados (sessions.outcome). Investigar se o problema é silencioso.`,
severity: 'info',
metric_value: matches.length,
sample_session_ids: matches.slice(0, 5).map((r) => r.session_id),
affected_count: matches.length,
})
}
}
return out
}
/**
* 9. Palavras/frases em patterns_text de worklogs recorrentes na semana
* (3+ worklogs com token comum 4 chars).
*/
export function detectWorklogPatternFrequency(ctx: DetectCtx): Pattern[] {
const rows = ctx.db.prepare(`
SELECT id, patterns_text FROM worklog_comments
WHERE created_at >= ? AND created_at <= ?
`).all(ctx.weekStartIso, ctx.weekEndIso) as Array<{ id: number; patterns_text: string }>
if (rows.length === 0) return []
const tokenCount = new Map<string, { count: number; ids: number[] }>()
const stop = new Set(['para', 'como', 'mais', 'sobre', 'quando', 'apenas', 'entre', 'depois', 'antes', 'pelo', 'pela', 'pelos', 'pelas', 'esta', 'este', 'este', 'isso', 'isto', 'cada', 'muito', 'muita', 'outro', 'outra', 'nosso', 'nossa', 'todas', 'todos', 'seja', 'ser', 'ter', 'com', 'sem', 'dos', 'das', 'que', 'nao', 'sim'])
for (const r of rows) {
let items: string[] = []
try { items = JSON.parse(r.patterns_text) } catch {}
const seen = new Set<string>()
for (const t of items) {
const words = t
.toLowerCase()
.normalize('NFD')
.replace(/[̀-ͯ]/g, '')
.split(/[^a-z0-9]+/)
.filter((w) => w.length >= 5 && !stop.has(w))
for (const w of words) {
if (seen.has(w)) continue
seen.add(w)
const e = tokenCount.get(w) ?? { count: 0, ids: [] }
e.count++
e.ids.push(r.id)
tokenCount.set(w, e)
}
}
}
const frequent = [...tokenCount.entries()]
.filter(([, v]) => v.count >= 3)
.sort((a, b) => b[1].count - a[1].count)
.slice(0, 5)
if (frequent.length === 0) return []
return [{
pattern_key: 'worklog_pattern_frequency',
title: `Termos recorrentes em ${rows.length} worklogs desta semana`,
description: `Top tokens em patterns_text: ${frequent.map(([w, v]) => `${w}(${v.count})`).join(', ')}`,
severity: 'info',
metric_value: frequent[0][1].count,
sample_session_ids: frequent.flatMap(([, v]) => v.ids.slice(0, 2)).slice(0, 5).map((id) => `worklog:${id}`),
affected_count: rows.length,
}]
}
/** Orquestra todos os detectores para a semana indicada. */
export function detectPatterns(
dbWrapper: SessionsDb,
weekStart: Date,
weekEnd: Date,
): Pattern[] {
const db = dbWrapper.rawDb()
const ctx: DetectCtx = {
db,
weekStartIso: iso(weekStart),
weekEndIso: iso(weekEnd),
}
const prevStart = new Date(weekStart); prevStart.setUTCDate(prevStart.getUTCDate() - 7)
const prevEnd = new Date(weekEnd); prevEnd.setUTCDate(prevEnd.getUTCDate() - 7)
const base: Pattern[] = [
...detectSkillsHighErrorRate(ctx),
...detectToolsLowEfficiency(ctx),
...detectSkillToolPairs(ctx),
...detectDurationOutliers(ctx),
...detectAbandonedSessions(ctx),
...detectGrowingComplexity(ctx, iso(prevStart), iso(prevEnd)),
]
// Cross-detectors: só correm se houver worklogs na janela
const worklogCount = (db.prepare(`SELECT COUNT(*) as c FROM worklog_comments`).get() as { c: number }).c
if (worklogCount > 0) {
base.push(
...detectActionsNeverExecuted(ctx),
...detectSkillReportedBrokenButCompleted(ctx),
...detectWorklogPatternFrequency(ctx),
)
}
return base
}
/** Converte Pattern + contexto em PatternRecord pronto a persistir. */
export function toPatternRecord(p: Pattern, weekIso: string, consecutiveWeeks: number): PatternRecord {
return {
detected_at: new Date().toISOString(),
week_iso: weekIso,
pattern_key: p.pattern_key,
title: p.title,
description: p.description,
severity: p.severity,
metric_value: p.metric_value,
sample_session_ids: p.sample_session_ids,
affected_count: p.affected_count,
consecutive_weeks: consecutiveWeeks,
}
}
+44
View File
@@ -0,0 +1,44 @@
import chokidar from 'chokidar'
import { openSessionsDb } from './db.js'
import { indexFile, PROJECTS_ROOT } from './indexer.js'
export async function startWatcher(dbPath: string): Promise<void> {
const db = openSessionsDb(dbPath)
const watcher = chokidar.watch(`${PROJECTS_ROOT}/**/*.jsonl`, {
persistent: true,
ignoreInitial: true,
awaitWriteFinish: { stabilityThreshold: 2000, pollInterval: 500 },
})
async function reindex(path: string): Promise<void> {
try {
await indexFile(db, path)
console.log(`[watcher] indexed ${path}`)
} catch (err) {
console.error(`[watcher] erro ${path}:`, err)
}
}
watcher
.on('add', reindex)
.on('change', reindex)
.on('unlink', (path) => {
db.deleteByJsonlPath(path)
console.log(`[watcher] removed ${path}`)
})
.on('error', (err) => console.error('[watcher] error:', err))
console.log('[watcher] pronto')
// Registar handler SIGTERM/SIGINT para fechar DB limpa (evita WAL corruption em Task 9 systemd restart)
const cleanup = async (): Promise<void> => {
console.log('[watcher] SIGTERM/SIGINT — a fechar watcher e DB')
await watcher.close()
db.close()
process.exit(0)
}
process.on('SIGTERM', () => { void cleanup() })
process.on('SIGINT', () => { void cleanup() })
return new Promise(() => {}) // nunca resolve — processo mantém-se vivo
}
+351
View File
@@ -0,0 +1,351 @@
/**
* Importer dos comentários das discussões Desk #31 (Logs), #32 (Reflexões)
* e #33 (Acções de Melhoria) para a tabela `worklog_comments`.
*
* Parser HTML tolerante aceita ambos formatos produzidos pelo skill
* `gestao:worklog` (versão antiga usava `<h2>/<h3>` inline-styled, versão
* nova usa `<h4>` limpos). Secções identificadas por título normalizado
* (ex.: "trabalho realizado", "ficheiros modificados", "problemas",
* "padrões detectados", "acções sugeridas").
*/
import { parse, type HTMLElement } from 'node-html-parser'
import type { SessionsDb, WorklogCommentRecord } from './db.js'
import { callMcpTool, extractMcpJsonPayload } from './mcp-client.js'
export interface ParsedWorklogComment {
id: number
discussion_id: number
created_at: string
staff_id: number | null
title: string | null
task_ref: string | null
duration_sec: number | null
work_items: string[]
files_modified: string[]
problems: { problema: string; solucao: string }[]
patterns_text: string[]
actions: { tipo: string; descricao: string; prioridade: string | null }[]
raw_html: string
}
interface RawComment {
id: number
discussion_id: number
content: string
created: unknown
staff_id: number | null
children?: RawComment[]
}
/** Remove whitespace redundante. */
function norm(s: string): string {
return s.replace(/\s+/g, ' ').trim()
}
/** Converte string livre para chave de secção (lowercase, sem acentos, sem pontuação). */
function sectionKey(s: string): string {
return s
.toLowerCase()
.normalize('NFD')
.replace(/[̀-ͯ]/g, '')
.replace(/[^a-z0-9 ]/g, ' ')
.replace(/\s+/g, ' ')
.trim()
}
const SECTION_WORK = new Set(['trabalho realizado', 'o que foi feito', 'feito', 'realizado', 'trabalho'])
const SECTION_FILES = new Set(['ficheiros modificados', 'ficheiros alterados', 'files modified', 'ficheiros'])
const SECTION_PROBLEMS = new Set(['problemas solucoes', 'problemas', 'solucoes', 'problemas e solucoes', 'problemas solucao'])
const SECTION_PATTERNS = new Set(['padroes detectados', 'padroes', 'patterns', 'insights'])
const SECTION_ACTIONS = new Set(['accoes sugeridas', 'accoes', 'acoes sugeridas', 'acoes', 'actions', 'accoes de melhoria'])
/** Extrai data ISO do título (YYYY-MM-DD [HH:MM]) ou devolve null. */
function parseDateFromTitle(title: string): string | null {
const m = title.match(/(\d{4})-(\d{2})-(\d{2})(?:[ T](\d{2}):(\d{2}))?/)
if (!m) return null
const [, y, mo, d, hh, mm] = m
if (hh && mm) return `${y}-${mo}-${d}T${hh}:${mm}:00Z`
return `${y}-${mo}-${d}T00:00:00Z`
}
/** Tenta extrair "Tarefa: #ID" ou similar. */
function parseTaskRef(text: string): string | null {
const m = text.match(/(?:Tarefa|Task|Ticket)[:\s]*(#?\d+)/i)
if (m) return m[1].startsWith('#') ? m[1] : `#${m[1]}`
const bare = text.match(/#(\d{3,6})/)
return bare ? `#${bare[1]}` : null
}
/** "~2h 30m" / "~45 min" / "5 minutos" → segundos. */
function parseDuration(text: string): number | null {
const m = text.match(/~?\s*(\d+)\s*h\s*(\d+)?\s*m?/i)
if (m) {
const h = parseInt(m[1], 10)
const mm = m[2] ? parseInt(m[2], 10) : 0
return h * 3600 + mm * 60
}
const mm = text.match(/~?\s*(\d+)\s*(?:min|minutos|m)/i)
if (mm) return parseInt(mm[1], 10) * 60
return null
}
/** Extrai texto de um elemento, incluindo inner HTML como plain text. */
function textOf(el: HTMLElement): string {
return norm(el.text ?? '')
}
/** Colecta items de uma UL ou lista no mesmo nível que vem depois de um cabeçalho. */
function collectFollowingListItems(heading: HTMLElement): string[] {
const items: string[] = []
let cur: HTMLElement | null = heading.nextElementSibling
while (cur) {
const tag = cur.rawTagName?.toLowerCase()
if (tag && /^h[1-6]$/.test(tag)) break
if (tag === 'ul' || tag === 'ol') {
for (const li of cur.querySelectorAll('li')) {
const t = textOf(li)
if (t) items.push(t)
}
} else if (tag === 'p') {
// Alguns comentários partem o UL em múltiplos <p>; vasculha <li> dentro
for (const li of cur.querySelectorAll('li')) {
const t = textOf(li)
if (t) items.push(t)
}
}
cur = cur.nextElementSibling
}
return items
}
/** Parse item "[Tipo] descrição" ou "Tipo: descrição (Px)". */
function parseActionItem(raw: string): { tipo: string; descricao: string; prioridade: string | null } {
// Remove checkbox inicial "[ ]" ou "[x]" se existir
let s = raw.trim().replace(/^\[[\s xX✓]\]\s*/, '')
const bracket = s.match(/^\[([^\]]+)\]\s*(.+)$/)
let tipo = 'Geral'
let rest = s
if (bracket) {
tipo = bracket[1].trim()
rest = bracket[2].trim()
}
const prio = rest.match(/\b(P[0-4])\b/i)
return {
tipo,
descricao: rest,
prioridade: prio ? prio[1].toUpperCase() : null,
}
}
/** Parse problema/solução. Heurística: "Problema: X | Solução: Y" ou pares de <li>. */
function parseProblemItem(raw: string): { problema: string; solucao: string } {
const s = raw.trim()
const split = s.split(/\s*(?:->|→|\|\s*Solu[çc][ãa]o:|\s*Solu[çc][ãa]o:)\s*/i)
if (split.length >= 2) {
return {
problema: split[0].replace(/^Problema:\s*/i, '').trim(),
solucao: split.slice(1).join(' ').trim(),
}
}
return { problema: s, solucao: '' }
}
/** Extrai lista "bruta" de todas as <li> dentro do HTML (fallback). */
function extractAllLiItems(root: HTMLElement): string[] {
return root
.querySelectorAll('li')
.map((li) => textOf(li))
.filter(Boolean)
}
export function parseWorklogHtml(
html: string,
meta: { id: number; discussion_id: number; created_at: string; staff_id?: number | null },
): ParsedWorklogComment {
const root = parse(html || '')
const headings = root.querySelectorAll('h1, h2, h3, h4, h5, h6')
// Título: primeiro heading não vazio
let title: string | null = null
for (const h of headings) {
const t = textOf(h)
if (t) { title = t; break }
}
// Data: preferir `meta.created_at` se válido; senão extrair do título ou do texto
let createdAt = meta.created_at
if (!createdAt || createdAt === '1970-01-01T00:00:00.000Z' || createdAt.startsWith('1970')) {
const fromTitle = title ? parseDateFromTitle(title) : null
if (fromTitle) createdAt = fromTitle
else {
const fromText = parseDateFromTitle(textOf(root).slice(0, 500))
createdAt = fromText ?? new Date().toISOString()
}
}
const fullText = textOf(root)
const taskRef = parseTaskRef(fullText)
const durationSec = parseDuration(fullText)
// Indexa secções por chave normalizada
const sections = new Map<string, HTMLElement>()
for (const h of headings) {
const key = sectionKey(textOf(h))
if (!sections.has(key)) sections.set(key, h)
}
function findSection(target: Set<string>): HTMLElement | null {
for (const [k, el] of sections) {
if (target.has(k)) return el
}
// match parcial (ex.: "trabalho realizado manutenção" — começa com)
for (const [k, el] of sections) {
for (const t of target) {
if (k.startsWith(t) || t.startsWith(k)) return el
}
}
return null
}
const workHeading = findSection(SECTION_WORK)
const filesHeading = findSection(SECTION_FILES)
const problemsHeading = findSection(SECTION_PROBLEMS)
const patternsHeading = findSection(SECTION_PATTERNS)
const actionsHeading = findSection(SECTION_ACTIONS)
const workItems = workHeading ? collectFollowingListItems(workHeading) : []
const filesModified = filesHeading ? collectFollowingListItems(filesHeading) : []
const problemsRaw = problemsHeading ? collectFollowingListItems(problemsHeading) : []
const patternsText = patternsHeading ? collectFollowingListItems(patternsHeading) : []
const actionsRaw = actionsHeading ? collectFollowingListItems(actionsHeading) : []
// Fallback: se nenhuma secção encontrada mas existem <li>, e a discussão é #33,
// tratar tudo como acções (formato diferente das outras discussões)
let actions = actionsRaw.map(parseActionItem)
if (meta.discussion_id === 33 && actions.length === 0) {
actions = extractAllLiItems(root).map(parseActionItem)
}
const problems = problemsRaw.map(parseProblemItem)
return {
id: meta.id,
discussion_id: meta.discussion_id,
created_at: createdAt,
staff_id: meta.staff_id ?? null,
title,
task_ref: taskRef,
duration_sec: durationSec,
work_items: workItems,
files_modified: filesModified,
problems,
patterns_text: patternsText,
actions,
raw_html: html,
}
}
/** Converte o campo `created` devolvido pelo Desk MCP (pode ser objecto vazio). */
function normalizeMcpDate(v: unknown): string {
if (!v) return ''
if (typeof v === 'string') return v
if (typeof v === 'object') {
const obj = v as Record<string, unknown>
if (typeof obj.date === 'string') return obj.date
if (typeof obj.datetime === 'string') return obj.datetime
}
return ''
}
/** Achata a árvore de comentários (comentários com children recursivos). */
function flattenComments(comments: RawComment[]): RawComment[] {
const out: RawComment[] = []
for (const c of comments) {
out.push(c)
if (c.children && c.children.length) {
out.push(...flattenComments(c.children))
}
}
return out
}
export interface ImportResult {
discussion_id: number
fetched: number
imported: number
updated: number
skipped: number
errors: number
}
/**
* Importa todos os comentários de uma discussão Desk. Paginação por `limit`/`offset`.
* Idempotente por `id` comentários existentes sofrem update (raw_html pode mudar).
*/
export async function importWorklogDiscussion(
db: SessionsDb,
discussionId: number,
opts: { sinceIso?: string; pageSize?: number; maxPages?: number } = {},
): Promise<ImportResult> {
// O MCP desk-crm parece clampar resultados em 200/página independentemente do limit.
// Pedimos 200 e iteramos offset até a resposta vir vazia.
const pageSize = opts.pageSize ?? 200
const maxPages = opts.maxPages ?? 20
const result: ImportResult = {
discussion_id: discussionId,
fetched: 0,
imported: 0,
updated: 0,
skipped: 0,
errors: 0,
}
let offset = 0
for (let page = 0; page < maxPages; page++) {
const raw = await callMcpTool('get_discussion_comments', {
discussion_id: discussionId,
limit: pageSize,
offset,
})
const payload = extractMcpJsonPayload<{
success?: boolean
comments?: RawComment[]
}>(raw)
const pageComments = flattenComments(payload.comments ?? [])
if (pageComments.length === 0) break
result.fetched += pageComments.length
const importedAt = new Date().toISOString()
for (const c of pageComments) {
try {
const createdStr = normalizeMcpDate(c.created)
const parsed = parseWorklogHtml(c.content ?? '', {
id: c.id,
discussion_id: c.discussion_id ?? discussionId,
created_at: createdStr || '',
staff_id: c.staff_id,
})
if (opts.sinceIso && parsed.created_at < opts.sinceIso) {
result.skipped++
continue
}
const record: WorklogCommentRecord = {
...parsed,
imported_at: importedAt,
}
const { inserted } = db.upsertWorklogComment(record)
if (inserted) result.imported++
else result.updated++
} catch (e) {
console.error(`[worklog-import] erro a parsear comentário #${c.id}:`, (e as Error).message)
result.errors++
}
}
// Avança offset; quando próxima página vier vazia, o while quebra na próxima iter.
offset += pageComments.length
// Safety: se MCP devolveu 0, para
if (pageComments.length === 0) break
}
return result
}
+38
View File
@@ -67,6 +67,44 @@ describe('parseSessionFile', () => {
expect(result.meta.skills_invoked).toContain('superpowers:brainstorming')
})
it('detecta skill invocation em tool_result.content (string)', async () => {
const path = writeJsonl([
{
type: 'user',
timestamp: '2026-04-23T10:00:00Z',
message: {
role: 'user',
content: [
{ type: 'tool_result', tool_use_id: 'abc', content: 'Launching skill: infraestrutura:easypanel-monitor\nOther log output' },
],
},
},
])
const result = await parseSessionFile(path)
expect(result.meta.skills_invoked).toContain('infraestrutura:easypanel-monitor')
})
it('detecta skill invocation em tool_result.content (array de text blocks)', async () => {
const path = writeJsonl([
{
type: 'user',
timestamp: '2026-04-23T10:00:00Z',
message: {
role: 'user',
content: [
{
type: 'tool_result',
tool_use_id: 'abc',
content: [{ type: 'text', text: 'Launching skill: superpowers:brainstorming' }],
},
],
},
},
])
const result = await parseSessionFile(path)
expect(result.meta.skills_invoked).toContain('superpowers:brainstorming')
})
it('ignora linhas JSON inválidas silenciosamente', async () => {
const path = writeJsonl([
{ type: 'user', message: { role: 'user', content: [{ type: 'text', text: 'válido' }] } },
+123
View File
@@ -0,0 +1,123 @@
import { describe, it, expect, beforeEach } from 'vitest'
import { mkdtempSync } from 'fs'
import { tmpdir } from 'os'
import { join } from 'path'
import { openSessionsDb, type SessionsDb, type PatternRecord } from '../services/sessions/db.js'
import { detectPatterns, weekRange, toPatternRecord } from '../services/sessions/patterns.js'
import type { SessionMeta } from '../types/session.js'
function meta(overrides: Partial<SessionMeta>): SessionMeta {
return {
session_id: 's-' + Math.random().toString(36).slice(2, 10),
project_path: '/tmp/project',
project_slug: 'project',
jsonl_path: '/tmp/' + Math.random().toString(36).slice(2) + '.jsonl',
started_at: '2026-04-20T10:00:00Z', // segunda de 2026-W17
ended_at: '2026-04-20T10:30:00Z',
duration_sec: 1800,
event_count: 50,
user_messages: 5,
assistant_msgs: 10,
tool_calls: 20,
first_prompt: 'olá',
tools_used: ['Bash'],
skills_invoked: [],
outcome: 'completed',
permission_mode: 'default',
file_size: 10000,
indexed_at: '2026-04-20T10:31:00Z',
...overrides,
}
}
describe('patterns detector', () => {
let db: SessionsDb
beforeEach(() => {
const dir = mkdtempSync(join(tmpdir(), 'obs-pat-'))
db = openSessionsDb(join(dir, 'sessions.db'))
})
it('detecta skill com taxa elevada de erro (action)', () => {
// 3 sessões skill X: 2 error, 1 completed → ratio 0.67 → severity=action
db.upsertSession(meta({ session_id: 'a', skills_invoked: ['skillX'], outcome: 'error' }))
db.upsertSession(meta({ session_id: 'b', skills_invoked: ['skillX'], outcome: 'interrupted' }))
db.upsertSession(meta({ session_id: 'c', skills_invoked: ['skillX'], outcome: 'completed' }))
const { start, end } = weekRange(new Date('2026-04-22T00:00:00Z'))
const patterns = detectPatterns(db, start, end)
const errorRate = patterns.find((p) => p.pattern_key === 'skill_error_rate:skillX')
expect(errorRate).toBeDefined()
expect(errorRate!.severity).toBe('action')
expect(errorRate!.affected_count).toBe(2)
})
it('detecta sessões abandonadas', () => {
for (let i = 0; i < 6; i++) {
db.upsertSession(meta({ session_id: `ab-${i}`, event_count: 1, outcome: 'unknown' }))
}
const { start, end } = weekRange(new Date('2026-04-22T00:00:00Z'))
const patterns = detectPatterns(db, start, end)
expect(patterns.some((p) => p.pattern_key === 'abandoned_sessions')).toBe(true)
})
it('getConsecutiveWeeks devolve 3 após upserts em semanas sucessivas', () => {
const key = 'skill_error_rate:Y'
const weeks = ['2026-W15', '2026-W16', '2026-W17']
for (const w of weeks) {
db.upsertPattern({
detected_at: new Date().toISOString(),
week_iso: w,
pattern_key: key,
title: 't',
description: 'd',
severity: 'warning',
metric_value: 0.5,
sample_session_ids: ['x'],
affected_count: 1,
consecutive_weeks: 1,
})
}
expect(db.getConsecutiveWeeks(key, '2026-W17')).toBe(3)
expect(db.getConsecutiveWeeks(key, '2026-W16')).toBe(2)
})
it('upsertPattern é idempotente por (week_iso, pattern_key)', () => {
const base: PatternRecord = {
detected_at: '2026-04-20T00:00:00Z',
week_iso: '2026-W17',
pattern_key: 'test',
title: 'v1',
description: 'd',
severity: 'info',
metric_value: 1,
sample_session_ids: ['a'],
affected_count: 1,
consecutive_weeks: 1,
}
db.upsertPattern(base)
db.upsertPattern({ ...base, title: 'v2', affected_count: 5, consecutive_weeks: 2 })
const rows = db.getPatternsByWeek('2026-W17')
expect(rows).toHaveLength(1)
expect(rows[0].title).toBe('v2')
expect(rows[0].affected_count).toBe(5)
expect(rows[0].consecutive_weeks).toBe(2)
})
it('toPatternRecord propaga week_iso e consecutive_weeks', () => {
const rec = toPatternRecord(
{
pattern_key: 'k',
title: 't',
description: 'd',
severity: 'warning',
metric_value: 0.42,
sample_session_ids: ['a', 'b'],
affected_count: 2,
},
'2026-W17',
3,
)
expect(rec.week_iso).toBe('2026-W17')
expect(rec.consecutive_weeks).toBe(3)
expect(rec.severity).toBe('warning')
})
})
+68
View File
@@ -0,0 +1,68 @@
/**
* Testes da rota /api/sessions (validação Zod + integração com SessionsDb).
* @author Descomplicar® | Projecto Observabilidade (Espelho)
*/
import { describe, it, expect, beforeAll } from 'vitest'
import express from 'express'
import request from 'supertest'
import { mkdtempSync } from 'fs'
import { tmpdir } from 'os'
import { join } from 'path'
import { openSessionsDb } from '../services/sessions/db.js'
import { createSessionsRouter } from '../routes/sessions.js'
import type { SessionMeta } from '../types/session.js'
function meta(overrides: Partial<SessionMeta> = {}): SessionMeta {
return {
session_id: 's1',
project_path: '/tmp/p',
project_slug: 'p',
jsonl_path: '/tmp/p/s1.jsonl',
started_at: new Date().toISOString(),
ended_at: null,
duration_sec: 60,
event_count: 10,
user_messages: 2,
assistant_msgs: 5,
tool_calls: 3,
first_prompt: 'teste',
tools_used: ['Bash'],
skills_invoked: [],
outcome: 'completed',
permission_mode: 'default',
file_size: 1000,
indexed_at: new Date().toISOString(),
...overrides,
}
}
describe('GET /api/sessions', () => {
let app: express.Express
beforeAll(() => {
const dbPath = join(mkdtempSync(join(tmpdir(), 'obs-r-')), 'sessions.db')
const db = openSessionsDb(dbPath)
db.upsertSession(meta({ session_id: 's1', project_slug: 'alpha', jsonl_path: '/tmp/p/s1.jsonl' }))
db.upsertSession(meta({ session_id: 's2', project_slug: 'beta', jsonl_path: '/tmp/p/s2.jsonl' }))
app = express()
app.use('/api/sessions', createSessionsRouter(db))
})
it('lista todas as sessões por omissão', async () => {
const res = await request(app).get('/api/sessions')
expect(res.status).toBe(200)
expect(res.body.total).toBe(2)
expect(res.body.items).toHaveLength(2)
})
it('filtra por projecto', async () => {
const res = await request(app).get('/api/sessions').query({ project: 'alpha' })
expect(res.status).toBe(200)
expect(res.body.total).toBe(1)
expect(res.body.items[0].project_slug).toBe('alpha')
})
it('rejeita limit inválido', async () => {
const res = await request(app).get('/api/sessions').query({ limit: '9999' })
expect(res.status).toBe(400)
})
})
+175
View File
@@ -0,0 +1,175 @@
import { describe, it, expect, beforeEach } from 'vitest'
import { mkdtempSync } from 'fs'
import { tmpdir } from 'os'
import { join } from 'path'
import { openSessionsDb, type SessionsDb, type WorklogCommentRecord } from '../services/sessions/db.js'
import { parseWorklogHtml } from '../services/sessions/worklog-import.js'
import { detectActionsNeverExecuted, weekRange } from '../services/sessions/patterns.js'
const SAMPLE_H4 = `
<h4>2026-04-15 10:30 - Refactor API sessions</h4>
<p><strong>Projecto:</strong> DashDescomplicar</p>
<p><strong>Tarefa:</strong> #2059 - Observabilidade Espelho</p>
<p><strong>Duração:</strong> ~2h 15m</p>
<h4>Trabalho Realizado</h4>
<ul><li>Criar módulo worklog-import</li><li>Integrar detectores cruzados</li></ul>
<h4>Ficheiros Modificados</h4>
<ul><li><code>api/services/sessions/db.ts</code></li><li><code>api/scripts/sessions-worklog-import.ts</code></li></ul>
<h4>Problemas / Soluções</h4>
<ul><li>Parser HTML frágil usar node-html-parser</li></ul>
<h4>Padrões Detectados</h4>
<ul><li>MCP gateway responde em SSE ou JSON</li></ul>
<h4>Acções Sugeridas</h4>
<ul><li>[Refactor] Extrair callMcpTool para módulo partilhado P2</li></ul>
`
const SAMPLE_H2 = `
<h2>2026-01-31 - Estratégia Stack</h2>
<p><strong>Duração:</strong> ~2h</p>
<h3>Trabalho Realizado</h3>
<ul><li>Stack Mapeado - 15 sistemas</li></ul>
<h3>Insights</h3>
<ul><li>Posicionamento: Marketing alta performance</li></ul>
`
const SAMPLE_D33 = `
<ul>
<li>[ ] [MCP] Corrigir bug desk-crm-v3 com tabelas de discussões</li>
</ul>
<p><strong>Origem:</strong> Sessão 2026-02-02</p>
<p><strong>Prioridade:</strong> P1</p>
`
describe('parseWorklogHtml', () => {
it('extrai campos de comentário formato <h4>', () => {
const parsed = parseWorklogHtml(SAMPLE_H4, { id: 100, discussion_id: 31, created_at: '' })
expect(parsed.id).toBe(100)
expect(parsed.title).toMatch(/2026-04-15/)
expect(parsed.task_ref).toBe('#2059')
expect(parsed.duration_sec).toBe(2 * 3600 + 15 * 60)
expect(parsed.work_items.length).toBe(2)
expect(parsed.files_modified.length).toBe(2)
expect(parsed.patterns_text.length).toBe(1)
expect(parsed.actions.length).toBe(1)
expect(parsed.actions[0].tipo).toBe('Refactor')
expect(parsed.actions[0].prioridade).toBe('P2')
expect(parsed.created_at.startsWith('2026-04-15')).toBe(true)
})
it('extrai campos de comentário formato <h2>/<h3> (legacy)', () => {
const parsed = parseWorklogHtml(SAMPLE_H2, { id: 64, discussion_id: 31, created_at: '' })
expect(parsed.title).toMatch(/2026-01-31/)
expect(parsed.work_items.length).toBeGreaterThanOrEqual(1)
expect(parsed.duration_sec).toBe(2 * 3600)
expect(parsed.created_at.startsWith('2026-01-31')).toBe(true)
})
it('extrai acções em formato discussão #33 (lista crua)', () => {
const parsed = parseWorklogHtml(SAMPLE_D33, { id: 200, discussion_id: 33, created_at: '2026-02-02T00:00:00Z' })
expect(parsed.actions.length).toBe(1)
expect(parsed.actions[0].tipo).toBe('MCP')
})
})
describe('upsertWorklogComment idempotência', () => {
let db: SessionsDb
beforeEach(() => {
const dir = mkdtempSync(join(tmpdir(), 'obs-wl-'))
db = openSessionsDb(join(dir, 'sessions.db'))
})
it('insert primeiro, update depois', () => {
const base: WorklogCommentRecord = {
id: 42,
discussion_id: 31,
created_at: '2026-04-15T10:30:00Z',
staff_id: 25,
title: 'Test',
task_ref: '#100',
duration_sec: 600,
work_items: ['a'],
files_modified: [],
problems: [],
patterns_text: [],
actions: [],
raw_html: '<h4>Test</h4>',
imported_at: '2026-04-23T00:00:00Z',
}
const r1 = db.upsertWorklogComment(base)
expect(r1.inserted).toBe(true)
expect(db.countWorklogComments()).toBe(1)
const r2 = db.upsertWorklogComment({ ...base, title: 'Updated' })
expect(r2.inserted).toBe(false)
expect(db.countWorklogComments()).toBe(1)
const list = db.listWorklogComments({ discussion_id: 31 })
expect(list[0].title).toBe('Updated')
})
})
describe('detectActionsNeverExecuted', () => {
let db: SessionsDb
beforeEach(() => {
const dir = mkdtempSync(join(tmpdir(), 'obs-act-'))
db = openSessionsDb(join(dir, 'sessions.db'))
})
it('sinaliza acções P1/P2 antigas sem execução', () => {
const old = new Date('2026-03-01T00:00:00Z').toISOString()
for (let i = 0; i < 4; i++) {
db.upsertWorklogComment({
id: 300 + i,
discussion_id: 33,
created_at: old,
staff_id: 25,
title: `Acção ${i}`,
task_ref: `#${1000 + i}`,
duration_sec: null,
work_items: [],
files_modified: [],
problems: [],
patterns_text: [],
actions: [{ tipo: 'MCP', descricao: `Corrigir bug X${i}`, prioridade: i % 2 ? 'P1' : 'P2' }],
raw_html: '',
imported_at: '2026-04-23T00:00:00Z',
})
}
const range = weekRange(new Date('2026-04-22T00:00:00Z'))
const patterns = detectActionsNeverExecuted({
db: db.rawDb(),
weekStartIso: range.start.toISOString(),
weekEndIso: range.end.toISOString(),
})
expect(patterns.length).toBe(1)
expect(patterns[0].pattern_key).toBe('actions_never_executed')
expect(patterns[0].affected_count).toBeGreaterThanOrEqual(3)
})
it('não sinaliza se acções recentes (<14 dias)', () => {
const recent = new Date().toISOString()
for (let i = 0; i < 5; i++) {
db.upsertWorklogComment({
id: 400 + i,
discussion_id: 33,
created_at: recent,
staff_id: 25,
title: null,
task_ref: null,
duration_sec: null,
work_items: [],
files_modified: [],
problems: [],
patterns_text: [],
actions: [{ tipo: 'MCP', descricao: 'x', prioridade: 'P1' }],
raw_html: '',
imported_at: recent,
})
}
const range = weekRange(new Date())
const patterns = detectActionsNeverExecuted({
db: db.rawDb(),
weekStartIso: range.start.toISOString(),
weekEndIso: range.end.toISOString(),
})
expect(patterns.length).toBe(0)
})
})
Generated Executable → Regular
+375 -31
View File
@@ -21,6 +21,7 @@
"googleapis": "^171.4.0",
"lucide-react": "^0.563.0",
"mysql2": "^3.11.5",
"node-html-parser": "^7.1.0",
"oidc-client-ts": "^3.0.1",
"pg": "^8.20.0",
"react": "^19.2.0",
@@ -45,6 +46,7 @@
"@types/react": "^19.2.5",
"@types/react-dom": "^19.2.3",
"@types/ssh2": "^1.15.5",
"@types/supertest": "^7.2.0",
"@vitejs/plugin-react": "^5.1.1",
"@vitest/ui": "^4.0.18",
"autoprefixer": "^10.4.24",
@@ -55,6 +57,7 @@
"globals": "^16.5.0",
"jsdom": "^28.0.0",
"postcss": "^8.5.6",
"supertest": "^7.2.2",
"tailwindcss": "^4.1.18",
"tsx": "^4.19.2",
"typescript": "~5.9.3",
@@ -176,7 +179,6 @@
"integrity": "sha512-CGOfOJqWjg2qW/Mb6zNsDm+u5vFQ8DxXfbM09z69p5Z6+mE1ikP2jUXw+j42Pf1XTYED2Rni5f95npYeuwMDQA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@babel/code-frame": "^7.29.0",
"@babel/generator": "^7.29.0",
@@ -526,7 +528,6 @@
}
],
"license": "MIT",
"peer": true,
"engines": {
"node": ">=20.19.0"
},
@@ -567,7 +568,6 @@
}
],
"license": "MIT",
"peer": true,
"engines": {
"node": ">=20.19.0"
}
@@ -1306,6 +1306,19 @@
"@jridgewell/sourcemap-codec": "^1.4.14"
}
},
"node_modules/@noble/hashes": {
"version": "1.8.0",
"resolved": "https://registry.npmjs.org/@noble/hashes/-/hashes-1.8.0.tgz",
"integrity": "sha512-jCs9ldd7NwzpgXDIf6P3+NrHh9/sD6CQdxHyjQI+h/6rDNo88ypBxxz45UDuZHz9r3tNz7N/VInSVoVdtXEI4A==",
"dev": true,
"license": "MIT",
"engines": {
"node": "^14.21.3 || >=16"
},
"funding": {
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/@panva/asn1.js": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/@panva/asn1.js/-/asn1.js-1.0.0.tgz",
@@ -1315,6 +1328,16 @@
"node": ">=10.13.0"
}
},
"node_modules/@paralleldrive/cuid2": {
"version": "2.3.1",
"resolved": "https://registry.npmjs.org/@paralleldrive/cuid2/-/cuid2-2.3.1.tgz",
"integrity": "sha512-XO7cAxhnTZl0Yggq6jOgjiOHhbgcO4NqFqwSmQpjK3b6TEE6Uj/jfSk6wzYyemh3+I0sHirKSetjQwn5cZktFw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@noble/hashes": "^1.1.5"
}
},
"node_modules/@polka/url": {
"version": "1.0.0-next.29",
"resolved": "https://registry.npmjs.org/@polka/url/-/url-1.0.0-next.29.tgz",
@@ -2138,7 +2161,8 @@
"resolved": "https://registry.npmjs.org/@types/aria-query/-/aria-query-5.0.4.tgz",
"integrity": "sha512-rfT93uj5s0PRL7EzccGMs3brplhcrghnDoV26NqKhCAS1hVo+WdNsPvE/yb6ilfr5hi2MEk6d5EWJTKdxg8jVw==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/@types/babel__core": {
"version": "7.20.5",
@@ -2239,6 +2263,13 @@
"@types/node": "*"
}
},
"node_modules/@types/cookiejar": {
"version": "2.1.5",
"resolved": "https://registry.npmjs.org/@types/cookiejar/-/cookiejar-2.1.5.tgz",
"integrity": "sha512-he+DHOWReW0nghN24E1WUqM0efK4kI9oTqDm6XmK8ZPe2djZ90BSNdGnIyCLzCPw7/pogPlGbzI2wHGGmi4O/Q==",
"dev": true,
"license": "MIT"
},
"node_modules/@types/cors": {
"version": "2.8.19",
"resolved": "https://registry.npmjs.org/@types/cors/-/cors-2.8.19.tgz",
@@ -2380,6 +2411,13 @@
"@types/node": "*"
}
},
"node_modules/@types/methods": {
"version": "1.1.4",
"resolved": "https://registry.npmjs.org/@types/methods/-/methods-1.1.4.tgz",
"integrity": "sha512-ymXWVrDiCxTBE3+RIrrP533E70eA+9qu7zdWoHuOmGujkYtzf4HQF96b8nwHLqhuf4ykX61IGRIB38CC6/sImQ==",
"dev": true,
"license": "MIT"
},
"node_modules/@types/node": {
"version": "24.10.10",
"resolved": "https://registry.npmjs.org/@types/node/-/node-24.10.10.tgz",
@@ -2420,7 +2458,6 @@
"integrity": "sha512-WPigyYuGhgZ/cTPRXB2EwUw+XvsRA3GqHlsP4qteqrnnjDrApbS7MxcGr/hke5iUoeB7E/gQtrs9I37zAJ0Vjw==",
"devOptional": true,
"license": "MIT",
"peer": true,
"dependencies": {
"csstype": "^3.2.2"
}
@@ -2431,7 +2468,6 @@
"integrity": "sha512-jp2L/eY6fn+KgVVQAOqYItbF0VY/YApe5Mz2F0aykSO8gx31bYCZyvSeYxCHKvzHG5eZjc+zyaS5BrBWya2+kQ==",
"dev": true,
"license": "MIT",
"peer": true,
"peerDependencies": {
"@types/react": "^19.2.0"
}
@@ -2493,6 +2529,30 @@
"dev": true,
"license": "MIT"
},
"node_modules/@types/superagent": {
"version": "8.1.9",
"resolved": "https://registry.npmjs.org/@types/superagent/-/superagent-8.1.9.tgz",
"integrity": "sha512-pTVjI73witn+9ILmoJdajHGW2jkSaOzhiFYF1Rd3EQ94kymLqB9PjD9ISg7WaALC7+dCHT0FGe9T2LktLq/3GQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/cookiejar": "^2.1.5",
"@types/methods": "^1.1.4",
"@types/node": "*",
"form-data": "^4.0.0"
}
},
"node_modules/@types/supertest": {
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/@types/supertest/-/supertest-7.2.0.tgz",
"integrity": "sha512-uh2Lv57xvggst6lCqNdFAmDSvoMG7M/HDtX4iUCquxQ5EGPtaPM5PL5Hmi7LCvOG8db7YaCPNJEeoI8s/WzIQw==",
"dev": true,
"license": "MIT",
"dependencies": {
"@types/methods": "^1.1.4",
"@types/superagent": "^8.1.0"
}
},
"node_modules/@types/use-sync-external-store": {
"version": "0.0.6",
"resolved": "https://registry.npmjs.org/@types/use-sync-external-store/-/use-sync-external-store-0.0.6.tgz",
@@ -2544,7 +2604,6 @@
"integrity": "sha512-BtE0k6cjwjLZoZixN0t5AKP0kSzlGu7FctRXYuPAm//aaiZhmfq1JwdYpYr1brzEspYyFeF+8XF5j2VK6oalrA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@typescript-eslint/scope-manager": "8.54.0",
"@typescript-eslint/types": "8.54.0",
@@ -2893,7 +2952,6 @@
"integrity": "sha512-CGJ25bc8fRi8Lod/3GHSvXRKi7nBo3kxh0ApW4yCjmrWmRmlT53B5E08XRSZRliygG0aVNxLrBEqPYdz/KcCtQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@vitest/utils": "4.0.18",
"fflate": "^0.8.2",
@@ -2943,7 +3001,6 @@
"integrity": "sha512-NZyJarBfL7nWwIq+FDL6Zp/yHEhePMNnnJ0y3qfieCrmNvYct8uvtiV41UvlSe6apAfk0fY1FbWx+NwfmpvtTg==",
"dev": true,
"license": "MIT",
"peer": true,
"bin": {
"acorn": "bin/acorn"
},
@@ -3108,6 +3165,13 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/asap": {
"version": "2.0.6",
"resolved": "https://registry.npmjs.org/asap/-/asap-2.0.6.tgz",
"integrity": "sha512-BSHWgDSAiKs50o2Re8ppvp3seVHXSRM44cdSsT9FfNEUUZLOGWVCsiWaRPWM1Znn+mqZ1OfVZ3z3DWEzSp7hRA==",
"dev": true,
"license": "MIT"
},
"node_modules/asn1": {
"version": "0.2.6",
"resolved": "https://registry.npmjs.org/asn1/-/asn1-0.2.6.tgz",
@@ -3136,6 +3200,13 @@
"node": ">= 0.4"
}
},
"node_modules/asynckit": {
"version": "0.4.0",
"resolved": "https://registry.npmjs.org/asynckit/-/asynckit-0.4.0.tgz",
"integrity": "sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==",
"dev": true,
"license": "MIT"
},
"node_modules/autoprefixer": {
"version": "10.4.24",
"resolved": "https://registry.npmjs.org/autoprefixer/-/autoprefixer-10.4.24.tgz",
@@ -3344,6 +3415,12 @@
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==",
"license": "MIT"
},
"node_modules/boolbase": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz",
"integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==",
"license": "ISC"
},
"node_modules/brace-expansion": {
"version": "1.1.13",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.13.tgz",
@@ -3375,7 +3452,6 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"baseline-browser-mapping": "^2.9.0",
"caniuse-lite": "^1.0.30001759",
@@ -3665,6 +3741,29 @@
"dev": true,
"license": "MIT"
},
"node_modules/combined-stream": {
"version": "1.0.8",
"resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.8.tgz",
"integrity": "sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==",
"dev": true,
"license": "MIT",
"dependencies": {
"delayed-stream": "~1.0.0"
},
"engines": {
"node": ">= 0.8"
}
},
"node_modules/component-emitter": {
"version": "1.3.1",
"resolved": "https://registry.npmjs.org/component-emitter/-/component-emitter-1.3.1.tgz",
"integrity": "sha512-T0+barUSQRTUQASh8bx02dl+DhF54GtIDY13Y3m9oWTklKbb3Wv974meRpeZ3lp1JpLVECWWNHC4vaG2XHXouQ==",
"dev": true,
"license": "MIT",
"funding": {
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/concat-map": {
"version": "0.0.1",
"resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz",
@@ -3760,6 +3859,13 @@
"integrity": "sha512-NXdYc3dLr47pBkpUCHtKSwIOQXLVn8dZEuywboCOJY/osA0wFSLlSawr3KN8qXJEyX66FcONTH8EIlVuK0yyFA==",
"license": "MIT"
},
"node_modules/cookiejar": {
"version": "2.1.4",
"resolved": "https://registry.npmjs.org/cookiejar/-/cookiejar-2.1.4.tgz",
"integrity": "sha512-LDx6oHrK+PhzLKJU9j5S7/Y3jM/mUHvD/DeI1WQmJn652iPC5Y4TBzC9l+5OMOXlyTTA+SmVUPm0HQUwpD5Jqw==",
"dev": true,
"license": "MIT"
},
"node_modules/cors": {
"version": "2.8.6",
"resolved": "https://registry.npmjs.org/cors/-/cors-2.8.6.tgz",
@@ -3806,6 +3912,22 @@
"node": ">= 8"
}
},
"node_modules/css-select": {
"version": "5.2.2",
"resolved": "https://registry.npmjs.org/css-select/-/css-select-5.2.2.tgz",
"integrity": "sha512-TizTzUddG/xYLA3NXodFM0fSbNizXjOKhqiQQwvhlspadZokn1KDy0NZFS0wuEubIYAV5/c1/lAr0TaaFXEXzw==",
"license": "BSD-2-Clause",
"dependencies": {
"boolbase": "^1.0.0",
"css-what": "^6.1.0",
"domhandler": "^5.0.2",
"domutils": "^3.0.1",
"nth-check": "^2.0.1"
},
"funding": {
"url": "https://github.com/sponsors/fb55"
}
},
"node_modules/css-tree": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/css-tree/-/css-tree-3.1.0.tgz",
@@ -3820,6 +3942,18 @@
"node": "^10 || ^12.20.0 || ^14.13.0 || >=15.0.0"
}
},
"node_modules/css-what": {
"version": "6.2.2",
"resolved": "https://registry.npmjs.org/css-what/-/css-what-6.2.2.tgz",
"integrity": "sha512-u/O3vwbptzhMs3L1fQE82ZSLHQQfto5gyZzwteVIEyeaY5Fc7R4dapF/BvRoSYFeqfBk4m0V1Vafq5Pjv25wvA==",
"license": "BSD-2-Clause",
"engines": {
"node": ">= 6"
},
"funding": {
"url": "https://github.com/sponsors/fb55"
}
},
"node_modules/css.escape": {
"version": "1.5.1",
"resolved": "https://registry.npmjs.org/css.escape/-/css.escape-1.5.1.tgz",
@@ -4209,6 +4343,16 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/delayed-stream": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz",
"integrity": "sha512-ZySD7Nf91aLB0RxL4KGrKHBXl7Eds1DAmEdcoVawXnLD7SDhpNgtuII2aAkg7a7QS41jxPSZ17p4VdGnMHk3MQ==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=0.4.0"
}
},
"node_modules/denque": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/denque/-/denque-2.1.0.tgz",
@@ -4256,12 +4400,91 @@
"node": ">=8"
}
},
"node_modules/dezalgo": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/dezalgo/-/dezalgo-1.0.4.tgz",
"integrity": "sha512-rXSP0bf+5n0Qonsb+SVVfNfIsimO4HEtmnIpPHY8Q1UCzKlQrDMfdobr8nJOOsRgWCyMRqeSBQzmWUMq7zvVig==",
"dev": true,
"license": "ISC",
"dependencies": {
"asap": "^2.0.0",
"wrappy": "1"
}
},
"node_modules/dom-accessibility-api": {
"version": "0.5.16",
"resolved": "https://registry.npmjs.org/dom-accessibility-api/-/dom-accessibility-api-0.5.16.tgz",
"integrity": "sha512-X7BJ2yElsnOJ30pZF4uIIDfBEVgF4XEBxL9Bxhy6dnrm5hkzqmsWHGTiHqRiITNhMyFLyAiWndIJP7Z1NTteDg==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/dom-serializer": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz",
"integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==",
"license": "MIT",
"dependencies": {
"domelementtype": "^2.3.0",
"domhandler": "^5.0.2",
"entities": "^4.2.0"
},
"funding": {
"url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
}
},
"node_modules/dom-serializer/node_modules/entities": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
"license": "BSD-2-Clause",
"engines": {
"node": ">=0.12"
},
"funding": {
"url": "https://github.com/fb55/entities?sponsor=1"
}
},
"node_modules/domelementtype": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
"integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/fb55"
}
],
"license": "BSD-2-Clause"
},
"node_modules/domhandler": {
"version": "5.0.3",
"resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz",
"integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==",
"license": "BSD-2-Clause",
"dependencies": {
"domelementtype": "^2.3.0"
},
"engines": {
"node": ">= 4"
},
"funding": {
"url": "https://github.com/fb55/domhandler?sponsor=1"
}
},
"node_modules/domutils": {
"version": "3.2.2",
"resolved": "https://registry.npmjs.org/domutils/-/domutils-3.2.2.tgz",
"integrity": "sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==",
"license": "BSD-2-Clause",
"dependencies": {
"dom-serializer": "^2.0.0",
"domelementtype": "^2.3.0",
"domhandler": "^5.0.3"
},
"funding": {
"url": "https://github.com/fb55/domutils?sponsor=1"
}
},
"node_modules/dotenv": {
"version": "16.6.1",
@@ -4593,7 +4816,6 @@
"integrity": "sha512-LEyamqS7W5HB3ujJyvi0HQK/dtVINZvd5mAAp9eT5S/ujByGjiZLCzPcHVzuXbpJDJF/cxwHlfceVUDZ2lnSTw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@eslint-community/eslint-utils": "^4.8.0",
"@eslint-community/regexpp": "^4.12.1",
@@ -4821,7 +5043,6 @@
"resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz",
"integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==",
"license": "MIT",
"peer": true,
"dependencies": {
"accepts": "~1.3.8",
"array-flatten": "1.1.1",
@@ -5001,6 +5222,13 @@
"dev": true,
"license": "MIT"
},
"node_modules/fast-safe-stringify": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/fast-safe-stringify/-/fast-safe-stringify-2.1.1.tgz",
"integrity": "sha512-W+KJc2dmILlPplD/H4K9l9LcAHAfPtP6BY84uVLXQ6Evcz9Lcg33Y2z1IVblT6xdY54PXYVHEv+0Wpq8Io6zkA==",
"dev": true,
"license": "MIT"
},
"node_modules/fdir": {
"version": "6.5.0",
"resolved": "https://registry.npmjs.org/fdir/-/fdir-6.5.0.tgz",
@@ -5154,6 +5382,23 @@
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/form-data": {
"version": "4.0.5",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.5.tgz",
"integrity": "sha512-8RipRLol37bNs2bhoV67fiTEvdTrbMUYcFTiy3+wuuOnUog2QBHCZWXDRijWQfAkhBj2Uf5UnVaiWwA5vdd82w==",
"dev": true,
"license": "MIT",
"dependencies": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
"es-set-tostringtag": "^2.1.0",
"hasown": "^2.0.2",
"mime-types": "^2.1.12"
},
"engines": {
"node": ">= 6"
}
},
"node_modules/formdata-polyfill": {
"version": "4.0.10",
"resolved": "https://registry.npmjs.org/formdata-polyfill/-/formdata-polyfill-4.0.10.tgz",
@@ -5166,6 +5411,24 @@
"node": ">=12.20.0"
}
},
"node_modules/formidable": {
"version": "3.5.4",
"resolved": "https://registry.npmjs.org/formidable/-/formidable-3.5.4.tgz",
"integrity": "sha512-YikH+7CUTOtP44ZTnUhR7Ic2UASBPOqmaRkRKxRbywPTe5VxF7RRCck4af9wutiZ/QKM5nME9Bie2fFaPz5Gug==",
"dev": true,
"license": "MIT",
"dependencies": {
"@paralleldrive/cuid2": "^2.2.2",
"dezalgo": "^1.0.4",
"once": "^1.4.0"
},
"engines": {
"node": ">=14.0.0"
},
"funding": {
"url": "https://ko-fi.com/tunnckoCore/commissions"
}
},
"node_modules/forwarded": {
"version": "0.2.0",
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
@@ -5676,6 +5939,15 @@
"node": ">= 0.4"
}
},
"node_modules/he": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/he/-/he-1.2.0.tgz",
"integrity": "sha512-F/1DnUGPopORZi0ni+CvrCgHQ5FyEAHRLSApuYWMmrbSwoN2Mn/7k+Gl38gJnR7yyDZk6WLXwiGod1JOWNDKGw==",
"license": "MIT",
"bin": {
"he": "bin/he"
}
},
"node_modules/hermes-estree": {
"version": "0.25.1",
"resolved": "https://registry.npmjs.org/hermes-estree/-/hermes-estree-0.25.1.tgz",
@@ -6877,6 +7149,7 @@
"integrity": "sha512-h5bgJWpxJNswbU7qCrV0tIKQCaS3blPDrqKWx+QxzuzL1zGUzij9XCWLrSLsJPu5t+eWA/ycetzYAO5IOMcWAQ==",
"dev": true,
"license": "MIT",
"peer": true,
"bin": {
"lz-string": "bin/bin.js"
}
@@ -7209,6 +7482,16 @@
"url": "https://opencollective.com/node-fetch"
}
},
"node_modules/node-html-parser": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/node-html-parser/-/node-html-parser-7.1.0.tgz",
"integrity": "sha512-iJo8b2uYGT40Y8BTyy5ufL6IVbN8rbm/1QK2xffXU/1a/v3AAa0d1YAoqBNYqaS4R/HajkWIpIfdE6KcyFh1AQ==",
"license": "MIT",
"dependencies": {
"css-select": "^5.1.0",
"he": "1.2.0"
}
},
"node_modules/node-releases": {
"version": "2.0.27",
"resolved": "https://registry.npmjs.org/node-releases/-/node-releases-2.0.27.tgz",
@@ -7228,6 +7511,18 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/nth-check": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz",
"integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==",
"license": "BSD-2-Clause",
"dependencies": {
"boolbase": "^1.0.0"
},
"funding": {
"url": "https://github.com/fb55/nth-check?sponsor=1"
}
},
"node_modules/object-assign": {
"version": "4.1.1",
"resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
@@ -7324,7 +7619,6 @@
"resolved": "https://registry.npmjs.org/oidc-client-ts/-/oidc-client-ts-3.4.1.tgz",
"integrity": "sha512-jNdst/U28Iasukx/L5MP6b274Vr7ftQs6qAhPBCvz6Wt5rPCA+Q/tUmCzfCHHWweWw5szeMy2Gfrm1rITwUKrw==",
"license": "Apache-2.0",
"peer": true,
"dependencies": {
"jwt-decode": "^4.0.0"
},
@@ -7559,7 +7853,6 @@
"resolved": "https://registry.npmjs.org/pg/-/pg-8.20.0.tgz",
"integrity": "sha512-ldhMxz2r8fl/6QkXnBD3CR9/xg694oT6DZQ2s6c/RI28OjtSOpxnPrUCGOBJ46RCUxcWdx3p6kw/xnDHjKvaRA==",
"license": "MIT",
"peer": true,
"dependencies": {
"pg-connection-string": "^2.12.0",
"pg-pool": "^3.13.0",
@@ -7657,7 +7950,6 @@
"integrity": "sha512-QP88BAKvMam/3NxH6vj2o21R6MjxZUAd6nlwAS/pnGvN9IVLocLHxGYIzFhg6fUQ+5th6P4dv4eW9jX3DSIj7A==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=12"
},
@@ -7675,9 +7967,9 @@
}
},
"node_modules/postcss": {
"version": "8.5.6",
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.6.tgz",
"integrity": "sha512-3Ybi1tAuwAP9s0r1UQ2J4n5Y0G05bJkpUIO0/bI9MhwmD70S5aTWbXGBwxHrelT+XM1k6dM0pk+SwNkpTRN7Pg==",
"version": "8.5.12",
"resolved": "https://registry.npmjs.org/postcss/-/postcss-8.5.12.tgz",
"integrity": "sha512-W62t/Se6rA0Az3DfCL0AqJwXuKwBeYg6nOaIgzP+xZ7N5BFCI7DYi1qs6ygUYT6rvfi6t9k65UMLJC+PHZpDAA==",
"dev": true,
"funding": [
{
@@ -7694,7 +7986,6 @@
}
],
"license": "MIT",
"peer": true,
"dependencies": {
"nanoid": "^3.3.11",
"picocolors": "^1.1.1",
@@ -7793,6 +8084,7 @@
"integrity": "sha512-Qb1gy5OrP5+zDf2Bvnzdl3jsTf1qXVMazbvCoKhtKqVs4/YK4ozX4gKQJJVyNe+cajNPn0KoC0MC3FUmaHWEmQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"ansi-regex": "^5.0.1",
"ansi-styles": "^5.0.0",
@@ -7808,6 +8100,7 @@
"integrity": "sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA==",
"dev": true,
"license": "MIT",
"peer": true,
"engines": {
"node": ">=10"
},
@@ -7820,7 +8113,8 @@
"resolved": "https://registry.npmjs.org/react-is/-/react-is-17.0.2.tgz",
"integrity": "sha512-w2GsyukL62IJnlaff/nRegPQR94C/XXamvMWmSHRJ4y7Ts/4ocGRmTHvOs8PSE6pB3dWOrD/nueuU5sduBsQ4w==",
"dev": true,
"license": "MIT"
"license": "MIT",
"peer": true
},
"node_modules/proxy-addr": {
"version": "2.0.7",
@@ -7935,7 +8229,6 @@
"resolved": "https://registry.npmjs.org/react/-/react-19.2.4.tgz",
"integrity": "sha512-9nfp2hYpCwOjAN+8TZFGhtWEwgvWHXqESH8qT89AT/lWklpLON22Lc8pEtnpsZz7VmawabSU0gCjnj8aC0euHQ==",
"license": "MIT",
"peer": true,
"engines": {
"node": ">=0.10.0"
}
@@ -7945,7 +8238,6 @@
"resolved": "https://registry.npmjs.org/react-dom/-/react-dom-19.2.4.tgz",
"integrity": "sha512-AXJdLo8kgMbimY95O2aKQqsz2iWi9jMgKJhRBAxECE4IFxfcazB2LmzloIoibJI3C12IlY20+KFaLv+71bUJeQ==",
"license": "MIT",
"peer": true,
"dependencies": {
"scheduler": "^0.27.0"
},
@@ -7978,7 +8270,6 @@
"resolved": "https://registry.npmjs.org/react-redux/-/react-redux-9.2.0.tgz",
"integrity": "sha512-ROY9fvHhwOD9ySfrF0wmvu//bKCQ6AeZZq1nJNtbDC+kk5DuSuNX/n6YWYF/SYy7bSba4D4FSz8DJeKY/S/r+g==",
"license": "MIT",
"peer": true,
"dependencies": {
"@types/use-sync-external-store": "^0.0.6",
"use-sync-external-store": "^1.4.0"
@@ -8120,8 +8411,7 @@
"version": "5.0.1",
"resolved": "https://registry.npmjs.org/redux/-/redux-5.0.1.tgz",
"integrity": "sha512-M9/ELqF6fy8FwmkpnF0S3YKOqMyoWJ4+CS5Efg2ct3oY9daQvd/Pc71FpGZsVsbl3Cpb+IIcjBDUnnyBdQbq4w==",
"license": "MIT",
"peer": true
"license": "MIT"
},
"node_modules/redux-thunk": {
"version": "3.1.0",
@@ -8892,6 +9182,65 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/superagent": {
"version": "10.3.0",
"resolved": "https://registry.npmjs.org/superagent/-/superagent-10.3.0.tgz",
"integrity": "sha512-B+4Ik7ROgVKrQsXTV0Jwp2u+PXYLSlqtDAhYnkkD+zn3yg8s/zjA2MeGayPoY/KICrbitwneDHrjSotxKL+0XQ==",
"dev": true,
"license": "MIT",
"dependencies": {
"component-emitter": "^1.3.1",
"cookiejar": "^2.1.4",
"debug": "^4.3.7",
"fast-safe-stringify": "^2.1.1",
"form-data": "^4.0.5",
"formidable": "^3.5.4",
"methods": "^1.1.2",
"mime": "2.6.0",
"qs": "^6.14.1"
},
"engines": {
"node": ">=14.18.0"
}
},
"node_modules/superagent/node_modules/mime": {
"version": "2.6.0",
"resolved": "https://registry.npmjs.org/mime/-/mime-2.6.0.tgz",
"integrity": "sha512-USPkMeET31rOMiarsBNIHZKLGgvKc/LrjofAnBlOttf5ajRvqiRA8QsenbcooctK6d6Ts6aqZXBA+XbkKthiQg==",
"dev": true,
"license": "MIT",
"bin": {
"mime": "cli.js"
},
"engines": {
"node": ">=4.0.0"
}
},
"node_modules/supertest": {
"version": "7.2.2",
"resolved": "https://registry.npmjs.org/supertest/-/supertest-7.2.2.tgz",
"integrity": "sha512-oK8WG9diS3DlhdUkcFn4tkNIiIbBx9lI2ClF8K+b2/m8Eyv47LSawxUzZQSNKUrVb2KsqeTDCcjAAVPYaSLVTA==",
"dev": true,
"license": "MIT",
"dependencies": {
"cookie-signature": "^1.2.2",
"methods": "^1.1.2",
"superagent": "^10.3.0"
},
"engines": {
"node": ">=14.18.0"
}
},
"node_modules/supertest/node_modules/cookie-signature": {
"version": "1.2.2",
"resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.2.2.tgz",
"integrity": "sha512-D76uU73ulSXrD1UXF4KE2TMxVVwhsnCgfAyTg9k8P6KGZjlXKrOLe4dJQKI3Bxi5wjesZoFXJWElNWBjPZMbhg==",
"dev": true,
"license": "MIT",
"engines": {
"node": ">=6.6.0"
}
},
"node_modules/supports-color": {
"version": "7.2.0",
"resolved": "https://registry.npmjs.org/supports-color/-/supports-color-7.2.0.tgz",
@@ -9108,7 +9457,6 @@
"integrity": "sha512-5C1sg4USs1lfG0GFb2RLXsdpXqBSEhAaA/0kPL01wxzpMqLILNxIxIOKiILz+cdg/pLnOUxFYOR5yhHU666wbw==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "~0.27.0",
"get-tsconfig": "^4.7.5"
@@ -9247,7 +9595,6 @@
"integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==",
"dev": true,
"license": "Apache-2.0",
"peer": true,
"bin": {
"tsc": "bin/tsc",
"tsserver": "bin/tsserver"
@@ -9446,7 +9793,6 @@
"integrity": "sha512-Bby3NOsna2jsjfLVOHKes8sGwgl4TT0E6vvpYgnAYDIF/tie7MRaFthmKuHx1NSXjiTueXH3do80FMQgvEktRg==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"esbuild": "^0.27.0",
"fdir": "^6.5.0",
@@ -9522,7 +9868,6 @@
"integrity": "sha512-hOQuK7h0FGKgBAas7v0mSAsnvrIgAvWmRFjmzpJ7SwFHH3g1k2u37JtYwOwmEKhK6ZO3v9ggDBBm0La1LCK4uQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"@vitest/expect": "4.0.18",
"@vitest/mocker": "4.0.18",
@@ -9869,7 +10214,6 @@
"resolved": "https://registry.npmjs.org/zod/-/zod-4.3.6.tgz",
"integrity": "sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg==",
"license": "MIT",
"peer": true,
"funding": {
"url": "https://github.com/sponsors/colinhacks"
}
Executable → Regular
+3
View File
@@ -29,6 +29,7 @@
"googleapis": "^171.4.0",
"lucide-react": "^0.563.0",
"mysql2": "^3.11.5",
"node-html-parser": "^7.1.0",
"oidc-client-ts": "^3.0.1",
"pg": "^8.20.0",
"react": "^19.2.0",
@@ -53,6 +54,7 @@
"@types/react": "^19.2.5",
"@types/react-dom": "^19.2.3",
"@types/ssh2": "^1.15.5",
"@types/supertest": "^7.2.0",
"@vitejs/plugin-react": "^5.1.1",
"@vitest/ui": "^4.0.18",
"autoprefixer": "^10.4.24",
@@ -63,6 +65,7 @@
"globals": "^16.5.0",
"jsdom": "^28.0.0",
"postcss": "^8.5.6",
"supertest": "^7.2.2",
"tailwindcss": "^4.1.18",
"tsx": "^4.19.2",
"typescript": "~5.9.3",
+2 -1
View File
@@ -5,8 +5,9 @@ export const oidcConfig = {
client_id: 'OKRSM2FZeSxJDhoV9e17dGRU1L1NEE1JBdnPVWTO',
redirect_uri: window.location.origin + '/callback',
post_logout_redirect_uri: window.location.origin,
scope: 'openid profile email',
scope: 'openid profile email offline_access',
userStore: new WebStorageStateStore({ store: window.localStorage }),
automaticSilentRenew: true,
onSigninCallback: () => {
window.history.replaceState({}, document.title, window.location.pathname);
},
+2
View File
@@ -10,6 +10,7 @@ import {
Bot,
Brain,
ClipboardList,
Eye,
Zap,
ChevronLeft,
ChevronRight,
@@ -35,6 +36,7 @@ const NAV_ITEMS: NavItem[] = [
{ to: '/paperclip', label: 'Paperclip', icon: Bot },
{ to: '/ai', label: 'IA / Claude', icon: Brain },
{ to: '/operations', label: 'Operações', icon: ClipboardList },
{ to: '/sessions', label: 'Espelho', icon: Eye },
]
function useIsMobile() {
+87
View File
@@ -0,0 +1,87 @@
import { useState } from 'react'
import type { SessionEvent } from '../../../api/types/session'
function truncate(s: string, n: number): string {
return s.length > n ? s.slice(0, n) + '…' : s
}
interface Props {
event: SessionEvent
defaultCollapsed: boolean
}
export function EventBlock({ event, defaultCollapsed }: Props) {
const [collapsed, setCollapsed] = useState(defaultCollapsed)
const base = 'rounded border px-3 py-2 my-1 text-sm'
switch (event.type) {
case 'user':
if (event.tool_result !== null && event.tool_result !== undefined) {
return (
<div id={`evt-${event.index}`} className={`${base} bg-slate-900/40 border-slate-700`}>
<button onClick={() => setCollapsed(!collapsed)} className="text-xs text-slate-500 uppercase">
tool_result {collapsed ? '▸' : '▾'}
</button>
{!collapsed && (
<pre className="mt-2 text-xs overflow-x-auto whitespace-pre-wrap text-slate-300">
{typeof event.tool_result === 'string' ? event.tool_result : JSON.stringify(event.tool_result, null, 2)}
</pre>
)}
</div>
)
}
return (
<div id={`evt-${event.index}`} className={`${base} bg-blue-500/10 border-blue-500/30`}>
<div className="text-xs text-blue-300 uppercase mb-1">user</div>
<div className="whitespace-pre-wrap text-slate-100">{event.text ?? '—'}</div>
</div>
)
case 'assistant':
if (event.tool_name) {
return (
<div id={`evt-${event.index}`} className={`${base} bg-amber-500/10 border-amber-500/30`}>
<button onClick={() => setCollapsed(!collapsed)} className="text-xs text-amber-300 uppercase">
tool_use: {event.tool_name} {collapsed ? '▸' : '▾'}
</button>
{!collapsed && event.tool_input && (
<pre className="mt-2 text-xs overflow-x-auto whitespace-pre-wrap text-slate-300">
{JSON.stringify(event.tool_input, null, 2)}
</pre>
)}
</div>
)
}
return (
<div id={`evt-${event.index}`} className={`${base} bg-slate-800/40 border-slate-700`}>
<div className="text-xs text-slate-500 uppercase mb-1">assistant</div>
<div className="whitespace-pre-wrap text-slate-200">{truncate(event.text ?? '—', collapsed ? 300 : Number.MAX_SAFE_INTEGER)}</div>
{(event.text?.length ?? 0) > 300 && (
<button onClick={() => setCollapsed(!collapsed)} className="text-xs text-slate-500 mt-1">
{collapsed ? 'Expandir' : 'Colapsar'}
</button>
)}
</div>
)
case 'system':
return (
<div id={`evt-${event.index}`} className={`${base} bg-slate-900/30 border-slate-800 text-xs text-slate-500`}>
<button onClick={() => setCollapsed(!collapsed)} className="uppercase">
system {event.skill_invoked ? `· skill: ${event.skill_invoked}` : ''} {event.hook_name ? `· hook: ${event.hook_name}` : ''} {collapsed ? '▸' : '▾'}
</button>
{!collapsed && <div className="mt-2 whitespace-pre-wrap">{event.text ?? JSON.stringify(event.raw)}</div>}
</div>
)
case 'attachment':
return (
<div id={`evt-${event.index}`} className={`${base} bg-purple-500/10 border-purple-500/30 text-xs text-purple-300`}>
📎 attachment
</div>
)
default:
return (
<div id={`evt-${event.index}`} className={`${base} bg-slate-900/20 border-slate-800 text-xs text-slate-500`}>
{event.type}
</div>
)
}
}
+102
View File
@@ -0,0 +1,102 @@
import { useEffect, useState } from 'react'
export interface Filters {
days: number
project: string
tool: string
skill: string
q: string
}
interface Props {
initial: Filters
projects: string[]
tools: string[]
skills: string[]
onChange: (f: Filters) => void
}
export function FilterBar({ initial, projects, tools, skills, onChange }: Props) {
const [f, setF] = useState<Filters>(initial)
const [qLocal, setQLocal] = useState<string>(initial.q)
useEffect(() => {
const t = setTimeout(() => {
if (qLocal !== f.q) {
const next = { ...f, q: qLocal }
setF(next)
onChange(next)
}
}, 300)
return () => clearTimeout(t)
}, [qLocal])
function update(partial: Partial<Filters>) {
const next = { ...f, ...partial }
setF(next)
onChange(next)
}
return (
<div className="flex flex-wrap gap-3 p-4 bg-white/5 rounded-lg backdrop-blur border border-white/10">
<select
value={f.days}
onChange={(e) => update({ days: Number(e.target.value) })}
className="bg-slate-900 border border-white/10 rounded px-3 py-2 text-sm"
>
<option value={1}>24h</option>
<option value={7}>7 dias</option>
<option value={30}>30 dias</option>
<option value={90}>90 dias</option>
<option value={3650}>Tudo</option>
</select>
<select
value={f.project}
onChange={(e) => update({ project: e.target.value })}
className="bg-slate-900 border border-white/10 rounded px-3 py-2 text-sm"
>
<option value="">Todos os projectos</option>
{projects.map((p) => (
<option key={p} value={p}>
{p}
</option>
))}
</select>
<select
value={f.tool}
onChange={(e) => update({ tool: e.target.value })}
className="bg-slate-900 border border-white/10 rounded px-3 py-2 text-sm"
>
<option value="">Qualquer tool</option>
{tools.map((t) => (
<option key={t} value={t}>
{t}
</option>
))}
</select>
<select
value={f.skill}
onChange={(e) => update({ skill: e.target.value })}
className="bg-slate-900 border border-white/10 rounded px-3 py-2 text-sm"
>
<option value="">Qualquer skill</option>
{skills.map((s) => (
<option key={s} value={s}>
{s}
</option>
))}
</select>
<input
type="search"
placeholder="Pesquisar no prompt inicial…"
value={qLocal}
onChange={(e) => setQLocal(e.target.value)}
className="flex-1 min-w-[200px] bg-slate-900 border border-white/10 rounded px-3 py-2 text-sm"
/>
</div>
)
}
+57
View File
@@ -0,0 +1,57 @@
import { useNavigate } from 'react-router-dom'
import type { SessionMeta } from '../../../api/types/session'
function formatDuration(sec: number | null): string {
if (!sec) return '—'
if (sec < 60) return `${sec}s`
if (sec < 3600) return `${Math.round(sec / 60)}min`
return `${Math.floor(sec / 3600)}h${Math.round((sec % 3600) / 60)}m`
}
function outcomeIcon(o: SessionMeta['outcome']): string {
switch (o) {
case 'completed':
return '✓'
case 'error':
return '✗'
case 'interrupted':
return '⚠'
default:
return '?'
}
}
interface Props {
session: SessionMeta
}
export function SessionRow({ session }: Props) {
const navigate = useNavigate()
const when = new Date(session.started_at).toLocaleString('pt-PT', { dateStyle: 'short', timeStyle: 'short' })
return (
<tr
className="border-b border-white/5 hover:bg-white/5 cursor-pointer"
onClick={() => navigate(`/sessions/${session.session_id}`)}
>
<td className="px-3 py-2 text-sm text-slate-300">{when}</td>
<td className="px-3 py-2 text-sm text-slate-400">{session.project_slug}</td>
<td className="px-3 py-2 text-sm text-slate-200">
{session.first_prompt?.slice(0, 80) ?? '—'}
{(session.first_prompt?.length ?? 0) > 80 ? '…' : ''}
</td>
<td className="px-3 py-2 text-sm text-slate-400">{formatDuration(session.duration_sec)}</td>
<td className="px-3 py-2 text-sm text-right text-slate-400">{session.event_count}</td>
<td className="px-3 py-2 text-sm text-right text-slate-400">{session.tool_calls}</td>
<td className="px-3 py-2 text-xs">
<div className="flex flex-wrap gap-1">
{session.skills_invoked.slice(0, 2).map((s) => (
<span key={s} className="px-2 py-0.5 bg-indigo-500/20 text-indigo-300 rounded">
{s}
</span>
))}
</div>
</td>
<td className="px-3 py-2 text-center">{outcomeIcon(session.outcome)}</td>
</tr>
)
}
+36
View File
@@ -0,0 +1,36 @@
import type { SessionMeta, SessionEvent } from '../../../api/types/session'
export interface ListParams {
days?: number
project?: string
tool?: string
skill?: string
q?: string
limit?: number
offset?: number
}
export interface ListResponse {
total: number
items: SessionMeta[]
}
const API_BASE = import.meta.env.VITE_API_BASE ?? ''
function buildQuery(params: Record<string, unknown>): string {
const entries = Object.entries(params).filter(([, v]) => v !== undefined && v !== '' && v !== null)
if (entries.length === 0) return ''
return '?' + new URLSearchParams(entries as [string, string][]).toString()
}
export async function listSessions(params: ListParams): Promise<ListResponse> {
const res = await fetch(`${API_BASE}/api/sessions${buildQuery(params as Record<string, unknown>)}`)
if (!res.ok) throw new Error(`listSessions failed: ${res.status}`)
return res.json()
}
export async function getSession(id: string): Promise<{ meta: SessionMeta; events: SessionEvent[] }> {
const res = await fetch(`${API_BASE}/api/sessions/${encodeURIComponent(id)}`)
if (!res.ok) throw new Error(`getSession failed: ${res.status}`)
return res.json()
}
+4
View File
@@ -11,6 +11,8 @@ import N8nMonitor from './pages/N8nMonitor.tsx'
import Paperclip from './pages/Paperclip.tsx'
import AiOverview from './pages/AiOverview.tsx'
import Operations from './pages/Operations.tsx'
import Sessions from './pages/Sessions.tsx'
import SessionDetail from './pages/SessionDetail.tsx'
import Layout from './components/Layout.tsx'
import { oidcConfig } from './auth/config.ts'
import { AuthWrapper } from './auth/AuthWrapper.tsx'
@@ -30,6 +32,8 @@ createRoot(document.getElementById('root')!).render(
<Route path="/paperclip" element={<Paperclip />} />
<Route path="/ai" element={<AiOverview />} />
<Route path="/operations" element={<Operations />} />
<Route path="/sessions" element={<Sessions />} />
<Route path="/sessions/:id" element={<SessionDetail />} />
<Route path="/callback" element={<App />} />
</Route>
</Routes>
+83
View File
@@ -0,0 +1,83 @@
import { useEffect, useState } from 'react'
import { useParams, Link } from 'react-router-dom'
import { getSession } from '../lib/api/sessions'
import { EventBlock } from '../components/sessions/EventBlock'
import type { SessionMeta, SessionEvent } from '../../api/types/session'
type FilterMode = 'all' | 'no-system' | 'tools-only' | 'prompts-only'
export default function SessionDetail() {
const { id } = useParams<{ id: string }>()
const [meta, setMeta] = useState<SessionMeta | null>(null)
const [events, setEvents] = useState<SessionEvent[]>([])
const [mode, setMode] = useState<FilterMode>('all')
const [loading, setLoading] = useState(true)
const [error, setError] = useState<string | null>(null)
useEffect(() => {
if (!id) return
setLoading(true)
getSession(id)
.then((r) => {
setMeta(r.meta)
setEvents(r.events)
})
.catch((e: Error) => setError(e.message))
.finally(() => setLoading(false))
}, [id])
const visible = events.filter((e) => {
if (mode === 'no-system') return e.type !== 'system'
if (mode === 'tools-only') return e.tool_name !== null || e.tool_result !== null
if (mode === 'prompts-only') return e.type === 'user' && e.tool_result === null
return true
})
if (loading) return <div className="p-6 text-slate-400">A carregar</div>
if (error) return <div className="p-6 text-red-300">{error}</div>
if (!meta) return <div className="p-6 text-slate-400">Sessão não encontrada.</div>
return (
<div className="p-6 space-y-4">
<Link to="/sessions" className="text-sm text-slate-400 hover:underline">
Voltar à lista
</Link>
<header className="space-y-1">
<h1 className="text-xl font-semibold text-white">{meta.first_prompt?.slice(0, 120) ?? meta.session_id}</h1>
<div className="text-sm text-slate-400 flex flex-wrap gap-4">
<span>{new Date(meta.started_at).toLocaleString('pt-PT')}</span>
<span>Projecto: {meta.project_slug}</span>
<span>Duração: {meta.duration_sec ?? 0}s</span>
<span>Eventos: {meta.event_count}</span>
<span>Tool calls: {meta.tool_calls}</span>
<span>Skills: {meta.skills_invoked.join(', ') || '—'}</span>
<span>Resultado: {meta.outcome}</span>
</div>
<div className="text-xs text-slate-500">JSONL: {meta.jsonl_path}</div>
</header>
<div className="flex gap-2 text-sm">
{(['all', 'no-system', 'tools-only', 'prompts-only'] as FilterMode[]).map((m) => (
<button
key={m}
onClick={() => setMode(m)}
className={`px-3 py-1 rounded ${mode === m ? 'bg-indigo-500/30 text-indigo-200' : 'bg-white/5 text-slate-400'}`}
>
{m === 'all' ? 'Tudo' : m === 'no-system' ? 'Esconder system' : m === 'tools-only' ? 'Só tools' : 'Só prompts'}
</button>
))}
</div>
<div className="text-xs text-slate-500">
A mostrar {visible.length} de {events.length} eventos.
</div>
<div>
{visible.map((e) => (
<EventBlock key={e.index} event={e} defaultCollapsed={e.type === 'system' || (e.tool_result !== null && e.tool_result !== undefined)} />
))}
</div>
</div>
)
}
+131
View File
@@ -0,0 +1,131 @@
import { useEffect, useMemo, useState } from 'react'
import { listSessions, type ListResponse } from '../lib/api/sessions'
import { SessionRow } from '../components/sessions/SessionRow'
import { FilterBar, type Filters } from '../components/sessions/FilterBar'
const PAGE_SIZE = 50
export default function Sessions() {
const [filters, setFilters] = useState<Filters>({ days: 7, project: '', tool: '', skill: '', q: '' })
const [data, setData] = useState<ListResponse | null>(null)
const [offset, setOffset] = useState(0)
const [loading, setLoading] = useState(false)
const [error, setError] = useState<string | null>(null)
useEffect(() => {
let cancelled = false
setLoading(true)
setError(null)
listSessions({
days: filters.days,
project: filters.project || undefined,
tool: filters.tool || undefined,
skill: filters.skill || undefined,
q: filters.q || undefined,
limit: PAGE_SIZE,
offset,
})
.then((r) => {
if (!cancelled) setData(r)
})
.catch((e: Error) => {
if (!cancelled) setError(e.message)
})
.finally(() => {
if (!cancelled) setLoading(false)
})
return () => {
cancelled = true
}
}, [filters, offset])
const { projects, tools, skills } = useMemo(() => {
const p = new Set<string>()
const t = new Set<string>()
const s = new Set<string>()
data?.items.forEach((it) => {
p.add(it.project_slug)
it.tools_used.forEach((x) => t.add(x))
it.skills_invoked.forEach((x) => s.add(x))
})
return { projects: [...p].sort(), tools: [...t].sort(), skills: [...s].sort() }
}, [data])
return (
<div className="p-6 space-y-4">
<header>
<h1 className="text-2xl font-semibold text-white">Espelho Sessões Claude</h1>
<p className="text-sm text-slate-400">Replay de sessões para observar comportamento real.</p>
</header>
<FilterBar
initial={filters}
projects={projects}
tools={tools}
skills={skills}
onChange={(f) => {
setFilters(f)
setOffset(0)
}}
/>
{error && <div className="p-3 bg-red-500/10 border border-red-500/30 rounded text-red-300 text-sm">{error}</div>}
<div className="overflow-x-auto rounded-lg border border-white/10">
<table className="w-full">
<thead className="bg-white/5 text-xs uppercase text-slate-400">
<tr>
<th className="px-3 py-2 text-left">Início</th>
<th className="px-3 py-2 text-left">Projecto</th>
<th className="px-3 py-2 text-left">Prompt</th>
<th className="px-3 py-2 text-left">Duração</th>
<th className="px-3 py-2 text-right">Eventos</th>
<th className="px-3 py-2 text-right">Tools</th>
<th className="px-3 py-2 text-left">Skills</th>
<th className="px-3 py-2 text-center">OK</th>
</tr>
</thead>
<tbody>
{loading && (
<tr>
<td colSpan={8} className="px-3 py-8 text-center text-slate-500">
A carregar
</td>
</tr>
)}
{!loading && data?.items.length === 0 && (
<tr>
<td colSpan={8} className="px-3 py-8 text-center text-slate-500">
Sem sessões para estes filtros.
</td>
</tr>
)}
{data?.items.map((s) => <SessionRow key={s.session_id} session={s} />)}
</tbody>
</table>
</div>
<div className="flex items-center justify-between text-sm text-slate-400">
<span>
{data ? `${offset + 1}${Math.min(offset + PAGE_SIZE, data.total)} de ${data.total}` : ''}
</span>
<div className="flex gap-2">
<button
disabled={offset === 0}
onClick={() => setOffset(Math.max(0, offset - PAGE_SIZE))}
className="px-3 py-1 bg-white/5 rounded disabled:opacity-30"
>
Anterior
</button>
<button
disabled={!data || offset + PAGE_SIZE >= data.total}
onClick={() => setOffset(offset + PAGE_SIZE)}
className="px-3 py-1 bg-white/5 rounded disabled:opacity-30"
>
Seguinte
</button>
</div>
</div>
</div>
)
}
+20
View File
@@ -0,0 +1,20 @@
[Unit]
Description=Observabilidade (Espelho) — indexer incremental de sessões Claude
After=default.target
[Service]
Type=simple
WorkingDirectory=/media/ealmeida/Dados/Dev/DashDescomplicar
ExecStart=/home/ealmeida/.nvm/versions/node/v22.22.2/bin/npx tsx api/scripts/sessions-indexer.ts --watch
Restart=on-failure
RestartSec=5
KillMode=mixed
KillSignal=SIGTERM
TimeoutStopSec=10s
StandardOutput=append:/home/ealmeida/.claude-work/observabilidade-indexer.log
StandardError=append:/home/ealmeida/.claude-work/observabilidade-indexer.log
Environment="OBSERVABILIDADE_DB=/home/ealmeida/.claude-work/sessions.db"
Environment="PATH=/home/ealmeida/.nvm/versions/node/v22.22.2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
[Install]
WantedBy=default.target
@@ -0,0 +1,2 @@
MCP_GATEWAY_TOKEN=coloca-token-aqui
MCP_GATEWAY_URL=https://gateway.descomplicar.pt/v1/desk-crm/mcp
+13
View File
@@ -0,0 +1,13 @@
[Unit]
Description=Observabilidade — detector semanal de padrões
After=default.target
[Service]
Type=oneshot
WorkingDirectory=/media/ealmeida/Dados/Dev/DashDescomplicar
ExecStart=/home/ealmeida/.nvm/versions/node/v22.22.2/bin/npx tsx api/scripts/sessions-patterns.ts --publish
Environment="OBSERVABILIDADE_DB=/home/ealmeida/.claude-work/sessions.db"
Environment="PATH=/home/ealmeida/.nvm/versions/node/v22.22.2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
EnvironmentFile=/home/ealmeida/.claude-work/observabilidade-patterns.env
StandardOutput=append:/home/ealmeida/.claude-work/observabilidade-patterns.log
StandardError=append:/home/ealmeida/.claude-work/observabilidade-patterns.log
+9
View File
@@ -0,0 +1,9 @@
[Unit]
Description=Observabilidade — detector semanal
[Timer]
OnCalendar=Sun 23:00
Persistent=true
[Install]
WantedBy=default.target
@@ -0,0 +1,13 @@
[Unit]
Description=Observabilidade — import diário de worklogs Desk (#31/#32/#33)
After=default.target
[Service]
Type=oneshot
WorkingDirectory=/media/ealmeida/Dados/Dev/DashDescomplicar
ExecStart=/home/ealmeida/.nvm/versions/node/v22.22.2/bin/npx tsx api/scripts/sessions-worklog-import.ts --discussion all --since-days 7
Environment="OBSERVABILIDADE_DB=/home/ealmeida/.claude-work/sessions.db"
Environment="PATH=/home/ealmeida/.nvm/versions/node/v22.22.2/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
EnvironmentFile=/home/ealmeida/.claude-work/observabilidade-patterns.env
StandardOutput=append:/home/ealmeida/.claude-work/observabilidade-worklog-import.log
StandardError=append:/home/ealmeida/.claude-work/observabilidade-worklog-import.log
@@ -0,0 +1,10 @@
[Unit]
Description=Observabilidade — import diário de worklogs Desk
[Timer]
OnCalendar=daily
OnCalendar=*-*-* 03:00:00
Persistent=true
[Install]
WantedBy=default.target