diff --git a/docs/next-session.md b/docs/next-session.md index c0542fd..6d6b30c 100644 --- a/docs/next-session.md +++ b/docs/next-session.md @@ -1,6 +1,43 @@ # Next Session — Outbound Call UI + Remaining Work -## Priority 0: Kookoo Dial to SIP Extension +## Priority 0: Outbound Call — CloudAgent WebSocket Integration + +**CRITICAL FINDING:** The Ozonetel toolbar's outbound dial works via TWO connections: +1. CloudAgent WebSocket (mdlConnection.php) — sends `tbManualDial` with `browserSessionId` + `usId` +2. SIP WebSocket (blr-pub-rtc4.ozonetel.com:444) — receives the SIP INVITE and auto-answers + +The `browserSessionId` and `usId` come from the CloudAgent WebSocket session handshake. Without them, CloudAgent doesn't know which browser to route the SIP INVITE to. + +**The toolbar's tbManualDial payload:** +```json +{ + "type": "tbManualDial", + "ns": "ozonetel.cloudagent", + "customer": "global_healthx", + "agentId": "global", + "agentUniqId": 374804, + "browserSessionId": "e15cd447-...", // FROM WEBSOCKET SESSION + "usId": "af7hkcT3BcwCG-g=", // FROM WEBSOCKET SESSION + "isSip": "true", + "mode": "manual", + "params": "312792,9949879837,523591,SIP:true", // campaignId,phone,sipExt,SIP:true + "utid": 57 +} +``` + +**What we need to do:** +1. Connect to CloudAgent WebSocket from our browser (same mdlConnection.php endpoint) +2. Establish session → get `usId` and `browserSessionId` +3. Include these in `tbManualDial` requests +4. CloudAgent will then send SIP INVITE to our JsSIP + +**The toolbar's SIP service code** is at: user pasted it in conversation. Key function: `handleSipAutoAnswer()` which auto-answers based on agent's `autoAnswer` setting (0=none, 1=all, 2=inbound, 3=outbound). + +**SIP config from toolbar:** password = extension number (523590), registrar = `sip:blr-pub-rtc4.ozonetel.com`, session_timers = false. Same as what we have. + +**Kookoo approach is abandoned** — `` only works with PSTN numbers, not SIP extensions. + +## Priority 0 (OLD): Kookoo Dial to SIP Extension **Status:** Kookoo IVR endpoint works. When customer answers, Kookoo hits /kookoo/ivr, we respond with `523590`. But Kookoo tries to call 523590 as a PSTN number — status=not_answered. diff --git a/docs/superpowers/plans/2026-03-20-worklist-ux-redesign.md b/docs/superpowers/plans/2026-03-20-worklist-ux-redesign.md new file mode 100644 index 0000000..5bb02de --- /dev/null +++ b/docs/superpowers/plans/2026-03-20-worklist-ux-redesign.md @@ -0,0 +1,435 @@ +# Worklist UX Redesign — Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Redesign the call desk worklist table for faster agent action — clickable phone numbers, last interaction context, campaign tags, context menus for SMS/WhatsApp, and meaningful SLA indicators. + +**Architecture:** All changes are frontend-only. The data model already has everything needed (`lastContacted`, `contactAttempts`, `source`, `utmCampaign`, `interestedService`, `disposition` on calls). We enrich the worklist rows with this data and redesign the table columns. + +**Tech Stack:** React 19, Untitled UI components, FontAwesome Pro Duotone icons, Jotai + +--- + +## Current problems + +1. Phone column is passive text — separate Call button in Actions column wastes space +2. No last interaction context — agent doesn't know what happened before +3. No campaign/source — agent can't personalize the opening +4. SLA shows time since creation, not time since last contact +5. Rows without phone numbers are dead weight +6. No way to SMS or WhatsApp from the worklist + +## Column redesign + +| Before | After | +|--------|-------| +| PRIORITY \| PATIENT \| PHONE \| TYPE \| SLA \| ACTIONS | PRIORITY \| PATIENT \| PHONE \| SOURCE \| SLA | + +- **PRIORITY** — badge, same as now +- **PATIENT** — name + sub-line: last interaction context ("Called 2h ago — Info Provided") or interested service +- **PHONE** — clickable number with phone icon. Hover shows context menu (Call / SMS / WhatsApp). On mobile, long-press shows the same menu. No separate Actions column. +- **SOURCE** — campaign/source tag (e.g., "Facebook", "Google", "Walk-in") +- **SLA** — time since `lastContacted` (not `createdAt`). Falls back to `createdAt` if never contacted. + +## File map + +| File | Responsibility | Action | +|------|---------------|--------| +| `src/components/call-desk/worklist-panel.tsx` | Worklist table + tabs | Modify: redesign columns, add phone context menu, enrich rows | +| `src/components/call-desk/phone-action-cell.tsx` | Clickable phone with context menu | Create: encapsulates call/SMS/WhatsApp actions | +| `src/hooks/use-worklist.ts` | Worklist data fetching | Modify: pass through `lastContacted`, `source`, `utmCampaign` fields | + +--- + +## Task 1: Enrich worklist data with last interaction and source + +Pass through the additional fields that already exist in the Lead data but aren't currently used in the worklist row. + +**Files:** +- Modify: `helix-engage/src/components/call-desk/worklist-panel.tsx` + +- [ ] **Step 1: Extend WorklistLead type in worklist-panel** + +Add fields that are already returned by the hook but not typed: + +```typescript +type WorklistLead = { + id: string; + createdAt: string; + contactName: { firstName: string; lastName: string } | null; + contactPhone: { number: string; callingCode: string }[] | null; + leadSource: string | null; + leadStatus: string | null; + interestedService: string | null; + aiSummary: string | null; + aiSuggestedAction: string | null; + // New fields (already in API response) + lastContacted: string | null; + contactAttempts: number | null; + utmCampaign: string | null; + campaignId: string | null; +}; +``` + +- [ ] **Step 2: Extend WorklistRow with new fields** + +```typescript +type WorklistRow = { + // ... existing fields ... + lastContactedAt: string | null; + contactAttempts: number; + source: string | null; // leadSource or utmCampaign + lastDisposition: string | null; +}; +``` + +- [ ] **Step 3: Populate new fields in buildRows** + +For leads: +```typescript +rows.push({ + // ... existing ... + lastContactedAt: lead.lastContacted ?? null, + contactAttempts: lead.contactAttempts ?? 0, + source: lead.leadSource ?? lead.utmCampaign ?? null, + lastDisposition: null, +}); +``` + +For missed calls: +```typescript +rows.push({ + // ... existing ... + lastContactedAt: call.startedAt ?? call.createdAt, + contactAttempts: 0, + source: null, + lastDisposition: call.disposition ?? null, +}); +``` + +For follow-ups: +```typescript +rows.push({ + // ... existing ... + lastContactedAt: fu.scheduledAt ?? fu.createdAt ?? null, + contactAttempts: 0, + source: null, + lastDisposition: null, +}); +``` + +- [ ] **Step 4: Update MissedCall type to include disposition** + +The hook already returns `disposition` but the worklist panel type doesn't have it: + +```typescript +type MissedCall = { + // ... existing ... + disposition: string | null; +}; +``` + +- [ ] **Step 5: Commit** + +``` +feat: enrich worklist rows with last interaction and source data +``` + +--- + +## Task 2: Create PhoneActionCell component + +A reusable cell that shows the phone number as a clickable element with a context menu for Call, SMS, and WhatsApp. + +**Files:** +- Create: `helix-engage/src/components/call-desk/phone-action-cell.tsx` + +- [ ] **Step 1: Create the component** + +```typescript +import { useState, useRef } from 'react'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { faPhone, faComment, faEllipsisVertical } from '@fortawesome/pro-duotone-svg-icons'; +import type { FC, HTMLAttributes } from 'react'; +import { useSip } from '@/providers/sip-provider'; +import { useSetAtom } from 'jotai'; +import { sipCallStateAtom, sipCallerNumberAtom, sipCallUcidAtom } from '@/state/sip-state'; +import { setOutboundPending } from '@/state/sip-manager'; +import { apiClient } from '@/lib/api-client'; +import { notify } from '@/lib/toast'; +import { cx } from '@/utils/cx'; + +type PhoneActionCellProps = { + phoneNumber: string; + displayNumber: string; + leadId?: string; +}; +``` + +The component renders: +- The formatted phone number as clickable text (triggers call on click) +- A small kebab menu icon (⋮) on hover that opens a popover with: + - 📞 Call + - 💬 SMS (opens `sms:` link) + - 📱 WhatsApp (opens `https://wa.me/{number}`) +- On mobile: long-press on the phone number opens the same menu + +Implementation: +- Use a simple `useState` for menu open/close +- Position the menu absolutely below the phone number +- Click outside closes it +- The Call action uses the same logic as ClickToCallButton (setCallState, setCallerNumber, setOutboundPending, apiClient.post dial) +- SMS opens `sms:+91${phoneNumber}` +- WhatsApp opens `https://wa.me/91${phoneNumber}` in a new tab + +- [ ] **Step 2: Handle long-press for mobile** + +Add `onContextMenu` (prevents default) and `onTouchStart`/`onTouchEnd` for 500ms long-press detection: + +```typescript +const touchTimer = useRef(null); + +const onTouchStart = () => { + touchTimer.current = window.setTimeout(() => { + setMenuOpen(true); + }, 500); +}; + +const onTouchEnd = () => { + if (touchTimer.current) { + clearTimeout(touchTimer.current); + touchTimer.current = null; + } +}; +``` + +- [ ] **Step 3: Commit** + +``` +feat: create PhoneActionCell with call/SMS/WhatsApp context menu +``` + +--- + +## Task 3: Redesign the worklist table columns + +Replace the current 6-column layout with the new 5-column layout. + +**Files:** +- Modify: `helix-engage/src/components/call-desk/worklist-panel.tsx` + +- [ ] **Step 1: Import PhoneActionCell** + +```typescript +import { PhoneActionCell } from './phone-action-cell'; +``` + +- [ ] **Step 2: Replace table headers** + +```typescript + + + + + + + +``` + +- [ ] **Step 3: Redesign PATIENT cell with sub-line** + +```typescript + +
+ {row.direction === 'inbound' && ( + + )} + {row.direction === 'outbound' && ( + + )} +
+ + {row.name} + + + {row.lastContactedAt + ? `${formatTimeAgo(row.lastContactedAt)}${row.lastDisposition ? ` — ${formatDisposition(row.lastDisposition)}` : ''}` + : row.reason || row.typeLabel} + +
+
+
+``` + +- [ ] **Step 4: Replace PHONE cell with PhoneActionCell** + +```typescript + + {row.phoneRaw ? ( + + ) : ( + No phone + )} + +``` + +- [ ] **Step 5: Add SOURCE cell** + +```typescript + + {row.source ? ( + + {formatSource(row.source)} + + ) : ( + + )} + +``` + +- [ ] **Step 6: Update SLA to use lastContacted** + +Change `computeSla` to accept a `lastContactedAt` fallback: + +```typescript +const sla = computeSla(row.lastContactedAt ?? row.createdAt); +``` + +- [ ] **Step 7: Remove ACTIONS column and TYPE column** + +The TYPE info moves to the tab filter (already there) and the badge on the patient sub-line. The ACTIONS column is replaced by the clickable phone. + +- [ ] **Step 8: Add helper functions** + +```typescript +const formatTimeAgo = (dateStr: string): string => { + const minutes = Math.round((Date.now() - new Date(dateStr).getTime()) / 60000); + if (minutes < 1) return 'Just now'; + if (minutes < 60) return `${minutes}m ago`; + const hours = Math.floor(minutes / 60); + if (hours < 24) return `${hours}h ago`; + return `${Math.floor(hours / 24)}d ago`; +}; + +const formatDisposition = (disposition: string): string => { + return disposition.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase()); +}; + +const formatSource = (source: string): string => { + const map: Record = { + FACEBOOK_AD: 'Facebook', + GOOGLE_AD: 'Google', + WALK_IN: 'Walk-in', + REFERRAL: 'Referral', + WEBSITE: 'Website', + PHONE_INQUIRY: 'Phone', + }; + return map[source] ?? source.replace(/_/g, ' '); +}; +``` + +- [ ] **Step 9: Remove ClickToCallButton import** + +No longer needed in the worklist panel — PhoneActionCell handles it. + +- [ ] **Step 10: Commit** + +``` +feat: redesign worklist table with clickable phones and interaction context +``` + +--- + +## Task 4: Add notification badges for new items + +When new missed calls or follow-ups arrive (detected via the 30-second refresh), show a visual indicator. + +**Files:** +- Modify: `helix-engage/src/components/call-desk/worklist-panel.tsx` + +- [ ] **Step 1: Track previous counts to detect new items** + +```typescript +const [prevMissedCount, setPrevMissedCount] = useState(missedCount); + +useEffect(() => { + if (missedCount > prevMissedCount && prevMissedCount > 0) { + notify.info('New Missed Call', `${missedCount - prevMissedCount} new missed call(s)`); + } + setPrevMissedCount(missedCount); +}, [missedCount, prevMissedCount]); +``` + +- [ ] **Step 2: Add pulsing dot to tab badges when new items exist** + +In the tab items, add a visual indicator for tabs with urgent items: + +```typescript +const tabItems = [ + { id: 'all' as const, label: 'All Tasks', badge: allRows.length > 0 ? String(allRows.length) : undefined }, + { id: 'missed' as const, label: 'Missed Calls', badge: missedCount > 0 ? String(missedCount) : undefined, hasNew: missedCount > prevMissedCount }, + // ... +]; +``` + +The Tab component already supports badges. For the "new" indicator, append a small red dot after the badge number using a custom render if needed. + +- [ ] **Step 3: Commit** + +``` +feat: add notification for new missed calls in worklist +``` + +--- + +## Task 5: Deploy and verify + +- [ ] **Step 1: Type check** + +```bash +cd helix-engage && npx tsc --noEmit +``` + +- [ ] **Step 2: Build and deploy** + +```bash +VITE_API_URL=https://engage-api.srv1477139.hstgr.cloud \ +VITE_SIP_URI=sip:523590@blr-pub-rtc4.ozonetel.com \ +VITE_SIP_PASSWORD=523590 \ +VITE_SIP_WS_SERVER=wss://blr-pub-rtc4.ozonetel.com:444 \ +npm run build +``` + +- [ ] **Step 3: Test clickable phone** + +1. Hover over a phone number — kebab menu icon appears +2. Click phone number directly — places outbound call +3. Click kebab → SMS — opens SMS app +4. Click kebab → WhatsApp — opens WhatsApp web +5. On mobile: long-press phone number — context menu appears + +- [ ] **Step 4: Test last interaction context** + +1. Leads with `lastContacted` show "2h ago — Info Provided" sub-line +2. Leads without `lastContacted` show interested service or type +3. Missed calls show "Missed at 2:30 PM" + +- [ ] **Step 5: Test SLA** + +1. SLA shows time since last contact (not creation) +2. Green < 15m, amber 15-30m, red > 30m + +--- + +## Notes + +- **No schema changes needed** — all data is already available from the platform +- **ClickToCallButton stays** — it's still used in the active call card for the ringing-out End Call button. Only the worklist replaces it with PhoneActionCell. +- **WhatsApp link format** — `https://wa.me/91XXXXXXXXXX` (no + prefix, includes country code) +- **SMS link format** — `sms:+91XXXXXXXXXX` (with + prefix) +- **The TYPE column is removed** — the tab filter already categorizes by type, and the patient sub-line shows context. Adding a TYPE badge to each row is redundant. +- **Filter out no-phone follow-ups** — optional future improvement. For now, show "No phone" in italic which makes it clear the agent can't call. diff --git a/docs/superpowers/plans/2026-03-21-cc-agent-features.md b/docs/superpowers/plans/2026-03-21-cc-agent-features.md new file mode 100644 index 0000000..907b026 --- /dev/null +++ b/docs/superpowers/plans/2026-03-21-cc-agent-features.md @@ -0,0 +1,480 @@ +# CC Agent Features — Phase 1 Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Add call transfer, recording pause, and missed call queue to the CC agent's call desk — the three most impactful features for daily workflow. + +**Architecture:** Three new service methods in the NestJS sidecar (callControl, pauseRecording, getAbandonCalls), exposed via REST endpoints. Frontend adds Transfer and Pause Recording buttons to the active call card, and a missed call queue that pulls from Ozonetel instead of our webhook-created records. + +**Tech Stack:** NestJS sidecar (Ozonetel Token Auth APIs), React 19 + Jotai + Untitled UI + +**Ozonetel API endpoints used:** +- Call Control: `POST /ca_apis/CallControl_V4` — Token auth — CONFERENCE, HOLD, UNHOLD, MUTE, UNMUTE, KICK_CALL +- Recording: `GET /CAServices/Call/Record.php` — apiKey in query string — pause/unPause +- Abandon Calls: `GET /ca_apis/abandonCalls` — Token auth — missed calls list + +--- + +## File Map + +### Sidecar (helix-engage-server) +| File | Action | +|------|--------| +| `src/ozonetel/ozonetel-agent.service.ts` | Modify: add `callControl()`, `pauseRecording()`, `getAbandonCalls()` | +| `src/ozonetel/ozonetel-agent.controller.ts` | Modify: add `POST /api/ozonetel/call-control`, `POST /api/ozonetel/recording`, `GET /api/ozonetel/missed-calls` | + +### Frontend (helix-engage) +| File | Action | +|------|--------| +| `src/components/call-desk/active-call-card.tsx` | Modify: add Transfer button + transfer input, Pause Recording button | +| `src/components/call-desk/transfer-dialog.tsx` | Create: inline transfer form (enter number, confirm) | +| `src/hooks/use-worklist.ts` | Modify: fetch missed calls from Ozonetel API instead of platform | + +--- + +## Task 1: Add call control service methods + +Three new methods in the Ozonetel service: `callControl()` (generic), `pauseRecording()`, and `getAbandonCalls()`. + +**Files:** +- Modify: `helix-engage-server/src/ozonetel/ozonetel-agent.service.ts` + +- [ ] **Step 1: Add `callControl()` method** + +```typescript +async callControl(params: { + action: 'CONFERENCE' | 'HOLD' | 'UNHOLD' | 'MUTE' | 'UNMUTE' | 'KICK_CALL'; + ucid: string; + conferenceNumber?: string; +}): Promise<{ status: string; message: string; ucid?: string }> { + const url = `https://${this.apiDomain}/ca_apis/CallControl_V4`; + const did = process.env.OZONETEL_DID ?? '918041763265'; + const agentPhoneName = process.env.OZONETEL_SIP_ID ?? '523590'; + + this.logger.log(`Call control: action=${params.action} ucid=${params.ucid} conference=${params.conferenceNumber ?? 'none'}`); + + try { + const token = await this.getToken(); + const body: Record = { + userName: this.accountId, + action: params.action, + ucid: params.ucid, + did, + agentPhoneName, + }; + if (params.conferenceNumber) { + body.conferenceNumber = params.conferenceNumber; + } + + const response = await axios.post(url, body, { + headers: { + Authorization: `Bearer ${token}`, + 'Content-Type': 'application/json', + }, + }); + + this.logger.log(`Call control response: ${JSON.stringify(response.data)}`); + return response.data; + } catch (error: any) { + const responseData = error?.response?.data ? JSON.stringify(error.response.data) : ''; + this.logger.error(`Call control failed: ${error.message} ${responseData}`); + throw error; + } +} +``` + +- [ ] **Step 2: Add `pauseRecording()` method** + +This uses apiKey in query params, not token auth: + +```typescript +async pauseRecording(params: { + ucid: string; + action: 'pause' | 'unPause'; +}): Promise<{ status: string; message: string }> { + const url = `https://${this.apiDomain}/CAServices/Call/Record.php`; + + this.logger.log(`Recording ${params.action}: ucid=${params.ucid}`); + + try { + const response = await axios.get(url, { + params: { + userName: this.accountId, + apiKey: this.apiKey, + action: params.action, + ucid: params.ucid, + }, + }); + + this.logger.log(`Recording control response: ${JSON.stringify(response.data)}`); + return response.data; + } catch (error: any) { + const responseData = error?.response?.data ? JSON.stringify(error.response.data) : ''; + this.logger.error(`Recording control failed: ${error.message} ${responseData}`); + throw error; + } +} +``` + +- [ ] **Step 3: Add `getAbandonCalls()` method** + +```typescript +async getAbandonCalls(params?: { + fromTime?: string; + toTime?: string; + campaignName?: string; +}): Promise> { + const url = `https://${this.apiDomain}/ca_apis/abandonCalls`; + + this.logger.log('Fetching abandon calls'); + + try { + const token = await this.getToken(); + const body: Record = { userName: this.accountId }; + if (params?.fromTime) body.fromTime = params.fromTime; + if (params?.toTime) body.toTime = params.toTime; + if (params?.campaignName) body.campaignName = params.campaignName; + + const response = await axios.get(url, { + headers: { + Authorization: `Bearer ${token}`, + 'Content-Type': 'application/json', + }, + data: body, + }); + + const data = response.data; + if (data.status === 'success' && Array.isArray(data.message)) { + return data.message; + } + return []; + } catch (error: any) { + this.logger.error(`Abandon calls failed: ${error.message}`); + return []; + } +} +``` + +- [ ] **Step 4: Type check and commit** + +```bash +cd helix-engage-server && npx tsc --noEmit +``` + +``` +feat: add call control, recording pause, and abandon calls to Ozonetel service +``` + +--- + +## Task 2: Add sidecar REST endpoints + +**Files:** +- Modify: `helix-engage-server/src/ozonetel/ozonetel-agent.controller.ts` + +- [ ] **Step 1: Add `POST /api/ozonetel/call-control`** + +```typescript +@Post('call-control') +async callControl( + @Body() body: { + action: 'CONFERENCE' | 'HOLD' | 'UNHOLD' | 'MUTE' | 'UNMUTE' | 'KICK_CALL'; + ucid: string; + conferenceNumber?: string; + }, +) { + if (!body.action || !body.ucid) { + throw new HttpException('action and ucid required', 400); + } + if (body.action === 'CONFERENCE' && !body.conferenceNumber) { + throw new HttpException('conferenceNumber required for CONFERENCE action', 400); + } + + this.logger.log(`Call control: ${body.action} ucid=${body.ucid}`); + + try { + const result = await this.ozonetelAgent.callControl(body); + return result; + } catch (error: any) { + const message = error.response?.data?.message ?? error.message ?? 'Call control failed'; + throw new HttpException(message, error.response?.status ?? 502); + } +} +``` + +- [ ] **Step 2: Add `POST /api/ozonetel/recording`** + +```typescript +@Post('recording') +async recording( + @Body() body: { ucid: string; action: 'pause' | 'unPause' }, +) { + if (!body.ucid || !body.action) { + throw new HttpException('ucid and action required', 400); + } + + try { + const result = await this.ozonetelAgent.pauseRecording(body); + return result; + } catch (error: any) { + const message = error.response?.data?.message ?? error.message ?? 'Recording control failed'; + throw new HttpException(message, error.response?.status ?? 502); + } +} +``` + +- [ ] **Step 3: Add `GET /api/ozonetel/missed-calls`** + +Import `Get` from `@nestjs/common`: + +```typescript +@Get('missed-calls') +async missedCalls() { + const result = await this.ozonetelAgent.getAbandonCalls(); + return result; +} +``` + +- [ ] **Step 4: Type check and commit** + +```bash +cd helix-engage-server && npx tsc --noEmit +``` + +``` +feat: add call control, recording, and missed calls REST endpoints +``` + +--- + +## Task 3: Add Transfer and Pause Recording to active call UI + +During an active call, the agent gets two new buttons: +- **Transfer** — opens an inline input for the transfer number, then does CONFERENCE + KICK_CALL +- **Pause Rec** — toggles recording pause + +**Files:** +- Create: `helix-engage/src/components/call-desk/transfer-dialog.tsx` +- Modify: `helix-engage/src/components/call-desk/active-call-card.tsx` + +- [ ] **Step 1: Create transfer-dialog.tsx** + +A simple inline form: text input for phone number + "Transfer" button. On submit, calls the sidecar's call-control endpoint twice: CONFERENCE (dial the target), then after confirming, KICK_CALL (drop the agent). + +```typescript +import { useState } from 'react'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { faPhoneArrowRight, faXmark } from '@fortawesome/pro-duotone-svg-icons'; +import { Input } from '@/components/base/input/input'; +import { Button } from '@/components/base/buttons/button'; +import { apiClient } from '@/lib/api-client'; +import { notify } from '@/lib/toast'; + +type TransferDialogProps = { + ucid: string; + onClose: () => void; + onTransferred: () => void; +}; + +export const TransferDialog = ({ ucid, onClose, onTransferred }: TransferDialogProps) => { + const [number, setNumber] = useState(''); + const [transferring, setTransferring] = useState(false); + const [stage, setStage] = useState<'input' | 'connected'>('input'); + + const handleConference = async () => { + if (!number.trim()) return; + setTransferring(true); + try { + // Add the target to the conference + await apiClient.post('/api/ozonetel/call-control', { + action: 'CONFERENCE', + ucid, + conferenceNumber: `0${number.replace(/\D/g, '')}`, + }); + notify.success('Connected', 'Third party connected. Click Complete to transfer.'); + setStage('connected'); + } catch { + notify.error('Transfer Failed', 'Could not connect to the target number'); + } finally { + setTransferring(false); + } + }; + + const handleComplete = async () => { + setTransferring(true); + try { + // Drop the agent from the call — customer stays with the target + await apiClient.post('/api/ozonetel/call-control', { + action: 'KICK_CALL', + ucid, + conferenceNumber: `0${number.replace(/\D/g, '')}`, + }); + notify.success('Transferred', 'Call transferred successfully'); + onTransferred(); + } catch { + notify.error('Transfer Failed', 'Could not complete transfer'); + } finally { + setTransferring(false); + } + }; + + return ( +
+
+ Transfer Call + +
+ {stage === 'input' ? ( +
+ + +
+ ) : ( +
+ Connected to {number} + +
+ )} +
+ ); +}; +``` + +- [ ] **Step 2: Add Transfer and Pause Recording buttons to active call card** + +In `active-call-card.tsx`, add imports: +```typescript +import { faPhoneArrowRight, faRecordVinyl } from '@fortawesome/pro-duotone-svg-icons'; +import { TransferDialog } from './transfer-dialog'; +``` + +Add state: +```typescript +const [transferOpen, setTransferOpen] = useState(false); +const [recordingPaused, setRecordingPaused] = useState(false); +``` + +In the active call button row (around line 241), add two new buttons before the End button: + +```typescript + + +``` + +After the button row, before the AppointmentForm, add the transfer dialog: +```typescript +{transferOpen && callUcid && ( + setTransferOpen(false)} + onTransferred={() => { + setTransferOpen(false); + hangup(); + setPostCallStage('disposition'); + }} + /> +)} +``` + +- [ ] **Step 3: Type check and commit** + +```bash +cd helix-engage && npx tsc --noEmit +``` + +``` +feat: add call transfer and recording pause to active call UI +``` + +--- + +## Task 4: Deploy and verify + +- [ ] **Step 1: Build and deploy sidecar** + +```bash +cd helix-engage-server && npm run build +# tar + scp + docker cp + restart +``` + +- [ ] **Step 2: Build and deploy frontend** + +```bash +cd helix-engage +VITE_API_URL=https://engage-api.srv1477139.hstgr.cloud \ +VITE_SIP_URI=sip:523590@blr-pub-rtc4.ozonetel.com \ +VITE_SIP_PASSWORD=523590 \ +VITE_SIP_WS_SERVER=wss://blr-pub-rtc4.ozonetel.com:444 \ +npm run build +``` + +- [ ] **Step 3: Test call transfer** + +1. Place an outbound call +2. Click "Transfer" → enter a phone number → "Connect" +3. Third party should ring and join the call +4. Click "Complete Transfer" → agent drops, customer stays with target +5. Disposition form shows + +- [ ] **Step 4: Test recording pause** + +1. During an active call, click "Pause Rec" +2. Button changes to "Resume Rec" (destructive color) +3. Check Ozonetel reports — recording should have a gap +4. Click "Resume Rec" — recording resumes + +- [ ] **Step 5: Test missed calls endpoint** + +```bash +curl -s https://engage-api.srv1477139.hstgr.cloud/api/ozonetel/missed-calls | python3 -m json.tool +``` + +Verify it returns abandon call data from Ozonetel. + +--- + +## Notes + +- **Call Transfer is two-step**: CONFERENCE adds the target, KICK_CALL drops the agent. This is a "warm transfer" — all three parties are briefly connected before the agent drops. For "cold transfer" (blind), we'd CONFERENCE + immediately KICK_CALL without waiting. +- **Recording pause uses apiKey in query params** — different auth pattern from other `/ca_apis/` endpoints. This is the `/CAServices/` path. +- **KICK_CALL note from docs**: "Always pass the agent phone number in the conferenceNumber parameter to use KICK_CALL action." This means to drop the agent, pass the agent's phone number as conferenceNumber. To drop the transferred party, pass their number. +- **Missed calls API** — the `getAbandonCalls` returns today's data by default. For historical data, pass fromTime/toTime. +- **The active call button row is getting crowded** (Mute, Hold, Book Appt, Transfer, Pause Rec, End — 6 buttons). If this is too many, we can group Transfer + Pause Rec under a "More" dropdown. diff --git a/docs/superpowers/plans/2026-03-21-live-call-assist.md b/docs/superpowers/plans/2026-03-21-live-call-assist.md new file mode 100644 index 0000000..079e1d3 --- /dev/null +++ b/docs/superpowers/plans/2026-03-21-live-call-assist.md @@ -0,0 +1,796 @@ +# Live Call Assist — Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Stream customer audio during calls to Deepgram for transcription, feed transcript + lead context to OpenAI every 10 seconds for suggestions, display live transcript + AI suggestions in the sidebar. + +**Architecture:** Browser captures remote WebRTC audio via AudioWorklet, streams PCM over Socket.IO to sidecar. Sidecar pipes audio to Deepgram Nova WebSocket for STT, accumulates transcript, and every 10 seconds sends transcript + pre-loaded lead context to OpenAI gpt-4o-mini for suggestions. Results stream back to browser via Socket.IO. + +**Tech Stack:** Socket.IO (already installed), Deepgram Nova SDK, OpenAI via Vercel AI SDK (already installed), AudioWorklet (browser API) + +--- + +## File Map + +### Sidecar (helix-engage-server) +| File | Action | +|------|--------| +| `src/call-assist/call-assist.gateway.ts` | Create: Socket.IO gateway handling audio stream, Deepgram + OpenAI orchestration | +| `src/call-assist/call-assist.service.ts` | Create: Lead context loading from platform, OpenAI prompt building | +| `src/call-assist/call-assist.module.ts` | Create: Module registration | +| `src/app.module.ts` | Modify: import CallAssistModule | +| `package.json` | Modify: add `@deepgram/sdk` | + +### Frontend (helix-engage) +| File | Action | +|------|--------| +| `src/lib/audio-capture.ts` | Create: Capture remote audio track, downsample to 16kHz PCM, emit chunks | +| `src/hooks/use-call-assist.ts` | Create: Socket.IO connection, manages transcript + suggestions state | +| `src/components/call-desk/live-transcript.tsx` | Create: Scrolling transcript + AI suggestion cards | +| `src/components/call-desk/context-panel.tsx` | Modify: show LiveTranscript during active calls instead of AiChatPanel | +| `src/pages/call-desk.tsx` | Modify: remove CallPrepCard during active calls | + +--- + +## Task 1: Sidecar — Call Assist service (context loading + OpenAI) + +**Files:** +- Create: `helix-engage-server/src/call-assist/call-assist.service.ts` + +- [ ] **Step 1: Create the service** + +```typescript +import { Injectable, Logger } from '@nestjs/common'; +import { ConfigService } from '@nestjs/config'; +import { generateText } from 'ai'; +import { PlatformGraphqlService } from '../platform/platform-graphql.service'; +import { createAiModel } from '../ai/ai-provider'; +import type { LanguageModel } from 'ai'; + +@Injectable() +export class CallAssistService { + private readonly logger = new Logger(CallAssistService.name); + private readonly aiModel: LanguageModel | null; + private readonly platformApiKey: string; + + constructor( + private config: ConfigService, + private platform: PlatformGraphqlService, + ) { + this.aiModel = createAiModel(config); + this.platformApiKey = config.get('platform.apiKey') ?? ''; + } + + async loadCallContext(leadId: string | null, callerPhone: string | null): Promise { + const authHeader = this.platformApiKey ? `Bearer ${this.platformApiKey}` : ''; + if (!authHeader) return 'No platform context available.'; + + try { + const parts: string[] = []; + + // Load lead details + if (leadId) { + const leadResult = await this.platform.queryWithAuth( + `{ leads(filter: { id: { eq: "${leadId}" } }) { edges { node { + id name contactName { firstName lastName } + contactPhone { primaryPhoneNumber } + source status interestedService + lastContacted contactAttempts + aiSummary aiSuggestedAction + } } } }`, + undefined, authHeader, + ); + const lead = leadResult.leads.edges[0]?.node; + if (lead) { + const name = lead.contactName ? `${lead.contactName.firstName} ${lead.contactName.lastName}`.trim() : lead.name; + parts.push(`CALLER: ${name}`); + parts.push(`Phone: ${lead.contactPhone?.primaryPhoneNumber ?? callerPhone}`); + parts.push(`Source: ${lead.source ?? 'Unknown'}`); + parts.push(`Interested in: ${lead.interestedService ?? 'Not specified'}`); + parts.push(`Contact attempts: ${lead.contactAttempts ?? 0}`); + if (lead.aiSummary) parts.push(`AI Summary: ${lead.aiSummary}`); + } + + // Load past appointments + const apptResult = await this.platform.queryWithAuth( + `{ appointments(filter: { patientId: { eq: "${leadId}" } }, first: 10, orderBy: [{ scheduledAt: DescNullsLast }]) { edges { node { + id scheduledAt appointmentStatus doctorName department reasonForVisit + } } } }`, + undefined, authHeader, + ); + const appts = apptResult.appointments.edges.map((e: any) => e.node); + if (appts.length > 0) { + parts.push(`\nPAST APPOINTMENTS:`); + for (const a of appts) { + const date = a.scheduledAt ? new Date(a.scheduledAt).toLocaleDateString('en-IN') : '?'; + parts.push(`- ${date}: ${a.doctorName ?? '?'} (${a.department ?? '?'}) — ${a.appointmentStatus}`); + } + } + } else if (callerPhone) { + parts.push(`CALLER: Unknown (${callerPhone})`); + parts.push('No lead record found — this may be a new enquiry.'); + } + + // Load doctors + const docResult = await this.platform.queryWithAuth( + `{ doctors(first: 20) { edges { node { + fullName { firstName lastName } department specialty clinic { clinicName } + } } } }`, + undefined, authHeader, + ); + const docs = docResult.doctors.edges.map((e: any) => e.node); + if (docs.length > 0) { + parts.push(`\nAVAILABLE DOCTORS:`); + for (const d of docs) { + const name = d.fullName ? `Dr. ${d.fullName.firstName} ${d.fullName.lastName}`.trim() : 'Unknown'; + parts.push(`- ${name} — ${d.department ?? '?'} — ${d.clinic?.clinicName ?? '?'}`); + } + } + + return parts.join('\n') || 'No context available.'; + } catch (err) { + this.logger.error(`Failed to load call context: ${err}`); + return 'Context loading failed.'; + } + } + + async getSuggestion(transcript: string, context: string): Promise { + if (!this.aiModel || !transcript.trim()) return ''; + + try { + const { text } = await generateText({ + model: this.aiModel, + system: `You are a real-time call assistant for Global Hospital Bangalore. +You listen to the customer's words and provide brief, actionable suggestions for the CC agent. + +${context} + +RULES: +- Keep suggestions under 2 sentences +- Focus on actionable next steps the agent should take NOW +- If customer mentions a doctor or department, suggest available slots +- If customer wants to cancel or reschedule, note relevant appointment details +- If customer sounds upset, suggest empathetic response +- Do NOT repeat what the agent already knows`, + prompt: `Conversation transcript so far:\n${transcript}\n\nProvide a brief suggestion for the agent based on what was just said.`, + maxTokens: 150, + }); + return text; + } catch (err) { + this.logger.error(`AI suggestion failed: ${err}`); + return ''; + } + } +} +``` + +- [ ] **Step 2: Type check and commit** + +``` +feat: add CallAssistService for context loading and AI suggestions +``` + +--- + +## Task 2: Sidecar — Call Assist WebSocket gateway + +**Files:** +- Create: `helix-engage-server/src/call-assist/call-assist.gateway.ts` +- Create: `helix-engage-server/src/call-assist/call-assist.module.ts` +- Modify: `helix-engage-server/src/app.module.ts` +- Modify: `helix-engage-server/package.json` + +- [ ] **Step 1: Install Deepgram SDK** + +```bash +cd helix-engage-server && npm install @deepgram/sdk +``` + +- [ ] **Step 2: Create the gateway** + +```typescript +import { + WebSocketGateway, + SubscribeMessage, + MessageBody, + ConnectedSocket, + OnGatewayDisconnect, +} from '@nestjs/websockets'; +import { Logger } from '@nestjs/common'; +import { ConfigService } from '@nestjs/config'; +import { Socket } from 'socket.io'; +import { createClient, LiveTranscriptionEvents } from '@deepgram/sdk'; +import { CallAssistService } from './call-assist.service'; + +type SessionState = { + deepgramConnection: any; + transcript: string; + context: string; + suggestionTimer: NodeJS.Timeout | null; +}; + +@WebSocketGateway({ + cors: { origin: process.env.CORS_ORIGIN ?? '*', credentials: true }, + namespace: '/call-assist', +}) +export class CallAssistGateway implements OnGatewayDisconnect { + private readonly logger = new Logger(CallAssistGateway.name); + private readonly sessions = new Map(); + private readonly deepgramApiKey: string; + + constructor( + private readonly callAssist: CallAssistService, + private readonly config: ConfigService, + ) { + this.deepgramApiKey = process.env.DEEPGRAM_API_KEY ?? ''; + } + + @SubscribeMessage('call-assist:start') + async handleStart( + @ConnectedSocket() client: Socket, + @MessageBody() data: { ucid: string; leadId?: string; callerPhone?: string }, + ) { + this.logger.log(`Call assist start: ucid=${data.ucid} lead=${data.leadId ?? 'none'}`); + + // Load lead context + const context = await this.callAssist.loadCallContext( + data.leadId ?? null, + data.callerPhone ?? null, + ); + client.emit('call-assist:context', { context: context.substring(0, 200) + '...' }); + + // Connect to Deepgram + if (!this.deepgramApiKey) { + this.logger.warn('DEEPGRAM_API_KEY not set — transcription disabled'); + client.emit('call-assist:error', { message: 'Transcription not configured' }); + return; + } + + const deepgram = createClient(this.deepgramApiKey); + const dgConnection = deepgram.listen.live({ + model: 'nova-2', + language: 'en', + smart_format: true, + interim_results: true, + endpointing: 300, + sample_rate: 16000, + encoding: 'linear16', + channels: 1, + }); + + const session: SessionState = { + deepgramConnection: dgConnection, + transcript: '', + context, + suggestionTimer: null, + }; + + dgConnection.on(LiveTranscriptionEvents.Open, () => { + this.logger.log(`Deepgram connected for ${data.ucid}`); + }); + + dgConnection.on(LiveTranscriptionEvents.Transcript, (result: any) => { + const text = result.channel?.alternatives?.[0]?.transcript; + if (!text) return; + + const isFinal = result.is_final; + client.emit('call-assist:transcript', { text, isFinal }); + + if (isFinal) { + session.transcript += `Customer: ${text}\n`; + } + }); + + dgConnection.on(LiveTranscriptionEvents.Error, (err: any) => { + this.logger.error(`Deepgram error: ${err.message}`); + }); + + dgConnection.on(LiveTranscriptionEvents.Close, () => { + this.logger.log(`Deepgram closed for ${data.ucid}`); + }); + + // AI suggestion every 10 seconds + session.suggestionTimer = setInterval(async () => { + if (!session.transcript.trim()) return; + const suggestion = await this.callAssist.getSuggestion(session.transcript, session.context); + if (suggestion) { + client.emit('call-assist:suggestion', { text: suggestion }); + } + }, 10000); + + this.sessions.set(client.id, session); + } + + @SubscribeMessage('call-assist:audio') + handleAudio( + @ConnectedSocket() client: Socket, + @MessageBody() audioData: ArrayBuffer, + ) { + const session = this.sessions.get(client.id); + if (session?.deepgramConnection) { + session.deepgramConnection.send(Buffer.from(audioData)); + } + } + + @SubscribeMessage('call-assist:stop') + handleStop(@ConnectedSocket() client: Socket) { + this.cleanup(client.id); + this.logger.log(`Call assist stopped: ${client.id}`); + } + + handleDisconnect(client: Socket) { + this.cleanup(client.id); + } + + private cleanup(clientId: string) { + const session = this.sessions.get(clientId); + if (session) { + if (session.suggestionTimer) clearInterval(session.suggestionTimer); + if (session.deepgramConnection) { + try { session.deepgramConnection.finish(); } catch {} + } + this.sessions.delete(clientId); + } + } +} +``` + +- [ ] **Step 3: Create the module** + +```typescript +import { Module } from '@nestjs/common'; +import { CallAssistGateway } from './call-assist.gateway'; +import { CallAssistService } from './call-assist.service'; +import { PlatformModule } from '../platform/platform.module'; + +@Module({ + imports: [PlatformModule], + providers: [CallAssistGateway, CallAssistService], +}) +export class CallAssistModule {} +``` + +- [ ] **Step 4: Register in app.module.ts** + +Add `CallAssistModule` to imports. + +- [ ] **Step 5: Add DEEPGRAM_API_KEY to docker-compose env** + +The env var needs to be set in the VPS docker-compose for the sidecar container. + +- [ ] **Step 6: Type check and commit** + +``` +feat: add call assist WebSocket gateway with Deepgram STT + OpenAI suggestions +``` + +--- + +## Task 3: Frontend — Audio capture utility + +Capture the remote audio track from WebRTC, downsample to 16kHz 16-bit PCM, and provide chunks via callback. + +**Files:** +- Create: `helix-engage/src/lib/audio-capture.ts` + +- [ ] **Step 1: Create the audio capture module** + +```typescript +type AudioChunkCallback = (chunk: ArrayBuffer) => void; + +let audioContext: AudioContext | null = null; +let mediaStreamSource: MediaStreamAudioSourceNode | null = null; +let scriptProcessor: ScriptProcessorNode | null = null; + +export function startAudioCapture(remoteStream: MediaStream, onChunk: AudioChunkCallback): void { + stopAudioCapture(); + + audioContext = new AudioContext({ sampleRate: 16000 }); + mediaStreamSource = audioContext.createMediaStreamSource(remoteStream); + + // Use ScriptProcessorNode (deprecated but universally supported) + // AudioWorklet would be better but requires a separate file + scriptProcessor = audioContext.createScriptProcessor(4096, 1, 1); + + scriptProcessor.onaudioprocess = (event) => { + const inputData = event.inputBuffer.getChannelData(0); + + // Convert Float32 to Int16 PCM + const pcm = new Int16Array(inputData.length); + for (let i = 0; i < inputData.length; i++) { + const s = Math.max(-1, Math.min(1, inputData[i])); + pcm[i] = s < 0 ? s * 0x8000 : s * 0x7FFF; + } + + onChunk(pcm.buffer); + }; + + mediaStreamSource.connect(scriptProcessor); + scriptProcessor.connect(audioContext.destination); +} + +export function stopAudioCapture(): void { + if (scriptProcessor) { + scriptProcessor.disconnect(); + scriptProcessor = null; + } + if (mediaStreamSource) { + mediaStreamSource.disconnect(); + mediaStreamSource = null; + } + if (audioContext) { + audioContext.close().catch(() => {}); + audioContext = null; + } +} +``` + +- [ ] **Step 2: Commit** + +``` +feat: add audio capture utility for remote WebRTC stream +``` + +--- + +## Task 4: Frontend — useCallAssist hook + +Manages Socket.IO connection to `/call-assist`, sends audio, receives transcript + suggestions. + +**Files:** +- Create: `helix-engage/src/hooks/use-call-assist.ts` + +- [ ] **Step 1: Create the hook** + +```typescript +import { useEffect, useRef, useState, useCallback } from 'react'; +import { io, Socket } from 'socket.io-client'; +import { startAudioCapture, stopAudioCapture } from '@/lib/audio-capture'; +import { getSipClient } from '@/state/sip-manager'; + +const SIDECAR_URL = import.meta.env.VITE_SIDECAR_URL ?? 'http://localhost:4100'; + +type TranscriptLine = { + id: string; + text: string; + isFinal: boolean; + timestamp: Date; +}; + +type Suggestion = { + id: string; + text: string; + timestamp: Date; +}; + +export const useCallAssist = (active: boolean, ucid: string | null, leadId: string | null, callerPhone: string | null) => { + const [transcript, setTranscript] = useState([]); + const [suggestions, setSuggestions] = useState([]); + const [connected, setConnected] = useState(false); + const socketRef = useRef(null); + const idCounter = useRef(0); + + const nextId = useCallback(() => `ca-${++idCounter.current}`, []); + + useEffect(() => { + if (!active || !ucid) return; + + const socket = io(`${SIDECAR_URL}/call-assist`, { + transports: ['websocket'], + }); + socketRef.current = socket; + + socket.on('connect', () => { + setConnected(true); + socket.emit('call-assist:start', { ucid, leadId, callerPhone }); + + // Start capturing remote audio from the SIP session + const sipClient = getSipClient(); + const audioElement = (sipClient as any)?.audioElement as HTMLAudioElement | null; + if (audioElement?.srcObject) { + startAudioCapture(audioElement.srcObject as MediaStream, (chunk) => { + socket.emit('call-assist:audio', chunk); + }); + } + }); + + socket.on('call-assist:transcript', (data: { text: string; isFinal: boolean }) => { + if (!data.text.trim()) return; + setTranscript(prev => { + if (!data.isFinal) { + // Replace last interim line + const withoutLastInterim = prev.filter(l => l.isFinal); + return [...withoutLastInterim, { id: nextId(), text: data.text, isFinal: false, timestamp: new Date() }]; + } + // Add final line, remove interims + const finals = prev.filter(l => l.isFinal); + return [...finals, { id: nextId(), text: data.text, isFinal: true, timestamp: new Date() }]; + }); + }); + + socket.on('call-assist:suggestion', (data: { text: string }) => { + setSuggestions(prev => [...prev, { id: nextId(), text: data.text, timestamp: new Date() }]); + }); + + socket.on('disconnect', () => setConnected(false)); + + return () => { + stopAudioCapture(); + socket.emit('call-assist:stop'); + socket.disconnect(); + socketRef.current = null; + setConnected(false); + }; + }, [active, ucid, leadId, callerPhone, nextId]); + + // Reset state when call ends + useEffect(() => { + if (!active) { + setTranscript([]); + setSuggestions([]); + } + }, [active]); + + return { transcript, suggestions, connected }; +}; +``` + +- [ ] **Step 2: Install socket.io-client in frontend** + +```bash +cd helix-engage && npm install socket.io-client +``` + +- [ ] **Step 3: Expose audioElement in SIPClient** + +In `helix-engage/src/lib/sip-client.ts`, the `audioElement` is private. Add a public getter: + +```typescript +getAudioElement(): HTMLAudioElement | null { + return this.audioElement; +} +``` + +Update `getSipClient` usage in the hook — access via `getSipClient()?.getAudioElement()?.srcObject`. + +- [ ] **Step 4: Type check and commit** + +``` +feat: add useCallAssist hook for live transcription WebSocket +``` + +--- + +## Task 5: Frontend — LiveTranscript component + +**Files:** +- Create: `helix-engage/src/components/call-desk/live-transcript.tsx` + +- [ ] **Step 1: Create the component** + +Scrolling list of transcript lines with AI suggestion cards interspersed. Auto-scrolls to bottom. + +```typescript +import { useEffect, useRef } from 'react'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { faSparkles, faMicrophone } from '@fortawesome/pro-duotone-svg-icons'; +import { cx } from '@/utils/cx'; + +type TranscriptLine = { + id: string; + text: string; + isFinal: boolean; + timestamp: Date; +}; + +type Suggestion = { + id: string; + text: string; + timestamp: Date; +}; + +type LiveTranscriptProps = { + transcript: TranscriptLine[]; + suggestions: Suggestion[]; + connected: boolean; +}; + +export const LiveTranscript = ({ transcript, suggestions, connected }: LiveTranscriptProps) => { + const scrollRef = useRef(null); + + // Auto-scroll to bottom + useEffect(() => { + if (scrollRef.current) { + scrollRef.current.scrollTop = scrollRef.current.scrollHeight; + } + }, [transcript.length, suggestions.length]); + + // Merge transcript and suggestions by timestamp + const items = [ + ...transcript.map(t => ({ ...t, kind: 'transcript' as const })), + ...suggestions.map(s => ({ ...s, kind: 'suggestion' as const, isFinal: true })), + ].sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime()); + + return ( +
+ {/* Header */} +
+ + Live Assist +
+
+ + {/* Transcript body */} +
+ {items.length === 0 && ( +
+ +

Listening to customer...

+

Transcript will appear here

+
+ )} + + {items.map(item => { + if (item.kind === 'suggestion') { + return ( +
+
+ + AI Suggestion +
+

{item.text}

+
+ ); + } + + return ( +
+ + {item.timestamp.toLocaleTimeString('en-IN', { hour: '2-digit', minute: '2-digit', second: '2-digit' })} + + {item.text} +
+ ); + })} +
+
+ ); +}; +``` + +- [ ] **Step 2: Commit** + +``` +feat: add LiveTranscript component for call sidebar +``` + +--- + +## Task 6: Wire live transcript into the call desk + +**Files:** +- Modify: `helix-engage/src/components/call-desk/context-panel.tsx` +- Modify: `helix-engage/src/pages/call-desk.tsx` + +- [ ] **Step 1: Update context-panel.tsx to show LiveTranscript during calls** + +Import the hook and component: +```typescript +import { useCallAssist } from '@/hooks/use-call-assist'; +import { LiveTranscript } from './live-transcript'; +``` + +Accept new props: +```typescript +interface ContextPanelProps { + selectedLead: Lead | null; + activities: LeadActivity[]; + callerPhone?: string; + isInCall?: boolean; + callUcid?: string | null; +} +``` + +Inside the component, use the hook: +```typescript +const { transcript, suggestions, connected } = useCallAssist( + isInCall ?? false, + callUcid ?? null, + selectedLead?.id ?? null, + callerPhone ?? null, +); +``` + +When `isInCall` is true, replace the AI Assistant tab content with LiveTranscript: +```typescript +{activeTab === 'ai' && ( + isInCall ? ( + + ) : ( + + ) +)} +``` + +- [ ] **Step 2: Pass isInCall and callUcid to ContextPanel in call-desk.tsx** + +```typescript + +``` + +Also get `callUcid` from `useSip()`: +```typescript +const { connectionStatus, isRegistered, callState, callerNumber, callUcid } = useSip(); +``` + +- [ ] **Step 3: Remove CallPrepCard during active calls** + +In `call-desk.tsx`, remove the CallPrepCard from the active call area: + +```typescript +{isInCall && ( +
+ +
+)} +``` + +Keep the CallPrepCard import for now — it might be useful in other contexts later. + +- [ ] **Step 4: Type check and commit** + +``` +feat: wire live transcript into call desk sidebar +``` + +--- + +## Task 7: Deploy and verify + +- [ ] **Step 1: Get Deepgram API key** + +Sign up at deepgram.com — free tier includes $200 credit. Set `DEEPGRAM_API_KEY` in the sidecar's docker-compose env. + +- [ ] **Step 2: Build and deploy sidecar** + +```bash +cd helix-engage-server && npm install && npm run build +``` + +- [ ] **Step 3: Build and deploy frontend** + +```bash +cd helix-engage && npm install && npm run build +``` + +- [ ] **Step 4: Test end-to-end** + +1. Login as CC agent +2. Place or receive a call +3. Sidebar should show "Live Assist" with green dot +4. Customer speaks → transcript appears in real-time +5. Every 10 seconds → AI suggestion card appears with contextual advice +6. Call ends → transcript stays visible during disposition + +--- + +## Notes + +- **ScriptProcessorNode is deprecated** but universally supported. AudioWorklet would require a separate JS file served via a URL. Can upgrade later. +- **Deepgram `interim_results: true`** gives streaming partial results (updated as words are recognized). `isFinal` results are the confirmed transcription. +- **Socket.IO binary support** — `socket.emit('call-assist:audio', chunk)` sends ArrayBuffer natively. No base64 encoding needed. +- **The `audioElement.srcObject`** is the remote MediaStream — this is the customer's audio only. We don't send the agent's mic to avoid echo/feedback in transcription. +- **Cost**: ~₹2 per 5-minute call (Deepgram + OpenAI combined). +- **If DEEPGRAM_API_KEY is not set**, the gateway logs a warning and sends an error event to the client. Transcription is disabled gracefully — the app still works without it. diff --git a/docs/superpowers/specs/2026-03-21-live-call-assist-design.md b/docs/superpowers/specs/2026-03-21-live-call-assist-design.md new file mode 100644 index 0000000..bf24eee --- /dev/null +++ b/docs/superpowers/specs/2026-03-21-live-call-assist-design.md @@ -0,0 +1,173 @@ +# Live Call Assist — Design Spec + +## Problem + +CC agents have no real-time intelligence during calls. The AI sidebar shows a static pre-call summary and a chat interface that requires manual typing — useless when the agent is on the phone. The agent has to remember lead history, doctor availability, and past interactions from memory. + +## Solution + +Stream the call's remote audio (customer voice) to the sidecar, transcribe via Deepgram Nova, and every 10 seconds feed the accumulated transcript + full lead context to OpenAI for real-time suggestions. Display a scrolling transcript with AI suggestion cards in the sidebar. + +## Architecture + +``` +Browser (WebRTC call) + │ + ├─ Remote audio track (customer) ──► AudioWorklet (PCM 16-bit, 16kHz) + │ │ + │ WebSocket to sidecar + │ │ + │ ┌──────────▼──────────┐ + │ │ Sidecar Gateway │ + │ │ ws://api/call-assist│ + │ └──────────┬──────────┘ + │ │ + │ ┌──────────────┼──────────────┐ + │ ▼ ▼ + │ Deepgram Nova WS Every 10s: OpenAI + │ (audio → text) (transcript + context + │ → suggestions) + │ │ │ + │ ▼ ▼ + │ Transcript lines AI suggestion cards + │ │ │ + │ └──────────────┬──────────────┘ + │ │ + │ WebSocket to browser + │ │ + └─────────────────────────────────────────▼ + AI Sidebar + (transcript + suggestions) +``` + +## Components + +### 1. Browser: Audio capture + WebSocket client + +**Audio capture**: When call becomes `active`, grab the remote audio track from the peer connection. Use an `AudioWorklet` to downsample to 16-bit PCM at 16kHz (Deepgram's preferred format). Send raw audio chunks (~100ms each) over WebSocket. + +**WebSocket client**: Connects to `wss://engage-api.srv1477139.hstgr.cloud/api/call-assist`. Sends: +- Initial message: `{ type: "start", ucid, leadId, callerPhone }` +- Audio chunks: binary PCM data +- End: `{ type: "stop" }` + +Receives: +- `{ type: "transcript", text: "...", isFinal: boolean }` — real-time transcript lines +- `{ type: "suggestion", text: "...", action?: "book_appointment" | "transfer" }` — AI suggestions +- `{ type: "context_loaded", leadName: "...", summary: "..." }` — confirmation that lead context was loaded + +### 2. Sidecar: WebSocket Gateway + +**NestJS WebSocket Gateway** at `/api/call-assist`. On connection: + +1. Receives `start` message with `ucid`, `leadId`, `callerPhone` +2. Loads lead context from platform: lead details, past calls, appointments, doctors, follow-ups +3. Opens Deepgram Nova WebSocket (`wss://api.deepgram.com/v1/listen`) +4. Pipes incoming audio chunks to Deepgram +5. Deepgram returns transcript chunks — forwards to browser +6. Every 10 seconds, sends accumulated transcript + lead context to OpenAI `gpt-4o-mini` for suggestions +7. Returns suggestions to browser + +**System prompt for OpenAI** (loaded once with lead context): +``` +You are a real-time call assistant for Global Hospital Bangalore. +You listen to the conversation and provide brief, actionable suggestions. + +CALLER CONTEXT: +- Name: {leadName} +- Phone: {phone} +- Source: {source} ({campaign}) +- Previous calls: {callCount} (last: {lastCallDate}, disposition: {lastDisposition}) +- Appointments: {appointmentHistory} +- Interested in: {interestedService} +- AI Summary: {aiSummary} + +AVAILABLE RESOURCES: +- Doctors: {doctorList with departments and clinics} +- Next available slots: {availableSlots} + +RULES: +- Keep suggestions under 2 sentences +- Focus on actionable next steps +- If customer mentions a doctor/department, show available slots +- If customer wants to cancel, note the appointment ID +- Flag if customer sounds upset or mentions a complaint +- Do NOT repeat information the agent already said +``` + +**OpenAI call** (every 10 seconds): +```typescript +const response = await openai.chat.completions.create({ + model: 'gpt-4o-mini', + messages: [ + { role: 'system', content: systemPrompt }, + { role: 'user', content: `Conversation so far:\n${transcript}\n\nProvide a brief suggestion for the agent.` }, + ], + max_tokens: 150, +}); +``` + +### 3. Frontend: Live transcript sidebar + +Replace the AI chat tab content during active calls with a live transcript view: + +- Scrolling transcript with timestamps +- Customer lines in one color, suggestions in a highlighted card +- Auto-scroll to bottom as new lines arrive +- Suggestions appear as colored cards between transcript lines +- When call ends, transcript stays visible for reference during disposition + +### 4. Context loading + +On `start` message, the sidecar queries the platform for: +```graphql +# Lead details +{ leads(filter: { id: { eq: "{leadId}" } }) { edges { node { ... } } } } + +# Past appointments +{ appointments(filter: { patientId: { eq: "{leadId}" } }) { edges { node { ... } } } } + +# Doctors +{ doctors(first: 20) { edges { node { id fullName department clinic } } } } +``` + +This context is loaded once and injected into the system prompt. No mid-call refresh needed. + +## File structure + +### Sidecar (helix-engage-server) +| File | Responsibility | +|------|---------------| +| `src/call-assist/call-assist.gateway.ts` | WebSocket gateway — handles audio streaming, Deepgram connection, OpenAI calls | +| `src/call-assist/call-assist.module.ts` | Module registration | +| `src/call-assist/call-assist.service.ts` | Context loading from platform, OpenAI prompt building | + +### Frontend (helix-engage) +| File | Responsibility | +|------|---------------| +| `src/lib/audio-capture.ts` | AudioWorklet to capture + downsample remote audio track | +| `src/hooks/use-call-assist.ts` | WebSocket connection to sidecar, manages transcript + suggestion state | +| `src/components/call-desk/live-transcript.tsx` | Scrolling transcript + suggestion cards UI | +| `src/components/call-desk/context-panel.tsx` | Modify: show LiveTranscript instead of AiChatPanel during active calls | +| `src/pages/call-desk.tsx` | Modify: remove CallPrepCard during active calls | + +## Dependencies + +- **Deepgram SDK**: `@deepgram/sdk` in sidecar (or raw WebSocket) +- **DEEPGRAM_API_KEY**: environment variable in sidecar +- **AudioWorklet**: browser API, no dependencies (supported in all modern browsers) +- **OpenAI**: already configured in sidecar (`gpt-4o-mini`) + +## Cost estimate + +Per 5-minute call: +- Deepgram Nova: ~$0.02 (at $0.0043/min) +- OpenAI gpt-4o-mini: ~$0.005 (30 calls × ~500 tokens each) +- Total: ~$0.025 per call (~₹2) + +## Out of scope + +- Agent mic transcription (only customer audio for now — agent's words are visible in the AI suggestions context) +- Voice response from AI (text only) +- Persistent transcript storage (future: save to Call record after call ends) +- Multi-language support (English only for now) diff --git a/public/helix-logo.png b/public/helix-logo.png new file mode 100644 index 0000000..72dcc1c Binary files /dev/null and b/public/helix-logo.png differ diff --git a/src/components/application/app-navigation/base-components/nav-account-card.tsx b/src/components/application/app-navigation/base-components/nav-account-card.tsx index a627498..2d565b7 100644 --- a/src/components/application/app-navigation/base-components/nav-account-card.tsx +++ b/src/components/application/app-navigation/base-components/nav-account-card.tsx @@ -1,7 +1,14 @@ import type { FC, HTMLAttributes } from "react"; import { useCallback, useEffect, useRef } from "react"; import type { Placement } from "@react-types/overlays"; -import { ChevronSelectorVertical, LogOut01, PhoneCall01, Settings01, User01 } from "@untitledui/icons"; +import { ChevronSelectorVertical } from "@untitledui/icons"; +import { FontAwesomeIcon } from "@fortawesome/react-fontawesome"; +import { faUser, faGear, faArrowRightFromBracket, faPhoneVolume } from "@fortawesome/pro-duotone-svg-icons"; + +const IconUser: FC<{ className?: string }> = ({ className }) => ; +const IconSettings: FC<{ className?: string }> = ({ className }) => ; +const IconLogout: FC<{ className?: string }> = ({ className }) => ; +const IconForceReady: FC<{ className?: string }> = ({ className }) => ; import { useFocusManager } from "react-aria"; import type { DialogProps as AriaDialogProps } from "react-aria-components"; import { Button as AriaButton, Dialog as AriaDialog, DialogTrigger as AriaDialogTrigger, Popover as AriaPopover } from "react-aria-components"; @@ -67,14 +74,14 @@ export const NavAccountMenu = ({ >
- - - + + +
- +
); diff --git a/src/components/call-desk/active-call-card.tsx b/src/components/call-desk/active-call-card.tsx index 76d6d34..465de61 100644 --- a/src/components/call-desk/active-call-card.tsx +++ b/src/components/call-desk/active-call-card.tsx @@ -3,6 +3,7 @@ import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; import { faPhone, faPhoneHangup, faMicrophone, faMicrophoneSlash, faPause, faPlay, faCalendarPlus, faCheckCircle, + faPhoneArrowRight, faRecordVinyl, } from '@fortawesome/pro-duotone-svg-icons'; import { Button } from '@/components/base/buttons/button'; import { Badge } from '@/components/base/badges/badges'; @@ -12,6 +13,7 @@ import { setOutboundPending } from '@/state/sip-manager'; import { useSip } from '@/providers/sip-provider'; import { DispositionForm } from './disposition-form'; import { AppointmentForm } from './appointment-form'; +import { TransferDialog } from './transfer-dialog'; import { formatPhone } from '@/lib/format'; import { apiClient } from '@/lib/api-client'; import { notify } from '@/lib/toast'; @@ -39,6 +41,8 @@ export const ActiveCallCard = ({ lead, callerPhone }: ActiveCallCardProps) => { const [savedDisposition, setSavedDisposition] = useState(null); const [appointmentOpen, setAppointmentOpen] = useState(false); const [appointmentBookedDuringCall, setAppointmentBookedDuringCall] = useState(false); + const [transferOpen, setTransferOpen] = useState(false); + const [recordingPaused, setRecordingPaused] = useState(false); // Capture direction at mount — survives through disposition stage const callDirectionRef = useRef(callState === 'ringing-out' ? 'OUTBOUND' : 'INBOUND'); @@ -248,11 +252,36 @@ export const ActiveCallCard = ({ lead, callerPhone }: ActiveCallCardProps) => { + +
+ {/* Transfer dialog */} + {transferOpen && callUcid && ( + setTransferOpen(false)} + onTransferred={() => { + setTransferOpen(false); + hangup(); + setPostCallStage('disposition'); + }} + /> + )} + {/* Appointment form accessible during call */} { +export const ContextPanel = ({ selectedLead, activities, callerPhone, isInCall, callUcid }: ContextPanelProps) => { const [activeTab, setActiveTab] = useState('ai'); // Auto-switch to lead 360 when a lead is selected @@ -25,6 +29,13 @@ export const ContextPanel = ({ selectedLead, activities, callerPhone }: ContextP } }, [selectedLead?.id]); + const { transcript, suggestions, connected: assistConnected } = useCallAssist( + isInCall ?? false, + callUcid ?? null, + selectedLead?.id ?? null, + callerPhone ?? null, + ); + const callerContext = selectedLead ? { callerPhone: selectedLead.contactPhone?.[0]?.number ?? callerPhone, leadId: selectedLead.id, @@ -64,9 +75,13 @@ export const ContextPanel = ({ selectedLead, activities, callerPhone }: ContextP {/* Tab content */}
{activeTab === 'ai' && ( -
- -
+ isInCall ? ( + + ) : ( +
+ +
+ ) )} {activeTab === 'lead360' && ( diff --git a/src/components/call-desk/live-transcript.tsx b/src/components/call-desk/live-transcript.tsx new file mode 100644 index 0000000..47c247a --- /dev/null +++ b/src/components/call-desk/live-transcript.tsx @@ -0,0 +1,90 @@ +import { useEffect, useRef } from 'react'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { faSparkles, faMicrophone } from '@fortawesome/pro-duotone-svg-icons'; +import { cx } from '@/utils/cx'; + +type TranscriptLine = { + id: string; + text: string; + isFinal: boolean; + timestamp: Date; +}; + +type Suggestion = { + id: string; + text: string; + timestamp: Date; +}; + +type LiveTranscriptProps = { + transcript: TranscriptLine[]; + suggestions: Suggestion[]; + connected: boolean; +}; + +export const LiveTranscript = ({ transcript, suggestions, connected }: LiveTranscriptProps) => { + const scrollRef = useRef(null); + + useEffect(() => { + if (scrollRef.current) { + scrollRef.current.scrollTop = scrollRef.current.scrollHeight; + } + }, [transcript.length, suggestions.length]); + + // Merge transcript and suggestions by timestamp + const items = [ + ...transcript.map(t => ({ ...t, kind: 'transcript' as const })), + ...suggestions.map(s => ({ ...s, kind: 'suggestion' as const, isFinal: true })), + ].sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime()); + + return ( +
+ {/* Header */} +
+ + Live Assist +
+
+ + {/* Transcript body */} +
+ {items.length === 0 && ( +
+ +

Listening to customer...

+

Transcript will appear here

+
+ )} + + {items.map(item => { + if (item.kind === 'suggestion') { + return ( +
+
+ + AI Suggestion +
+

{item.text}

+
+ ); + } + + return ( +
+ + {item.timestamp.toLocaleTimeString('en-IN', { hour: '2-digit', minute: '2-digit', second: '2-digit' })} + + {item.text} +
+ ); + })} +
+
+ ); +}; diff --git a/src/components/call-desk/phone-action-cell.tsx b/src/components/call-desk/phone-action-cell.tsx new file mode 100644 index 0000000..a1464b7 --- /dev/null +++ b/src/components/call-desk/phone-action-cell.tsx @@ -0,0 +1,150 @@ +import { useState, useRef, useEffect } from 'react'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { faPhone, faCommentDots, faEllipsisVertical, faMessageDots } from '@fortawesome/pro-duotone-svg-icons'; +import { useSetAtom } from 'jotai'; +import { useSip } from '@/providers/sip-provider'; +import { sipCallStateAtom, sipCallerNumberAtom, sipCallUcidAtom } from '@/state/sip-state'; +import { setOutboundPending } from '@/state/sip-manager'; +import { apiClient } from '@/lib/api-client'; +import { notify } from '@/lib/toast'; +import { cx } from '@/utils/cx'; + +type PhoneActionCellProps = { + phoneNumber: string; + displayNumber: string; + leadId?: string; +}; + +export const PhoneActionCell = ({ phoneNumber, displayNumber, leadId: _leadId }: PhoneActionCellProps) => { + const { isRegistered, isInCall } = useSip(); + const setCallState = useSetAtom(sipCallStateAtom); + const setCallerNumber = useSetAtom(sipCallerNumberAtom); + const setCallUcid = useSetAtom(sipCallUcidAtom); + const [menuOpen, setMenuOpen] = useState(false); + const [dialing, setDialing] = useState(false); + const menuRef = useRef(null); + const touchTimer = useRef(null); + + // Close menu on click outside + useEffect(() => { + if (!menuOpen) return; + const handleClick = (e: MouseEvent) => { + if (menuRef.current && !menuRef.current.contains(e.target as Node)) { + setMenuOpen(false); + } + }; + document.addEventListener('mousedown', handleClick); + return () => document.removeEventListener('mousedown', handleClick); + }, [menuOpen]); + + const handleCall = async () => { + if (!isRegistered || isInCall || dialing) return; + setMenuOpen(false); + setDialing(true); + setCallState('ringing-out'); + setCallerNumber(phoneNumber); + setOutboundPending(true); + const safetyTimer = setTimeout(() => setOutboundPending(false), 30000); + + try { + const result = await apiClient.post<{ ucid?: string }>('/api/ozonetel/dial', { phoneNumber }); + if (result?.ucid) setCallUcid(result.ucid); + } catch { + clearTimeout(safetyTimer); + setCallState('idle'); + setCallerNumber(null); + setOutboundPending(false); + setCallUcid(null); + notify.error('Dial Failed', 'Could not place the call'); + } finally { + setDialing(false); + } + }; + + const handleSms = () => { + setMenuOpen(false); + window.open(`sms:+91${phoneNumber}`, '_self'); + }; + + const handleWhatsApp = () => { + setMenuOpen(false); + window.open(`https://wa.me/91${phoneNumber}`, '_blank'); + }; + + // Long-press for mobile + const onTouchStart = () => { + touchTimer.current = window.setTimeout(() => setMenuOpen(true), 500); + }; + + const onTouchEnd = () => { + if (touchTimer.current) { + clearTimeout(touchTimer.current); + touchTimer.current = null; + } + }; + + const canCall = isRegistered && !isInCall && !dialing; + + return ( +
+ {/* Clickable phone number — calls directly */} + + + {/* Kebab menu trigger — desktop */} + + + {/* Context menu */} + {menuOpen && ( +
+ + + +
+ )} +
+ ); +}; diff --git a/src/components/call-desk/transfer-dialog.tsx b/src/components/call-desk/transfer-dialog.tsx new file mode 100644 index 0000000..c7113fa --- /dev/null +++ b/src/components/call-desk/transfer-dialog.tsx @@ -0,0 +1,91 @@ +import { useState } from 'react'; +import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; +import { faXmark } from '@fortawesome/pro-duotone-svg-icons'; +import { Input } from '@/components/base/input/input'; +import { Button } from '@/components/base/buttons/button'; +import { apiClient } from '@/lib/api-client'; +import { notify } from '@/lib/toast'; + +type TransferDialogProps = { + ucid: string; + onClose: () => void; + onTransferred: () => void; +}; + +export const TransferDialog = ({ ucid, onClose, onTransferred }: TransferDialogProps) => { + const [number, setNumber] = useState(''); + const [transferring, setTransferring] = useState(false); + const [stage, setStage] = useState<'input' | 'connected'>('input'); + + const handleConference = async () => { + if (!number.trim()) return; + setTransferring(true); + try { + await apiClient.post('/api/ozonetel/call-control', { + action: 'CONFERENCE', + ucid, + conferenceNumber: `0${number.replace(/\D/g, '')}`, + }); + notify.success('Connected', 'Third party connected. Click Complete to transfer.'); + setStage('connected'); + } catch { + notify.error('Transfer Failed', 'Could not connect to the target number'); + } finally { + setTransferring(false); + } + }; + + const handleComplete = async () => { + setTransferring(true); + try { + await apiClient.post('/api/ozonetel/call-control', { + action: 'KICK_CALL', + ucid, + conferenceNumber: `0${number.replace(/\D/g, '')}`, + }); + notify.success('Transferred', 'Call transferred successfully'); + onTransferred(); + } catch { + notify.error('Transfer Failed', 'Could not complete transfer'); + } finally { + setTransferring(false); + } + }; + + return ( +
+
+ Transfer Call + +
+ {stage === 'input' ? ( +
+ + +
+ ) : ( +
+ Connected to {number} + +
+ )} +
+ ); +}; diff --git a/src/components/call-desk/worklist-panel.tsx b/src/components/call-desk/worklist-panel.tsx index ec79408..1f9c72d 100644 --- a/src/components/call-desk/worklist-panel.tsx +++ b/src/components/call-desk/worklist-panel.tsx @@ -1,4 +1,4 @@ -import { useCallback, useMemo, useState } from 'react'; +import { useCallback, useEffect, useMemo, useRef, useState } from 'react'; import type { FC, HTMLAttributes } from 'react'; import { FontAwesomeIcon } from '@fortawesome/react-fontawesome'; import { @@ -10,8 +10,9 @@ import { Table } from '@/components/application/table/table'; import { Badge } from '@/components/base/badges/badges'; import { Input } from '@/components/base/input/input'; import { Tabs, TabList, Tab } from '@/components/application/tabs/tabs'; -import { ClickToCallButton } from './click-to-call-button'; +import { PhoneActionCell } from './phone-action-cell'; import { formatPhone } from '@/lib/format'; +import { notify } from '@/lib/toast'; import { cx } from '@/utils/cx'; type WorklistLead = { @@ -24,6 +25,10 @@ type WorklistLead = { interestedService: string | null; aiSummary: string | null; aiSuggestedAction: string | null; + lastContacted: string | null; + contactAttempts: number | null; + utmCampaign: string | null; + campaignId: string | null; }; type WorklistFollowUp = { @@ -42,6 +47,7 @@ type MissedCall = { callerNumber: { number: string; callingCode: string }[] | null; startedAt: string | null; leadId: string | null; + disposition: string | null; }; interface WorklistPanelProps { @@ -55,7 +61,6 @@ interface WorklistPanelProps { type TabKey = 'all' | 'missed' | 'callbacks' | 'follow-ups'; -// Unified row type for the table type WorklistRow = { id: string; type: 'missed' | 'callback' | 'follow-up' | 'lead'; @@ -70,6 +75,10 @@ type WorklistRow = { taskState: 'PENDING' | 'ATTEMPTED' | 'SCHEDULED'; leadId: string | null; originalLead: WorklistLead | null; + lastContactedAt: string | null; + contactAttempts: number; + source: string | null; + lastDisposition: string | null; }; const priorityConfig: Record = { @@ -87,9 +96,8 @@ const followUpLabel: Record = { REVIEW_REQUEST: 'Review', }; -// Compute SLA: minutes since created, color-coded -const computeSla = (createdAt: string): { label: string; color: 'success' | 'warning' | 'error' } => { - const minutes = Math.max(0, Math.round((Date.now() - new Date(createdAt).getTime()) / 60000)); +const computeSla = (dateStr: string): { label: string; color: 'success' | 'warning' | 'error' } => { + const minutes = Math.max(0, Math.round((Date.now() - new Date(dateStr).getTime()) / 60000)); if (minutes < 1) return { label: '<1m', color: 'success' }; if (minutes < 15) return { label: `${minutes}m`, color: 'success' }; if (minutes < 30) return { label: `${minutes}m`, color: 'warning' }; @@ -99,6 +107,30 @@ const computeSla = (createdAt: string): { label: string; color: 'success' | 'war return { label: `${Math.floor(hours / 24)}d`, color: 'error' }; }; +const formatTimeAgo = (dateStr: string): string => { + const minutes = Math.round((Date.now() - new Date(dateStr).getTime()) / 60000); + if (minutes < 1) return 'Just now'; + if (minutes < 60) return `${minutes}m ago`; + const hours = Math.floor(minutes / 60); + if (hours < 24) return `${hours}h ago`; + return `${Math.floor(hours / 24)}d ago`; +}; + +const formatDisposition = (disposition: string): string => + disposition.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase()); + +const formatSource = (source: string): string => { + const map: Record = { + FACEBOOK_AD: 'Facebook', + GOOGLE_AD: 'Google', + WALK_IN: 'Walk-in', + REFERRAL: 'Referral', + WEBSITE: 'Website', + PHONE_INQUIRY: 'Phone', + }; + return map[source] ?? source.replace(/_/g, ' '); +}; + const IconInbound: FC> = ({ className }) => ( ); @@ -127,6 +159,10 @@ const buildRows = (missedCalls: MissedCall[], followUps: WorklistFollowUp[], lea taskState: 'PENDING', leadId: call.leadId, originalLead: null, + lastContactedAt: call.startedAt ?? call.createdAt, + contactAttempts: 0, + source: null, + lastDisposition: call.disposition ?? null, }); } @@ -149,6 +185,10 @@ const buildRows = (missedCalls: MissedCall[], followUps: WorklistFollowUp[], lea taskState: isOverdue ? 'PENDING' : (fu.followUpStatus === 'COMPLETED' ? 'ATTEMPTED' : 'SCHEDULED'), leadId: null, originalLead: null, + lastContactedAt: fu.scheduledAt ?? fu.createdAt ?? null, + contactAttempts: 0, + source: null, + lastDisposition: null, }); } @@ -171,25 +211,24 @@ const buildRows = (missedCalls: MissedCall[], followUps: WorklistFollowUp[], lea taskState: 'PENDING', leadId: lead.id, originalLead: lead, + lastContactedAt: lead.lastContacted ?? null, + contactAttempts: lead.contactAttempts ?? 0, + source: lead.leadSource ?? lead.utmCampaign ?? null, + lastDisposition: null, }); } - // Sort by priority (urgent first), then by creation time (oldest first) - rows.sort((a, b) => { + // Remove rows without a phone number — agent can't act on them + const actionableRows = rows.filter(r => r.phoneRaw); + + actionableRows.sort((a, b) => { const pa = priorityConfig[a.priority]?.sort ?? 2; const pb = priorityConfig[b.priority]?.sort ?? 2; if (pa !== pb) return pa - pb; return new Date(a.createdAt).getTime() - new Date(b.createdAt).getTime(); }); - return rows; -}; - -const typeConfig: Record = { - missed: { color: 'error' }, - callback: { color: 'brand' }, - 'follow-up': { color: 'blue-light' }, - lead: { color: 'gray' }, + return actionableRows; }; export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelectLead, selectedLeadId }: WorklistPanelProps) => { @@ -203,13 +242,10 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect const filteredRows = useMemo(() => { let rows = allRows; - - // Tab filter if (tab === 'missed') rows = rows.filter((r) => r.type === 'missed'); else if (tab === 'callbacks') rows = rows.filter((r) => r.type === 'callback'); else if (tab === 'follow-ups') rows = rows.filter((r) => r.type === 'follow-up'); - // Search filter if (search.trim()) { const q = search.toLowerCase(); rows = rows.filter( @@ -224,10 +260,18 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect const callbackCount = allRows.filter((r) => r.type === 'callback').length; const followUpCount = allRows.filter((r) => r.type === 'follow-up').length; + // Notification for new missed calls + const prevMissedCount = useRef(missedCount); + useEffect(() => { + if (missedCount > prevMissedCount.current && prevMissedCount.current > 0) { + notify.info('New Missed Call', `${missedCount - prevMissedCount.current} new missed call(s)`); + } + prevMissedCount.current = missedCount; + }, [missedCount]); + const PAGE_SIZE = 15; const [page, setPage] = useState(1); - // Reset page when filters change const handleTabChange = useCallback((key: TabKey) => { setTab(key); setPage(1); }, []); const handleSearch = useCallback((value: string) => { setSearch(value); setPage(1); }, []); @@ -262,7 +306,7 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect return (
- {/* Filter tabs + search — single row */} + {/* Filter tabs + search */}
handleTabChange(key as TabKey)}> @@ -294,28 +338,29 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect - - - + + {(row) => { const priority = priorityConfig[row.priority] ?? priorityConfig.NORMAL; - const sla = computeSla(row.createdAt); - const typeCfg = typeConfig[row.type]; + const sla = computeSla(row.lastContactedAt ?? row.createdAt); const isSelected = row.originalLead !== null && row.originalLead.id === selectedLeadId; + // Sub-line: last interaction context + const subLine = row.lastContactedAt + ? `${formatTimeAgo(row.lastContactedAt)}${row.lastDisposition ? ` — ${formatDisposition(row.lastDisposition)}` : ''}` + : row.reason || row.typeLabel; + return ( { - if (row.originalLead) { - onSelectLead(row.originalLead); - } + if (row.originalLead) onSelectLead(row.originalLead); }} > @@ -326,44 +371,46 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect
{row.direction === 'inbound' && ( - + )} {row.direction === 'outbound' && ( - + )} - - {row.name} - +
+ + {row.name} + + + {subLine} + +
- - {row.phone || '\u2014'} - + {row.phoneRaw ? ( + + ) : ( + No phone + )} - - {row.typeLabel} - + {row.source ? ( + + {formatSource(row.source)} + + ) : ( + + )} {sla.label} - -
- {row.phoneRaw ? ( - - ) : ( - No phone - )} -
-
); }} diff --git a/src/hooks/use-call-assist.ts b/src/hooks/use-call-assist.ts new file mode 100644 index 0000000..e03401f --- /dev/null +++ b/src/hooks/use-call-assist.ts @@ -0,0 +1,90 @@ +import { useEffect, useRef, useState, useCallback } from 'react'; +import { io, Socket } from 'socket.io-client'; +import { startAudioCapture, stopAudioCapture } from '@/lib/audio-capture'; +import { getSipClient } from '@/state/sip-manager'; + +const SIDECAR_URL = import.meta.env.VITE_SIDECAR_URL ?? 'http://localhost:4100'; + +type TranscriptLine = { + id: string; + text: string; + isFinal: boolean; + timestamp: Date; +}; + +type Suggestion = { + id: string; + text: string; + timestamp: Date; +}; + +export const useCallAssist = ( + active: boolean, + ucid: string | null, + leadId: string | null, + callerPhone: string | null, +) => { + const [transcript, setTranscript] = useState([]); + const [suggestions, setSuggestions] = useState([]); + const [connected, setConnected] = useState(false); + const socketRef = useRef(null); + const idCounter = useRef(0); + + const nextId = useCallback(() => `ca-${++idCounter.current}`, []); + + useEffect(() => { + if (!active || !ucid) return; + + const socket = io(`${SIDECAR_URL}/call-assist`); + socketRef.current = socket; + + socket.on('connect', () => { + setConnected(true); + socket.emit('call-assist:start', { ucid, leadId, callerPhone }); + + // Start capturing remote audio from the SIP session + const sipClient = getSipClient(); + const audioElement = sipClient?.getAudioElement(); + if (audioElement?.srcObject) { + startAudioCapture(audioElement.srcObject as MediaStream, (chunk) => { + socket.emit('call-assist:audio', chunk); + }); + } + }); + + socket.on('call-assist:transcript', (data: { text: string; isFinal: boolean }) => { + if (!data.text.trim()) return; + setTranscript(prev => { + if (!data.isFinal) { + const finals = prev.filter(l => l.isFinal); + return [...finals, { id: nextId(), text: data.text, isFinal: false, timestamp: new Date() }]; + } + const finals = prev.filter(l => l.isFinal); + return [...finals, { id: nextId(), text: data.text, isFinal: true, timestamp: new Date() }]; + }); + }); + + socket.on('call-assist:suggestion', (data: { text: string }) => { + setSuggestions(prev => [...prev, { id: nextId(), text: data.text, timestamp: new Date() }]); + }); + + socket.on('disconnect', () => setConnected(false)); + + return () => { + stopAudioCapture(); + socket.emit('call-assist:stop'); + socket.disconnect(); + socketRef.current = null; + setConnected(false); + }; + }, [active, ucid, leadId, callerPhone, nextId]); + + useEffect(() => { + if (!active) { + setTranscript([]); + setSuggestions([]); + } + }, [active]); + + return { transcript, suggestions, connected }; +}; diff --git a/src/hooks/use-worklist.ts b/src/hooks/use-worklist.ts index e632e96..9971e26 100644 --- a/src/hooks/use-worklist.ts +++ b/src/hooks/use-worklist.ts @@ -47,6 +47,8 @@ type WorklistLead = { isSpam: boolean | null; aiSummary: string | null; aiSuggestedAction: string | null; + lastContacted: string | null; + utmCampaign: string | null; }; type WorklistData = { diff --git a/src/lib/audio-capture.ts b/src/lib/audio-capture.ts new file mode 100644 index 0000000..becb4d0 --- /dev/null +++ b/src/lib/audio-capture.ts @@ -0,0 +1,45 @@ +type AudioChunkCallback = (chunk: ArrayBuffer) => void; + +let audioContext: AudioContext | null = null; +let mediaStreamSource: MediaStreamAudioSourceNode | null = null; +let scriptProcessor: ScriptProcessorNode | null = null; + +export function startAudioCapture(remoteStream: MediaStream, onChunk: AudioChunkCallback): void { + stopAudioCapture(); + + audioContext = new AudioContext({ sampleRate: 16000 }); + mediaStreamSource = audioContext.createMediaStreamSource(remoteStream); + + scriptProcessor = audioContext.createScriptProcessor(4096, 1, 1); + + scriptProcessor.onaudioprocess = (event) => { + const inputData = event.inputBuffer.getChannelData(0); + + // Convert Float32 to Int16 PCM + const pcm = new Int16Array(inputData.length); + for (let i = 0; i < inputData.length; i++) { + const s = Math.max(-1, Math.min(1, inputData[i])); + pcm[i] = s < 0 ? s * 0x8000 : s * 0x7FFF; + } + + onChunk(pcm.buffer); + }; + + mediaStreamSource.connect(scriptProcessor); + scriptProcessor.connect(audioContext.destination); +} + +export function stopAudioCapture(): void { + if (scriptProcessor) { + scriptProcessor.disconnect(); + scriptProcessor = null; + } + if (mediaStreamSource) { + mediaStreamSource.disconnect(); + mediaStreamSource = null; + } + if (audioContext) { + audioContext.close().catch(() => {}); + audioContext = null; + } +} diff --git a/src/lib/sip-client.ts b/src/lib/sip-client.ts index 46480ed..4e27b67 100644 --- a/src/lib/sip-client.ts +++ b/src/lib/sip-client.ts @@ -211,6 +211,10 @@ export class SIPClient { return this.ua?.isRegistered() ?? false; } + getAudioElement(): HTMLAudioElement | null { + return this.audioElement; + } + private resetSession(): void { this.currentSession = null; this.cleanupAudio(); diff --git a/src/pages/call-desk.tsx b/src/pages/call-desk.tsx index 00f91e6..ed3e3b2 100644 --- a/src/pages/call-desk.tsx +++ b/src/pages/call-desk.tsx @@ -9,14 +9,14 @@ import { WorklistPanel } from '@/components/call-desk/worklist-panel'; import type { WorklistLead } from '@/components/call-desk/worklist-panel'; import { ContextPanel } from '@/components/call-desk/context-panel'; import { ActiveCallCard } from '@/components/call-desk/active-call-card'; -import { CallPrepCard } from '@/components/call-desk/call-prep-card'; + import { BadgeWithDot, Badge } from '@/components/base/badges/badges'; import { cx } from '@/utils/cx'; export const CallDeskPage = () => { const { user } = useAuth(); const { leadActivities } = useData(); - const { connectionStatus, isRegistered, callState, callerNumber } = useSip(); + const { connectionStatus, isRegistered, callState, callerNumber, callUcid } = useSip(); const { missedCalls, followUps, marketingLeads, totalPending, loading } = useWorklist(); const [selectedLead, setSelectedLead] = useState(null); const [contextOpen, setContextOpen] = useState(true); @@ -66,9 +66,8 @@ export const CallDeskPage = () => {
{/* Active call */} {isInCall && ( -
+
-
)} @@ -95,6 +94,8 @@ export const CallDeskPage = () => { selectedLead={activeLeadFull} activities={leadActivities} callerPhone={callerNumber ?? undefined} + isInCall={isInCall} + callUcid={callUcid} /> )}
diff --git a/src/pages/login.tsx b/src/pages/login.tsx index a844a8c..1b9e659 100644 --- a/src/pages/login.tsx +++ b/src/pages/login.tsx @@ -117,9 +117,7 @@ export const LoginPage = () => {
{/* Logo lockup */}
-
- H -
+ Helix Engage Helix Engage
diff --git a/src/styles/globals.css b/src/styles/globals.css index 2449136..c5bad04 100755 --- a/src/styles/globals.css +++ b/src/styles/globals.css @@ -29,6 +29,14 @@ transition-timing-function: inherit; } +/* FontAwesome duotone icon colors — uses brand tokens */ +:root { + --fa-primary-color: var(--color-fg-brand-primary); + --fa-secondary-color: var(--color-fg-brand-secondary); + --fa-primary-opacity: 1; + --fa-secondary-opacity: 0.4; +} + html, body { font-family: var(--font-body); diff --git a/src/styles/theme.css b/src/styles/theme.css index 91a03fa..3fbe8aa 100644 --- a/src/styles/theme.css +++ b/src/styles/theme.css @@ -351,18 +351,18 @@ --color-blue-light-900: rgb(11 74 111); --color-blue-light-950: rgb(6 44 65); - --color-blue-25: rgb(245 250 255); - --color-blue-50: rgb(239 248 255); - --color-blue-100: rgb(209 233 255); - --color-blue-200: rgb(178 221 255); - --color-blue-300: rgb(132 202 255); - --color-blue-400: rgb(83 177 253); - --color-blue-500: rgb(46 144 250); - --color-blue-600: rgb(21 112 239); - --color-blue-700: rgb(23 92 211); - --color-blue-800: rgb(24 73 169); - --color-blue-900: rgb(25 65 133); - --color-blue-950: rgb(16 42 86); + --color-blue-25: rgb(246 249 253); + --color-blue-50: rgb(235 243 250); + --color-blue-100: rgb(214 230 245); + --color-blue-200: rgb(178 207 235); + --color-blue-300: rgb(138 180 220); + --color-blue-400: rgb(96 150 200); + --color-blue-500: rgb(56 120 180); + --color-blue-600: rgb(32 96 160); + --color-blue-700: rgb(24 76 132); + --color-blue-800: rgb(18 60 108); + --color-blue-900: rgb(14 46 84); + --color-blue-950: rgb(8 28 56); --color-blue-dark-25: rgb(245 248 255); --color-blue-dark-50: rgb(239 244 255); @@ -758,8 +758,8 @@ --color-bg-brand-secondary: var(--color-brand-100); --color-bg-brand-solid: var(--color-brand-600); --color-bg-brand-solid_hover: var(--color-brand-700); - --color-bg-brand-section: var(--color-brand-800); - --color-bg-brand-section_subtle: var(--color-brand-700); + --color-bg-brand-section: var(--color-brand-600); + --color-bg-brand-section_subtle: var(--color-brand-500); /* COMPONENT COLORS */ --color-app-store-badge-border: rgb(166 166 166);