mirror of
https://dev.azure.com/globalhealthx/EMR/_git/helix-engage
synced 2026-04-11 10:23:27 +00:00
feat: CC agent features, live call assist, worklist redesign, brand tokens
CC Agent: - Call transfer (CONFERENCE + KICK_CALL) with inline transfer dialog - Recording pause/resume during active calls - Missed calls API (Ozonetel abandonCalls) - Call history API (Ozonetel fetchCDRDetails) Live Call Assist: - Deepgram Nova STT via raw WebSocket - OpenAI suggestions every 10s with lead context - LiveTranscript component in sidebar during calls - Browser audio capture from remote WebRTC stream Worklist: - Redesigned table: clickable phones, context menu (Call/SMS/WhatsApp) - Last interaction sub-line, source column, improved SLA - Filtered out rows without phone numbers - New missed call notifications Brand: - Logo on login page - Blue scale rebuilt from logo blue rgb(32, 96, 160) - FontAwesome duotone CSS variables set globally - Profile menu icons switched to duotone Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -1,6 +1,43 @@
|
||||
# Next Session — Outbound Call UI + Remaining Work
|
||||
|
||||
## Priority 0: Kookoo Dial to SIP Extension
|
||||
## Priority 0: Outbound Call — CloudAgent WebSocket Integration
|
||||
|
||||
**CRITICAL FINDING:** The Ozonetel toolbar's outbound dial works via TWO connections:
|
||||
1. CloudAgent WebSocket (mdlConnection.php) — sends `tbManualDial` with `browserSessionId` + `usId`
|
||||
2. SIP WebSocket (blr-pub-rtc4.ozonetel.com:444) — receives the SIP INVITE and auto-answers
|
||||
|
||||
The `browserSessionId` and `usId` come from the CloudAgent WebSocket session handshake. Without them, CloudAgent doesn't know which browser to route the SIP INVITE to.
|
||||
|
||||
**The toolbar's tbManualDial payload:**
|
||||
```json
|
||||
{
|
||||
"type": "tbManualDial",
|
||||
"ns": "ozonetel.cloudagent",
|
||||
"customer": "global_healthx",
|
||||
"agentId": "global",
|
||||
"agentUniqId": 374804,
|
||||
"browserSessionId": "e15cd447-...", // FROM WEBSOCKET SESSION
|
||||
"usId": "af7hkcT3BcwCG-g=", // FROM WEBSOCKET SESSION
|
||||
"isSip": "true",
|
||||
"mode": "manual",
|
||||
"params": "312792,9949879837,523591,SIP:true", // campaignId,phone,sipExt,SIP:true
|
||||
"utid": 57
|
||||
}
|
||||
```
|
||||
|
||||
**What we need to do:**
|
||||
1. Connect to CloudAgent WebSocket from our browser (same mdlConnection.php endpoint)
|
||||
2. Establish session → get `usId` and `browserSessionId`
|
||||
3. Include these in `tbManualDial` requests
|
||||
4. CloudAgent will then send SIP INVITE to our JsSIP
|
||||
|
||||
**The toolbar's SIP service code** is at: user pasted it in conversation. Key function: `handleSipAutoAnswer()` which auto-answers based on agent's `autoAnswer` setting (0=none, 1=all, 2=inbound, 3=outbound).
|
||||
|
||||
**SIP config from toolbar:** password = extension number (523590), registrar = `sip:blr-pub-rtc4.ozonetel.com`, session_timers = false. Same as what we have.
|
||||
|
||||
**Kookoo approach is abandoned** — `<dial>` only works with PSTN numbers, not SIP extensions.
|
||||
|
||||
## Priority 0 (OLD): Kookoo Dial to SIP Extension
|
||||
|
||||
**Status:** Kookoo IVR endpoint works. When customer answers, Kookoo hits /kookoo/ivr, we respond with `<dial>523590</dial>`. But Kookoo tries to call 523590 as a PSTN number — status=not_answered.
|
||||
|
||||
|
||||
435
docs/superpowers/plans/2026-03-20-worklist-ux-redesign.md
Normal file
435
docs/superpowers/plans/2026-03-20-worklist-ux-redesign.md
Normal file
@@ -0,0 +1,435 @@
|
||||
# Worklist UX Redesign — Implementation Plan
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** Redesign the call desk worklist table for faster agent action — clickable phone numbers, last interaction context, campaign tags, context menus for SMS/WhatsApp, and meaningful SLA indicators.
|
||||
|
||||
**Architecture:** All changes are frontend-only. The data model already has everything needed (`lastContacted`, `contactAttempts`, `source`, `utmCampaign`, `interestedService`, `disposition` on calls). We enrich the worklist rows with this data and redesign the table columns.
|
||||
|
||||
**Tech Stack:** React 19, Untitled UI components, FontAwesome Pro Duotone icons, Jotai
|
||||
|
||||
---
|
||||
|
||||
## Current problems
|
||||
|
||||
1. Phone column is passive text — separate Call button in Actions column wastes space
|
||||
2. No last interaction context — agent doesn't know what happened before
|
||||
3. No campaign/source — agent can't personalize the opening
|
||||
4. SLA shows time since creation, not time since last contact
|
||||
5. Rows without phone numbers are dead weight
|
||||
6. No way to SMS or WhatsApp from the worklist
|
||||
|
||||
## Column redesign
|
||||
|
||||
| Before | After |
|
||||
|--------|-------|
|
||||
| PRIORITY \| PATIENT \| PHONE \| TYPE \| SLA \| ACTIONS | PRIORITY \| PATIENT \| PHONE \| SOURCE \| SLA |
|
||||
|
||||
- **PRIORITY** — badge, same as now
|
||||
- **PATIENT** — name + sub-line: last interaction context ("Called 2h ago — Info Provided") or interested service
|
||||
- **PHONE** — clickable number with phone icon. Hover shows context menu (Call / SMS / WhatsApp). On mobile, long-press shows the same menu. No separate Actions column.
|
||||
- **SOURCE** — campaign/source tag (e.g., "Facebook", "Google", "Walk-in")
|
||||
- **SLA** — time since `lastContacted` (not `createdAt`). Falls back to `createdAt` if never contacted.
|
||||
|
||||
## File map
|
||||
|
||||
| File | Responsibility | Action |
|
||||
|------|---------------|--------|
|
||||
| `src/components/call-desk/worklist-panel.tsx` | Worklist table + tabs | Modify: redesign columns, add phone context menu, enrich rows |
|
||||
| `src/components/call-desk/phone-action-cell.tsx` | Clickable phone with context menu | Create: encapsulates call/SMS/WhatsApp actions |
|
||||
| `src/hooks/use-worklist.ts` | Worklist data fetching | Modify: pass through `lastContacted`, `source`, `utmCampaign` fields |
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Enrich worklist data with last interaction and source
|
||||
|
||||
Pass through the additional fields that already exist in the Lead data but aren't currently used in the worklist row.
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage/src/components/call-desk/worklist-panel.tsx`
|
||||
|
||||
- [ ] **Step 1: Extend WorklistLead type in worklist-panel**
|
||||
|
||||
Add fields that are already returned by the hook but not typed:
|
||||
|
||||
```typescript
|
||||
type WorklistLead = {
|
||||
id: string;
|
||||
createdAt: string;
|
||||
contactName: { firstName: string; lastName: string } | null;
|
||||
contactPhone: { number: string; callingCode: string }[] | null;
|
||||
leadSource: string | null;
|
||||
leadStatus: string | null;
|
||||
interestedService: string | null;
|
||||
aiSummary: string | null;
|
||||
aiSuggestedAction: string | null;
|
||||
// New fields (already in API response)
|
||||
lastContacted: string | null;
|
||||
contactAttempts: number | null;
|
||||
utmCampaign: string | null;
|
||||
campaignId: string | null;
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Extend WorklistRow with new fields**
|
||||
|
||||
```typescript
|
||||
type WorklistRow = {
|
||||
// ... existing fields ...
|
||||
lastContactedAt: string | null;
|
||||
contactAttempts: number;
|
||||
source: string | null; // leadSource or utmCampaign
|
||||
lastDisposition: string | null;
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Populate new fields in buildRows**
|
||||
|
||||
For leads:
|
||||
```typescript
|
||||
rows.push({
|
||||
// ... existing ...
|
||||
lastContactedAt: lead.lastContacted ?? null,
|
||||
contactAttempts: lead.contactAttempts ?? 0,
|
||||
source: lead.leadSource ?? lead.utmCampaign ?? null,
|
||||
lastDisposition: null,
|
||||
});
|
||||
```
|
||||
|
||||
For missed calls:
|
||||
```typescript
|
||||
rows.push({
|
||||
// ... existing ...
|
||||
lastContactedAt: call.startedAt ?? call.createdAt,
|
||||
contactAttempts: 0,
|
||||
source: null,
|
||||
lastDisposition: call.disposition ?? null,
|
||||
});
|
||||
```
|
||||
|
||||
For follow-ups:
|
||||
```typescript
|
||||
rows.push({
|
||||
// ... existing ...
|
||||
lastContactedAt: fu.scheduledAt ?? fu.createdAt ?? null,
|
||||
contactAttempts: 0,
|
||||
source: null,
|
||||
lastDisposition: null,
|
||||
});
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Update MissedCall type to include disposition**
|
||||
|
||||
The hook already returns `disposition` but the worklist panel type doesn't have it:
|
||||
|
||||
```typescript
|
||||
type MissedCall = {
|
||||
// ... existing ...
|
||||
disposition: string | null;
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```
|
||||
feat: enrich worklist rows with last interaction and source data
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Create PhoneActionCell component
|
||||
|
||||
A reusable cell that shows the phone number as a clickable element with a context menu for Call, SMS, and WhatsApp.
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage/src/components/call-desk/phone-action-cell.tsx`
|
||||
|
||||
- [ ] **Step 1: Create the component**
|
||||
|
||||
```typescript
|
||||
import { useState, useRef } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faPhone, faComment, faEllipsisVertical } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import type { FC, HTMLAttributes } from 'react';
|
||||
import { useSip } from '@/providers/sip-provider';
|
||||
import { useSetAtom } from 'jotai';
|
||||
import { sipCallStateAtom, sipCallerNumberAtom, sipCallUcidAtom } from '@/state/sip-state';
|
||||
import { setOutboundPending } from '@/state/sip-manager';
|
||||
import { apiClient } from '@/lib/api-client';
|
||||
import { notify } from '@/lib/toast';
|
||||
import { cx } from '@/utils/cx';
|
||||
|
||||
type PhoneActionCellProps = {
|
||||
phoneNumber: string;
|
||||
displayNumber: string;
|
||||
leadId?: string;
|
||||
};
|
||||
```
|
||||
|
||||
The component renders:
|
||||
- The formatted phone number as clickable text (triggers call on click)
|
||||
- A small kebab menu icon (⋮) on hover that opens a popover with:
|
||||
- 📞 Call
|
||||
- 💬 SMS (opens `sms:` link)
|
||||
- 📱 WhatsApp (opens `https://wa.me/{number}`)
|
||||
- On mobile: long-press on the phone number opens the same menu
|
||||
|
||||
Implementation:
|
||||
- Use a simple `useState` for menu open/close
|
||||
- Position the menu absolutely below the phone number
|
||||
- Click outside closes it
|
||||
- The Call action uses the same logic as ClickToCallButton (setCallState, setCallerNumber, setOutboundPending, apiClient.post dial)
|
||||
- SMS opens `sms:+91${phoneNumber}`
|
||||
- WhatsApp opens `https://wa.me/91${phoneNumber}` in a new tab
|
||||
|
||||
- [ ] **Step 2: Handle long-press for mobile**
|
||||
|
||||
Add `onContextMenu` (prevents default) and `onTouchStart`/`onTouchEnd` for 500ms long-press detection:
|
||||
|
||||
```typescript
|
||||
const touchTimer = useRef<number | null>(null);
|
||||
|
||||
const onTouchStart = () => {
|
||||
touchTimer.current = window.setTimeout(() => {
|
||||
setMenuOpen(true);
|
||||
}, 500);
|
||||
};
|
||||
|
||||
const onTouchEnd = () => {
|
||||
if (touchTimer.current) {
|
||||
clearTimeout(touchTimer.current);
|
||||
touchTimer.current = null;
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```
|
||||
feat: create PhoneActionCell with call/SMS/WhatsApp context menu
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Redesign the worklist table columns
|
||||
|
||||
Replace the current 6-column layout with the new 5-column layout.
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage/src/components/call-desk/worklist-panel.tsx`
|
||||
|
||||
- [ ] **Step 1: Import PhoneActionCell**
|
||||
|
||||
```typescript
|
||||
import { PhoneActionCell } from './phone-action-cell';
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Replace table headers**
|
||||
|
||||
```typescript
|
||||
<Table.Header>
|
||||
<Table.Head label="PRIORITY" className="w-20" isRowHeader />
|
||||
<Table.Head label="PATIENT" />
|
||||
<Table.Head label="PHONE" />
|
||||
<Table.Head label="SOURCE" className="w-28" />
|
||||
<Table.Head label="SLA" className="w-24" />
|
||||
</Table.Header>
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Redesign PATIENT cell with sub-line**
|
||||
|
||||
```typescript
|
||||
<Table.Cell>
|
||||
<div className="flex items-center gap-2">
|
||||
{row.direction === 'inbound' && (
|
||||
<IconInbound className="size-3.5 text-fg-success-secondary shrink-0" />
|
||||
)}
|
||||
{row.direction === 'outbound' && (
|
||||
<IconOutbound className="size-3.5 text-fg-brand-secondary shrink-0" />
|
||||
)}
|
||||
<div className="min-w-0">
|
||||
<span className="text-sm font-medium text-primary truncate block max-w-[180px]">
|
||||
{row.name}
|
||||
</span>
|
||||
<span className="text-xs text-tertiary truncate block max-w-[180px]">
|
||||
{row.lastContactedAt
|
||||
? `${formatTimeAgo(row.lastContactedAt)}${row.lastDisposition ? ` — ${formatDisposition(row.lastDisposition)}` : ''}`
|
||||
: row.reason || row.typeLabel}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</Table.Cell>
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Replace PHONE cell with PhoneActionCell**
|
||||
|
||||
```typescript
|
||||
<Table.Cell>
|
||||
{row.phoneRaw ? (
|
||||
<PhoneActionCell
|
||||
phoneNumber={row.phoneRaw}
|
||||
displayNumber={row.phone}
|
||||
leadId={row.leadId ?? undefined}
|
||||
/>
|
||||
) : (
|
||||
<span className="text-xs text-quaternary italic">No phone</span>
|
||||
)}
|
||||
</Table.Cell>
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Add SOURCE cell**
|
||||
|
||||
```typescript
|
||||
<Table.Cell>
|
||||
{row.source ? (
|
||||
<span className="text-xs text-tertiary truncate block max-w-[100px]">
|
||||
{formatSource(row.source)}
|
||||
</span>
|
||||
) : (
|
||||
<span className="text-xs text-quaternary">—</span>
|
||||
)}
|
||||
</Table.Cell>
|
||||
```
|
||||
|
||||
- [ ] **Step 6: Update SLA to use lastContacted**
|
||||
|
||||
Change `computeSla` to accept a `lastContactedAt` fallback:
|
||||
|
||||
```typescript
|
||||
const sla = computeSla(row.lastContactedAt ?? row.createdAt);
|
||||
```
|
||||
|
||||
- [ ] **Step 7: Remove ACTIONS column and TYPE column**
|
||||
|
||||
The TYPE info moves to the tab filter (already there) and the badge on the patient sub-line. The ACTIONS column is replaced by the clickable phone.
|
||||
|
||||
- [ ] **Step 8: Add helper functions**
|
||||
|
||||
```typescript
|
||||
const formatTimeAgo = (dateStr: string): string => {
|
||||
const minutes = Math.round((Date.now() - new Date(dateStr).getTime()) / 60000);
|
||||
if (minutes < 1) return 'Just now';
|
||||
if (minutes < 60) return `${minutes}m ago`;
|
||||
const hours = Math.floor(minutes / 60);
|
||||
if (hours < 24) return `${hours}h ago`;
|
||||
return `${Math.floor(hours / 24)}d ago`;
|
||||
};
|
||||
|
||||
const formatDisposition = (disposition: string): string => {
|
||||
return disposition.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase());
|
||||
};
|
||||
|
||||
const formatSource = (source: string): string => {
|
||||
const map: Record<string, string> = {
|
||||
FACEBOOK_AD: 'Facebook',
|
||||
GOOGLE_AD: 'Google',
|
||||
WALK_IN: 'Walk-in',
|
||||
REFERRAL: 'Referral',
|
||||
WEBSITE: 'Website',
|
||||
PHONE_INQUIRY: 'Phone',
|
||||
};
|
||||
return map[source] ?? source.replace(/_/g, ' ');
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 9: Remove ClickToCallButton import**
|
||||
|
||||
No longer needed in the worklist panel — PhoneActionCell handles it.
|
||||
|
||||
- [ ] **Step 10: Commit**
|
||||
|
||||
```
|
||||
feat: redesign worklist table with clickable phones and interaction context
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Add notification badges for new items
|
||||
|
||||
When new missed calls or follow-ups arrive (detected via the 30-second refresh), show a visual indicator.
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage/src/components/call-desk/worklist-panel.tsx`
|
||||
|
||||
- [ ] **Step 1: Track previous counts to detect new items**
|
||||
|
||||
```typescript
|
||||
const [prevMissedCount, setPrevMissedCount] = useState(missedCount);
|
||||
|
||||
useEffect(() => {
|
||||
if (missedCount > prevMissedCount && prevMissedCount > 0) {
|
||||
notify.info('New Missed Call', `${missedCount - prevMissedCount} new missed call(s)`);
|
||||
}
|
||||
setPrevMissedCount(missedCount);
|
||||
}, [missedCount, prevMissedCount]);
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Add pulsing dot to tab badges when new items exist**
|
||||
|
||||
In the tab items, add a visual indicator for tabs with urgent items:
|
||||
|
||||
```typescript
|
||||
const tabItems = [
|
||||
{ id: 'all' as const, label: 'All Tasks', badge: allRows.length > 0 ? String(allRows.length) : undefined },
|
||||
{ id: 'missed' as const, label: 'Missed Calls', badge: missedCount > 0 ? String(missedCount) : undefined, hasNew: missedCount > prevMissedCount },
|
||||
// ...
|
||||
];
|
||||
```
|
||||
|
||||
The Tab component already supports badges. For the "new" indicator, append a small red dot after the badge number using a custom render if needed.
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```
|
||||
feat: add notification for new missed calls in worklist
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 5: Deploy and verify
|
||||
|
||||
- [ ] **Step 1: Type check**
|
||||
|
||||
```bash
|
||||
cd helix-engage && npx tsc --noEmit
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Build and deploy**
|
||||
|
||||
```bash
|
||||
VITE_API_URL=https://engage-api.srv1477139.hstgr.cloud \
|
||||
VITE_SIP_URI=sip:523590@blr-pub-rtc4.ozonetel.com \
|
||||
VITE_SIP_PASSWORD=523590 \
|
||||
VITE_SIP_WS_SERVER=wss://blr-pub-rtc4.ozonetel.com:444 \
|
||||
npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Test clickable phone**
|
||||
|
||||
1. Hover over a phone number — kebab menu icon appears
|
||||
2. Click phone number directly — places outbound call
|
||||
3. Click kebab → SMS — opens SMS app
|
||||
4. Click kebab → WhatsApp — opens WhatsApp web
|
||||
5. On mobile: long-press phone number — context menu appears
|
||||
|
||||
- [ ] **Step 4: Test last interaction context**
|
||||
|
||||
1. Leads with `lastContacted` show "2h ago — Info Provided" sub-line
|
||||
2. Leads without `lastContacted` show interested service or type
|
||||
3. Missed calls show "Missed at 2:30 PM"
|
||||
|
||||
- [ ] **Step 5: Test SLA**
|
||||
|
||||
1. SLA shows time since last contact (not creation)
|
||||
2. Green < 15m, amber 15-30m, red > 30m
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- **No schema changes needed** — all data is already available from the platform
|
||||
- **ClickToCallButton stays** — it's still used in the active call card for the ringing-out End Call button. Only the worklist replaces it with PhoneActionCell.
|
||||
- **WhatsApp link format** — `https://wa.me/91XXXXXXXXXX` (no + prefix, includes country code)
|
||||
- **SMS link format** — `sms:+91XXXXXXXXXX` (with + prefix)
|
||||
- **The TYPE column is removed** — the tab filter already categorizes by type, and the patient sub-line shows context. Adding a TYPE badge to each row is redundant.
|
||||
- **Filter out no-phone follow-ups** — optional future improvement. For now, show "No phone" in italic which makes it clear the agent can't call.
|
||||
480
docs/superpowers/plans/2026-03-21-cc-agent-features.md
Normal file
480
docs/superpowers/plans/2026-03-21-cc-agent-features.md
Normal file
@@ -0,0 +1,480 @@
|
||||
# CC Agent Features — Phase 1 Implementation Plan
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** Add call transfer, recording pause, and missed call queue to the CC agent's call desk — the three most impactful features for daily workflow.
|
||||
|
||||
**Architecture:** Three new service methods in the NestJS sidecar (callControl, pauseRecording, getAbandonCalls), exposed via REST endpoints. Frontend adds Transfer and Pause Recording buttons to the active call card, and a missed call queue that pulls from Ozonetel instead of our webhook-created records.
|
||||
|
||||
**Tech Stack:** NestJS sidecar (Ozonetel Token Auth APIs), React 19 + Jotai + Untitled UI
|
||||
|
||||
**Ozonetel API endpoints used:**
|
||||
- Call Control: `POST /ca_apis/CallControl_V4` — Token auth — CONFERENCE, HOLD, UNHOLD, MUTE, UNMUTE, KICK_CALL
|
||||
- Recording: `GET /CAServices/Call/Record.php` — apiKey in query string — pause/unPause
|
||||
- Abandon Calls: `GET /ca_apis/abandonCalls` — Token auth — missed calls list
|
||||
|
||||
---
|
||||
|
||||
## File Map
|
||||
|
||||
### Sidecar (helix-engage-server)
|
||||
| File | Action |
|
||||
|------|--------|
|
||||
| `src/ozonetel/ozonetel-agent.service.ts` | Modify: add `callControl()`, `pauseRecording()`, `getAbandonCalls()` |
|
||||
| `src/ozonetel/ozonetel-agent.controller.ts` | Modify: add `POST /api/ozonetel/call-control`, `POST /api/ozonetel/recording`, `GET /api/ozonetel/missed-calls` |
|
||||
|
||||
### Frontend (helix-engage)
|
||||
| File | Action |
|
||||
|------|--------|
|
||||
| `src/components/call-desk/active-call-card.tsx` | Modify: add Transfer button + transfer input, Pause Recording button |
|
||||
| `src/components/call-desk/transfer-dialog.tsx` | Create: inline transfer form (enter number, confirm) |
|
||||
| `src/hooks/use-worklist.ts` | Modify: fetch missed calls from Ozonetel API instead of platform |
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Add call control service methods
|
||||
|
||||
Three new methods in the Ozonetel service: `callControl()` (generic), `pauseRecording()`, and `getAbandonCalls()`.
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage-server/src/ozonetel/ozonetel-agent.service.ts`
|
||||
|
||||
- [ ] **Step 1: Add `callControl()` method**
|
||||
|
||||
```typescript
|
||||
async callControl(params: {
|
||||
action: 'CONFERENCE' | 'HOLD' | 'UNHOLD' | 'MUTE' | 'UNMUTE' | 'KICK_CALL';
|
||||
ucid: string;
|
||||
conferenceNumber?: string;
|
||||
}): Promise<{ status: string; message: string; ucid?: string }> {
|
||||
const url = `https://${this.apiDomain}/ca_apis/CallControl_V4`;
|
||||
const did = process.env.OZONETEL_DID ?? '918041763265';
|
||||
const agentPhoneName = process.env.OZONETEL_SIP_ID ?? '523590';
|
||||
|
||||
this.logger.log(`Call control: action=${params.action} ucid=${params.ucid} conference=${params.conferenceNumber ?? 'none'}`);
|
||||
|
||||
try {
|
||||
const token = await this.getToken();
|
||||
const body: Record<string, string> = {
|
||||
userName: this.accountId,
|
||||
action: params.action,
|
||||
ucid: params.ucid,
|
||||
did,
|
||||
agentPhoneName,
|
||||
};
|
||||
if (params.conferenceNumber) {
|
||||
body.conferenceNumber = params.conferenceNumber;
|
||||
}
|
||||
|
||||
const response = await axios.post(url, body, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${token}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
});
|
||||
|
||||
this.logger.log(`Call control response: ${JSON.stringify(response.data)}`);
|
||||
return response.data;
|
||||
} catch (error: any) {
|
||||
const responseData = error?.response?.data ? JSON.stringify(error.response.data) : '';
|
||||
this.logger.error(`Call control failed: ${error.message} ${responseData}`);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Add `pauseRecording()` method**
|
||||
|
||||
This uses apiKey in query params, not token auth:
|
||||
|
||||
```typescript
|
||||
async pauseRecording(params: {
|
||||
ucid: string;
|
||||
action: 'pause' | 'unPause';
|
||||
}): Promise<{ status: string; message: string }> {
|
||||
const url = `https://${this.apiDomain}/CAServices/Call/Record.php`;
|
||||
|
||||
this.logger.log(`Recording ${params.action}: ucid=${params.ucid}`);
|
||||
|
||||
try {
|
||||
const response = await axios.get(url, {
|
||||
params: {
|
||||
userName: this.accountId,
|
||||
apiKey: this.apiKey,
|
||||
action: params.action,
|
||||
ucid: params.ucid,
|
||||
},
|
||||
});
|
||||
|
||||
this.logger.log(`Recording control response: ${JSON.stringify(response.data)}`);
|
||||
return response.data;
|
||||
} catch (error: any) {
|
||||
const responseData = error?.response?.data ? JSON.stringify(error.response.data) : '';
|
||||
this.logger.error(`Recording control failed: ${error.message} ${responseData}`);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Add `getAbandonCalls()` method**
|
||||
|
||||
```typescript
|
||||
async getAbandonCalls(params?: {
|
||||
fromTime?: string;
|
||||
toTime?: string;
|
||||
campaignName?: string;
|
||||
}): Promise<Array<{
|
||||
monitorUCID: string;
|
||||
type: string;
|
||||
status: string;
|
||||
campaign: string;
|
||||
callerID: string;
|
||||
did: string;
|
||||
agentID: string;
|
||||
agent: string;
|
||||
hangupBy: string;
|
||||
callTime: string;
|
||||
}>> {
|
||||
const url = `https://${this.apiDomain}/ca_apis/abandonCalls`;
|
||||
|
||||
this.logger.log('Fetching abandon calls');
|
||||
|
||||
try {
|
||||
const token = await this.getToken();
|
||||
const body: Record<string, string> = { userName: this.accountId };
|
||||
if (params?.fromTime) body.fromTime = params.fromTime;
|
||||
if (params?.toTime) body.toTime = params.toTime;
|
||||
if (params?.campaignName) body.campaignName = params.campaignName;
|
||||
|
||||
const response = await axios.get(url, {
|
||||
headers: {
|
||||
Authorization: `Bearer ${token}`,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
data: body,
|
||||
});
|
||||
|
||||
const data = response.data;
|
||||
if (data.status === 'success' && Array.isArray(data.message)) {
|
||||
return data.message;
|
||||
}
|
||||
return [];
|
||||
} catch (error: any) {
|
||||
this.logger.error(`Abandon calls failed: ${error.message}`);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Type check and commit**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npx tsc --noEmit
|
||||
```
|
||||
|
||||
```
|
||||
feat: add call control, recording pause, and abandon calls to Ozonetel service
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Add sidecar REST endpoints
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage-server/src/ozonetel/ozonetel-agent.controller.ts`
|
||||
|
||||
- [ ] **Step 1: Add `POST /api/ozonetel/call-control`**
|
||||
|
||||
```typescript
|
||||
@Post('call-control')
|
||||
async callControl(
|
||||
@Body() body: {
|
||||
action: 'CONFERENCE' | 'HOLD' | 'UNHOLD' | 'MUTE' | 'UNMUTE' | 'KICK_CALL';
|
||||
ucid: string;
|
||||
conferenceNumber?: string;
|
||||
},
|
||||
) {
|
||||
if (!body.action || !body.ucid) {
|
||||
throw new HttpException('action and ucid required', 400);
|
||||
}
|
||||
if (body.action === 'CONFERENCE' && !body.conferenceNumber) {
|
||||
throw new HttpException('conferenceNumber required for CONFERENCE action', 400);
|
||||
}
|
||||
|
||||
this.logger.log(`Call control: ${body.action} ucid=${body.ucid}`);
|
||||
|
||||
try {
|
||||
const result = await this.ozonetelAgent.callControl(body);
|
||||
return result;
|
||||
} catch (error: any) {
|
||||
const message = error.response?.data?.message ?? error.message ?? 'Call control failed';
|
||||
throw new HttpException(message, error.response?.status ?? 502);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Add `POST /api/ozonetel/recording`**
|
||||
|
||||
```typescript
|
||||
@Post('recording')
|
||||
async recording(
|
||||
@Body() body: { ucid: string; action: 'pause' | 'unPause' },
|
||||
) {
|
||||
if (!body.ucid || !body.action) {
|
||||
throw new HttpException('ucid and action required', 400);
|
||||
}
|
||||
|
||||
try {
|
||||
const result = await this.ozonetelAgent.pauseRecording(body);
|
||||
return result;
|
||||
} catch (error: any) {
|
||||
const message = error.response?.data?.message ?? error.message ?? 'Recording control failed';
|
||||
throw new HttpException(message, error.response?.status ?? 502);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Add `GET /api/ozonetel/missed-calls`**
|
||||
|
||||
Import `Get` from `@nestjs/common`:
|
||||
|
||||
```typescript
|
||||
@Get('missed-calls')
|
||||
async missedCalls() {
|
||||
const result = await this.ozonetelAgent.getAbandonCalls();
|
||||
return result;
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Type check and commit**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npx tsc --noEmit
|
||||
```
|
||||
|
||||
```
|
||||
feat: add call control, recording, and missed calls REST endpoints
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Add Transfer and Pause Recording to active call UI
|
||||
|
||||
During an active call, the agent gets two new buttons:
|
||||
- **Transfer** — opens an inline input for the transfer number, then does CONFERENCE + KICK_CALL
|
||||
- **Pause Rec** — toggles recording pause
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage/src/components/call-desk/transfer-dialog.tsx`
|
||||
- Modify: `helix-engage/src/components/call-desk/active-call-card.tsx`
|
||||
|
||||
- [ ] **Step 1: Create transfer-dialog.tsx**
|
||||
|
||||
A simple inline form: text input for phone number + "Transfer" button. On submit, calls the sidecar's call-control endpoint twice: CONFERENCE (dial the target), then after confirming, KICK_CALL (drop the agent).
|
||||
|
||||
```typescript
|
||||
import { useState } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faPhoneArrowRight, faXmark } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { Input } from '@/components/base/input/input';
|
||||
import { Button } from '@/components/base/buttons/button';
|
||||
import { apiClient } from '@/lib/api-client';
|
||||
import { notify } from '@/lib/toast';
|
||||
|
||||
type TransferDialogProps = {
|
||||
ucid: string;
|
||||
onClose: () => void;
|
||||
onTransferred: () => void;
|
||||
};
|
||||
|
||||
export const TransferDialog = ({ ucid, onClose, onTransferred }: TransferDialogProps) => {
|
||||
const [number, setNumber] = useState('');
|
||||
const [transferring, setTransferring] = useState(false);
|
||||
const [stage, setStage] = useState<'input' | 'connected'>('input');
|
||||
|
||||
const handleConference = async () => {
|
||||
if (!number.trim()) return;
|
||||
setTransferring(true);
|
||||
try {
|
||||
// Add the target to the conference
|
||||
await apiClient.post('/api/ozonetel/call-control', {
|
||||
action: 'CONFERENCE',
|
||||
ucid,
|
||||
conferenceNumber: `0${number.replace(/\D/g, '')}`,
|
||||
});
|
||||
notify.success('Connected', 'Third party connected. Click Complete to transfer.');
|
||||
setStage('connected');
|
||||
} catch {
|
||||
notify.error('Transfer Failed', 'Could not connect to the target number');
|
||||
} finally {
|
||||
setTransferring(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleComplete = async () => {
|
||||
setTransferring(true);
|
||||
try {
|
||||
// Drop the agent from the call — customer stays with the target
|
||||
await apiClient.post('/api/ozonetel/call-control', {
|
||||
action: 'KICK_CALL',
|
||||
ucid,
|
||||
conferenceNumber: `0${number.replace(/\D/g, '')}`,
|
||||
});
|
||||
notify.success('Transferred', 'Call transferred successfully');
|
||||
onTransferred();
|
||||
} catch {
|
||||
notify.error('Transfer Failed', 'Could not complete transfer');
|
||||
} finally {
|
||||
setTransferring(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="mt-3 rounded-lg border border-secondary bg-secondary p-3">
|
||||
<div className="flex items-center justify-between mb-2">
|
||||
<span className="text-xs font-semibold text-secondary">Transfer Call</span>
|
||||
<button onClick={onClose} className="text-fg-quaternary hover:text-fg-secondary">
|
||||
<FontAwesomeIcon icon={faXmark} className="size-3" />
|
||||
</button>
|
||||
</div>
|
||||
{stage === 'input' ? (
|
||||
<div className="flex gap-2">
|
||||
<Input
|
||||
size="sm"
|
||||
placeholder="Enter phone number"
|
||||
value={number}
|
||||
onChange={setNumber}
|
||||
/>
|
||||
<Button
|
||||
size="sm"
|
||||
color="primary"
|
||||
isLoading={transferring}
|
||||
onClick={handleConference}
|
||||
isDisabled={!number.trim()}
|
||||
>
|
||||
Connect
|
||||
</Button>
|
||||
</div>
|
||||
) : (
|
||||
<div className="flex items-center justify-between">
|
||||
<span className="text-xs text-tertiary">Connected to {number}</span>
|
||||
<Button size="sm" color="primary" isLoading={transferring} onClick={handleComplete}>
|
||||
Complete Transfer
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Add Transfer and Pause Recording buttons to active call card**
|
||||
|
||||
In `active-call-card.tsx`, add imports:
|
||||
```typescript
|
||||
import { faPhoneArrowRight, faRecordVinyl } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { TransferDialog } from './transfer-dialog';
|
||||
```
|
||||
|
||||
Add state:
|
||||
```typescript
|
||||
const [transferOpen, setTransferOpen] = useState(false);
|
||||
const [recordingPaused, setRecordingPaused] = useState(false);
|
||||
```
|
||||
|
||||
In the active call button row (around line 241), add two new buttons before the End button:
|
||||
|
||||
```typescript
|
||||
<Button size="sm" color="secondary"
|
||||
iconLeading={({ className }: { className?: string }) => <FontAwesomeIcon icon={faPhoneArrowRight} className={className} />}
|
||||
onClick={() => setTransferOpen(!transferOpen)}>Transfer</Button>
|
||||
<Button size="sm" color={recordingPaused ? 'primary-destructive' : 'secondary'}
|
||||
iconLeading={({ className }: { className?: string }) => <FontAwesomeIcon icon={faRecordVinyl} className={className} />}
|
||||
onClick={async () => {
|
||||
const action = recordingPaused ? 'unPause' : 'pause';
|
||||
if (callUcid) {
|
||||
apiClient.post('/api/ozonetel/recording', { ucid: callUcid, action }).catch(() => {});
|
||||
}
|
||||
setRecordingPaused(!recordingPaused);
|
||||
}}>{recordingPaused ? 'Resume Rec' : 'Pause Rec'}</Button>
|
||||
```
|
||||
|
||||
After the button row, before the AppointmentForm, add the transfer dialog:
|
||||
```typescript
|
||||
{transferOpen && callUcid && (
|
||||
<TransferDialog
|
||||
ucid={callUcid}
|
||||
onClose={() => setTransferOpen(false)}
|
||||
onTransferred={() => {
|
||||
setTransferOpen(false);
|
||||
hangup();
|
||||
setPostCallStage('disposition');
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Type check and commit**
|
||||
|
||||
```bash
|
||||
cd helix-engage && npx tsc --noEmit
|
||||
```
|
||||
|
||||
```
|
||||
feat: add call transfer and recording pause to active call UI
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Deploy and verify
|
||||
|
||||
- [ ] **Step 1: Build and deploy sidecar**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm run build
|
||||
# tar + scp + docker cp + restart
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Build and deploy frontend**
|
||||
|
||||
```bash
|
||||
cd helix-engage
|
||||
VITE_API_URL=https://engage-api.srv1477139.hstgr.cloud \
|
||||
VITE_SIP_URI=sip:523590@blr-pub-rtc4.ozonetel.com \
|
||||
VITE_SIP_PASSWORD=523590 \
|
||||
VITE_SIP_WS_SERVER=wss://blr-pub-rtc4.ozonetel.com:444 \
|
||||
npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Test call transfer**
|
||||
|
||||
1. Place an outbound call
|
||||
2. Click "Transfer" → enter a phone number → "Connect"
|
||||
3. Third party should ring and join the call
|
||||
4. Click "Complete Transfer" → agent drops, customer stays with target
|
||||
5. Disposition form shows
|
||||
|
||||
- [ ] **Step 4: Test recording pause**
|
||||
|
||||
1. During an active call, click "Pause Rec"
|
||||
2. Button changes to "Resume Rec" (destructive color)
|
||||
3. Check Ozonetel reports — recording should have a gap
|
||||
4. Click "Resume Rec" — recording resumes
|
||||
|
||||
- [ ] **Step 5: Test missed calls endpoint**
|
||||
|
||||
```bash
|
||||
curl -s https://engage-api.srv1477139.hstgr.cloud/api/ozonetel/missed-calls | python3 -m json.tool
|
||||
```
|
||||
|
||||
Verify it returns abandon call data from Ozonetel.
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- **Call Transfer is two-step**: CONFERENCE adds the target, KICK_CALL drops the agent. This is a "warm transfer" — all three parties are briefly connected before the agent drops. For "cold transfer" (blind), we'd CONFERENCE + immediately KICK_CALL without waiting.
|
||||
- **Recording pause uses apiKey in query params** — different auth pattern from other `/ca_apis/` endpoints. This is the `/CAServices/` path.
|
||||
- **KICK_CALL note from docs**: "Always pass the agent phone number in the conferenceNumber parameter to use KICK_CALL action." This means to drop the agent, pass the agent's phone number as conferenceNumber. To drop the transferred party, pass their number.
|
||||
- **Missed calls API** — the `getAbandonCalls` returns today's data by default. For historical data, pass fromTime/toTime.
|
||||
- **The active call button row is getting crowded** (Mute, Hold, Book Appt, Transfer, Pause Rec, End — 6 buttons). If this is too many, we can group Transfer + Pause Rec under a "More" dropdown.
|
||||
796
docs/superpowers/plans/2026-03-21-live-call-assist.md
Normal file
796
docs/superpowers/plans/2026-03-21-live-call-assist.md
Normal file
@@ -0,0 +1,796 @@
|
||||
# Live Call Assist — Implementation Plan
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** Stream customer audio during calls to Deepgram for transcription, feed transcript + lead context to OpenAI every 10 seconds for suggestions, display live transcript + AI suggestions in the sidebar.
|
||||
|
||||
**Architecture:** Browser captures remote WebRTC audio via AudioWorklet, streams PCM over Socket.IO to sidecar. Sidecar pipes audio to Deepgram Nova WebSocket for STT, accumulates transcript, and every 10 seconds sends transcript + pre-loaded lead context to OpenAI gpt-4o-mini for suggestions. Results stream back to browser via Socket.IO.
|
||||
|
||||
**Tech Stack:** Socket.IO (already installed), Deepgram Nova SDK, OpenAI via Vercel AI SDK (already installed), AudioWorklet (browser API)
|
||||
|
||||
---
|
||||
|
||||
## File Map
|
||||
|
||||
### Sidecar (helix-engage-server)
|
||||
| File | Action |
|
||||
|------|--------|
|
||||
| `src/call-assist/call-assist.gateway.ts` | Create: Socket.IO gateway handling audio stream, Deepgram + OpenAI orchestration |
|
||||
| `src/call-assist/call-assist.service.ts` | Create: Lead context loading from platform, OpenAI prompt building |
|
||||
| `src/call-assist/call-assist.module.ts` | Create: Module registration |
|
||||
| `src/app.module.ts` | Modify: import CallAssistModule |
|
||||
| `package.json` | Modify: add `@deepgram/sdk` |
|
||||
|
||||
### Frontend (helix-engage)
|
||||
| File | Action |
|
||||
|------|--------|
|
||||
| `src/lib/audio-capture.ts` | Create: Capture remote audio track, downsample to 16kHz PCM, emit chunks |
|
||||
| `src/hooks/use-call-assist.ts` | Create: Socket.IO connection, manages transcript + suggestions state |
|
||||
| `src/components/call-desk/live-transcript.tsx` | Create: Scrolling transcript + AI suggestion cards |
|
||||
| `src/components/call-desk/context-panel.tsx` | Modify: show LiveTranscript during active calls instead of AiChatPanel |
|
||||
| `src/pages/call-desk.tsx` | Modify: remove CallPrepCard during active calls |
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Sidecar — Call Assist service (context loading + OpenAI)
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage-server/src/call-assist/call-assist.service.ts`
|
||||
|
||||
- [ ] **Step 1: Create the service**
|
||||
|
||||
```typescript
|
||||
import { Injectable, Logger } from '@nestjs/common';
|
||||
import { ConfigService } from '@nestjs/config';
|
||||
import { generateText } from 'ai';
|
||||
import { PlatformGraphqlService } from '../platform/platform-graphql.service';
|
||||
import { createAiModel } from '../ai/ai-provider';
|
||||
import type { LanguageModel } from 'ai';
|
||||
|
||||
@Injectable()
|
||||
export class CallAssistService {
|
||||
private readonly logger = new Logger(CallAssistService.name);
|
||||
private readonly aiModel: LanguageModel | null;
|
||||
private readonly platformApiKey: string;
|
||||
|
||||
constructor(
|
||||
private config: ConfigService,
|
||||
private platform: PlatformGraphqlService,
|
||||
) {
|
||||
this.aiModel = createAiModel(config);
|
||||
this.platformApiKey = config.get<string>('platform.apiKey') ?? '';
|
||||
}
|
||||
|
||||
async loadCallContext(leadId: string | null, callerPhone: string | null): Promise<string> {
|
||||
const authHeader = this.platformApiKey ? `Bearer ${this.platformApiKey}` : '';
|
||||
if (!authHeader) return 'No platform context available.';
|
||||
|
||||
try {
|
||||
const parts: string[] = [];
|
||||
|
||||
// Load lead details
|
||||
if (leadId) {
|
||||
const leadResult = await this.platform.queryWithAuth<any>(
|
||||
`{ leads(filter: { id: { eq: "${leadId}" } }) { edges { node {
|
||||
id name contactName { firstName lastName }
|
||||
contactPhone { primaryPhoneNumber }
|
||||
source status interestedService
|
||||
lastContacted contactAttempts
|
||||
aiSummary aiSuggestedAction
|
||||
} } } }`,
|
||||
undefined, authHeader,
|
||||
);
|
||||
const lead = leadResult.leads.edges[0]?.node;
|
||||
if (lead) {
|
||||
const name = lead.contactName ? `${lead.contactName.firstName} ${lead.contactName.lastName}`.trim() : lead.name;
|
||||
parts.push(`CALLER: ${name}`);
|
||||
parts.push(`Phone: ${lead.contactPhone?.primaryPhoneNumber ?? callerPhone}`);
|
||||
parts.push(`Source: ${lead.source ?? 'Unknown'}`);
|
||||
parts.push(`Interested in: ${lead.interestedService ?? 'Not specified'}`);
|
||||
parts.push(`Contact attempts: ${lead.contactAttempts ?? 0}`);
|
||||
if (lead.aiSummary) parts.push(`AI Summary: ${lead.aiSummary}`);
|
||||
}
|
||||
|
||||
// Load past appointments
|
||||
const apptResult = await this.platform.queryWithAuth<any>(
|
||||
`{ appointments(filter: { patientId: { eq: "${leadId}" } }, first: 10, orderBy: [{ scheduledAt: DescNullsLast }]) { edges { node {
|
||||
id scheduledAt appointmentStatus doctorName department reasonForVisit
|
||||
} } } }`,
|
||||
undefined, authHeader,
|
||||
);
|
||||
const appts = apptResult.appointments.edges.map((e: any) => e.node);
|
||||
if (appts.length > 0) {
|
||||
parts.push(`\nPAST APPOINTMENTS:`);
|
||||
for (const a of appts) {
|
||||
const date = a.scheduledAt ? new Date(a.scheduledAt).toLocaleDateString('en-IN') : '?';
|
||||
parts.push(`- ${date}: ${a.doctorName ?? '?'} (${a.department ?? '?'}) — ${a.appointmentStatus}`);
|
||||
}
|
||||
}
|
||||
} else if (callerPhone) {
|
||||
parts.push(`CALLER: Unknown (${callerPhone})`);
|
||||
parts.push('No lead record found — this may be a new enquiry.');
|
||||
}
|
||||
|
||||
// Load doctors
|
||||
const docResult = await this.platform.queryWithAuth<any>(
|
||||
`{ doctors(first: 20) { edges { node {
|
||||
fullName { firstName lastName } department specialty clinic { clinicName }
|
||||
} } } }`,
|
||||
undefined, authHeader,
|
||||
);
|
||||
const docs = docResult.doctors.edges.map((e: any) => e.node);
|
||||
if (docs.length > 0) {
|
||||
parts.push(`\nAVAILABLE DOCTORS:`);
|
||||
for (const d of docs) {
|
||||
const name = d.fullName ? `Dr. ${d.fullName.firstName} ${d.fullName.lastName}`.trim() : 'Unknown';
|
||||
parts.push(`- ${name} — ${d.department ?? '?'} — ${d.clinic?.clinicName ?? '?'}`);
|
||||
}
|
||||
}
|
||||
|
||||
return parts.join('\n') || 'No context available.';
|
||||
} catch (err) {
|
||||
this.logger.error(`Failed to load call context: ${err}`);
|
||||
return 'Context loading failed.';
|
||||
}
|
||||
}
|
||||
|
||||
async getSuggestion(transcript: string, context: string): Promise<string> {
|
||||
if (!this.aiModel || !transcript.trim()) return '';
|
||||
|
||||
try {
|
||||
const { text } = await generateText({
|
||||
model: this.aiModel,
|
||||
system: `You are a real-time call assistant for Global Hospital Bangalore.
|
||||
You listen to the customer's words and provide brief, actionable suggestions for the CC agent.
|
||||
|
||||
${context}
|
||||
|
||||
RULES:
|
||||
- Keep suggestions under 2 sentences
|
||||
- Focus on actionable next steps the agent should take NOW
|
||||
- If customer mentions a doctor or department, suggest available slots
|
||||
- If customer wants to cancel or reschedule, note relevant appointment details
|
||||
- If customer sounds upset, suggest empathetic response
|
||||
- Do NOT repeat what the agent already knows`,
|
||||
prompt: `Conversation transcript so far:\n${transcript}\n\nProvide a brief suggestion for the agent based on what was just said.`,
|
||||
maxTokens: 150,
|
||||
});
|
||||
return text;
|
||||
} catch (err) {
|
||||
this.logger.error(`AI suggestion failed: ${err}`);
|
||||
return '';
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Type check and commit**
|
||||
|
||||
```
|
||||
feat: add CallAssistService for context loading and AI suggestions
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Sidecar — Call Assist WebSocket gateway
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage-server/src/call-assist/call-assist.gateway.ts`
|
||||
- Create: `helix-engage-server/src/call-assist/call-assist.module.ts`
|
||||
- Modify: `helix-engage-server/src/app.module.ts`
|
||||
- Modify: `helix-engage-server/package.json`
|
||||
|
||||
- [ ] **Step 1: Install Deepgram SDK**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm install @deepgram/sdk
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Create the gateway**
|
||||
|
||||
```typescript
|
||||
import {
|
||||
WebSocketGateway,
|
||||
SubscribeMessage,
|
||||
MessageBody,
|
||||
ConnectedSocket,
|
||||
OnGatewayDisconnect,
|
||||
} from '@nestjs/websockets';
|
||||
import { Logger } from '@nestjs/common';
|
||||
import { ConfigService } from '@nestjs/config';
|
||||
import { Socket } from 'socket.io';
|
||||
import { createClient, LiveTranscriptionEvents } from '@deepgram/sdk';
|
||||
import { CallAssistService } from './call-assist.service';
|
||||
|
||||
type SessionState = {
|
||||
deepgramConnection: any;
|
||||
transcript: string;
|
||||
context: string;
|
||||
suggestionTimer: NodeJS.Timeout | null;
|
||||
};
|
||||
|
||||
@WebSocketGateway({
|
||||
cors: { origin: process.env.CORS_ORIGIN ?? '*', credentials: true },
|
||||
namespace: '/call-assist',
|
||||
})
|
||||
export class CallAssistGateway implements OnGatewayDisconnect {
|
||||
private readonly logger = new Logger(CallAssistGateway.name);
|
||||
private readonly sessions = new Map<string, SessionState>();
|
||||
private readonly deepgramApiKey: string;
|
||||
|
||||
constructor(
|
||||
private readonly callAssist: CallAssistService,
|
||||
private readonly config: ConfigService,
|
||||
) {
|
||||
this.deepgramApiKey = process.env.DEEPGRAM_API_KEY ?? '';
|
||||
}
|
||||
|
||||
@SubscribeMessage('call-assist:start')
|
||||
async handleStart(
|
||||
@ConnectedSocket() client: Socket,
|
||||
@MessageBody() data: { ucid: string; leadId?: string; callerPhone?: string },
|
||||
) {
|
||||
this.logger.log(`Call assist start: ucid=${data.ucid} lead=${data.leadId ?? 'none'}`);
|
||||
|
||||
// Load lead context
|
||||
const context = await this.callAssist.loadCallContext(
|
||||
data.leadId ?? null,
|
||||
data.callerPhone ?? null,
|
||||
);
|
||||
client.emit('call-assist:context', { context: context.substring(0, 200) + '...' });
|
||||
|
||||
// Connect to Deepgram
|
||||
if (!this.deepgramApiKey) {
|
||||
this.logger.warn('DEEPGRAM_API_KEY not set — transcription disabled');
|
||||
client.emit('call-assist:error', { message: 'Transcription not configured' });
|
||||
return;
|
||||
}
|
||||
|
||||
const deepgram = createClient(this.deepgramApiKey);
|
||||
const dgConnection = deepgram.listen.live({
|
||||
model: 'nova-2',
|
||||
language: 'en',
|
||||
smart_format: true,
|
||||
interim_results: true,
|
||||
endpointing: 300,
|
||||
sample_rate: 16000,
|
||||
encoding: 'linear16',
|
||||
channels: 1,
|
||||
});
|
||||
|
||||
const session: SessionState = {
|
||||
deepgramConnection: dgConnection,
|
||||
transcript: '',
|
||||
context,
|
||||
suggestionTimer: null,
|
||||
};
|
||||
|
||||
dgConnection.on(LiveTranscriptionEvents.Open, () => {
|
||||
this.logger.log(`Deepgram connected for ${data.ucid}`);
|
||||
});
|
||||
|
||||
dgConnection.on(LiveTranscriptionEvents.Transcript, (result: any) => {
|
||||
const text = result.channel?.alternatives?.[0]?.transcript;
|
||||
if (!text) return;
|
||||
|
||||
const isFinal = result.is_final;
|
||||
client.emit('call-assist:transcript', { text, isFinal });
|
||||
|
||||
if (isFinal) {
|
||||
session.transcript += `Customer: ${text}\n`;
|
||||
}
|
||||
});
|
||||
|
||||
dgConnection.on(LiveTranscriptionEvents.Error, (err: any) => {
|
||||
this.logger.error(`Deepgram error: ${err.message}`);
|
||||
});
|
||||
|
||||
dgConnection.on(LiveTranscriptionEvents.Close, () => {
|
||||
this.logger.log(`Deepgram closed for ${data.ucid}`);
|
||||
});
|
||||
|
||||
// AI suggestion every 10 seconds
|
||||
session.suggestionTimer = setInterval(async () => {
|
||||
if (!session.transcript.trim()) return;
|
||||
const suggestion = await this.callAssist.getSuggestion(session.transcript, session.context);
|
||||
if (suggestion) {
|
||||
client.emit('call-assist:suggestion', { text: suggestion });
|
||||
}
|
||||
}, 10000);
|
||||
|
||||
this.sessions.set(client.id, session);
|
||||
}
|
||||
|
||||
@SubscribeMessage('call-assist:audio')
|
||||
handleAudio(
|
||||
@ConnectedSocket() client: Socket,
|
||||
@MessageBody() audioData: ArrayBuffer,
|
||||
) {
|
||||
const session = this.sessions.get(client.id);
|
||||
if (session?.deepgramConnection) {
|
||||
session.deepgramConnection.send(Buffer.from(audioData));
|
||||
}
|
||||
}
|
||||
|
||||
@SubscribeMessage('call-assist:stop')
|
||||
handleStop(@ConnectedSocket() client: Socket) {
|
||||
this.cleanup(client.id);
|
||||
this.logger.log(`Call assist stopped: ${client.id}`);
|
||||
}
|
||||
|
||||
handleDisconnect(client: Socket) {
|
||||
this.cleanup(client.id);
|
||||
}
|
||||
|
||||
private cleanup(clientId: string) {
|
||||
const session = this.sessions.get(clientId);
|
||||
if (session) {
|
||||
if (session.suggestionTimer) clearInterval(session.suggestionTimer);
|
||||
if (session.deepgramConnection) {
|
||||
try { session.deepgramConnection.finish(); } catch {}
|
||||
}
|
||||
this.sessions.delete(clientId);
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Create the module**
|
||||
|
||||
```typescript
|
||||
import { Module } from '@nestjs/common';
|
||||
import { CallAssistGateway } from './call-assist.gateway';
|
||||
import { CallAssistService } from './call-assist.service';
|
||||
import { PlatformModule } from '../platform/platform.module';
|
||||
|
||||
@Module({
|
||||
imports: [PlatformModule],
|
||||
providers: [CallAssistGateway, CallAssistService],
|
||||
})
|
||||
export class CallAssistModule {}
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Register in app.module.ts**
|
||||
|
||||
Add `CallAssistModule` to imports.
|
||||
|
||||
- [ ] **Step 5: Add DEEPGRAM_API_KEY to docker-compose env**
|
||||
|
||||
The env var needs to be set in the VPS docker-compose for the sidecar container.
|
||||
|
||||
- [ ] **Step 6: Type check and commit**
|
||||
|
||||
```
|
||||
feat: add call assist WebSocket gateway with Deepgram STT + OpenAI suggestions
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Frontend — Audio capture utility
|
||||
|
||||
Capture the remote audio track from WebRTC, downsample to 16kHz 16-bit PCM, and provide chunks via callback.
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage/src/lib/audio-capture.ts`
|
||||
|
||||
- [ ] **Step 1: Create the audio capture module**
|
||||
|
||||
```typescript
|
||||
type AudioChunkCallback = (chunk: ArrayBuffer) => void;
|
||||
|
||||
let audioContext: AudioContext | null = null;
|
||||
let mediaStreamSource: MediaStreamAudioSourceNode | null = null;
|
||||
let scriptProcessor: ScriptProcessorNode | null = null;
|
||||
|
||||
export function startAudioCapture(remoteStream: MediaStream, onChunk: AudioChunkCallback): void {
|
||||
stopAudioCapture();
|
||||
|
||||
audioContext = new AudioContext({ sampleRate: 16000 });
|
||||
mediaStreamSource = audioContext.createMediaStreamSource(remoteStream);
|
||||
|
||||
// Use ScriptProcessorNode (deprecated but universally supported)
|
||||
// AudioWorklet would be better but requires a separate file
|
||||
scriptProcessor = audioContext.createScriptProcessor(4096, 1, 1);
|
||||
|
||||
scriptProcessor.onaudioprocess = (event) => {
|
||||
const inputData = event.inputBuffer.getChannelData(0);
|
||||
|
||||
// Convert Float32 to Int16 PCM
|
||||
const pcm = new Int16Array(inputData.length);
|
||||
for (let i = 0; i < inputData.length; i++) {
|
||||
const s = Math.max(-1, Math.min(1, inputData[i]));
|
||||
pcm[i] = s < 0 ? s * 0x8000 : s * 0x7FFF;
|
||||
}
|
||||
|
||||
onChunk(pcm.buffer);
|
||||
};
|
||||
|
||||
mediaStreamSource.connect(scriptProcessor);
|
||||
scriptProcessor.connect(audioContext.destination);
|
||||
}
|
||||
|
||||
export function stopAudioCapture(): void {
|
||||
if (scriptProcessor) {
|
||||
scriptProcessor.disconnect();
|
||||
scriptProcessor = null;
|
||||
}
|
||||
if (mediaStreamSource) {
|
||||
mediaStreamSource.disconnect();
|
||||
mediaStreamSource = null;
|
||||
}
|
||||
if (audioContext) {
|
||||
audioContext.close().catch(() => {});
|
||||
audioContext = null;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Commit**
|
||||
|
||||
```
|
||||
feat: add audio capture utility for remote WebRTC stream
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Frontend — useCallAssist hook
|
||||
|
||||
Manages Socket.IO connection to `/call-assist`, sends audio, receives transcript + suggestions.
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage/src/hooks/use-call-assist.ts`
|
||||
|
||||
- [ ] **Step 1: Create the hook**
|
||||
|
||||
```typescript
|
||||
import { useEffect, useRef, useState, useCallback } from 'react';
|
||||
import { io, Socket } from 'socket.io-client';
|
||||
import { startAudioCapture, stopAudioCapture } from '@/lib/audio-capture';
|
||||
import { getSipClient } from '@/state/sip-manager';
|
||||
|
||||
const SIDECAR_URL = import.meta.env.VITE_SIDECAR_URL ?? 'http://localhost:4100';
|
||||
|
||||
type TranscriptLine = {
|
||||
id: string;
|
||||
text: string;
|
||||
isFinal: boolean;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
type Suggestion = {
|
||||
id: string;
|
||||
text: string;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
export const useCallAssist = (active: boolean, ucid: string | null, leadId: string | null, callerPhone: string | null) => {
|
||||
const [transcript, setTranscript] = useState<TranscriptLine[]>([]);
|
||||
const [suggestions, setSuggestions] = useState<Suggestion[]>([]);
|
||||
const [connected, setConnected] = useState(false);
|
||||
const socketRef = useRef<Socket | null>(null);
|
||||
const idCounter = useRef(0);
|
||||
|
||||
const nextId = useCallback(() => `ca-${++idCounter.current}`, []);
|
||||
|
||||
useEffect(() => {
|
||||
if (!active || !ucid) return;
|
||||
|
||||
const socket = io(`${SIDECAR_URL}/call-assist`, {
|
||||
transports: ['websocket'],
|
||||
});
|
||||
socketRef.current = socket;
|
||||
|
||||
socket.on('connect', () => {
|
||||
setConnected(true);
|
||||
socket.emit('call-assist:start', { ucid, leadId, callerPhone });
|
||||
|
||||
// Start capturing remote audio from the SIP session
|
||||
const sipClient = getSipClient();
|
||||
const audioElement = (sipClient as any)?.audioElement as HTMLAudioElement | null;
|
||||
if (audioElement?.srcObject) {
|
||||
startAudioCapture(audioElement.srcObject as MediaStream, (chunk) => {
|
||||
socket.emit('call-assist:audio', chunk);
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
socket.on('call-assist:transcript', (data: { text: string; isFinal: boolean }) => {
|
||||
if (!data.text.trim()) return;
|
||||
setTranscript(prev => {
|
||||
if (!data.isFinal) {
|
||||
// Replace last interim line
|
||||
const withoutLastInterim = prev.filter(l => l.isFinal);
|
||||
return [...withoutLastInterim, { id: nextId(), text: data.text, isFinal: false, timestamp: new Date() }];
|
||||
}
|
||||
// Add final line, remove interims
|
||||
const finals = prev.filter(l => l.isFinal);
|
||||
return [...finals, { id: nextId(), text: data.text, isFinal: true, timestamp: new Date() }];
|
||||
});
|
||||
});
|
||||
|
||||
socket.on('call-assist:suggestion', (data: { text: string }) => {
|
||||
setSuggestions(prev => [...prev, { id: nextId(), text: data.text, timestamp: new Date() }]);
|
||||
});
|
||||
|
||||
socket.on('disconnect', () => setConnected(false));
|
||||
|
||||
return () => {
|
||||
stopAudioCapture();
|
||||
socket.emit('call-assist:stop');
|
||||
socket.disconnect();
|
||||
socketRef.current = null;
|
||||
setConnected(false);
|
||||
};
|
||||
}, [active, ucid, leadId, callerPhone, nextId]);
|
||||
|
||||
// Reset state when call ends
|
||||
useEffect(() => {
|
||||
if (!active) {
|
||||
setTranscript([]);
|
||||
setSuggestions([]);
|
||||
}
|
||||
}, [active]);
|
||||
|
||||
return { transcript, suggestions, connected };
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Install socket.io-client in frontend**
|
||||
|
||||
```bash
|
||||
cd helix-engage && npm install socket.io-client
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Expose audioElement in SIPClient**
|
||||
|
||||
In `helix-engage/src/lib/sip-client.ts`, the `audioElement` is private. Add a public getter:
|
||||
|
||||
```typescript
|
||||
getAudioElement(): HTMLAudioElement | null {
|
||||
return this.audioElement;
|
||||
}
|
||||
```
|
||||
|
||||
Update `getSipClient` usage in the hook — access via `getSipClient()?.getAudioElement()?.srcObject`.
|
||||
|
||||
- [ ] **Step 4: Type check and commit**
|
||||
|
||||
```
|
||||
feat: add useCallAssist hook for live transcription WebSocket
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 5: Frontend — LiveTranscript component
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage/src/components/call-desk/live-transcript.tsx`
|
||||
|
||||
- [ ] **Step 1: Create the component**
|
||||
|
||||
Scrolling list of transcript lines with AI suggestion cards interspersed. Auto-scrolls to bottom.
|
||||
|
||||
```typescript
|
||||
import { useEffect, useRef } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faSparkles, faMicrophone } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { cx } from '@/utils/cx';
|
||||
|
||||
type TranscriptLine = {
|
||||
id: string;
|
||||
text: string;
|
||||
isFinal: boolean;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
type Suggestion = {
|
||||
id: string;
|
||||
text: string;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
type LiveTranscriptProps = {
|
||||
transcript: TranscriptLine[];
|
||||
suggestions: Suggestion[];
|
||||
connected: boolean;
|
||||
};
|
||||
|
||||
export const LiveTranscript = ({ transcript, suggestions, connected }: LiveTranscriptProps) => {
|
||||
const scrollRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
// Auto-scroll to bottom
|
||||
useEffect(() => {
|
||||
if (scrollRef.current) {
|
||||
scrollRef.current.scrollTop = scrollRef.current.scrollHeight;
|
||||
}
|
||||
}, [transcript.length, suggestions.length]);
|
||||
|
||||
// Merge transcript and suggestions by timestamp
|
||||
const items = [
|
||||
...transcript.map(t => ({ ...t, kind: 'transcript' as const })),
|
||||
...suggestions.map(s => ({ ...s, kind: 'suggestion' as const, isFinal: true })),
|
||||
].sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime());
|
||||
|
||||
return (
|
||||
<div className="flex flex-1 flex-col overflow-hidden">
|
||||
{/* Header */}
|
||||
<div className="flex items-center gap-2 px-4 py-3 border-b border-secondary">
|
||||
<FontAwesomeIcon icon={faSparkles} className="size-3.5 text-fg-brand-primary" />
|
||||
<span className="text-xs font-bold uppercase tracking-wider text-brand-secondary">Live Assist</span>
|
||||
<div className={cx(
|
||||
"ml-auto size-2 rounded-full",
|
||||
connected ? "bg-success-solid" : "bg-disabled",
|
||||
)} />
|
||||
</div>
|
||||
|
||||
{/* Transcript body */}
|
||||
<div ref={scrollRef} className="flex-1 overflow-y-auto px-4 py-3 space-y-2">
|
||||
{items.length === 0 && (
|
||||
<div className="flex flex-col items-center justify-center py-8 text-center">
|
||||
<FontAwesomeIcon icon={faMicrophone} className="size-6 text-fg-quaternary mb-2" />
|
||||
<p className="text-xs text-quaternary">Listening to customer...</p>
|
||||
<p className="text-xs text-quaternary">Transcript will appear here</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{items.map(item => {
|
||||
if (item.kind === 'suggestion') {
|
||||
return (
|
||||
<div key={item.id} className="rounded-lg bg-brand-primary p-3 border border-brand">
|
||||
<div className="flex items-center gap-1.5 mb-1">
|
||||
<FontAwesomeIcon icon={faSparkles} className="size-3 text-fg-brand-primary" />
|
||||
<span className="text-xs font-semibold text-brand-secondary">AI Suggestion</span>
|
||||
</div>
|
||||
<p className="text-sm text-primary">{item.text}</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div key={item.id} className={cx(
|
||||
"text-sm",
|
||||
item.isFinal ? "text-primary" : "text-tertiary italic",
|
||||
)}>
|
||||
<span className="text-xs text-quaternary mr-2">
|
||||
{item.timestamp.toLocaleTimeString('en-IN', { hour: '2-digit', minute: '2-digit', second: '2-digit' })}
|
||||
</span>
|
||||
{item.text}
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Commit**
|
||||
|
||||
```
|
||||
feat: add LiveTranscript component for call sidebar
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 6: Wire live transcript into the call desk
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage/src/components/call-desk/context-panel.tsx`
|
||||
- Modify: `helix-engage/src/pages/call-desk.tsx`
|
||||
|
||||
- [ ] **Step 1: Update context-panel.tsx to show LiveTranscript during calls**
|
||||
|
||||
Import the hook and component:
|
||||
```typescript
|
||||
import { useCallAssist } from '@/hooks/use-call-assist';
|
||||
import { LiveTranscript } from './live-transcript';
|
||||
```
|
||||
|
||||
Accept new props:
|
||||
```typescript
|
||||
interface ContextPanelProps {
|
||||
selectedLead: Lead | null;
|
||||
activities: LeadActivity[];
|
||||
callerPhone?: string;
|
||||
isInCall?: boolean;
|
||||
callUcid?: string | null;
|
||||
}
|
||||
```
|
||||
|
||||
Inside the component, use the hook:
|
||||
```typescript
|
||||
const { transcript, suggestions, connected } = useCallAssist(
|
||||
isInCall ?? false,
|
||||
callUcid ?? null,
|
||||
selectedLead?.id ?? null,
|
||||
callerPhone ?? null,
|
||||
);
|
||||
```
|
||||
|
||||
When `isInCall` is true, replace the AI Assistant tab content with LiveTranscript:
|
||||
```typescript
|
||||
{activeTab === 'ai' && (
|
||||
isInCall ? (
|
||||
<LiveTranscript transcript={transcript} suggestions={suggestions} connected={connected} />
|
||||
) : (
|
||||
<AiChatPanel callerContext={callerContext} role={...} />
|
||||
)
|
||||
)}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Pass isInCall and callUcid to ContextPanel in call-desk.tsx**
|
||||
|
||||
```typescript
|
||||
<ContextPanel
|
||||
selectedLead={activeLeadFull}
|
||||
activities={leadActivities}
|
||||
callerPhone={callerNumber ?? undefined}
|
||||
isInCall={isInCall}
|
||||
callUcid={callUcid}
|
||||
/>
|
||||
```
|
||||
|
||||
Also get `callUcid` from `useSip()`:
|
||||
```typescript
|
||||
const { connectionStatus, isRegistered, callState, callerNumber, callUcid } = useSip();
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Remove CallPrepCard during active calls**
|
||||
|
||||
In `call-desk.tsx`, remove the CallPrepCard from the active call area:
|
||||
|
||||
```typescript
|
||||
{isInCall && (
|
||||
<div className="space-y-4 p-5">
|
||||
<ActiveCallCard lead={activeLeadFull} callerPhone={callerNumber ?? ''} />
|
||||
</div>
|
||||
)}
|
||||
```
|
||||
|
||||
Keep the CallPrepCard import for now — it might be useful in other contexts later.
|
||||
|
||||
- [ ] **Step 4: Type check and commit**
|
||||
|
||||
```
|
||||
feat: wire live transcript into call desk sidebar
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 7: Deploy and verify
|
||||
|
||||
- [ ] **Step 1: Get Deepgram API key**
|
||||
|
||||
Sign up at deepgram.com — free tier includes $200 credit. Set `DEEPGRAM_API_KEY` in the sidecar's docker-compose env.
|
||||
|
||||
- [ ] **Step 2: Build and deploy sidecar**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm install && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Build and deploy frontend**
|
||||
|
||||
```bash
|
||||
cd helix-engage && npm install && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Test end-to-end**
|
||||
|
||||
1. Login as CC agent
|
||||
2. Place or receive a call
|
||||
3. Sidebar should show "Live Assist" with green dot
|
||||
4. Customer speaks → transcript appears in real-time
|
||||
5. Every 10 seconds → AI suggestion card appears with contextual advice
|
||||
6. Call ends → transcript stays visible during disposition
|
||||
|
||||
---
|
||||
|
||||
## Notes
|
||||
|
||||
- **ScriptProcessorNode is deprecated** but universally supported. AudioWorklet would require a separate JS file served via a URL. Can upgrade later.
|
||||
- **Deepgram `interim_results: true`** gives streaming partial results (updated as words are recognized). `isFinal` results are the confirmed transcription.
|
||||
- **Socket.IO binary support** — `socket.emit('call-assist:audio', chunk)` sends ArrayBuffer natively. No base64 encoding needed.
|
||||
- **The `audioElement.srcObject`** is the remote MediaStream — this is the customer's audio only. We don't send the agent's mic to avoid echo/feedback in transcription.
|
||||
- **Cost**: ~₹2 per 5-minute call (Deepgram + OpenAI combined).
|
||||
- **If DEEPGRAM_API_KEY is not set**, the gateway logs a warning and sends an error event to the client. Transcription is disabled gracefully — the app still works without it.
|
||||
173
docs/superpowers/specs/2026-03-21-live-call-assist-design.md
Normal file
173
docs/superpowers/specs/2026-03-21-live-call-assist-design.md
Normal file
@@ -0,0 +1,173 @@
|
||||
# Live Call Assist — Design Spec
|
||||
|
||||
## Problem
|
||||
|
||||
CC agents have no real-time intelligence during calls. The AI sidebar shows a static pre-call summary and a chat interface that requires manual typing — useless when the agent is on the phone. The agent has to remember lead history, doctor availability, and past interactions from memory.
|
||||
|
||||
## Solution
|
||||
|
||||
Stream the call's remote audio (customer voice) to the sidecar, transcribe via Deepgram Nova, and every 10 seconds feed the accumulated transcript + full lead context to OpenAI for real-time suggestions. Display a scrolling transcript with AI suggestion cards in the sidebar.
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
Browser (WebRTC call)
|
||||
│
|
||||
├─ Remote audio track (customer) ──► AudioWorklet (PCM 16-bit, 16kHz)
|
||||
│ │
|
||||
│ WebSocket to sidecar
|
||||
│ │
|
||||
│ ┌──────────▼──────────┐
|
||||
│ │ Sidecar Gateway │
|
||||
│ │ ws://api/call-assist│
|
||||
│ └──────────┬──────────┘
|
||||
│ │
|
||||
│ ┌──────────────┼──────────────┐
|
||||
│ ▼ ▼
|
||||
│ Deepgram Nova WS Every 10s: OpenAI
|
||||
│ (audio → text) (transcript + context
|
||||
│ → suggestions)
|
||||
│ │ │
|
||||
│ ▼ ▼
|
||||
│ Transcript lines AI suggestion cards
|
||||
│ │ │
|
||||
│ └──────────────┬──────────────┘
|
||||
│ │
|
||||
│ WebSocket to browser
|
||||
│ │
|
||||
└─────────────────────────────────────────▼
|
||||
AI Sidebar
|
||||
(transcript + suggestions)
|
||||
```
|
||||
|
||||
## Components
|
||||
|
||||
### 1. Browser: Audio capture + WebSocket client
|
||||
|
||||
**Audio capture**: When call becomes `active`, grab the remote audio track from the peer connection. Use an `AudioWorklet` to downsample to 16-bit PCM at 16kHz (Deepgram's preferred format). Send raw audio chunks (~100ms each) over WebSocket.
|
||||
|
||||
**WebSocket client**: Connects to `wss://engage-api.srv1477139.hstgr.cloud/api/call-assist`. Sends:
|
||||
- Initial message: `{ type: "start", ucid, leadId, callerPhone }`
|
||||
- Audio chunks: binary PCM data
|
||||
- End: `{ type: "stop" }`
|
||||
|
||||
Receives:
|
||||
- `{ type: "transcript", text: "...", isFinal: boolean }` — real-time transcript lines
|
||||
- `{ type: "suggestion", text: "...", action?: "book_appointment" | "transfer" }` — AI suggestions
|
||||
- `{ type: "context_loaded", leadName: "...", summary: "..." }` — confirmation that lead context was loaded
|
||||
|
||||
### 2. Sidecar: WebSocket Gateway
|
||||
|
||||
**NestJS WebSocket Gateway** at `/api/call-assist`. On connection:
|
||||
|
||||
1. Receives `start` message with `ucid`, `leadId`, `callerPhone`
|
||||
2. Loads lead context from platform: lead details, past calls, appointments, doctors, follow-ups
|
||||
3. Opens Deepgram Nova WebSocket (`wss://api.deepgram.com/v1/listen`)
|
||||
4. Pipes incoming audio chunks to Deepgram
|
||||
5. Deepgram returns transcript chunks — forwards to browser
|
||||
6. Every 10 seconds, sends accumulated transcript + lead context to OpenAI `gpt-4o-mini` for suggestions
|
||||
7. Returns suggestions to browser
|
||||
|
||||
**System prompt for OpenAI** (loaded once with lead context):
|
||||
```
|
||||
You are a real-time call assistant for Global Hospital Bangalore.
|
||||
You listen to the conversation and provide brief, actionable suggestions.
|
||||
|
||||
CALLER CONTEXT:
|
||||
- Name: {leadName}
|
||||
- Phone: {phone}
|
||||
- Source: {source} ({campaign})
|
||||
- Previous calls: {callCount} (last: {lastCallDate}, disposition: {lastDisposition})
|
||||
- Appointments: {appointmentHistory}
|
||||
- Interested in: {interestedService}
|
||||
- AI Summary: {aiSummary}
|
||||
|
||||
AVAILABLE RESOURCES:
|
||||
- Doctors: {doctorList with departments and clinics}
|
||||
- Next available slots: {availableSlots}
|
||||
|
||||
RULES:
|
||||
- Keep suggestions under 2 sentences
|
||||
- Focus on actionable next steps
|
||||
- If customer mentions a doctor/department, show available slots
|
||||
- If customer wants to cancel, note the appointment ID
|
||||
- Flag if customer sounds upset or mentions a complaint
|
||||
- Do NOT repeat information the agent already said
|
||||
```
|
||||
|
||||
**OpenAI call** (every 10 seconds):
|
||||
```typescript
|
||||
const response = await openai.chat.completions.create({
|
||||
model: 'gpt-4o-mini',
|
||||
messages: [
|
||||
{ role: 'system', content: systemPrompt },
|
||||
{ role: 'user', content: `Conversation so far:\n${transcript}\n\nProvide a brief suggestion for the agent.` },
|
||||
],
|
||||
max_tokens: 150,
|
||||
});
|
||||
```
|
||||
|
||||
### 3. Frontend: Live transcript sidebar
|
||||
|
||||
Replace the AI chat tab content during active calls with a live transcript view:
|
||||
|
||||
- Scrolling transcript with timestamps
|
||||
- Customer lines in one color, suggestions in a highlighted card
|
||||
- Auto-scroll to bottom as new lines arrive
|
||||
- Suggestions appear as colored cards between transcript lines
|
||||
- When call ends, transcript stays visible for reference during disposition
|
||||
|
||||
### 4. Context loading
|
||||
|
||||
On `start` message, the sidecar queries the platform for:
|
||||
```graphql
|
||||
# Lead details
|
||||
{ leads(filter: { id: { eq: "{leadId}" } }) { edges { node { ... } } } }
|
||||
|
||||
# Past appointments
|
||||
{ appointments(filter: { patientId: { eq: "{leadId}" } }) { edges { node { ... } } } }
|
||||
|
||||
# Doctors
|
||||
{ doctors(first: 20) { edges { node { id fullName department clinic } } } }
|
||||
```
|
||||
|
||||
This context is loaded once and injected into the system prompt. No mid-call refresh needed.
|
||||
|
||||
## File structure
|
||||
|
||||
### Sidecar (helix-engage-server)
|
||||
| File | Responsibility |
|
||||
|------|---------------|
|
||||
| `src/call-assist/call-assist.gateway.ts` | WebSocket gateway — handles audio streaming, Deepgram connection, OpenAI calls |
|
||||
| `src/call-assist/call-assist.module.ts` | Module registration |
|
||||
| `src/call-assist/call-assist.service.ts` | Context loading from platform, OpenAI prompt building |
|
||||
|
||||
### Frontend (helix-engage)
|
||||
| File | Responsibility |
|
||||
|------|---------------|
|
||||
| `src/lib/audio-capture.ts` | AudioWorklet to capture + downsample remote audio track |
|
||||
| `src/hooks/use-call-assist.ts` | WebSocket connection to sidecar, manages transcript + suggestion state |
|
||||
| `src/components/call-desk/live-transcript.tsx` | Scrolling transcript + suggestion cards UI |
|
||||
| `src/components/call-desk/context-panel.tsx` | Modify: show LiveTranscript instead of AiChatPanel during active calls |
|
||||
| `src/pages/call-desk.tsx` | Modify: remove CallPrepCard during active calls |
|
||||
|
||||
## Dependencies
|
||||
|
||||
- **Deepgram SDK**: `@deepgram/sdk` in sidecar (or raw WebSocket)
|
||||
- **DEEPGRAM_API_KEY**: environment variable in sidecar
|
||||
- **AudioWorklet**: browser API, no dependencies (supported in all modern browsers)
|
||||
- **OpenAI**: already configured in sidecar (`gpt-4o-mini`)
|
||||
|
||||
## Cost estimate
|
||||
|
||||
Per 5-minute call:
|
||||
- Deepgram Nova: ~$0.02 (at $0.0043/min)
|
||||
- OpenAI gpt-4o-mini: ~$0.005 (30 calls × ~500 tokens each)
|
||||
- Total: ~$0.025 per call (~₹2)
|
||||
|
||||
## Out of scope
|
||||
|
||||
- Agent mic transcription (only customer audio for now — agent's words are visible in the AI suggestions context)
|
||||
- Voice response from AI (text only)
|
||||
- Persistent transcript storage (future: save to Call record after call ends)
|
||||
- Multi-language support (English only for now)
|
||||
BIN
public/helix-logo.png
Normal file
BIN
public/helix-logo.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 918 KiB |
@@ -1,7 +1,14 @@
|
||||
import type { FC, HTMLAttributes } from "react";
|
||||
import { useCallback, useEffect, useRef } from "react";
|
||||
import type { Placement } from "@react-types/overlays";
|
||||
import { ChevronSelectorVertical, LogOut01, PhoneCall01, Settings01, User01 } from "@untitledui/icons";
|
||||
import { ChevronSelectorVertical } from "@untitledui/icons";
|
||||
import { FontAwesomeIcon } from "@fortawesome/react-fontawesome";
|
||||
import { faUser, faGear, faArrowRightFromBracket, faPhoneVolume } from "@fortawesome/pro-duotone-svg-icons";
|
||||
|
||||
const IconUser: FC<{ className?: string }> = ({ className }) => <FontAwesomeIcon icon={faUser} className={className} />;
|
||||
const IconSettings: FC<{ className?: string }> = ({ className }) => <FontAwesomeIcon icon={faGear} className={className} />;
|
||||
const IconLogout: FC<{ className?: string }> = ({ className }) => <FontAwesomeIcon icon={faArrowRightFromBracket} className={className} />;
|
||||
const IconForceReady: FC<{ className?: string }> = ({ className }) => <FontAwesomeIcon icon={faPhoneVolume} className={className} />;
|
||||
import { useFocusManager } from "react-aria";
|
||||
import type { DialogProps as AriaDialogProps } from "react-aria-components";
|
||||
import { Button as AriaButton, Dialog as AriaDialog, DialogTrigger as AriaDialogTrigger, Popover as AriaPopover } from "react-aria-components";
|
||||
@@ -67,14 +74,14 @@ export const NavAccountMenu = ({
|
||||
>
|
||||
<div className="rounded-xl bg-primary ring-1 ring-secondary">
|
||||
<div className="flex flex-col gap-0.5 py-1.5">
|
||||
<NavAccountCardMenuItem label="View profile" icon={User01} shortcut="⌘K->P" />
|
||||
<NavAccountCardMenuItem label="Account settings" icon={Settings01} shortcut="⌘S" />
|
||||
<NavAccountCardMenuItem label="Force Ready" icon={PhoneCall01} onClick={onForceReady} />
|
||||
<NavAccountCardMenuItem label="View profile" icon={IconUser} shortcut="⌘K->P" />
|
||||
<NavAccountCardMenuItem label="Account settings" icon={IconSettings} shortcut="⌘S" />
|
||||
<NavAccountCardMenuItem label="Force Ready" icon={IconForceReady} onClick={onForceReady} />
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div className="pt-1 pb-1.5">
|
||||
<NavAccountCardMenuItem label="Sign out" icon={LogOut01} shortcut="⌥⇧Q" onClick={onSignOut} />
|
||||
<NavAccountCardMenuItem label="Sign out" icon={IconLogout} shortcut="⌥⇧Q" onClick={onSignOut} />
|
||||
</div>
|
||||
</AriaDialog>
|
||||
);
|
||||
|
||||
@@ -3,6 +3,7 @@ import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import {
|
||||
faPhone, faPhoneHangup, faMicrophone, faMicrophoneSlash,
|
||||
faPause, faPlay, faCalendarPlus, faCheckCircle,
|
||||
faPhoneArrowRight, faRecordVinyl,
|
||||
} from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { Button } from '@/components/base/buttons/button';
|
||||
import { Badge } from '@/components/base/badges/badges';
|
||||
@@ -12,6 +13,7 @@ import { setOutboundPending } from '@/state/sip-manager';
|
||||
import { useSip } from '@/providers/sip-provider';
|
||||
import { DispositionForm } from './disposition-form';
|
||||
import { AppointmentForm } from './appointment-form';
|
||||
import { TransferDialog } from './transfer-dialog';
|
||||
import { formatPhone } from '@/lib/format';
|
||||
import { apiClient } from '@/lib/api-client';
|
||||
import { notify } from '@/lib/toast';
|
||||
@@ -39,6 +41,8 @@ export const ActiveCallCard = ({ lead, callerPhone }: ActiveCallCardProps) => {
|
||||
const [savedDisposition, setSavedDisposition] = useState<CallDisposition | null>(null);
|
||||
const [appointmentOpen, setAppointmentOpen] = useState(false);
|
||||
const [appointmentBookedDuringCall, setAppointmentBookedDuringCall] = useState(false);
|
||||
const [transferOpen, setTransferOpen] = useState(false);
|
||||
const [recordingPaused, setRecordingPaused] = useState(false);
|
||||
// Capture direction at mount — survives through disposition stage
|
||||
const callDirectionRef = useRef(callState === 'ringing-out' ? 'OUTBOUND' : 'INBOUND');
|
||||
|
||||
@@ -248,11 +252,36 @@ export const ActiveCallCard = ({ lead, callerPhone }: ActiveCallCardProps) => {
|
||||
<Button size="sm" color="secondary"
|
||||
iconLeading={({ className }: { className?: string }) => <FontAwesomeIcon icon={faCalendarPlus} className={className} />}
|
||||
onClick={() => setAppointmentOpen(true)}>Book Appt</Button>
|
||||
<Button size="sm" color="secondary"
|
||||
iconLeading={({ className }: { className?: string }) => <FontAwesomeIcon icon={faPhoneArrowRight} className={className} />}
|
||||
onClick={() => setTransferOpen(!transferOpen)}>Transfer</Button>
|
||||
<Button size="sm" color={recordingPaused ? 'primary-destructive' : 'secondary'}
|
||||
iconLeading={({ className }: { className?: string }) => <FontAwesomeIcon icon={faRecordVinyl} className={className} />}
|
||||
onClick={() => {
|
||||
const action = recordingPaused ? 'unPause' : 'pause';
|
||||
if (callUcid) {
|
||||
apiClient.post('/api/ozonetel/recording', { ucid: callUcid, action }).catch(() => {});
|
||||
}
|
||||
setRecordingPaused(!recordingPaused);
|
||||
}}>{recordingPaused ? 'Resume Rec' : 'Pause Rec'}</Button>
|
||||
<Button size="sm" color="primary-destructive" className="ml-auto"
|
||||
iconLeading={({ className }: { className?: string }) => <FontAwesomeIcon icon={faPhoneHangup} className={className} />}
|
||||
onClick={() => { hangup(); setPostCallStage('disposition'); }}>End</Button>
|
||||
</div>
|
||||
|
||||
{/* Transfer dialog */}
|
||||
{transferOpen && callUcid && (
|
||||
<TransferDialog
|
||||
ucid={callUcid}
|
||||
onClose={() => setTransferOpen(false)}
|
||||
onTransferred={() => {
|
||||
setTransferOpen(false);
|
||||
hangup();
|
||||
setPostCallStage('disposition');
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Appointment form accessible during call */}
|
||||
<AppointmentForm
|
||||
isOpen={appointmentOpen}
|
||||
|
||||
@@ -2,6 +2,8 @@ import { useEffect, useState } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faSparkles, faUser } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { AiChatPanel } from './ai-chat-panel';
|
||||
import { LiveTranscript } from './live-transcript';
|
||||
import { useCallAssist } from '@/hooks/use-call-assist';
|
||||
import { Badge } from '@/components/base/badges/badges';
|
||||
import { formatPhone, formatShortDate } from '@/lib/format';
|
||||
import { cx } from '@/utils/cx';
|
||||
@@ -13,9 +15,11 @@ interface ContextPanelProps {
|
||||
selectedLead: Lead | null;
|
||||
activities: LeadActivity[];
|
||||
callerPhone?: string;
|
||||
isInCall?: boolean;
|
||||
callUcid?: string | null;
|
||||
}
|
||||
|
||||
export const ContextPanel = ({ selectedLead, activities, callerPhone }: ContextPanelProps) => {
|
||||
export const ContextPanel = ({ selectedLead, activities, callerPhone, isInCall, callUcid }: ContextPanelProps) => {
|
||||
const [activeTab, setActiveTab] = useState<ContextTab>('ai');
|
||||
|
||||
// Auto-switch to lead 360 when a lead is selected
|
||||
@@ -25,6 +29,13 @@ export const ContextPanel = ({ selectedLead, activities, callerPhone }: ContextP
|
||||
}
|
||||
}, [selectedLead?.id]);
|
||||
|
||||
const { transcript, suggestions, connected: assistConnected } = useCallAssist(
|
||||
isInCall ?? false,
|
||||
callUcid ?? null,
|
||||
selectedLead?.id ?? null,
|
||||
callerPhone ?? null,
|
||||
);
|
||||
|
||||
const callerContext = selectedLead ? {
|
||||
callerPhone: selectedLead.contactPhone?.[0]?.number ?? callerPhone,
|
||||
leadId: selectedLead.id,
|
||||
@@ -64,9 +75,13 @@ export const ContextPanel = ({ selectedLead, activities, callerPhone }: ContextP
|
||||
{/* Tab content */}
|
||||
<div className="flex-1 overflow-y-auto">
|
||||
{activeTab === 'ai' && (
|
||||
isInCall ? (
|
||||
<LiveTranscript transcript={transcript} suggestions={suggestions} connected={assistConnected} />
|
||||
) : (
|
||||
<div className="flex h-full flex-col p-4">
|
||||
<AiChatPanel callerContext={callerContext} />
|
||||
</div>
|
||||
)
|
||||
)}
|
||||
{activeTab === 'lead360' && (
|
||||
<Lead360Tab lead={selectedLead} activities={activities} />
|
||||
|
||||
90
src/components/call-desk/live-transcript.tsx
Normal file
90
src/components/call-desk/live-transcript.tsx
Normal file
@@ -0,0 +1,90 @@
|
||||
import { useEffect, useRef } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faSparkles, faMicrophone } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { cx } from '@/utils/cx';
|
||||
|
||||
type TranscriptLine = {
|
||||
id: string;
|
||||
text: string;
|
||||
isFinal: boolean;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
type Suggestion = {
|
||||
id: string;
|
||||
text: string;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
type LiveTranscriptProps = {
|
||||
transcript: TranscriptLine[];
|
||||
suggestions: Suggestion[];
|
||||
connected: boolean;
|
||||
};
|
||||
|
||||
export const LiveTranscript = ({ transcript, suggestions, connected }: LiveTranscriptProps) => {
|
||||
const scrollRef = useRef<HTMLDivElement>(null);
|
||||
|
||||
useEffect(() => {
|
||||
if (scrollRef.current) {
|
||||
scrollRef.current.scrollTop = scrollRef.current.scrollHeight;
|
||||
}
|
||||
}, [transcript.length, suggestions.length]);
|
||||
|
||||
// Merge transcript and suggestions by timestamp
|
||||
const items = [
|
||||
...transcript.map(t => ({ ...t, kind: 'transcript' as const })),
|
||||
...suggestions.map(s => ({ ...s, kind: 'suggestion' as const, isFinal: true })),
|
||||
].sort((a, b) => a.timestamp.getTime() - b.timestamp.getTime());
|
||||
|
||||
return (
|
||||
<div className="flex flex-1 flex-col overflow-hidden">
|
||||
{/* Header */}
|
||||
<div className="flex items-center gap-2 px-4 py-3 border-b border-secondary">
|
||||
<FontAwesomeIcon icon={faSparkles} className="size-3.5 text-fg-brand-primary" />
|
||||
<span className="text-xs font-bold uppercase tracking-wider text-brand-secondary">Live Assist</span>
|
||||
<div className={cx(
|
||||
"ml-auto size-2 rounded-full",
|
||||
connected ? "bg-success-solid" : "bg-disabled",
|
||||
)} />
|
||||
</div>
|
||||
|
||||
{/* Transcript body */}
|
||||
<div ref={scrollRef} className="flex-1 overflow-y-auto px-4 py-3 space-y-2">
|
||||
{items.length === 0 && (
|
||||
<div className="flex flex-col items-center justify-center py-8 text-center">
|
||||
<FontAwesomeIcon icon={faMicrophone} className="size-6 text-fg-quaternary mb-2" />
|
||||
<p className="text-xs text-quaternary">Listening to customer...</p>
|
||||
<p className="text-xs text-quaternary">Transcript will appear here</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{items.map(item => {
|
||||
if (item.kind === 'suggestion') {
|
||||
return (
|
||||
<div key={item.id} className="rounded-lg bg-brand-primary p-3 border border-brand">
|
||||
<div className="flex items-center gap-1.5 mb-1">
|
||||
<FontAwesomeIcon icon={faSparkles} className="size-3 text-fg-brand-primary" />
|
||||
<span className="text-xs font-semibold text-brand-secondary">AI Suggestion</span>
|
||||
</div>
|
||||
<p className="text-sm text-primary">{item.text}</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<div key={item.id} className={cx(
|
||||
"text-sm",
|
||||
item.isFinal ? "text-primary" : "text-tertiary italic",
|
||||
)}>
|
||||
<span className="text-xs text-quaternary mr-2">
|
||||
{item.timestamp.toLocaleTimeString('en-IN', { hour: '2-digit', minute: '2-digit', second: '2-digit' })}
|
||||
</span>
|
||||
{item.text}
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
150
src/components/call-desk/phone-action-cell.tsx
Normal file
150
src/components/call-desk/phone-action-cell.tsx
Normal file
@@ -0,0 +1,150 @@
|
||||
import { useState, useRef, useEffect } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faPhone, faCommentDots, faEllipsisVertical, faMessageDots } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { useSetAtom } from 'jotai';
|
||||
import { useSip } from '@/providers/sip-provider';
|
||||
import { sipCallStateAtom, sipCallerNumberAtom, sipCallUcidAtom } from '@/state/sip-state';
|
||||
import { setOutboundPending } from '@/state/sip-manager';
|
||||
import { apiClient } from '@/lib/api-client';
|
||||
import { notify } from '@/lib/toast';
|
||||
import { cx } from '@/utils/cx';
|
||||
|
||||
type PhoneActionCellProps = {
|
||||
phoneNumber: string;
|
||||
displayNumber: string;
|
||||
leadId?: string;
|
||||
};
|
||||
|
||||
export const PhoneActionCell = ({ phoneNumber, displayNumber, leadId: _leadId }: PhoneActionCellProps) => {
|
||||
const { isRegistered, isInCall } = useSip();
|
||||
const setCallState = useSetAtom(sipCallStateAtom);
|
||||
const setCallerNumber = useSetAtom(sipCallerNumberAtom);
|
||||
const setCallUcid = useSetAtom(sipCallUcidAtom);
|
||||
const [menuOpen, setMenuOpen] = useState(false);
|
||||
const [dialing, setDialing] = useState(false);
|
||||
const menuRef = useRef<HTMLDivElement>(null);
|
||||
const touchTimer = useRef<number | null>(null);
|
||||
|
||||
// Close menu on click outside
|
||||
useEffect(() => {
|
||||
if (!menuOpen) return;
|
||||
const handleClick = (e: MouseEvent) => {
|
||||
if (menuRef.current && !menuRef.current.contains(e.target as Node)) {
|
||||
setMenuOpen(false);
|
||||
}
|
||||
};
|
||||
document.addEventListener('mousedown', handleClick);
|
||||
return () => document.removeEventListener('mousedown', handleClick);
|
||||
}, [menuOpen]);
|
||||
|
||||
const handleCall = async () => {
|
||||
if (!isRegistered || isInCall || dialing) return;
|
||||
setMenuOpen(false);
|
||||
setDialing(true);
|
||||
setCallState('ringing-out');
|
||||
setCallerNumber(phoneNumber);
|
||||
setOutboundPending(true);
|
||||
const safetyTimer = setTimeout(() => setOutboundPending(false), 30000);
|
||||
|
||||
try {
|
||||
const result = await apiClient.post<{ ucid?: string }>('/api/ozonetel/dial', { phoneNumber });
|
||||
if (result?.ucid) setCallUcid(result.ucid);
|
||||
} catch {
|
||||
clearTimeout(safetyTimer);
|
||||
setCallState('idle');
|
||||
setCallerNumber(null);
|
||||
setOutboundPending(false);
|
||||
setCallUcid(null);
|
||||
notify.error('Dial Failed', 'Could not place the call');
|
||||
} finally {
|
||||
setDialing(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleSms = () => {
|
||||
setMenuOpen(false);
|
||||
window.open(`sms:+91${phoneNumber}`, '_self');
|
||||
};
|
||||
|
||||
const handleWhatsApp = () => {
|
||||
setMenuOpen(false);
|
||||
window.open(`https://wa.me/91${phoneNumber}`, '_blank');
|
||||
};
|
||||
|
||||
// Long-press for mobile
|
||||
const onTouchStart = () => {
|
||||
touchTimer.current = window.setTimeout(() => setMenuOpen(true), 500);
|
||||
};
|
||||
|
||||
const onTouchEnd = () => {
|
||||
if (touchTimer.current) {
|
||||
clearTimeout(touchTimer.current);
|
||||
touchTimer.current = null;
|
||||
}
|
||||
};
|
||||
|
||||
const canCall = isRegistered && !isInCall && !dialing;
|
||||
|
||||
return (
|
||||
<div className="relative flex items-center gap-1" ref={menuRef}>
|
||||
{/* Clickable phone number — calls directly */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleCall}
|
||||
onTouchStart={onTouchStart}
|
||||
onTouchEnd={onTouchEnd}
|
||||
onContextMenu={(e) => { e.preventDefault(); setMenuOpen(true); }}
|
||||
disabled={!canCall}
|
||||
className={cx(
|
||||
'flex items-center gap-1.5 rounded-md px-1.5 py-1 text-sm transition duration-100 ease-linear',
|
||||
canCall
|
||||
? 'cursor-pointer text-brand-secondary hover:bg-brand-primary hover:text-brand-secondary'
|
||||
: 'cursor-default text-tertiary',
|
||||
)}
|
||||
>
|
||||
<FontAwesomeIcon icon={faPhone} className="size-3" />
|
||||
<span className="whitespace-nowrap">{displayNumber}</span>
|
||||
</button>
|
||||
|
||||
{/* Kebab menu trigger — desktop */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={(e) => { e.stopPropagation(); setMenuOpen(!menuOpen); }}
|
||||
className="flex size-6 items-center justify-center rounded-md text-fg-quaternary opacity-0 group-hover/row:opacity-100 hover:text-fg-secondary hover:bg-primary_hover transition duration-100 ease-linear"
|
||||
>
|
||||
<FontAwesomeIcon icon={faEllipsisVertical} className="size-3" />
|
||||
</button>
|
||||
|
||||
{/* Context menu */}
|
||||
{menuOpen && (
|
||||
<div className="absolute left-0 top-full z-50 mt-1 w-40 rounded-lg bg-primary shadow-lg ring-1 ring-secondary py-1">
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleCall}
|
||||
disabled={!canCall}
|
||||
className="flex w-full items-center gap-2 px-3 py-2 text-sm text-secondary hover:bg-primary_hover disabled:text-disabled"
|
||||
>
|
||||
<FontAwesomeIcon icon={faPhone} className="size-3.5 text-fg-success-secondary" />
|
||||
Call
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleSms}
|
||||
className="flex w-full items-center gap-2 px-3 py-2 text-sm text-secondary hover:bg-primary_hover"
|
||||
>
|
||||
<FontAwesomeIcon icon={faCommentDots} className="size-3.5 text-fg-brand-secondary" />
|
||||
SMS
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleWhatsApp}
|
||||
className="flex w-full items-center gap-2 px-3 py-2 text-sm text-secondary hover:bg-primary_hover"
|
||||
>
|
||||
<FontAwesomeIcon icon={faMessageDots} className="size-3.5 text-[#25D366]" />
|
||||
WhatsApp
|
||||
</button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
91
src/components/call-desk/transfer-dialog.tsx
Normal file
91
src/components/call-desk/transfer-dialog.tsx
Normal file
@@ -0,0 +1,91 @@
|
||||
import { useState } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import { faXmark } from '@fortawesome/pro-duotone-svg-icons';
|
||||
import { Input } from '@/components/base/input/input';
|
||||
import { Button } from '@/components/base/buttons/button';
|
||||
import { apiClient } from '@/lib/api-client';
|
||||
import { notify } from '@/lib/toast';
|
||||
|
||||
type TransferDialogProps = {
|
||||
ucid: string;
|
||||
onClose: () => void;
|
||||
onTransferred: () => void;
|
||||
};
|
||||
|
||||
export const TransferDialog = ({ ucid, onClose, onTransferred }: TransferDialogProps) => {
|
||||
const [number, setNumber] = useState('');
|
||||
const [transferring, setTransferring] = useState(false);
|
||||
const [stage, setStage] = useState<'input' | 'connected'>('input');
|
||||
|
||||
const handleConference = async () => {
|
||||
if (!number.trim()) return;
|
||||
setTransferring(true);
|
||||
try {
|
||||
await apiClient.post('/api/ozonetel/call-control', {
|
||||
action: 'CONFERENCE',
|
||||
ucid,
|
||||
conferenceNumber: `0${number.replace(/\D/g, '')}`,
|
||||
});
|
||||
notify.success('Connected', 'Third party connected. Click Complete to transfer.');
|
||||
setStage('connected');
|
||||
} catch {
|
||||
notify.error('Transfer Failed', 'Could not connect to the target number');
|
||||
} finally {
|
||||
setTransferring(false);
|
||||
}
|
||||
};
|
||||
|
||||
const handleComplete = async () => {
|
||||
setTransferring(true);
|
||||
try {
|
||||
await apiClient.post('/api/ozonetel/call-control', {
|
||||
action: 'KICK_CALL',
|
||||
ucid,
|
||||
conferenceNumber: `0${number.replace(/\D/g, '')}`,
|
||||
});
|
||||
notify.success('Transferred', 'Call transferred successfully');
|
||||
onTransferred();
|
||||
} catch {
|
||||
notify.error('Transfer Failed', 'Could not complete transfer');
|
||||
} finally {
|
||||
setTransferring(false);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className="mt-3 rounded-lg border border-secondary bg-secondary p-3">
|
||||
<div className="flex items-center justify-between mb-2">
|
||||
<span className="text-xs font-semibold text-secondary">Transfer Call</span>
|
||||
<button onClick={onClose} className="text-fg-quaternary hover:text-fg-secondary transition duration-100 ease-linear">
|
||||
<FontAwesomeIcon icon={faXmark} className="size-3" />
|
||||
</button>
|
||||
</div>
|
||||
{stage === 'input' ? (
|
||||
<div className="flex gap-2">
|
||||
<Input
|
||||
size="sm"
|
||||
placeholder="Enter phone number"
|
||||
value={number}
|
||||
onChange={setNumber}
|
||||
/>
|
||||
<Button
|
||||
size="sm"
|
||||
color="primary"
|
||||
isLoading={transferring}
|
||||
onClick={handleConference}
|
||||
isDisabled={!number.trim()}
|
||||
>
|
||||
Connect
|
||||
</Button>
|
||||
</div>
|
||||
) : (
|
||||
<div className="flex items-center justify-between">
|
||||
<span className="text-xs text-tertiary">Connected to {number}</span>
|
||||
<Button size="sm" color="primary" isLoading={transferring} onClick={handleComplete}>
|
||||
Complete Transfer
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
@@ -1,4 +1,4 @@
|
||||
import { useCallback, useMemo, useState } from 'react';
|
||||
import { useCallback, useEffect, useMemo, useRef, useState } from 'react';
|
||||
import type { FC, HTMLAttributes } from 'react';
|
||||
import { FontAwesomeIcon } from '@fortawesome/react-fontawesome';
|
||||
import {
|
||||
@@ -10,8 +10,9 @@ import { Table } from '@/components/application/table/table';
|
||||
import { Badge } from '@/components/base/badges/badges';
|
||||
import { Input } from '@/components/base/input/input';
|
||||
import { Tabs, TabList, Tab } from '@/components/application/tabs/tabs';
|
||||
import { ClickToCallButton } from './click-to-call-button';
|
||||
import { PhoneActionCell } from './phone-action-cell';
|
||||
import { formatPhone } from '@/lib/format';
|
||||
import { notify } from '@/lib/toast';
|
||||
import { cx } from '@/utils/cx';
|
||||
|
||||
type WorklistLead = {
|
||||
@@ -24,6 +25,10 @@ type WorklistLead = {
|
||||
interestedService: string | null;
|
||||
aiSummary: string | null;
|
||||
aiSuggestedAction: string | null;
|
||||
lastContacted: string | null;
|
||||
contactAttempts: number | null;
|
||||
utmCampaign: string | null;
|
||||
campaignId: string | null;
|
||||
};
|
||||
|
||||
type WorklistFollowUp = {
|
||||
@@ -42,6 +47,7 @@ type MissedCall = {
|
||||
callerNumber: { number: string; callingCode: string }[] | null;
|
||||
startedAt: string | null;
|
||||
leadId: string | null;
|
||||
disposition: string | null;
|
||||
};
|
||||
|
||||
interface WorklistPanelProps {
|
||||
@@ -55,7 +61,6 @@ interface WorklistPanelProps {
|
||||
|
||||
type TabKey = 'all' | 'missed' | 'callbacks' | 'follow-ups';
|
||||
|
||||
// Unified row type for the table
|
||||
type WorklistRow = {
|
||||
id: string;
|
||||
type: 'missed' | 'callback' | 'follow-up' | 'lead';
|
||||
@@ -70,6 +75,10 @@ type WorklistRow = {
|
||||
taskState: 'PENDING' | 'ATTEMPTED' | 'SCHEDULED';
|
||||
leadId: string | null;
|
||||
originalLead: WorklistLead | null;
|
||||
lastContactedAt: string | null;
|
||||
contactAttempts: number;
|
||||
source: string | null;
|
||||
lastDisposition: string | null;
|
||||
};
|
||||
|
||||
const priorityConfig: Record<string, { color: 'error' | 'warning' | 'brand' | 'gray'; label: string; sort: number }> = {
|
||||
@@ -87,9 +96,8 @@ const followUpLabel: Record<string, string> = {
|
||||
REVIEW_REQUEST: 'Review',
|
||||
};
|
||||
|
||||
// Compute SLA: minutes since created, color-coded
|
||||
const computeSla = (createdAt: string): { label: string; color: 'success' | 'warning' | 'error' } => {
|
||||
const minutes = Math.max(0, Math.round((Date.now() - new Date(createdAt).getTime()) / 60000));
|
||||
const computeSla = (dateStr: string): { label: string; color: 'success' | 'warning' | 'error' } => {
|
||||
const minutes = Math.max(0, Math.round((Date.now() - new Date(dateStr).getTime()) / 60000));
|
||||
if (minutes < 1) return { label: '<1m', color: 'success' };
|
||||
if (minutes < 15) return { label: `${minutes}m`, color: 'success' };
|
||||
if (minutes < 30) return { label: `${minutes}m`, color: 'warning' };
|
||||
@@ -99,6 +107,30 @@ const computeSla = (createdAt: string): { label: string; color: 'success' | 'war
|
||||
return { label: `${Math.floor(hours / 24)}d`, color: 'error' };
|
||||
};
|
||||
|
||||
const formatTimeAgo = (dateStr: string): string => {
|
||||
const minutes = Math.round((Date.now() - new Date(dateStr).getTime()) / 60000);
|
||||
if (minutes < 1) return 'Just now';
|
||||
if (minutes < 60) return `${minutes}m ago`;
|
||||
const hours = Math.floor(minutes / 60);
|
||||
if (hours < 24) return `${hours}h ago`;
|
||||
return `${Math.floor(hours / 24)}d ago`;
|
||||
};
|
||||
|
||||
const formatDisposition = (disposition: string): string =>
|
||||
disposition.replace(/_/g, ' ').replace(/\b\w/g, c => c.toUpperCase());
|
||||
|
||||
const formatSource = (source: string): string => {
|
||||
const map: Record<string, string> = {
|
||||
FACEBOOK_AD: 'Facebook',
|
||||
GOOGLE_AD: 'Google',
|
||||
WALK_IN: 'Walk-in',
|
||||
REFERRAL: 'Referral',
|
||||
WEBSITE: 'Website',
|
||||
PHONE_INQUIRY: 'Phone',
|
||||
};
|
||||
return map[source] ?? source.replace(/_/g, ' ');
|
||||
};
|
||||
|
||||
const IconInbound: FC<HTMLAttributes<HTMLOrSVGElement>> = ({ className }) => (
|
||||
<FontAwesomeIcon icon={faPhoneArrowDown} className={className} />
|
||||
);
|
||||
@@ -127,6 +159,10 @@ const buildRows = (missedCalls: MissedCall[], followUps: WorklistFollowUp[], lea
|
||||
taskState: 'PENDING',
|
||||
leadId: call.leadId,
|
||||
originalLead: null,
|
||||
lastContactedAt: call.startedAt ?? call.createdAt,
|
||||
contactAttempts: 0,
|
||||
source: null,
|
||||
lastDisposition: call.disposition ?? null,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -149,6 +185,10 @@ const buildRows = (missedCalls: MissedCall[], followUps: WorklistFollowUp[], lea
|
||||
taskState: isOverdue ? 'PENDING' : (fu.followUpStatus === 'COMPLETED' ? 'ATTEMPTED' : 'SCHEDULED'),
|
||||
leadId: null,
|
||||
originalLead: null,
|
||||
lastContactedAt: fu.scheduledAt ?? fu.createdAt ?? null,
|
||||
contactAttempts: 0,
|
||||
source: null,
|
||||
lastDisposition: null,
|
||||
});
|
||||
}
|
||||
|
||||
@@ -171,25 +211,24 @@ const buildRows = (missedCalls: MissedCall[], followUps: WorklistFollowUp[], lea
|
||||
taskState: 'PENDING',
|
||||
leadId: lead.id,
|
||||
originalLead: lead,
|
||||
lastContactedAt: lead.lastContacted ?? null,
|
||||
contactAttempts: lead.contactAttempts ?? 0,
|
||||
source: lead.leadSource ?? lead.utmCampaign ?? null,
|
||||
lastDisposition: null,
|
||||
});
|
||||
}
|
||||
|
||||
// Sort by priority (urgent first), then by creation time (oldest first)
|
||||
rows.sort((a, b) => {
|
||||
// Remove rows without a phone number — agent can't act on them
|
||||
const actionableRows = rows.filter(r => r.phoneRaw);
|
||||
|
||||
actionableRows.sort((a, b) => {
|
||||
const pa = priorityConfig[a.priority]?.sort ?? 2;
|
||||
const pb = priorityConfig[b.priority]?.sort ?? 2;
|
||||
if (pa !== pb) return pa - pb;
|
||||
return new Date(a.createdAt).getTime() - new Date(b.createdAt).getTime();
|
||||
});
|
||||
|
||||
return rows;
|
||||
};
|
||||
|
||||
const typeConfig: Record<WorklistRow['type'], { color: 'error' | 'brand' | 'blue-light' | 'gray' }> = {
|
||||
missed: { color: 'error' },
|
||||
callback: { color: 'brand' },
|
||||
'follow-up': { color: 'blue-light' },
|
||||
lead: { color: 'gray' },
|
||||
return actionableRows;
|
||||
};
|
||||
|
||||
export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelectLead, selectedLeadId }: WorklistPanelProps) => {
|
||||
@@ -203,13 +242,10 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect
|
||||
|
||||
const filteredRows = useMemo(() => {
|
||||
let rows = allRows;
|
||||
|
||||
// Tab filter
|
||||
if (tab === 'missed') rows = rows.filter((r) => r.type === 'missed');
|
||||
else if (tab === 'callbacks') rows = rows.filter((r) => r.type === 'callback');
|
||||
else if (tab === 'follow-ups') rows = rows.filter((r) => r.type === 'follow-up');
|
||||
|
||||
// Search filter
|
||||
if (search.trim()) {
|
||||
const q = search.toLowerCase();
|
||||
rows = rows.filter(
|
||||
@@ -224,10 +260,18 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect
|
||||
const callbackCount = allRows.filter((r) => r.type === 'callback').length;
|
||||
const followUpCount = allRows.filter((r) => r.type === 'follow-up').length;
|
||||
|
||||
// Notification for new missed calls
|
||||
const prevMissedCount = useRef(missedCount);
|
||||
useEffect(() => {
|
||||
if (missedCount > prevMissedCount.current && prevMissedCount.current > 0) {
|
||||
notify.info('New Missed Call', `${missedCount - prevMissedCount.current} new missed call(s)`);
|
||||
}
|
||||
prevMissedCount.current = missedCount;
|
||||
}, [missedCount]);
|
||||
|
||||
const PAGE_SIZE = 15;
|
||||
const [page, setPage] = useState(1);
|
||||
|
||||
// Reset page when filters change
|
||||
const handleTabChange = useCallback((key: TabKey) => { setTab(key); setPage(1); }, []);
|
||||
const handleSearch = useCallback((value: string) => { setSearch(value); setPage(1); }, []);
|
||||
|
||||
@@ -262,7 +306,7 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect
|
||||
|
||||
return (
|
||||
<div className="flex flex-1 flex-col">
|
||||
{/* Filter tabs + search — single row */}
|
||||
{/* Filter tabs + search */}
|
||||
<div className="flex items-end justify-between border-b border-secondary px-5 pt-3 pb-0.5">
|
||||
<Tabs selectedKey={tab} onSelectionChange={(key) => handleTabChange(key as TabKey)}>
|
||||
<TabList items={tabItems} type="underline" size="sm">
|
||||
@@ -294,28 +338,29 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect
|
||||
<Table.Head label="PRIORITY" className="w-20" isRowHeader />
|
||||
<Table.Head label="PATIENT" />
|
||||
<Table.Head label="PHONE" />
|
||||
<Table.Head label="TYPE" />
|
||||
<Table.Head label="SLA" className="w-20" />
|
||||
<Table.Head label="ACTIONS" className="w-24" />
|
||||
<Table.Head label="SOURCE" className="w-28" />
|
||||
<Table.Head label="SLA" className="w-24" />
|
||||
</Table.Header>
|
||||
<Table.Body items={pagedRows}>
|
||||
{(row) => {
|
||||
const priority = priorityConfig[row.priority] ?? priorityConfig.NORMAL;
|
||||
const sla = computeSla(row.createdAt);
|
||||
const typeCfg = typeConfig[row.type];
|
||||
const sla = computeSla(row.lastContactedAt ?? row.createdAt);
|
||||
const isSelected = row.originalLead !== null && row.originalLead.id === selectedLeadId;
|
||||
|
||||
// Sub-line: last interaction context
|
||||
const subLine = row.lastContactedAt
|
||||
? `${formatTimeAgo(row.lastContactedAt)}${row.lastDisposition ? ` — ${formatDisposition(row.lastDisposition)}` : ''}`
|
||||
: row.reason || row.typeLabel;
|
||||
|
||||
return (
|
||||
<Table.Row
|
||||
id={row.id}
|
||||
className={cx(
|
||||
'cursor-pointer',
|
||||
'cursor-pointer group/row',
|
||||
isSelected && 'bg-brand-primary',
|
||||
)}
|
||||
onAction={() => {
|
||||
if (row.originalLead) {
|
||||
onSelectLead(row.originalLead);
|
||||
}
|
||||
if (row.originalLead) onSelectLead(row.originalLead);
|
||||
}}
|
||||
>
|
||||
<Table.Cell>
|
||||
@@ -326,44 +371,46 @@ export const WorklistPanel = ({ missedCalls, followUps, leads, loading, onSelect
|
||||
<Table.Cell>
|
||||
<div className="flex items-center gap-2">
|
||||
{row.direction === 'inbound' && (
|
||||
<IconInbound className="size-3.5 text-fg-success-secondary" />
|
||||
<IconInbound className="size-3.5 text-fg-success-secondary shrink-0" />
|
||||
)}
|
||||
{row.direction === 'outbound' && (
|
||||
<IconOutbound className="size-3.5 text-fg-brand-secondary" />
|
||||
<IconOutbound className="size-3.5 text-fg-brand-secondary shrink-0" />
|
||||
)}
|
||||
<span className="text-sm font-medium text-primary truncate max-w-[140px]">
|
||||
<div className="min-w-0">
|
||||
<span className="text-sm font-medium text-primary truncate block max-w-[180px]">
|
||||
{row.name}
|
||||
</span>
|
||||
<span className="text-xs text-tertiary truncate block max-w-[200px]">
|
||||
{subLine}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
</Table.Cell>
|
||||
<Table.Cell>
|
||||
<span className="text-sm text-tertiary whitespace-nowrap">
|
||||
{row.phone || '\u2014'}
|
||||
</span>
|
||||
{row.phoneRaw ? (
|
||||
<PhoneActionCell
|
||||
phoneNumber={row.phoneRaw}
|
||||
displayNumber={row.phone}
|
||||
leadId={row.leadId ?? undefined}
|
||||
/>
|
||||
) : (
|
||||
<span className="text-xs text-quaternary italic">No phone</span>
|
||||
)}
|
||||
</Table.Cell>
|
||||
<Table.Cell>
|
||||
<Badge size="sm" color={typeCfg.color} type="pill-color">
|
||||
{row.typeLabel}
|
||||
</Badge>
|
||||
{row.source ? (
|
||||
<span className="text-xs text-tertiary truncate block max-w-[100px]">
|
||||
{formatSource(row.source)}
|
||||
</span>
|
||||
) : (
|
||||
<span className="text-xs text-quaternary">—</span>
|
||||
)}
|
||||
</Table.Cell>
|
||||
<Table.Cell>
|
||||
<Badge size="sm" color={sla.color} type="pill-color">
|
||||
{sla.label}
|
||||
</Badge>
|
||||
</Table.Cell>
|
||||
<Table.Cell>
|
||||
<div className="flex items-center gap-1">
|
||||
{row.phoneRaw ? (
|
||||
<ClickToCallButton
|
||||
phoneNumber={row.phoneRaw}
|
||||
leadId={row.leadId ?? undefined}
|
||||
size="sm"
|
||||
/>
|
||||
) : (
|
||||
<span className="text-xs text-quaternary">No phone</span>
|
||||
)}
|
||||
</div>
|
||||
</Table.Cell>
|
||||
</Table.Row>
|
||||
);
|
||||
}}
|
||||
|
||||
90
src/hooks/use-call-assist.ts
Normal file
90
src/hooks/use-call-assist.ts
Normal file
@@ -0,0 +1,90 @@
|
||||
import { useEffect, useRef, useState, useCallback } from 'react';
|
||||
import { io, Socket } from 'socket.io-client';
|
||||
import { startAudioCapture, stopAudioCapture } from '@/lib/audio-capture';
|
||||
import { getSipClient } from '@/state/sip-manager';
|
||||
|
||||
const SIDECAR_URL = import.meta.env.VITE_SIDECAR_URL ?? 'http://localhost:4100';
|
||||
|
||||
type TranscriptLine = {
|
||||
id: string;
|
||||
text: string;
|
||||
isFinal: boolean;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
type Suggestion = {
|
||||
id: string;
|
||||
text: string;
|
||||
timestamp: Date;
|
||||
};
|
||||
|
||||
export const useCallAssist = (
|
||||
active: boolean,
|
||||
ucid: string | null,
|
||||
leadId: string | null,
|
||||
callerPhone: string | null,
|
||||
) => {
|
||||
const [transcript, setTranscript] = useState<TranscriptLine[]>([]);
|
||||
const [suggestions, setSuggestions] = useState<Suggestion[]>([]);
|
||||
const [connected, setConnected] = useState(false);
|
||||
const socketRef = useRef<Socket | null>(null);
|
||||
const idCounter = useRef(0);
|
||||
|
||||
const nextId = useCallback(() => `ca-${++idCounter.current}`, []);
|
||||
|
||||
useEffect(() => {
|
||||
if (!active || !ucid) return;
|
||||
|
||||
const socket = io(`${SIDECAR_URL}/call-assist`);
|
||||
socketRef.current = socket;
|
||||
|
||||
socket.on('connect', () => {
|
||||
setConnected(true);
|
||||
socket.emit('call-assist:start', { ucid, leadId, callerPhone });
|
||||
|
||||
// Start capturing remote audio from the SIP session
|
||||
const sipClient = getSipClient();
|
||||
const audioElement = sipClient?.getAudioElement();
|
||||
if (audioElement?.srcObject) {
|
||||
startAudioCapture(audioElement.srcObject as MediaStream, (chunk) => {
|
||||
socket.emit('call-assist:audio', chunk);
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
socket.on('call-assist:transcript', (data: { text: string; isFinal: boolean }) => {
|
||||
if (!data.text.trim()) return;
|
||||
setTranscript(prev => {
|
||||
if (!data.isFinal) {
|
||||
const finals = prev.filter(l => l.isFinal);
|
||||
return [...finals, { id: nextId(), text: data.text, isFinal: false, timestamp: new Date() }];
|
||||
}
|
||||
const finals = prev.filter(l => l.isFinal);
|
||||
return [...finals, { id: nextId(), text: data.text, isFinal: true, timestamp: new Date() }];
|
||||
});
|
||||
});
|
||||
|
||||
socket.on('call-assist:suggestion', (data: { text: string }) => {
|
||||
setSuggestions(prev => [...prev, { id: nextId(), text: data.text, timestamp: new Date() }]);
|
||||
});
|
||||
|
||||
socket.on('disconnect', () => setConnected(false));
|
||||
|
||||
return () => {
|
||||
stopAudioCapture();
|
||||
socket.emit('call-assist:stop');
|
||||
socket.disconnect();
|
||||
socketRef.current = null;
|
||||
setConnected(false);
|
||||
};
|
||||
}, [active, ucid, leadId, callerPhone, nextId]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!active) {
|
||||
setTranscript([]);
|
||||
setSuggestions([]);
|
||||
}
|
||||
}, [active]);
|
||||
|
||||
return { transcript, suggestions, connected };
|
||||
};
|
||||
@@ -47,6 +47,8 @@ type WorklistLead = {
|
||||
isSpam: boolean | null;
|
||||
aiSummary: string | null;
|
||||
aiSuggestedAction: string | null;
|
||||
lastContacted: string | null;
|
||||
utmCampaign: string | null;
|
||||
};
|
||||
|
||||
type WorklistData = {
|
||||
|
||||
45
src/lib/audio-capture.ts
Normal file
45
src/lib/audio-capture.ts
Normal file
@@ -0,0 +1,45 @@
|
||||
type AudioChunkCallback = (chunk: ArrayBuffer) => void;
|
||||
|
||||
let audioContext: AudioContext | null = null;
|
||||
let mediaStreamSource: MediaStreamAudioSourceNode | null = null;
|
||||
let scriptProcessor: ScriptProcessorNode | null = null;
|
||||
|
||||
export function startAudioCapture(remoteStream: MediaStream, onChunk: AudioChunkCallback): void {
|
||||
stopAudioCapture();
|
||||
|
||||
audioContext = new AudioContext({ sampleRate: 16000 });
|
||||
mediaStreamSource = audioContext.createMediaStreamSource(remoteStream);
|
||||
|
||||
scriptProcessor = audioContext.createScriptProcessor(4096, 1, 1);
|
||||
|
||||
scriptProcessor.onaudioprocess = (event) => {
|
||||
const inputData = event.inputBuffer.getChannelData(0);
|
||||
|
||||
// Convert Float32 to Int16 PCM
|
||||
const pcm = new Int16Array(inputData.length);
|
||||
for (let i = 0; i < inputData.length; i++) {
|
||||
const s = Math.max(-1, Math.min(1, inputData[i]));
|
||||
pcm[i] = s < 0 ? s * 0x8000 : s * 0x7FFF;
|
||||
}
|
||||
|
||||
onChunk(pcm.buffer);
|
||||
};
|
||||
|
||||
mediaStreamSource.connect(scriptProcessor);
|
||||
scriptProcessor.connect(audioContext.destination);
|
||||
}
|
||||
|
||||
export function stopAudioCapture(): void {
|
||||
if (scriptProcessor) {
|
||||
scriptProcessor.disconnect();
|
||||
scriptProcessor = null;
|
||||
}
|
||||
if (mediaStreamSource) {
|
||||
mediaStreamSource.disconnect();
|
||||
mediaStreamSource = null;
|
||||
}
|
||||
if (audioContext) {
|
||||
audioContext.close().catch(() => {});
|
||||
audioContext = null;
|
||||
}
|
||||
}
|
||||
@@ -211,6 +211,10 @@ export class SIPClient {
|
||||
return this.ua?.isRegistered() ?? false;
|
||||
}
|
||||
|
||||
getAudioElement(): HTMLAudioElement | null {
|
||||
return this.audioElement;
|
||||
}
|
||||
|
||||
private resetSession(): void {
|
||||
this.currentSession = null;
|
||||
this.cleanupAudio();
|
||||
|
||||
@@ -9,14 +9,14 @@ import { WorklistPanel } from '@/components/call-desk/worklist-panel';
|
||||
import type { WorklistLead } from '@/components/call-desk/worklist-panel';
|
||||
import { ContextPanel } from '@/components/call-desk/context-panel';
|
||||
import { ActiveCallCard } from '@/components/call-desk/active-call-card';
|
||||
import { CallPrepCard } from '@/components/call-desk/call-prep-card';
|
||||
|
||||
import { BadgeWithDot, Badge } from '@/components/base/badges/badges';
|
||||
import { cx } from '@/utils/cx';
|
||||
|
||||
export const CallDeskPage = () => {
|
||||
const { user } = useAuth();
|
||||
const { leadActivities } = useData();
|
||||
const { connectionStatus, isRegistered, callState, callerNumber } = useSip();
|
||||
const { connectionStatus, isRegistered, callState, callerNumber, callUcid } = useSip();
|
||||
const { missedCalls, followUps, marketingLeads, totalPending, loading } = useWorklist();
|
||||
const [selectedLead, setSelectedLead] = useState<WorklistLead | null>(null);
|
||||
const [contextOpen, setContextOpen] = useState(true);
|
||||
@@ -66,9 +66,8 @@ export const CallDeskPage = () => {
|
||||
<div className="flex flex-1 flex-col overflow-y-auto">
|
||||
{/* Active call */}
|
||||
{isInCall && (
|
||||
<div className="space-y-4 p-5">
|
||||
<div className="p-5">
|
||||
<ActiveCallCard lead={activeLeadFull} callerPhone={callerNumber ?? ''} />
|
||||
<CallPrepCard lead={activeLeadFull} callerPhone={callerNumber ?? ''} activities={leadActivities} />
|
||||
</div>
|
||||
)}
|
||||
|
||||
@@ -95,6 +94,8 @@ export const CallDeskPage = () => {
|
||||
selectedLead={activeLeadFull}
|
||||
activities={leadActivities}
|
||||
callerPhone={callerNumber ?? undefined}
|
||||
isInCall={isInCall}
|
||||
callUcid={callUcid}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
|
||||
@@ -117,9 +117,7 @@ export const LoginPage = () => {
|
||||
<div className="relative z-10 flex flex-col gap-10 w-full max-w-[560px] px-12">
|
||||
{/* Logo lockup */}
|
||||
<div className="flex items-center gap-3">
|
||||
<div className="flex items-center justify-center bg-brand-solid rounded-xl p-2 size-10 shrink-0">
|
||||
<span className="text-white font-bold text-lg leading-none font-display">H</span>
|
||||
</div>
|
||||
<img src="/helix-logo.png" alt="Helix Engage" className="size-10 rounded-xl shrink-0" />
|
||||
<span className="text-white font-bold text-xl font-display tracking-tight">Helix Engage</span>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -29,6 +29,14 @@
|
||||
transition-timing-function: inherit;
|
||||
}
|
||||
|
||||
/* FontAwesome duotone icon colors — uses brand tokens */
|
||||
:root {
|
||||
--fa-primary-color: var(--color-fg-brand-primary);
|
||||
--fa-secondary-color: var(--color-fg-brand-secondary);
|
||||
--fa-primary-opacity: 1;
|
||||
--fa-secondary-opacity: 0.4;
|
||||
}
|
||||
|
||||
html,
|
||||
body {
|
||||
font-family: var(--font-body);
|
||||
|
||||
@@ -351,18 +351,18 @@
|
||||
--color-blue-light-900: rgb(11 74 111);
|
||||
--color-blue-light-950: rgb(6 44 65);
|
||||
|
||||
--color-blue-25: rgb(245 250 255);
|
||||
--color-blue-50: rgb(239 248 255);
|
||||
--color-blue-100: rgb(209 233 255);
|
||||
--color-blue-200: rgb(178 221 255);
|
||||
--color-blue-300: rgb(132 202 255);
|
||||
--color-blue-400: rgb(83 177 253);
|
||||
--color-blue-500: rgb(46 144 250);
|
||||
--color-blue-600: rgb(21 112 239);
|
||||
--color-blue-700: rgb(23 92 211);
|
||||
--color-blue-800: rgb(24 73 169);
|
||||
--color-blue-900: rgb(25 65 133);
|
||||
--color-blue-950: rgb(16 42 86);
|
||||
--color-blue-25: rgb(246 249 253);
|
||||
--color-blue-50: rgb(235 243 250);
|
||||
--color-blue-100: rgb(214 230 245);
|
||||
--color-blue-200: rgb(178 207 235);
|
||||
--color-blue-300: rgb(138 180 220);
|
||||
--color-blue-400: rgb(96 150 200);
|
||||
--color-blue-500: rgb(56 120 180);
|
||||
--color-blue-600: rgb(32 96 160);
|
||||
--color-blue-700: rgb(24 76 132);
|
||||
--color-blue-800: rgb(18 60 108);
|
||||
--color-blue-900: rgb(14 46 84);
|
||||
--color-blue-950: rgb(8 28 56);
|
||||
|
||||
--color-blue-dark-25: rgb(245 248 255);
|
||||
--color-blue-dark-50: rgb(239 244 255);
|
||||
@@ -758,8 +758,8 @@
|
||||
--color-bg-brand-secondary: var(--color-brand-100);
|
||||
--color-bg-brand-solid: var(--color-brand-600);
|
||||
--color-bg-brand-solid_hover: var(--color-brand-700);
|
||||
--color-bg-brand-section: var(--color-brand-800);
|
||||
--color-bg-brand-section_subtle: var(--color-brand-700);
|
||||
--color-bg-brand-section: var(--color-brand-600);
|
||||
--color-bg-brand-section_subtle: var(--color-brand-500);
|
||||
|
||||
/* COMPONENT COLORS */
|
||||
--color-app-store-badge-border: rgb(166 166 166);
|
||||
|
||||
Reference in New Issue
Block a user