mirror of
https://dev.azure.com/globalhealthx/EMR/_git/helix-engage
synced 2026-04-11 18:28:15 +00:00
feat: appointments page, data refresh on login, multi-agent spec + plan
- Appointment Master page with status tabs, search, PhoneActionCell - Login calls DataProvider.refresh() to load data after auth - Sidebar: appointments nav for CC agents + executives - Multi-agent SIP + lockout spec and implementation plan Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
643
docs/superpowers/plans/2026-03-23-multi-agent-sip-lockout.md
Normal file
643
docs/superpowers/plans/2026-03-23-multi-agent-sip-lockout.md
Normal file
@@ -0,0 +1,643 @@
|
||||
# Multi-Agent SIP + Duplicate Login Lockout — Implementation Plan
|
||||
|
||||
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
||||
|
||||
**Goal:** Per-agent Ozonetel/SIP credentials resolved from platform Agent entity on login, with Redis-backed duplicate login lockout.
|
||||
|
||||
**Architecture:** Sidecar queries Agent entity on CC login, checks Redis for active sessions, returns per-agent SIP config. Frontend SIP provider uses dynamic credentials from login response. Heartbeat keeps session alive.
|
||||
|
||||
**Tech Stack:** NestJS sidecar + ioredis + FortyTwo platform GraphQL + React frontend
|
||||
|
||||
**Spec:** `docs/superpowers/specs/2026-03-23-multi-agent-sip-lockout.md`
|
||||
|
||||
---
|
||||
|
||||
## File Map
|
||||
|
||||
### Sidecar (`helix-engage-server/src/`)
|
||||
|
||||
| File | Action | Responsibility |
|
||||
|------|--------|----------------|
|
||||
| `auth/session.service.ts` | Create | Redis session lock/unlock/refresh |
|
||||
| `auth/agent-config.service.ts` | Create | Query Agent entity, cache agent configs |
|
||||
| `auth/auth.controller.ts` | Modify | Use agent config + session locking on login, add logout + heartbeat |
|
||||
| `auth/auth.module.ts` | Modify | Register new services, import Redis |
|
||||
| `config/configuration.ts` | Modify | Add `REDIS_URL` + SIP domain config |
|
||||
|
||||
### Frontend (`helix-engage/src/`)
|
||||
|
||||
| File | Action | Responsibility |
|
||||
|------|--------|----------------|
|
||||
| `pages/login.tsx` | Modify | Store agentConfig, handle 403/409 errors |
|
||||
| `providers/sip-provider.tsx` | Modify | Read SIP config from agentConfig instead of env vars |
|
||||
| `components/layout/app-shell.tsx` | Modify | Add heartbeat interval for CC agents |
|
||||
| `lib/api-client.ts` | Modify | Add logout API call |
|
||||
| `providers/auth-provider.tsx` | Modify | Call sidecar logout on sign-out |
|
||||
|
||||
### Docker
|
||||
|
||||
| File | Action | Responsibility |
|
||||
|------|--------|----------------|
|
||||
| VPS `docker-compose.yml` | Modify | Add `REDIS_URL` to sidecar env |
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Install ioredis + Redis Session Service
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage-server/package.json`
|
||||
- Create: `helix-engage-server/src/auth/session.service.ts`
|
||||
- Modify: `helix-engage-server/src/config/configuration.ts`
|
||||
|
||||
- [ ] **Step 1: Install ioredis**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm install ioredis
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Add Redis URL to config**
|
||||
|
||||
In `config/configuration.ts`, add to the returned object:
|
||||
|
||||
```typescript
|
||||
redis: {
|
||||
url: process.env.REDIS_URL ?? 'redis://localhost:6379',
|
||||
},
|
||||
sip: {
|
||||
domain: process.env.SIP_DOMAIN ?? 'blr-pub-rtc4.ozonetel.com',
|
||||
wsPort: process.env.SIP_WS_PORT ?? '444',
|
||||
},
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Create session service**
|
||||
|
||||
```typescript
|
||||
// src/auth/session.service.ts
|
||||
import { Injectable, Logger, OnModuleInit } from '@nestjs/common';
|
||||
import { ConfigService } from '@nestjs/config';
|
||||
import Redis from 'ioredis';
|
||||
|
||||
const SESSION_TTL = 3600; // 1 hour
|
||||
|
||||
@Injectable()
|
||||
export class SessionService implements OnModuleInit {
|
||||
private readonly logger = new Logger(SessionService.name);
|
||||
private redis: Redis;
|
||||
|
||||
constructor(private config: ConfigService) {}
|
||||
|
||||
onModuleInit() {
|
||||
const url = this.config.get<string>('redis.url', 'redis://localhost:6379');
|
||||
this.redis = new Redis(url);
|
||||
this.redis.on('connect', () => this.logger.log('Redis connected'));
|
||||
this.redis.on('error', (err) => this.logger.error(`Redis error: ${err.message}`));
|
||||
}
|
||||
|
||||
private key(agentId: string): string {
|
||||
return `agent:session:${agentId}`;
|
||||
}
|
||||
|
||||
async lockSession(agentId: string, memberId: string): Promise<void> {
|
||||
await this.redis.set(this.key(agentId), memberId, 'EX', SESSION_TTL);
|
||||
}
|
||||
|
||||
async isSessionLocked(agentId: string): Promise<string | null> {
|
||||
return this.redis.get(this.key(agentId));
|
||||
}
|
||||
|
||||
async refreshSession(agentId: string): Promise<void> {
|
||||
await this.redis.expire(this.key(agentId), SESSION_TTL);
|
||||
}
|
||||
|
||||
async unlockSession(agentId: string): Promise<void> {
|
||||
await this.redis.del(this.key(agentId));
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Verify sidecar compiles**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add package.json package-lock.json src/auth/session.service.ts src/config/configuration.ts
|
||||
git commit -m "feat: Redis session service for agent login lockout"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Agent Config Service
|
||||
|
||||
**Files:**
|
||||
- Create: `helix-engage-server/src/auth/agent-config.service.ts`
|
||||
|
||||
- [ ] **Step 1: Create agent config service**
|
||||
|
||||
```typescript
|
||||
// src/auth/agent-config.service.ts
|
||||
import { Injectable, Logger } from '@nestjs/common';
|
||||
import { ConfigService } from '@nestjs/config';
|
||||
import { PlatformGraphqlService } from '../platform/platform-graphql.service';
|
||||
|
||||
export type AgentConfig = {
|
||||
id: string;
|
||||
ozonetelAgentId: string;
|
||||
sipExtension: string;
|
||||
sipPassword: string;
|
||||
campaignName: string;
|
||||
sipUri: string;
|
||||
sipWsServer: string;
|
||||
};
|
||||
|
||||
@Injectable()
|
||||
export class AgentConfigService {
|
||||
private readonly logger = new Logger(AgentConfigService.name);
|
||||
private readonly cache = new Map<string, AgentConfig>();
|
||||
private readonly sipDomain: string;
|
||||
private readonly sipWsPort: string;
|
||||
|
||||
constructor(
|
||||
private platform: PlatformGraphqlService,
|
||||
private config: ConfigService,
|
||||
) {
|
||||
this.sipDomain = config.get<string>('sip.domain', 'blr-pub-rtc4.ozonetel.com');
|
||||
this.sipWsPort = config.get<string>('sip.wsPort', '444');
|
||||
}
|
||||
|
||||
async getByMemberId(memberId: string): Promise<AgentConfig | null> {
|
||||
// Check cache first
|
||||
const cached = this.cache.get(memberId);
|
||||
if (cached) return cached;
|
||||
|
||||
try {
|
||||
const data = await this.platform.query<any>(
|
||||
`{ agents(first: 1, filter: { wsmemberId: { eq: "${memberId}" } }) { edges { node {
|
||||
id ozonetelagentid sipextension sippassword campaignname
|
||||
} } } }`,
|
||||
);
|
||||
|
||||
const node = data?.agents?.edges?.[0]?.node;
|
||||
if (!node || !node.ozonetelagentid || !node.sipextension) return null;
|
||||
|
||||
const agentConfig: AgentConfig = {
|
||||
id: node.id,
|
||||
ozonetelAgentId: node.ozonetelagentid,
|
||||
sipExtension: node.sipextension,
|
||||
sipPassword: node.sippassword ?? node.sipextension,
|
||||
campaignName: node.campaignname ?? process.env.OZONETEL_CAMPAIGN_NAME ?? 'Inbound_918041763265',
|
||||
sipUri: `sip:${node.sipextension}@${this.sipDomain}`,
|
||||
sipWsServer: `wss://${this.sipDomain}:${this.sipWsPort}`,
|
||||
};
|
||||
|
||||
this.cache.set(memberId, agentConfig);
|
||||
this.logger.log(`Loaded agent config for member ${memberId}: ${agentConfig.ozonetelAgentId} / ${agentConfig.sipExtension}`);
|
||||
return agentConfig;
|
||||
} catch (err) {
|
||||
this.logger.warn(`Failed to fetch agent config: ${err}`);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
getFromCache(memberId: string): AgentConfig | null {
|
||||
return this.cache.get(memberId) ?? null;
|
||||
}
|
||||
|
||||
clearCache(memberId: string): void {
|
||||
this.cache.delete(memberId);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Verify sidecar compiles**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```bash
|
||||
git add src/auth/agent-config.service.ts
|
||||
git commit -m "feat: agent config service with platform query + in-memory cache"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Update Auth Module + Controller
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage-server/src/auth/auth.module.ts`
|
||||
- Modify: `helix-engage-server/src/auth/auth.controller.ts`
|
||||
|
||||
- [ ] **Step 1: Update auth module to register new services**
|
||||
|
||||
Read `src/auth/auth.module.ts` and add imports:
|
||||
|
||||
```typescript
|
||||
import { SessionService } from './session.service';
|
||||
import { AgentConfigService } from './agent-config.service';
|
||||
import { PlatformModule } from '../platform/platform.module';
|
||||
|
||||
@Module({
|
||||
imports: [PlatformModule],
|
||||
controllers: [AuthController],
|
||||
providers: [SessionService, AgentConfigService],
|
||||
exports: [SessionService, AgentConfigService],
|
||||
})
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Rewrite auth controller login for multi-agent**
|
||||
|
||||
Inject new services into `AuthController`:
|
||||
|
||||
```typescript
|
||||
constructor(
|
||||
private config: ConfigService,
|
||||
private ozonetelAgent: OzonetelAgentService,
|
||||
private sessionService: SessionService,
|
||||
private agentConfigService: AgentConfigService,
|
||||
) { ... }
|
||||
```
|
||||
|
||||
Modify the CC agent section of `login()` (currently lines 115-128). Replace the hardcoded Ozonetel login with:
|
||||
|
||||
```typescript
|
||||
if (appRole === 'cc-agent') {
|
||||
const memberId = workspaceMember?.id;
|
||||
if (!memberId) throw new HttpException('Workspace member not found', 400);
|
||||
|
||||
// Look up agent config from platform
|
||||
const agentConfig = await this.agentConfigService.getByMemberId(memberId);
|
||||
if (!agentConfig) {
|
||||
throw new HttpException('Agent account not configured. Contact administrator.', 403);
|
||||
}
|
||||
|
||||
// Check for duplicate login
|
||||
const existingSession = await this.sessionService.isSessionLocked(agentConfig.ozonetelAgentId);
|
||||
if (existingSession && existingSession !== memberId) {
|
||||
throw new HttpException('You are already logged in on another device. Please log out there first.', 409);
|
||||
}
|
||||
|
||||
// Lock session
|
||||
await this.sessionService.lockSession(agentConfig.ozonetelAgentId, memberId);
|
||||
|
||||
// Login to Ozonetel with agent-specific credentials
|
||||
const ozAgentPassword = process.env.OZONETEL_AGENT_PASSWORD ?? 'Test123$';
|
||||
this.ozonetelAgent.loginAgent({
|
||||
agentId: agentConfig.ozonetelAgentId,
|
||||
password: ozAgentPassword,
|
||||
phoneNumber: agentConfig.sipExtension,
|
||||
mode: 'blended',
|
||||
}).catch(err => {
|
||||
this.logger.warn(`Ozonetel agent login failed (non-blocking): ${err.message}`);
|
||||
});
|
||||
|
||||
// Return agent config to frontend
|
||||
return {
|
||||
accessToken,
|
||||
refreshToken: tokens.refreshToken.token,
|
||||
user: { ... }, // same as today
|
||||
agentConfig: {
|
||||
ozonetelAgentId: agentConfig.ozonetelAgentId,
|
||||
sipExtension: agentConfig.sipExtension,
|
||||
sipPassword: agentConfig.sipPassword,
|
||||
sipUri: agentConfig.sipUri,
|
||||
sipWsServer: agentConfig.sipWsServer,
|
||||
campaignName: agentConfig.campaignName,
|
||||
},
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
Note: `workspaceMember.id` is already available from the profile query on line 87-88 of the existing code.
|
||||
|
||||
- [ ] **Step 3: Add logout endpoint**
|
||||
|
||||
Add after the `refresh` endpoint:
|
||||
|
||||
```typescript
|
||||
@Post('logout')
|
||||
async logout(@Headers('authorization') auth: string) {
|
||||
if (!auth) throw new HttpException('Authorization required', 401);
|
||||
|
||||
try {
|
||||
// Resolve workspace member from JWT
|
||||
const profileRes = await axios.post(this.graphqlUrl, {
|
||||
query: '{ currentUser { workspaceMember { id } } }',
|
||||
}, { headers: { 'Content-Type': 'application/json', Authorization: auth } });
|
||||
|
||||
const memberId = profileRes.data?.data?.currentUser?.workspaceMember?.id;
|
||||
if (!memberId) return { status: 'ok' };
|
||||
|
||||
const agentConfig = this.agentConfigService.getFromCache(memberId);
|
||||
if (agentConfig) {
|
||||
// Unlock Redis session
|
||||
await this.sessionService.unlockSession(agentConfig.ozonetelAgentId);
|
||||
|
||||
// Logout from Ozonetel
|
||||
this.ozonetelAgent.logoutAgent({
|
||||
agentId: agentConfig.ozonetelAgentId,
|
||||
password: process.env.OZONETEL_AGENT_PASSWORD ?? 'Test123$',
|
||||
}).catch(err => this.logger.warn(`Ozonetel logout failed: ${err.message}`));
|
||||
|
||||
// Clear cache
|
||||
this.agentConfigService.clearCache(memberId);
|
||||
}
|
||||
|
||||
return { status: 'ok' };
|
||||
} catch (err) {
|
||||
this.logger.warn(`Logout cleanup failed: ${err}`);
|
||||
return { status: 'ok' };
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 4: Add heartbeat endpoint**
|
||||
|
||||
```typescript
|
||||
@Post('heartbeat')
|
||||
async heartbeat(@Headers('authorization') auth: string) {
|
||||
if (!auth) throw new HttpException('Authorization required', 401);
|
||||
|
||||
try {
|
||||
const profileRes = await axios.post(this.graphqlUrl, {
|
||||
query: '{ currentUser { workspaceMember { id } } }',
|
||||
}, { headers: { 'Content-Type': 'application/json', Authorization: auth } });
|
||||
|
||||
const memberId = profileRes.data?.data?.currentUser?.workspaceMember?.id;
|
||||
const agentConfig = memberId ? this.agentConfigService.getFromCache(memberId) : null;
|
||||
|
||||
if (agentConfig) {
|
||||
await this.sessionService.refreshSession(agentConfig.ozonetelAgentId);
|
||||
}
|
||||
|
||||
return { status: 'ok' };
|
||||
} catch {
|
||||
return { status: 'ok' };
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Verify sidecar compiles**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 6: Commit**
|
||||
|
||||
```bash
|
||||
git add src/auth/auth.module.ts src/auth/auth.controller.ts
|
||||
git commit -m "feat: multi-agent login with Redis lockout, logout, heartbeat"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Update Ozonetel Controller for Per-Agent Calls
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage-server/src/ozonetel/ozonetel-agent.controller.ts`
|
||||
|
||||
- [ ] **Step 1: Add AgentConfigService to Ozonetel controller**
|
||||
|
||||
Import and inject `AgentConfigService`. Add a helper to resolve the agent config from the auth header:
|
||||
|
||||
```typescript
|
||||
import { AgentConfigService } from '../auth/agent-config.service';
|
||||
|
||||
// In constructor:
|
||||
private readonly agentConfig: AgentConfigService,
|
||||
|
||||
// Helper method:
|
||||
private async resolveAgentId(authHeader: string): Promise<string> {
|
||||
try {
|
||||
const data = await this.platform.queryWithAuth<any>(
|
||||
'{ currentUser { workspaceMember { id } } }',
|
||||
undefined, authHeader,
|
||||
);
|
||||
const memberId = data.currentUser?.workspaceMember?.id;
|
||||
const config = memberId ? this.agentConfig.getFromCache(memberId) : null;
|
||||
return config?.ozonetelAgentId ?? this.defaultAgentId;
|
||||
} catch {
|
||||
return this.defaultAgentId;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Update dispose, agent-state, dial, and other endpoints**
|
||||
|
||||
Replace `this.defaultAgentId` with `await this.resolveAgentId(authHeader)` in the endpoints that pass the auth header. The key endpoints to update:
|
||||
|
||||
- `dispose()` — add `@Headers('authorization') auth: string` param, resolve agent ID
|
||||
- `agentState()` — same
|
||||
- `dial()` — same
|
||||
- `agentReady()` — same
|
||||
|
||||
For endpoints that don't currently take the auth header, add it as a parameter.
|
||||
|
||||
- [ ] **Step 3: Update auth module to handle circular dependency**
|
||||
|
||||
The `OzonetelAgentModule` now needs `AgentConfigService` from `AuthModule`. Use `forwardRef` if needed, or export `AgentConfigService` from a shared module.
|
||||
|
||||
Simplest approach: move `AgentConfigService` export from `AuthModule` and import it in `OzonetelAgentModule`.
|
||||
|
||||
- [ ] **Step 4: Verify sidecar compiles**
|
||||
|
||||
```bash
|
||||
cd helix-engage-server && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add src/ozonetel/ozonetel-agent.controller.ts src/ozonetel/ozonetel-agent.module.ts src/auth/auth.module.ts
|
||||
git commit -m "feat: per-agent Ozonetel credentials in all controller endpoints"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 5: Frontend — Store Agent Config + Dynamic SIP
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage/src/pages/login.tsx`
|
||||
- Modify: `helix-engage/src/providers/sip-provider.tsx`
|
||||
- Modify: `helix-engage/src/providers/auth-provider.tsx`
|
||||
|
||||
- [ ] **Step 1: Store agentConfig on login**
|
||||
|
||||
In `login.tsx`, after successful login, store the agent config:
|
||||
|
||||
```typescript
|
||||
if (response.agentConfig) {
|
||||
localStorage.setItem('helix_agent_config', JSON.stringify(response.agentConfig));
|
||||
}
|
||||
```
|
||||
|
||||
Handle new error codes:
|
||||
```typescript
|
||||
} catch (err: any) {
|
||||
if (err.message?.includes('not configured')) {
|
||||
setError('Agent account not configured. Contact your administrator.');
|
||||
} else if (err.message?.includes('already logged in')) {
|
||||
setError('You are already logged in on another device. Please log out there first.');
|
||||
} else {
|
||||
setError(err.message);
|
||||
}
|
||||
setIsLoading(false);
|
||||
}
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Update SIP provider to use stored agent config**
|
||||
|
||||
In `sip-provider.tsx`, replace the hardcoded `DEFAULT_CONFIG`:
|
||||
|
||||
```typescript
|
||||
const getAgentSipConfig = (): SIPConfig => {
|
||||
try {
|
||||
const stored = localStorage.getItem('helix_agent_config');
|
||||
if (stored) {
|
||||
const config = JSON.parse(stored);
|
||||
return {
|
||||
displayName: 'Helix Agent',
|
||||
uri: config.sipUri,
|
||||
password: config.sipPassword,
|
||||
wsServer: config.sipWsServer,
|
||||
stunServers: 'stun:stun.l.google.com:19302',
|
||||
};
|
||||
}
|
||||
} catch {}
|
||||
// Fallback to env vars
|
||||
return {
|
||||
displayName: import.meta.env.VITE_SIP_DISPLAY_NAME ?? 'Helix Agent',
|
||||
uri: import.meta.env.VITE_SIP_URI ?? '',
|
||||
password: import.meta.env.VITE_SIP_PASSWORD ?? '',
|
||||
wsServer: import.meta.env.VITE_SIP_WS_SERVER ?? '',
|
||||
stunServers: 'stun:stun.l.google.com:19302',
|
||||
};
|
||||
};
|
||||
```
|
||||
|
||||
Use `getAgentSipConfig()` where `DEFAULT_CONFIG` was used.
|
||||
|
||||
- [ ] **Step 3: Update auth provider logout to call sidecar**
|
||||
|
||||
In `auth-provider.tsx`, modify `logout()` to call the sidecar first:
|
||||
|
||||
```typescript
|
||||
const logout = async () => {
|
||||
try {
|
||||
const token = localStorage.getItem('helix_access_token');
|
||||
if (token) {
|
||||
await fetch(`${API_URL}/auth/logout`, {
|
||||
method: 'POST',
|
||||
headers: { Authorization: `Bearer ${token}` },
|
||||
}).catch(() => {});
|
||||
}
|
||||
} finally {
|
||||
localStorage.removeItem('helix_access_token');
|
||||
localStorage.removeItem('helix_refresh_token');
|
||||
localStorage.removeItem('helix_user');
|
||||
localStorage.removeItem('helix_agent_config');
|
||||
setUser(null);
|
||||
}
|
||||
};
|
||||
```
|
||||
|
||||
Note: `API_URL` needs to be available here. Import from `api-client.ts` or read from env.
|
||||
|
||||
- [ ] **Step 4: Verify frontend compiles**
|
||||
|
||||
```bash
|
||||
cd helix-engage && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 5: Commit**
|
||||
|
||||
```bash
|
||||
git add src/pages/login.tsx src/providers/sip-provider.tsx src/providers/auth-provider.tsx
|
||||
git commit -m "feat: dynamic SIP config from login response, logout cleanup"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 6: Frontend — Heartbeat
|
||||
|
||||
**Files:**
|
||||
- Modify: `helix-engage/src/components/layout/app-shell.tsx`
|
||||
|
||||
- [ ] **Step 1: Add heartbeat interval for CC agents**
|
||||
|
||||
In `AppShell`, add a heartbeat effect:
|
||||
|
||||
```typescript
|
||||
const { isCCAgent } = useAuth();
|
||||
|
||||
useEffect(() => {
|
||||
if (!isCCAgent) return;
|
||||
|
||||
const interval = setInterval(() => {
|
||||
const token = localStorage.getItem('helix_access_token');
|
||||
if (token) {
|
||||
fetch(`${import.meta.env.VITE_API_URL ?? 'http://localhost:4100'}/auth/heartbeat`, {
|
||||
method: 'POST',
|
||||
headers: { Authorization: `Bearer ${token}` },
|
||||
}).catch(() => {});
|
||||
}
|
||||
}, 5 * 60 * 1000); // Every 5 minutes
|
||||
|
||||
return () => clearInterval(interval);
|
||||
}, [isCCAgent]);
|
||||
```
|
||||
|
||||
- [ ] **Step 2: Verify frontend compiles**
|
||||
|
||||
```bash
|
||||
cd helix-engage && npm run build
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Commit**
|
||||
|
||||
```bash
|
||||
git add src/components/layout/app-shell.tsx
|
||||
git commit -m "feat: heartbeat every 5 min to keep agent session alive"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 7: Docker + Deploy
|
||||
|
||||
**Files:**
|
||||
- Modify: VPS `docker-compose.yml`
|
||||
|
||||
- [ ] **Step 1: Add REDIS_URL to sidecar in docker-compose**
|
||||
|
||||
SSH to VPS and add `REDIS_URL: redis://redis:6379` to the sidecar environment section. Also add `redis` to the sidecar's `depends_on`.
|
||||
|
||||
- [ ] **Step 2: Deploy using deploy script**
|
||||
|
||||
```bash
|
||||
./deploy.sh all
|
||||
```
|
||||
|
||||
- [ ] **Step 3: Verify sidecar connects to Redis**
|
||||
|
||||
```bash
|
||||
ssh -i ~/Downloads/fortytwoai_hostinger root@148.230.67.184 "docker logs fortytwo-staging-sidecar-1 --tail 10 2>&1 | grep -i redis"
|
||||
```
|
||||
|
||||
Expected: `Redis connected`
|
||||
|
||||
- [ ] **Step 4: Test login flow**
|
||||
|
||||
Login as rekha.cc → should get `agentConfig` in response. SIP should connect with her specific extension. Try logging in from another browser → should get "already logged in" error.
|
||||
|
||||
- [ ] **Step 5: Commit docker-compose change and push all to Azure**
|
||||
|
||||
```bash
|
||||
cd helix-engage && git add . && git push origin dev
|
||||
cd helix-engage-server && git add . && git push origin dev
|
||||
```
|
||||
176
docs/superpowers/specs/2026-03-23-multi-agent-sip-lockout.md
Normal file
176
docs/superpowers/specs/2026-03-23-multi-agent-sip-lockout.md
Normal file
@@ -0,0 +1,176 @@
|
||||
# Multi-Agent SIP Credentials + Duplicate Login Lockout
|
||||
|
||||
**Date**: 2026-03-23
|
||||
**Status**: Approved design
|
||||
|
||||
---
|
||||
|
||||
## Problem
|
||||
|
||||
Single Ozonetel agent account (`global`) and SIP extension (`523590`) shared across all CC agents. When multiple agents log in, calls route to whichever browser registered last. No way to have multiple simultaneous CC agents.
|
||||
|
||||
## Solution
|
||||
|
||||
Per-agent Ozonetel credentials stored in the platform's Agent entity, resolved on login. Redis-backed session locking prevents duplicate logins. Frontend SIP provider uses dynamic credentials from login response.
|
||||
|
||||
---
|
||||
|
||||
## 1. Data Model
|
||||
|
||||
**Agent entity** (already created on platform via admin portal):
|
||||
|
||||
| Field (GraphQL) | Type | Purpose |
|
||||
|---|---|---|
|
||||
| `wsmemberId` | Relation | Links to workspace member |
|
||||
| `ozonetelagentid` | Text | Ozonetel agent ID (e.g. "global", "agent2") |
|
||||
| `sipextension` | Text | SIP extension number (e.g. "523590") |
|
||||
| `sippassword` | Text | SIP auth password |
|
||||
| `campaignname` | Text | Ozonetel campaign (e.g. "Inbound_918041763265") |
|
||||
|
||||
Custom fields use **all-lowercase** GraphQL names. One Agent record per CC user.
|
||||
|
||||
---
|
||||
|
||||
## 2. Sidecar Changes
|
||||
|
||||
### 2.1 Redis Integration
|
||||
|
||||
Add `ioredis` dependency to `helix-engage-server`. Connect to `REDIS_URL` (default `redis://redis:6379`).
|
||||
|
||||
New service: `src/auth/session.service.ts`
|
||||
|
||||
```
|
||||
lockSession(agentId, memberId) → SET agent:session:{agentId} {memberId} EX 3600
|
||||
isSessionLocked(agentId) → GET agent:session:{agentId} → returns memberId or null
|
||||
refreshSession(agentId) → EXPIRE agent:session:{agentId} 3600
|
||||
unlockSession(agentId) → DEL agent:session:{agentId}
|
||||
```
|
||||
|
||||
### 2.2 Auth Controller — Login Flow
|
||||
|
||||
Modify `POST /auth/login`:
|
||||
|
||||
1. Authenticate with platform → get JWT + user profile + workspace member ID
|
||||
2. Determine role (same as today)
|
||||
3. **If CC agent:**
|
||||
a. Query platform: `agents(filter: { wsmemberId: { eq: "<memberId>" } })` using server API key
|
||||
b. No Agent record → `403: "Agent account not configured. Contact administrator."`
|
||||
c. Check Redis: `isSessionLocked(agent.ozonetelagentid)`
|
||||
d. Locked by different user → `409: "You are already logged in on another device. Please log out there first."`
|
||||
e. Locked by same user → refresh TTL (re-login from same browser)
|
||||
f. Not locked → `lockSession(agent.ozonetelagentid, memberId)`
|
||||
g. Login to Ozonetel with agent's specific credentials
|
||||
h. Return `agentConfig` in response
|
||||
4. **If manager/executive:** No Agent query, no Redis, no SIP. Same as today.
|
||||
|
||||
**Login response** (CC agent):
|
||||
```json
|
||||
{
|
||||
"accessToken": "...",
|
||||
"refreshToken": "...",
|
||||
"user": { "id": "...", "role": "cc-agent", ... },
|
||||
"agentConfig": {
|
||||
"ozonetelAgentId": "global",
|
||||
"sipExtension": "523590",
|
||||
"sipPassword": "523590",
|
||||
"sipUri": "sip:523590@blr-pub-rtc4.ozonetel.com",
|
||||
"sipWsServer": "wss://blr-pub-rtc4.ozonetel.com:444",
|
||||
"campaignName": "Inbound_918041763265"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
SIP domain (`blr-pub-rtc4.ozonetel.com`) and WS port (`444`) remain from env vars — these are shared infrastructure, not per-agent.
|
||||
|
||||
### 2.3 Auth Controller — Logout
|
||||
|
||||
Modify `POST /auth/logout` (or add if doesn't exist):
|
||||
1. Resolve agent from JWT
|
||||
2. `unlockSession(agent.ozonetelagentid)`
|
||||
3. Ozonetel agent logout
|
||||
|
||||
### 2.4 Auth Controller — Heartbeat
|
||||
|
||||
New endpoint: `POST /auth/heartbeat`
|
||||
1. Resolve agent from JWT
|
||||
2. `refreshSession(agent.ozonetelagentid)` → extends TTL to 1 hour
|
||||
3. Return `{ status: 'ok' }`
|
||||
|
||||
### 2.5 Agent Config Cache
|
||||
|
||||
On login, store agent config in an in-memory `Map<workspaceMemberId, AgentConfig>`.
|
||||
|
||||
All Ozonetel controller endpoints currently use `this.defaultAgentId`. Change to:
|
||||
1. Resolve workspace member from JWT (already done in worklist controller's `resolveAgentName`)
|
||||
2. Lookup agent config from the in-memory map
|
||||
3. Use the agent's `ozonetelagentid` for Ozonetel API calls
|
||||
|
||||
This avoids querying Redis/platform on every API call.
|
||||
|
||||
Clear the cache entry on logout.
|
||||
|
||||
### 2.6 Config
|
||||
|
||||
New env var: `REDIS_URL` (default: `redis://redis:6379`)
|
||||
|
||||
Existing env vars (`OZONETEL_AGENT_ID`, `OZONETEL_SIP_ID`, etc.) become fallbacks only — used when no Agent record exists (backward compatibility for dev).
|
||||
|
||||
---
|
||||
|
||||
## 3. Frontend Changes
|
||||
|
||||
### 3.1 Store Agent Config
|
||||
|
||||
On login, store `agentConfig` from the response in localStorage (`helix_agent_config`).
|
||||
|
||||
On logout, clear it.
|
||||
|
||||
### 3.2 SIP Provider
|
||||
|
||||
`sip-provider.tsx`: Read SIP credentials from stored `agentConfig` instead of env vars.
|
||||
|
||||
```
|
||||
const agentConfig = JSON.parse(localStorage.getItem('helix_agent_config'));
|
||||
const sipUri = agentConfig?.sipUri ?? import.meta.env.VITE_SIP_URI;
|
||||
const sipPassword = agentConfig?.sipPassword ?? import.meta.env.VITE_SIP_PASSWORD;
|
||||
const sipWsServer = agentConfig?.sipWsServer ?? import.meta.env.VITE_SIP_WS_SERVER;
|
||||
```
|
||||
|
||||
If no `agentConfig` and no env vars → don't connect SIP.
|
||||
|
||||
### 3.3 Heartbeat
|
||||
|
||||
Add a heartbeat interval in `AppShell` (only for CC agents):
|
||||
- Every 5 minutes: `POST /auth/heartbeat`
|
||||
- If heartbeat fails with 401 → session expired, redirect to login
|
||||
|
||||
### 3.4 Login Error Handling
|
||||
|
||||
Handle new error codes from login:
|
||||
- `403` → "Agent account not configured. Contact administrator."
|
||||
- `409` → "You are already logged in on another device. Please log out there first."
|
||||
|
||||
### 3.5 Logout
|
||||
|
||||
On logout, call `POST /auth/logout` before clearing tokens (so sidecar can clean up Redis + Ozonetel).
|
||||
|
||||
---
|
||||
|
||||
## 4. Docker Compose
|
||||
|
||||
Add `REDIS_URL` to sidecar environment in `docker-compose.yml`:
|
||||
```yaml
|
||||
sidecar:
|
||||
environment:
|
||||
REDIS_URL: redis://redis:6379
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. Edge Cases
|
||||
|
||||
- **Sidecar restart**: Redis retains session locks. Agent config cache is lost but rebuilt on next API call (query Agent entity lazily).
|
||||
- **Redis restart**: All session locks cleared. Agents can re-login. Acceptable — same as TTL expiry.
|
||||
- **Browser crash (no logout)**: Heartbeat stops → Redis key expires in ≤1 hour → lock clears.
|
||||
- **Same user, same browser re-login**: Detected by comparing `memberId` in Redis → refreshes TTL instead of blocking.
|
||||
- **Agent record deleted while logged in**: Next Ozonetel API call fails → sidecar clears cache → agent gets logged out.
|
||||
Reference in New Issue
Block a user