Black-Box AI vs Receipted AI
Every other AI phone agent is a black box. You hear the call. You might get a transcript. But you don't know what it wrote to your system.
AliceHQ is different. Every action produces a receipt. Booking created. CRM updated. Calendar entry made. Timestamped, immutable, and yours.
What Black-Box AI Costs When Something Goes Wrong
These aren't hypothetical risks. They are the three scenarios that NZ service business owners actually face. In each case, a receipt resolves the situation in minutes. Without one, the liability sits with you.
The Hospitality Incident: “Your AI Told Me This Price”
It is 7am on a busy Saturday. A guest arrives at your motel reception and presents a screenshot from their phone. It shows a conversation with what they say is your AI booking system, quoting $89 per night. Your published rate is $140. They are not asking — they are informing you that they have a confirmed booking at $89, and they expect you to honour it. Air Canada faced this exact scenario with a chatbot that invented a bereavement discount policy. The tribunal ruled Air Canada liable for what the AI said, regardless of what the company intended. Under NZ Consumer Law, the same principle applies: your AI's representations bind your business. A black-box AI leaves you with no record to verify or contest the claim. A receipt shows the exact conversation, the rate communicated, the booking reference created — and settles the dispute before it escalates.
The Property Management Audit: TCIT Asks for the Maintenance Record
The Tenancy Compliance and Investigations Team audits 400–500 property management companies each year. During an inspection, they request evidence that a specific tenant maintenance call was received and actioned within your required response window. Your AI handled the call three months ago. Your black-box AI system has the call recording, if it recorded it at all. But the question is not whether the call happened — the question is what happened after the call. Was the maintenance request logged? Was it classified correctly? Was the property manager notified? Was a ticket created? Without receipts for each of those downstream actions, you cannot answer the auditor's question. That gap is a compliance finding. With receipts, you hand them a timestamped record of every action taken from the moment the tenant called.
The Healthcare Inquiry: Privacy Commissioner Wants to Know
A patient contacts the Office of the Privacy Commissioner. They claim that when they called your practice, the AI shared information about their appointment that should not have been disclosed, or made a triage decision that they believe was incorrect. Under HIPC 2020, your practice has obligations about how health information is handled at every touchpoint — including AI-handled calls. A black-box AI system that logs only a transcript cannot answer the Commissioner's question about what decision the AI made and on what basis. A Receipted AI system provides a record of the triage decision: the intent identified, the information accessed, the action taken, and the timestamp. That record is your evidence of compliant handling. Without it, the inquiry becomes an investigation.
The Receipted AI Difference
Side by side, the gap between Black-Box AI and Receipted AI is the gap between “I hope nothing goes wrong” and “I can prove exactly what happened.”
| Feature | Black-Box AI | Receipted AI (AliceHQ) |
|---|---|---|
| Call handled | ||
| Call recorded | Sometimes | |
| Booking receipt produced | ||
| CRM write logged with action ID | ||
| Triage decision recorded | ||
| Evidence for TCIT audit | ||
| HIPC 2020 compliant | ||
| Can prove what AI said and did |
If Something Goes Wrong, You Have the Receipt
The promise of Receipted AI is not that your AI will never make a mistake. The promise is that when something happens — and something always eventually happens — you have the same paper trail you would have if a staff member had handled it. You can see what was said. You can see what was written. You can show the receipt to a customer, a regulator, or a lawyer. That is what accountability looks like when AI is doing operational work on your behalf.