Toast GHL Pipeline — Step-by-Step Diagnostic

Updated 2026-04-26. All 12 steps operational.
1
Toast
auth
2
Switch
location
3
Click
Download
4
Wait for
email
5
Download
CSV
6
Row count
check
7
Parse +
clean names
8
Phone
dedup
9
GHL
upsert
10
GHL
workflow
11
Save
ledger
12
Slack
notify
01
Toast Session Auth
Working

The session cookies (auth0, TOAST_SESSION, etc.) are loaded from toast_session.json and refreshed on each run. session.ensure_session() handles re-login if cookies expire.

Code
session.py · ensure_session(headless=True)
Status
Re-authenticated cleanly on every run. No session-expiry failures.
02
Switch Toast Location
Working

Each location gets its own isolated browser session with a fresh OAuth login. The browser switches to the target restaurant, waits for Toast's full re-authorization to complete, and verifies the correct restaurant GUID appears in network traffic before proceeding.

Smoke test results (2026-04-26):

LocationSwitch needed?Verified XHRsExport triggered?
Kaju Buena ParkNo (default)37Yes
Kaju Garden GroveYes (same mgmt group)39Yes
Kaju Irvine CulverYes (same mgmt group)39Yes
Oji Sushi PasadenaYes (different mgmt group)39Yes
Code
unified_export.py · run_export() + switch_location()
03
Click Download Button
Working

The export is now triggered via a direct API call in the same isolated browser session, using that session's restaurant-scoped OAuth token. Toast accepts the request, queues the export, and emails the download link within ~30-90s.

Code
unified_export.py · run_export()
04
Wait for Toast Email (Gmail IMAP)
Working

Polls rlee@tiny-mammoth.com via IMAP for emails with subject "Your Guestbook contacts are ready to be downloaded". Filters by:

  • triggered_after timestamp — skips emails that pre-date this trigger
  • exclude_uuids — skips UUIDs already consumed in this run

Hardened against IMAP connection drops + bad Date headers.

Code
fetch_toast_link.py
05
Download CSV (Same Session)
Working

Downloads the CSV in the same browser session that triggered the export. The browser navigates to ?downloadReportUUID=<uuid> using the session's existing cookies, ensuring the correct restaurant's data is returned.

Code
unified_export.py · BrowserSession.download()
06
Row Count Smell Test
Safety Net

Compares the downloaded CSV's row count to the expected baseline per location, with a +/-30% tolerance band. This gate caught all 4 mis-routings on April 25 before any bad data reached GHL. It remains as a safety net even with the fix in place.

LocationExpectedTolerance band
Kaju Buena Park4,4763,133 – 5,818
Kaju Garden Grove12,6618,862 – 16,459
Kaju Irvine Culver21,68215,177 – 28,186
Oji Sushi Pasadena13,4239,396 – 17,449
Code
process_and_upload.py · assert_row_count_matches()
07
Parse + Clean Names
Working

Parses the Toast CSV. Filters by lastVisitDate against the location's checkpoint. Dedupes within-batch by phone. Cleans junk names from card swipes.

'Visa | Cardholder'           → 'Kajufam'   # credit card "name"
'Valued Customer | ''          → 'Kajufam'   # generic placeholder
'10m??'                        → 'Kajufam'   # digit-prefix garbage
'Mike'                         → 'Mike'      # real names preserved
'Sarah | Cardholder'           → 'Sarah'     # first-name still real
Code
process_and_upload.py · load_and_clean_csv() + clean_first_name()
08
Phone-Level Dedup (180-day cooldown)
Working

Each candidate phone gets bucketed against the per-location ledger (uploaded_phones.json):

  • NEW — phone not in ledger → enroll
  • RE-ENTRY — in ledger, last_enrolled >= 180 days ago → enroll, increment count
  • COOLDOWN — in ledger, last_enrolled < 180 days ago → skip
Code
process_and_upload.py · bucket_contacts() · state.record_enrollment()
09
GHL Contact Upsert
Working

POST https://services.leadconnectorhq.com/contacts/upsert with each location's API key + locationId. Creates new contacts or updates existing ones (matched by phone). Returns the GHL contactId.

Code
process_and_upload.py · upsert_contact()
10
GHL Workflow Enrollment
Working

POST /contacts/{contactId}/workflow/{workflow_id} — enrolls each contact in their location's "2b Meals Momentum" workflow. The workflow then fires the SMS sequence.

Code
process_and_upload.py · add_to_workflow()
11
Save Checkpoint + Phone Ledger
Working

After each location: writes the max lastVisitDate to run_state.json (date checkpoint) and adds enrolled phones to uploaded_phones.json (ledger). The CI workflow commits both files back to master with [skip ci] after each run.

Code
state.py · save_state() · save_ledger()
12
Slack Notification
Working

Posts a per-location summary to #system-notifications via the TM Notify worker. Format includes total guest rows, % with phone, % with email, new vs re-entry counts, cooldown-skipped count.

Code
process_and_upload.py · notify()

Status: All 12 steps operational

The location-switching bug (Step 2) that blocked the pipeline from April 25 is resolved. The fix uses isolated browser sessions per location, each with its own OAuth-scoped login token. Smoke tests verified all 4 locations produce the correct restaurant ID in network traffic.

Remaining items:

1. Merge PRfeature/isolated-sessions branch ready for review.

2. First production run — re-enable the daily cron in .github/workflows/daily-upload.yml after merge. Run with seed=true first to verify row counts match before enabling live GHL enrollment.

3. Dupe cleanup — ~1,434 duplicate workflow enrollments from April 25 still need to be cleaned up in GHL (separate task from the pipeline fix).