Skip to content

ENG-2851: Fix website monitor LLM toggle to set context_classifier#7566

Merged
dsill-ethyca merged 8 commits intomainfrom
ENG-2851-website-monitor-llm-context-classifier
Mar 5, 2026
Merged

ENG-2851: Fix website monitor LLM toggle to set context_classifier#7566
dsill-ethyca merged 8 commits intomainfrom
ENG-2851-website-monitor-llm-context-classifier

Conversation

@dsill-ethyca
Copy link
Contributor

@dsill-ethyca dsill-ethyca commented Mar 4, 2026

Description Of Changes

Fixes the website monitor LLM classifier toggle to properly set context_classifier='llm' when enabled, matching the behaviour of the database monitor form.

Previously the toggle only set llm_model_override, which meant the backend gate on context_classifier == 'llm' (added in the fidesplus BE) would never fire correctly for website monitors.

Also fixes the Save button being briefly disabled when flipping the LLM toggle, caused by validateFields running before the dynamically mounted model field finished registering with the form.

Code Changes

  • clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx
    • Detect existing LLM config via context_classifier === "llm" instead of !!llm_model_override
    • Set context_classifier: "llm" when toggle is enabled, clear it when disabled
    • Include llm_model_override only when explicitly provided (optional — omitting it lets the BE default to DEFAULT_CLASSIFICATION_MODEL)
    • Defer validateFields with setTimeout(0) so dynamically mounted form fields finish registering before validation runs

Steps to Confirm

  1. Open a website monitor config form
  2. Enable the LLM classifier toggle — Save button should remain enabled immediately
  3. Save without specifying a model override — monitor should save successfully
  4. Verify classify_params.context_classifier is "llm" in the saved monitor
  5. Disable the toggle, save — verify context_classifier is cleared

Summary by CodeRabbit

  • Bug Fixes
    • The website monitor LLM toggle now correctly enables LLM-based classification.
    • The LLM model field is consistently handled (mounted but hidden when not applicable), improving form layout and validation.
    • Disabling LLM classification now reliably clears any selected model settings.

- Set context_classifier: "llm" when LLM toggle is enabled, clear it when disabled
- Detect existing LLM config via context_classifier === "llm" instead of llm_model_override
- Include llm_model_override only when explicitly provided (optional)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@vercel
Copy link
Contributor

vercel bot commented Mar 4, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
fides-plus-nightly Ready Ready Preview, Comment Mar 5, 2026 4:57pm
1 Skipped Deployment
Project Deployment Actions Updated (UTC)
fides-privacy-center Ignored Ignored Mar 5, 2026 4:57pm

Request Review

@coderabbitai
Copy link

coderabbitai bot commented Mar 4, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

Website monitor LLM toggle now derives from classify_params.context_classifier === "llm". Form initial state and payload assembly set or clear context_classifier and llm_model_override accordingly. LLM model selector field is always mounted and conditionally hidden to preserve form state.

Changes

Cohort / File(s) Summary
Changelog
changelog/7566-fix-website-monitor-llm-context-classifier.yaml
Added changelog entry documenting the fix: enabling LLM classification now sets context_classifier to "llm".
Website Monitor Form Logic
clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx
Detect use_llm_classifier from classify_params.context_classifier === "llm"; drive initial form value from that check; assemble classify_params to set/clear context_classifier and always pass llm_model_override (possibly undefined); minor merge logic change using ??.
LLM Model Selector UI
clients/admin-ui/src/features/common/form/LlmModelSelector.tsx
Replaced conditional render of the model override field with an always-mounted Form.Item that is hidden when not applicable; keeps label/tooltip and disabled state but ensures consistent form registration and layout.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Possibly related PRs

Poem

🐰 I hopped through props and fields so sly,
"llm" now flags the classifier high,
Hidden but present, the model waits,
Payloads tidy, no stray gates,
A little hop — the fix goes by! 🥕

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The pull request title directly and specifically identifies the main change: fixing the website monitor LLM toggle to set the context_classifier field.
Description check ✅ Passed The description covers the key sections including what changed, why it matters, and steps to confirm. However, the pre-merge checklist is not explicitly addressed in the provided description.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch ENG-2851-website-monitor-llm-context-classifier

Comment @coderabbitai help to get the list of available commands and usage tips.

@dsill-ethyca dsill-ethyca marked this pull request as ready for review March 4, 2026 21:03
@dsill-ethyca dsill-ethyca requested a review from a team as a code owner March 4, 2026 21:03
@greptile-apps
Copy link
Contributor

greptile-apps bot commented Mar 4, 2026

Greptile Summary

This PR correctly fixes the website monitor LLM classifier toggle to properly set context_classifier: "llm" when enabled, addressing the issue where only llm_model_override was previously set.

Key changes:

  • Detection of existing LLM config now uses context_classifier === "llm" (matching the database monitor form) instead of !!llm_model_override
  • The classifyParams payload now explicitly sets context_classifier: "llm" on enable, and clears both context_classifier and llm_model_override on disable
  • llm_model_override is only included when explicitly provided, allowing the backend to fall back to its default model

The fix correctly aligns the website monitor form with the established pattern in the database monitor form and implements the intended behavior described in the PR.

Confidence Score: 5/5

  • Safe to merge — the fix is correct, focused, and properly implements the intended behavior for website monitor LLM classification.
  • The core fix is straightforward and correct. The change from detecting LLM usage via !!llm_model_override to context_classifier === "llm" matches the database monitor form pattern. The classifyParams construction properly sets and clears the context_classifier flag based on the toggle state. The fix is minimal, focused, and addresses exactly what the PR describes.
  • No files require special attention

Last reviewed commit: f02a25b

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In
`@clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx`:
- Around line 154-161: The current construction of classifyParams spreads
monitor?.classify_params which may include a stale llm_model_override; if
values.llm_model_override is cleared the conditional spread won't remove that
old value. Fix by starting from a copy of monitor?.classify_params (or a fresh
object), explicitly set context_classifier: "llm", then if
values.llm_model_override is present add llm_model_override, otherwise ensure
any existing llm_model_override key from monitor?.classify_params is removed (or
omitted) before sending; refer to classifyParams, values.use_llm_classifier,
monitor?.classify_params, and values.llm_model_override to locate and implement
this change.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 75abecbb-bdd5-4840-aea1-0c1ab4ed56c1

📥 Commits

Reviewing files that changed from the base of the PR and between f02a25b and 57ae375.

📒 Files selected for processing (1)
  • clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx

- Simplify classifyParams to a flat object spread
- Use hidden prop on LLM model override Form.Item instead of conditional
  rendering, eliminating the need for the setTimeout(0) validation workaround

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Copy link
Contributor

@speaker-ender speaker-ender left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dsill-ethyca approving s the changes look good but I think there are some tests that need to be fixed with the latest change before it can be merged

The model override Form.Item now uses hidden prop instead of conditional
rendering, so the input is always in the DOM. Update assertions from
not.toBeInTheDocument / not.exist to not.toBeVisible / not.be.visible.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Wait for dictionary to load before checking rows (dictionary reload was
resetting row selections), and wait for POST to complete before checking
URL redirect.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
@dsill-ethyca dsill-ethyca added this pull request to the merge queue Mar 5, 2026
Merged via the queue into main with commit aa26617 Mar 5, 2026
47 checks passed
@dsill-ethyca dsill-ethyca deleted the ENG-2851-website-monitor-llm-context-classifier branch March 5, 2026 20:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants