ENG-2851: Fix website monitor LLM toggle to set context_classifier#7566
Conversation
- Set context_classifier: "llm" when LLM toggle is enabled, clear it when disabled - Detect existing LLM config via context_classifier === "llm" instead of llm_model_override - Include llm_model_override only when explicitly provided (optional) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub.
1 Skipped Deployment
|
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughWebsite monitor LLM toggle now derives from Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
Greptile SummaryThis PR correctly fixes the website monitor LLM classifier toggle to properly set Key changes:
The fix correctly aligns the website monitor form with the established pattern in the database monitor form and implements the intended behavior described in the PR. Confidence Score: 5/5
Last reviewed commit: f02a25b |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx`:
- Around line 154-161: The current construction of classifyParams spreads
monitor?.classify_params which may include a stale llm_model_override; if
values.llm_model_override is cleared the conditional spread won't remove that
old value. Fix by starting from a copy of monitor?.classify_params (or a fresh
object), explicitly set context_classifier: "llm", then if
values.llm_model_override is present add llm_model_override, otherwise ensure
any existing llm_model_override key from monitor?.classify_params is removed (or
omitted) before sending; refer to classifyParams, values.use_llm_classifier,
monitor?.classify_params, and values.llm_model_override to locate and implement
this change.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 75abecbb-bdd5-4840-aea1-0c1ab4ed56c1
📒 Files selected for processing (1)
clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx
clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx
Outdated
Show resolved
Hide resolved
clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx
Outdated
Show resolved
Hide resolved
clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsx
Outdated
Show resolved
Hide resolved
- Simplify classifyParams to a flat object spread - Use hidden prop on LLM model override Form.Item instead of conditional rendering, eliminating the need for the setTimeout(0) validation workaround Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
speaker-ender
left a comment
There was a problem hiding this comment.
@dsill-ethyca approving s the changes look good but I think there are some tests that need to be fixed with the latest change before it can be merged
The model override Form.Item now uses hidden prop instead of conditional rendering, so the input is always in the DOM. Update assertions from not.toBeInTheDocument / not.exist to not.toBeVisible / not.be.visible. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Wait for dictionary to load before checking rows (dictionary reload was resetting row selections), and wait for POST to complete before checking URL redirect. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Description Of Changes
Fixes the website monitor LLM classifier toggle to properly set
context_classifier='llm'when enabled, matching the behaviour of the database monitor form.Previously the toggle only set
llm_model_override, which meant the backend gate oncontext_classifier == 'llm'(added in the fidesplus BE) would never fire correctly for website monitors.Also fixes the Save button being briefly disabled when flipping the LLM toggle, caused by
validateFieldsrunning before the dynamically mounted model field finished registering with the form.Code Changes
clients/admin-ui/src/features/integrations/configure-monitor/ConfigureWebsiteMonitorForm.tsxcontext_classifier === "llm"instead of!!llm_model_overridecontext_classifier: "llm"when toggle is enabled, clear it when disabledllm_model_overrideonly when explicitly provided (optional — omitting it lets the BE default toDEFAULT_CLASSIFICATION_MODEL)validateFieldswithsetTimeout(0)so dynamically mounted form fields finish registering before validation runsSteps to Confirm
classify_params.context_classifieris"llm"in the saved monitorcontext_classifieris clearedSummary by CodeRabbit