Skip to content

Feature/publication tree#1249

Open
limina1 wants to merge 36 commits intodamus-io:masterfrom
limina1:feature/publication-tree
Open

Feature/publication tree#1249
limina1 wants to merge 36 commits intodamus-io:masterfrom
limina1:feature/publication-tree

Conversation

@limina1
Copy link
Copy Markdown

@limina1 limina1 commented Jan 18, 2026

Summary

  • Add notedeck_publications crate for NKBIP-01 (kind 30040/30041) publication support
  • Implement hierarchical publication tree navigation with breadcrumbs
  • Add Publications feed column and publication reader view
  • Add NIP-11 relay info caching to respect relay limits when fetching
  • Add "Load More" pagination for publications timeline
  • Add CI workflow for Android APK builds and GitHub Releases

Features

  • Parse and display 30040 index events and 30041 section events
  • Tree navigation: parent/child/sibling traversal
  • Index view with section counts on feed cards
  • Reader mode toggle for publications
  • Relay info caching to batch subscription filters appropriately

Summary by CodeRabbit

  • New Features

    • Publication reader UI (Continuous, Paginated, Outline) with TOC, navigation, and reader app integration.
    • Publications timeline/column, publication cards, and "load more" for older entries.
    • Relay info fetching + caching with aggregated relay limit detection.
  • Documentation

    • Added comprehensive publication reader design document.
  • Chores

    • Multi-platform release automation for Android/Linux/macOS/Windows.
    • Added additional bootstrap relays and workspace crates for publications/reader.

✏️ Tip: You can customize this high-level summary in your review settings.

limina1 and others added 22 commits January 6, 2026 17:12
Implements core data structures for working with Nostr curated
publications (kinds 30040/30041) as defined in NKBIP-01:

- EventAddress: NIP-33 event addressing (kind:pubkey:dtag)
- PublicationTreeNode: Branch/leaf node types with lazy resolution
- PublicationTree: Tree structure with depth-first traversal
- PublicationFetcher: Filter building for relay subscriptions
- D-tag utilities: Namespaced d-tag generation to avoid collisions

This lays the foundation for a Knowledge Base tab that will allow
users to browse and understand the Nostr ecosystem through curated
publications.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Demonstrates fetching NKBIP-01 publications from wss://thecitadel.nostr1.com:
- Discovers publication indices (kind 30040)
- Parses a-tags to find section references
- Fetches content sections (kind 30041)
- Uses PublicationFetcher.build_filters() for relay subscriptions

Run with: cargo run --example fetch_publication -p notedeck_publications

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Adds support for displaying NKBIP-01 publication indices (kind 30040)
as a timeline feed. This enables users to browse curated publications.

Changes:
- Add TimelineKind::Publications variant to kind.rs
- Add publications_filter() returning kind 30040 filter
- Add serialization/parsing for "publications" token
- Update all match statements across app.rs, route.rs, header.rs

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The hyper 1.x API requires explicit type annotations for async response
handling. Simplified body reading using Limited::new() with BodyExt::collect()
instead of frame-by-frame processing.

Also removed fallback to webpki_roots as it requires a feature flag.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Users can now add a Publications column from the Add Column menu.
This creates a timeline that subscribes to kind 30040 events
(NKBIP-01 publication indices).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Adds the infrastructure for viewing publications:
- PublicationSelection type to hold publication context
- Route::Publication variant for navigation
- Serialization/parsing support for the route
- Placeholder reader view in nav.rs

Also adds docs/publication-reader-design.org with layout mockups.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Feed improvements:
- Filter 30040 events to require title tag and empty content
- Custom publication card with title, author tag, and index author
- Cover image thumbnail from "image" tag (right-justified)
- Options button on each card
- Whole-card clickable via hitbox pattern

Publication reader:
- Dual reading modes: continuous scroll and paginated
- TOC overlay with dimmed background drawer
- Section cards with options buttons
- Text word-wrapping with paragraph breaks
- Previous/Next chapter navigation in paginated mode

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy Note ID and Copy Note Link now correctly use naddr (instead of
nevent) for parameterized replaceable events (kinds 30000-39999).

This ensures publications (30040/30041) are referenced by their
coordinate (pubkey + kind + d-tag) rather than event ID, which is
the correct way to reference addressable events per NIP-19.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace flat section list with PublicationTree from notedeck_publications
crate to properly handle NKBIP-01 nested publication structures.

Key improvements:
- Use PublicationTree for lazy loading of nested 30040/30041 hierarchies
- Add iterative resolution that cascades through nested branches
- Pre-resolve nodes already in nostrdb before subscribing
- Group subscription filters by (kind, pubkey) for multi-author publications
- Add collapsible TOC rendering with expand/collapse for branch nodes
- Add diagnostic logging for subscription and relay debugging
- Support MAX_INLINE_DEPTH (5 levels) before opening new publication view

The tree-based approach correctly handles publications where index nodes
(30040) can contain references to other index nodes, not just content
sections (30041).

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add history stack to PublicationSelection for tracking navigation through
nested publications. This enables back navigation when drilling into
deeply nested publication trees.

- Add navigate_into() to push current publication and switch to nested one
- Add navigate_back() to return to previous publication
- Add can_go_back() and breadcrumbs() for UI integration
- Hash/Eq only considers current index_id for timeline cache lookups
- Remove redundant PublicationHistory struct from publication.rs

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add back button and breadcrumb trail for navigating nested publications.
When drilling into nested publications, users can now see their path
and navigate back to previous publications.

- Add PublicationNavAction enum for Back and NavigateInto actions
- Add back button (←) in header when navigation history exists
- Add breadcrumb trail showing parent publication titles
- Add RenderNavAction::PublicationNav variant for action processing
- Add routes_mut() to ColumnsRouter for modifying routes in place
- Update PublicationViewResponse to include navigation actions

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add a method to get previous and next sibling indices for a node,
enabling horizontal navigation between siblings at the same tree level.
This supports the upcoming index view mode with prev/next navigation.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add the foundational types for Index view mode:
- Add Index variant to ReaderMode enum
- Add IndexViewState to track current node position
- Extend PublicationNavAction with DrillDown, DrillUp, PrevSibling, NextSibling
- Add placeholder handling in nav.rs and render_publication

The Index view mode will allow drilling down through the publication tree
layer by layer, showing immediate children of the current node instead of
flattening all content.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add render_index_view method that shows the publication tree structure
layer by layer:
- Shows immediate children of current node
- 30040 (branch) children display as clickable cards to drill down
- 30041 (leaf) children display as cards with their content
- Breadcrumb trail shows path from root to current node
- Type indicators (📁 for branches, 📄 for leaves)
- Loading spinners for unresolved nodes

This enables navigating deeply nested publications without loading
all content at once.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add navigation controls for index view mode in the header:
- Toggle button (📑) to enter index view from continuous/paginated modes
- Up button (⬆) to navigate to parent when not at root
- Prev/Next sibling navigation (◀/▶) for horizontal tree traversal

This allows full navigation through the publication tree structure
using the header controls in addition to clicking on branch cards.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Show the number of sections (a-tags) in publication cards in the feed.
This gives users a preview of how much content a publication contains
before opening it.

Displays as "N sections" below the "published by" line.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add a mode toggle to the Publications feed header that sets the default
reader mode when opening publications:
- 📜 Reader (default): Opens publications in continuous/paginated view
- 📑 Index: Opens publications in index view for tree navigation

When a publication is opened, it now checks for this preference and
uses it as the initial reader mode. The toggle only affects newly
opened publications - it doesn't change the display of the feed itself.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add GitCitadel's relay for fetching publications content.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add RelayInfoCache to fetch and cache NIP-11 relay information documents.
This allows respecting relay limits (max_event_tags, max_limit, etc.) when
building subscription filters for publication trees.

- Add relay_info module with async NIP-11 fetching via hyper
- Integrate RelayInfoCache into AppContext for app-wide access
- Use relay limits to determine batch sizes for publication fetches
- Add "Load More" button to publications timeline for fetching older events
- Add NoteAction::LoadMore variant for triggering pagination
- Add oldest_ref() to NoteUnits for determining pagination cursor
- Add thecitadel.nostr1.com to bootstrap relays

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add android-apk job to build APK on master/ci branches and tags
- Create release.yml workflow triggered on version tags (v*)
- Builds all platforms: Android, Linux, macOS, Windows
- Creates GitHub Release with all artifacts attached
- Enables Obtainium support for Android app distribution

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Jan 18, 2026

📝 Walkthrough

Walkthrough

Adds NKBIP‑01 publication support (new notedeck_publications crate and reader), a reader app (notedeck_reader) and UI integration, NIP‑11 relay info caching, timeline/route/nav changes for Publications and LoadMore, timeline publication cards, workspace manifest updates, and new cross‑platform GitHub Actions release and Android APK CI jobs.

Changes

Cohort / File(s) Summary
CI / Release Workflows
.github/workflows/release.yml, .github/workflows/rust.yml
New cross‑platform release workflow (android, linux, macos, windows) with artifact aggregation and GitHub Release creation; adds android-apk job to existing Rust workflow.
Workspace Manifests
Cargo.toml, crates/notedeck_publications/Cargo.toml, crates/notedeck_chrome/Cargo.toml, crates/notedeck_reader/Cargo.toml, crates/notedeck_columns/Cargo.toml
Adds notedeck_publications and notedeck_reader to workspace and dependencies; optional notedeck_reader feature for chrome.
Relay info & network
crates/notedeck/src/relay_info.rs, crates/notedeck/src/lib.rs, crates/notedeck/src/app.rs, crates/notedeck/src/context.rs, crates/notedeck/src/account/relay.rs, crates/notedeck/src/media/network.rs
New RelayInfo module and RelayInfoCache with async NIP‑11 fetching, caching, in‑flight tracking and aggregation; integrated into Notedeck/AppContext; adds two bootstrap relays and tighter HTTP body handling.
Publications core crate
crates/notedeck_publications/src/{lib.rs,address.rs,constants.rs,dtag.rs,fetcher.rs,node.rs,tree.rs}, crates/notedeck_publications/examples/fetch_publication.rs
New crate implementing EventAddress parsing, publication kind constants, d‑tag utilities, PublicationTree/node types, PublicationFetcher, PublicationRequest builder, tests, and example.
Reader crate & integration
crates/notedeck_reader/src/{lib.rs,nav.rs,state.rs}, crates/notedeck_reader/src/ui/{mod.rs,reader.rs}, crates/notedeck_reader/Cargo.toml
New reader crate exposing PublicationSelection, Publications/PublicationTreeState, PublicationView and UI; ReaderApp type with App impl and navigation state.
Columns / Timeline core
crates/notedeck_columns/src/timeline/{kind.rs,mod.rs/publication.rs,note_units.rs,timeline_units.rs,route.rs}, crates/notedeck_columns/src/timeline/publication.rs
Adds TimelineKind::Publications, PublicationSelection, publications_load_more_filter, oldest/oldest_ref helpers, and PublicationTreeState integration.
Columns UI: publication cards & timeline
crates/notedeck_columns/src/ui/{publication.rs,mod.rs,timeline.rs,add_column.rs,column/header.rs}
New PublicationView UI (Continuous/Paginated/Outline) in columns crate; timeline renders publication cards with thumbnail/metadata; Add Column option for Publications; per-column load‑more support.
Columns navigation & actions
crates/notedeck_columns/src/{nav.rs,route.rs,actionbar.rs,note/action.rs,context.rs,app.rs}
New Route::Publication, navigation plumbing, NoteAction::LoadMore, NoteActionResult/AppAction propagation, and app-level OpenPublication/SwitchToColumns actions.
Publication reader UI (reader crate)
crates/notedeck_reader/src/ui/reader.rs
Full publication reader UI: PublicationView, PublicationNavAction, ReaderMode, TOC overlay, multi‑mode rendering and navigation actions.
Integration in Chrome / App
crates/notedeck_chrome/src/{app.rs,chrome.rs}
Reader app registration and NotedeckApp::Reader variant; Chrome::open_publication and AppAction routing to Reader (feature‑gated), plus borrow‑safe action processing refactor.
Docs & design
docs/publication-reader-design.org
New design document describing publication reader layouts, loading strategy, state model, and mobile guidelines.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant UI as rgba(63,81,181,0.5)|UI/PublicationView
  participant PubMgr as rgba(0,150,136,0.5)|Publications
  participant Fetcher as rgba(255,193,7,0.5)|PublicationFetcher
  participant Pool as rgba(244,67,54,0.5)|RelayPool
  participant Relay as rgba(33,150,243,0.5)|Remote Relays / WebSocket
  participant DB as rgba(156,39,176,0.5)|NostrDB

  UI->>PubMgr: open(index_id, batch_size)
  PubMgr->>DB: query index note
  DB-->>PubMgr: index note
  PubMgr->>Fetcher: build request for pending addresses
  Fetcher->>Pool: subscribe(filters, sub_id)
  Pool->>Relay: send SUBSCRIBE (filters)
  Relay-->>Pool: EVENT / EOSE
  Pool-->>Fetcher: deliver events
  Fetcher->>DB: store events
  Fetcher-->>PubMgr: resolved nodes (updates)
  PubMgr-->>UI: render updated tree/state
Loading

Estimated code review effort

🎯 5 (Critical) | ⏱️ ~120 minutes

Poem

🐰 I hopped through relays and tree‑lined threads,

Collected tags and sculpted tiny d‑tag beds,
Cached whispers, stitched an outline bright and true,
Opened reader doors and let publications through,
A joyful hop — new stories bloom for you!

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 inconclusive)
Check name Status Explanation Resolution
Title check ❓ Inconclusive The title "Feature/publication tree" is vague and doesn't clearly convey the primary feature. It uses a generic "Feature/" prefix and the term "publication tree" without context, making it unclear to someone unfamiliar with the codebase what the main change entails. Consider a more descriptive title like "Add NKBIP-01 publication reader with hierarchical tree navigation" or "Implement publication tree support with reader UI and relay info caching" to clearly communicate the core functionality.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

🤖 Fix all issues with AI agents
In @.github/workflows/release.yml:
- Around line 83-92: In the "Collect packages" GitHub Actions step the cp
commands for RPM/DEB use "|| true" which masks missing artifacts and allows
incomplete releases; update the step (the cp lines inside the "Collect packages"
block) to remove "|| true" and instead explicitly check for matching files
before copying (for example test -e target/.../*.rpm || { echo "No RPMs found";
exit 1; } and similarly for .deb, or use a safe glob check) so the job fails
fast when no packages are produced rather than silently continuing.
- Around line 193-205: Update the release job's conditional and action version:
replace the current if condition (if: always() && needs.build-android.result ==
'success') with a gate that requires all build jobs to succeed (e.g. if:
needs.build-android.result == 'success' && needs.build-linux.result == 'success'
&& needs.build-macos.result == 'success' && needs.build-windows.result ==
'success') and update the step using softprops/action-gh-release@v1 to the
latest v2.x release (e.g. softprops/action-gh-release@v2.5.0) in the "Create
Release" step so the release only publishes when all builds pass and uses the
current action version.

In @.github/workflows/rust.yml:
- Around line 51-76: The job-level if conditional currently contains an
unreachable check startsWith(github.ref, 'refs/tags/') because the workflow
trigger only defines push branches (master, ci); either remove that
startsWith(...) clause from the job's if conditional so it reads: if:
github.ref_name == 'master' || github.ref_name == 'ci', or alternatively add tag
pushes to the workflow trigger (on.push.tags) so the existing
startsWith(github.ref, 'refs/tags/') condition becomes reachable; update the if
clause or the workflow trigger accordingly and keep the rest of the job (steps
like Build JNI library, Build APK, Upload APK) unchanged.

In `@crates/notedeck_columns/src/actionbar.rs`:
- Around line 224-262: The LoadMore handler (NoteAction::LoadMore) currently
only queries the local nostrdb via ndb.query using the filter built by
crate::timeline::kind::publications_load_more_filter(until); change it so that
when the local query returns no/newer results you also trigger a remote fetch:
call the existing relay-subscription helper (e.g. try_add_remote) with the same
filters/limit to request older publications from relays, and ensure you keep the
existing timeline_cache/timeline_res behavior so that when remote events arrive
they will be merged into the timeline (you may need to register the subscription
id or a callback to update TimelineOpenResult/NotesOpenResult when remote
results are delivered).

In `@crates/notedeck_columns/src/timeline/publication.rs`:
- Around line 54-61: needs_fetch currently prevents re-fetch of addresses still
in pending_fetch but those are only removed when a note resolves, so publication
subscriptions can leave addresses "in-flight" forever; add a new
SubKind::Publication variant wherever SubKind is defined and ensure
subscriptions created for publications mark addresses into pending_fetch, then
update handle_eose(&mut self, sub_id: &SubscriptionId) to detect EOSE for
SubKind::Publication and remove any addresses that remain unresolved from
pending_fetch (and optionally record them as unresolved for retry), and add
timeout-based retry logic (e.g., store a timestamp per address or a retry queue)
that will move addresses back out of pending_fetch after the timeout to allow
needs_fetch(&self, batch_size: usize) to include them again; reference symbols:
needs_fetch, pending_fetch, pending_addresses, handle_eose,
SubKind::Publication, tree, and SubscriptionId.

In `@crates/notedeck_columns/src/ui/publication.rs`:
- Around line 546-552: The progress label currently uses the same value twice
because sections is the resolved set; update the ui.label inside the is_complete
check to show resolved vs total by using the resolved count and the total count
from the surrounding data (e.g., use the resolved length and the total_sections
value carried in the section_data tuple or captured earlier) instead of calling
sections.len() for both placeholders; locate the ui.horizontal block with
ui.spinner and ui.label and replace the second sections.len() with the
appropriate total variable (or extract total from section_data) so it reads
"Fetching... (resolved/total)".
- Around line 401-406: The current truncation uses a byte slice
&crumb_title[..12] which can panic on multi-byte UTF-8 characters; change the
logic in the display_title computation to use char-aware operations: check the
character count (e.g., crumb_title.chars().count()) and when it exceeds the
threshold, build the truncated title with
crumb_title.chars().take(12).collect::<String>() (then append the ellipsis)
instead of slicing by byte indices; update the display_title creation
accordingly to avoid UTF-8 boundary panics.
- Around line 763-767: The truncation uses byte-slicing (&title[..17]) which can
panic on multi-byte UTF-8; update the logic around the display variable to
operate on characters/graphemes instead: check length with title.chars().count()
(or use unicode_segmentation::UnicodeSegmentation and graphemes(true).count())
and build the truncated string with a char/grapheme-aware operation (e.g.,
take(17) and collect into a String) when creating display instead of
byte-slicing the title.

In `@crates/notedeck_publications/examples/fetch_publication.rs`:
- Around line 83-88: The code prints a truncated pubkey with &pubkey[..16],
which can panic if pubkey is shorter (e.g., when JSON parsing defaulted to
empty); update the print to safely take at most 16 chars using pubkey.get(..16)
(or equivalent safe slicing) and fall back to the full pubkey or a placeholder,
while keeping the surrounding context in the block where kind ==
KIND_PUBLICATION_INDEX and the variables title, dtag, pubkey, and a_tags are
used.

In `@crates/notedeck_publications/src/dtag.rs`:
- Around line 125-130: generate_dtag currently concatenates title_abbreviation
and slug with a hyphen which yields a leading hyphen for empty titles; change
generate_dtag to call title_abbreviation (and/or slugify) and only prepend the
"-" when the abbreviation is non-empty (i.e., if title_abbreviation(...) != ""
then format "{abbr}-{slug}" else return slug), and adjust tests to expect "test"
for generate_dtag("", "test"); also ensure extract_abbreviation still behaves
correctly with the new format (it should return None or Some("") according to
your semantics) by updating its expectations if needed.

In `@crates/notedeck_publications/src/fetcher.rs`:
- Around line 174-179: The reset() method currently resets state,
subscription_id, and events_received but fails to clear the batch counter;
update Fetcher::reset (the reset method on the struct that has the
batches_completed field) to also set self.batches_completed = 0 so
batches_completed does not carry over into a new fetch operation.

In `@crates/notedeck/src/media/network.rs`:
- Around line 93-98: The error mapping for collecting the limited body must box
the LengthLimitError to match the HyperHttpError::Hyper variant; update the
map_err in the Limited::new / BodyExt::collect call so it converts the error
into HyperHttpError::Hyper(Box::new(e)) (i.e., use Box::new(e) or equivalent)
when mapping the collection error coming from BodyExt::collect, keeping the rest
of the flow (limited_body, collected, bytes) unchanged.
🧹 Nitpick comments (10)
crates/notedeck_publications/src/address.rs (1)

69-83: Consider removing redundant to_string_format method.

The Display implementation already provides string formatting via to_string(). Having both to_string_format() and Display::fmt is redundant and may confuse API users.

♻️ Suggested simplification

Either remove to_string_format and rely on Display:

-    /// Convert to string format
-    pub fn to_string_format(&self) -> String {
-        format!("{}:{}:{}", self.kind, hex::encode(self.pubkey), self.dtag)
-    }
-

 impl fmt::Display for EventAddress {
     fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
-        write!(f, "{}", self.to_string_format())
+        write!(f, "{}:{}:{}", self.kind, hex::encode(self.pubkey), self.dtag)
     }
 }

Or keep to_string_format as a private helper called by Display.

crates/notedeck_publications/src/tree.rs (2)

30-35: Dead code: cursor is unused and bookmark is set but never read.

Both fields are marked or appear unused:

  • cursor has #[allow(dead_code)] indicating it's not implemented yet
  • bookmark can be set via set_bookmark/clear_bookmark but is never read

Consider removing these or documenting them as placeholders for future functionality.

♻️ If these are planned features, add TODO comments
     /// Current cursor position for iteration
-    #[allow(dead_code)]
-    cursor: Option<usize>,
+    // TODO: Implement cursor-based iteration for continuous reading
+    #[allow(dead_code)]
+    cursor: Option<usize>,

     /// Bookmark for resuming iteration
+    // TODO: Implement bookmark restoration on tree reload
     bookmark: Option<usize>,

389-420: LeafIterator relies on insertion order rather than explicit order field.

The iterator pushes children in reverse insertion order, assuming children vector matches the order field. The children() method (line 217) explicitly sorts by order, but LeafIterator doesn't.

If nodes are ever reordered or added out of sequence, the iterator would yield incorrect order. Consider sorting before pushing to stack.

♻️ Optional: Sort by order for robustness
 impl<'a> LeafIterator<'a> {
     fn new(tree: &'a PublicationTree) -> Self {
         let mut stack = Vec::new();
         // Push children of root in reverse order (so first child is processed first)
         if let Some(root) = tree.nodes.get(tree.root_index) {
-            for &child_idx in root.children.iter().rev() {
-                stack.push(child_idx);
-            }
+            let mut children: Vec<usize> = root.children.clone();
+            children.sort_by_key(|&idx| {
+                tree.nodes.get(idx).map(|n| n.order).unwrap_or(0)
+            });
+            for child_idx in children.into_iter().rev() {
+                stack.push(child_idx);
+            }
         }
         Self { tree, stack }
     }
 }

Apply similar sorting in the next() method when pushing branch children.

crates/notedeck_columns/src/timeline/publication.rs (1)

481-515: Consider reusing the shared filter builder to avoid drift.

This logic mirrors PublicationFetcher::build_filters. Sharing a single implementation reduces divergence risk between crates.

♻️ One way to reuse the shared helper
-use notedeck_publications::{
-    EventAddress, NodeType, PublicationTree, PublicationTreeNode,
-};
+use notedeck_publications::{
+    EventAddress, NodeType, PublicationTree, PublicationTreeNode, PublicationFetcher,
+};

 fn build_filters_for_addresses(addresses: &[EventAddress]) -> Vec<Filter> {
     if addresses.is_empty() {
         return vec![];
     }

-    // Group addresses by (kind, pubkey) since different sections may have different authors
-    let mut by_kind_and_author: HashMap<(u32, [u8; 32]), Vec<&str>> = HashMap::new();
-    for addr in addresses {
-        by_kind_and_author
-            .entry((addr.kind, addr.pubkey))
-            .or_default()
-            .push(&addr.dtag);
-    }
-
-    // Build a filter for each (kind, author) combination
-    let filters: Vec<Filter> = by_kind_and_author
-        .into_iter()
-        .map(|((kind, author), dtags)| {
-            debug!(
-                "Building filter: kind={}, author={}, dtags={:?}",
-                kind,
-                hex::encode(&author[..8]),
-                dtags
-            );
-            Filter::new()
-                .kinds([kind as u64])
-                .authors([&author])
-                .tags(dtags, 'd')
-                .build()
-        })
-        .collect();
-
-    debug!("Built {} filters total", filters.len());
-    filters
+    let refs: Vec<&EventAddress> = addresses.iter().collect();
+    let filters = PublicationFetcher::build_filters(&refs);
+    debug!("Built {} filters total", filters.len());
+    filters
 }
crates/notedeck_columns/src/timeline/timeline_units.rs (1)

141-165: Consider validating that the title value is non-empty.

The title tag validation checks for tag.count() >= 2 and tag.get_str(0) == Some("title"), but doesn't verify that the title value at index 1 is actually non-empty. A tag like ["title", ""] would pass validation but represents a malformed publication.

♻️ Suggested improvement
             // Require a title tag
             let has_title = payload.note.tags().iter().any(|tag| {
-                tag.count() >= 2 && tag.get_str(0) == Some("title")
+                tag.count() >= 2
+                    && tag.get_str(0) == Some("title")
+                    && tag.get_str(1).is_some_and(|t| !t.is_empty())
             });
crates/notedeck/src/app.rs (1)

84-84: Missing docstring for new field.

As per coding guidelines, ensure docstring coverage for any code added. The relay_info_cache field should have a doc comment explaining its purpose.

📝 Suggested documentation
     i18n: Localization,
+    /// Cache for relay information (NIP-11 documents) used to respect relay limits
+    /// when fetching publications and other data.
     relay_info_cache: RelayInfoCache,
crates/notedeck_columns/src/app.rs (1)

53-53: Missing docstring for the publications field.

Consider adding a doc comment similar to the threads field to explain the purpose of publications.

crates/notedeck_columns/src/route.rs (1)

591-593: Consider documenting the intended use case for routes_mut().

This method exposes internal mutable state directly, which is needed for in-place route modification (like updating PublicationSelection navigation state). Consider adding a doc comment explaining when this should be used vs. the existing navigation methods.

📝 Suggested documentation
+    /// Returns mutable access to the routes vector for in-place modifications.
+    /// Use sparingly - prefer navigation methods like `route_to()` when possible.
+    /// This is needed for updating internal state of existing routes (e.g., publication navigation).
     pub fn routes_mut(&mut self) -> &mut Vec<R> {
         &mut self.router_internal.routes
     }
crates/notedeck_columns/src/ui/timeline.rs (2)

772-772: Consider graceful handling for missing note key.

note.key().expect() will panic if the note lacks a key. While timeline notes from nostrdb queries should always have keys, a defensive approach would be safer.

♻️ Optional defensive handling
-    let note_key = note.key().expect("note should have key");
+    let Some(note_key) = note.key() else {
+        tracing::warn!("Publication card: note has no key");
+        return RenderEntryResponse::Unsuccessful;
+    };

867-871: Missing localization for section count text.

Per coding guidelines, user-facing strings should be wrapped with tr! or tr_plural! macros. The section count text is hardcoded in English.

♻️ Add localization support
-                let section_text = if section_count == 1 {
-                    "1 section".to_string()
-                } else {
-                    format!("{} sections", section_count)
-                };
+                let section_text = tr_plural!(
+                    note_context.i18n,
+                    "{count} section",
+                    "{count} sections",
+                    "section count in publication card",
+                    section_count,
+                    count = section_count
+                );

Based on coding guidelines, wrap user-facing strings with tr! or tr_plural! macros for localization.

Comment on lines +83 to +92
- name: Collect packages
run: |
mkdir -p packages
if [ "${{ matrix.arch }}" == "x86_64" ]; then
cp target/release/generate-rpm/*.rpm packages/ || true
cp target/release/debian/*.deb packages/ || true
else
cp target/${{ matrix.arch }}-unknown-linux-gnu/generate-rpm/*.rpm packages/ || true
cp target/${{ matrix.arch }}-unknown-linux-gnu/debian/*.deb packages/ || true
fi
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don’t mask missing package outputs in a release.

Lines 87-91 use || true, which can publish a release without RPM/DEB assets if the build didn’t actually produce them. Prefer failing fast to avoid incomplete releases.

✅ Suggested change
-            cp target/release/generate-rpm/*.rpm packages/ || true
-            cp target/release/debian/*.deb packages/ || true
+            cp target/release/generate-rpm/*.rpm packages/
+            cp target/release/debian/*.deb packages/
           else
-            cp target/${{ matrix.arch }}-unknown-linux-gnu/generate-rpm/*.rpm packages/ || true
-            cp target/${{ matrix.arch }}-unknown-linux-gnu/debian/*.deb packages/ || true
+            cp target/${{ matrix.arch }}-unknown-linux-gnu/generate-rpm/*.rpm packages/
+            cp target/${{ matrix.arch }}-unknown-linux-gnu/debian/*.deb packages/
           fi
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- name: Collect packages
run: |
mkdir -p packages
if [ "${{ matrix.arch }}" == "x86_64" ]; then
cp target/release/generate-rpm/*.rpm packages/ || true
cp target/release/debian/*.deb packages/ || true
else
cp target/${{ matrix.arch }}-unknown-linux-gnu/generate-rpm/*.rpm packages/ || true
cp target/${{ matrix.arch }}-unknown-linux-gnu/debian/*.deb packages/ || true
fi
- name: Collect packages
run: |
mkdir -p packages
if [ "${{ matrix.arch }}" == "x86_64" ]; then
cp target/release/generate-rpm/*.rpm packages/
cp target/release/debian/*.deb packages/
else
cp target/${{ matrix.arch }}-unknown-linux-gnu/generate-rpm/*.rpm packages/
cp target/${{ matrix.arch }}-unknown-linux-gnu/debian/*.deb packages/
fi
🤖 Prompt for AI Agents
In @.github/workflows/release.yml around lines 83 - 92, In the "Collect
packages" GitHub Actions step the cp commands for RPM/DEB use "|| true" which
masks missing artifacts and allows incomplete releases; update the step (the cp
lines inside the "Collect packages" block) to remove "|| true" and instead
explicitly check for matching files before copying (for example test -e
target/.../*.rpm || { echo "No RPMs found"; exit 1; } and similarly for .deb, or
use a safe glob check) so the job fails fast when no packages are produced
rather than silently continuing.

Comment on lines +193 to +205
if: always() && needs.build-android.result == 'success'
permissions:
contents: write
steps:
- name: Download all artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Display artifacts
run: find artifacts -type f
- name: Create Release
uses: softprops/action-gh-release@v1
with:
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

softprops/action-gh-release latest version 2024 2025

💡 Result:

  • Latest in 2024: v2.0.4 — released March 12, 2024. [1]
  • Latest in 2025: v2.5.0 — released December 1, 2025. [2]

Sources:
[1] action-gh-release CHANGELOG (2024 entries).
[2] softprops/action-gh-release Releases (v2.5.0).


🌐 Web query:

softprops/action-gh-release runner requirements compatibility

💡 Result:

Short summary — softprops/action-gh-release runner requirements & compatibility

  • Supported runners: GitHub Actions virtual environments on Linux, Windows and macOS (e.g., ubuntu-latest, windows-latest, macos-latest). [1]
  • Self-hosted: action is cross-platform (no Docker-only limitation anymore), so it can run on any runner that provides the usual filesystem and network access (virtual or self-hosted). (README notes cross‑platform support). [1]
  • Windows specifics: use forward slashes (/) in file/glob paths (not backslashes). [1]
  • Token & permissions: action uses a GitHub token (defaults to ${{ github.token }}); job/workflow must grant contents: write. If you use discussion_category_name, also grant discussions: write. If you need the workflow to trigger on a release event, use a personal access token (secrets) because the default GITHUB_TOKEN won’t trigger other workflows. [2]

Sources:

  • softprops/action-gh-release README (platforms, Windows note, inputs). [1]
  • softprops/action-gh-release README (Permissions & PAT note). [2]

🏁 Script executed:

# Check if the workflow file exists and inspect the conditional logic around the release job
fd -t f "release.yml" .github/workflows/ | head -5

Repository: damus-io/notedeck

Length of output: 90


🏁 Script executed:

# Read the release.yml file to understand the full context
cat -n .github/workflows/release.yml | sed -n '180,220p'

Repository: damus-io/notedeck

Length of output: 1504


🏁 Script executed:

# Search for all build job definitions to understand what jobs are expected
rg "^\s+build-" .github/workflows/release.yml

Repository: damus-io/notedeck

Length of output: 124


Update release action to latest version and gate on all builds.

Line 204 uses softprops/action-gh-release@v1, which is outdated; update to the latest version (v2.5.0 or latest v2.x). Line 193 uses always() with only Android success check, so a release can publish even if Linux/macOS/Windows builds fail. Gate on all four build jobs to ensure complete releases.

🔧 Suggested update
-    if: always() && needs.build-android.result == 'success'
+    if: needs.build-android.result == 'success' && needs.build-linux.result == 'success' && needs.build-macos.result == 'success' && needs.build-windows.result == 'success'
-      - name: Create Release
-        uses: softprops/action-gh-release@v1
+      - name: Create Release
+        uses: softprops/action-gh-release@v2.5.0
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if: always() && needs.build-android.result == 'success'
permissions:
contents: write
steps:
- name: Download all artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Display artifacts
run: find artifacts -type f
- name: Create Release
uses: softprops/action-gh-release@v1
with:
if: needs.build-android.result == 'success' && needs.build-linux.result == 'success' && needs.build-macos.result == 'success' && needs.build-windows.result == 'success'
permissions:
contents: write
steps:
- name: Download all artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Display artifacts
run: find artifacts -type f
- name: Create Release
uses: softprops/action-gh-release@v2.5.0
with:
🧰 Tools
🪛 actionlint (1.7.10)

204-204: the runner of "softprops/action-gh-release@v1" action is too old to run on GitHub Actions. update the action's version to fix this issue

(action)

🤖 Prompt for AI Agents
In @.github/workflows/release.yml around lines 193 - 205, Update the release
job's conditional and action version: replace the current if condition (if:
always() && needs.build-android.result == 'success') with a gate that requires
all build jobs to succeed (e.g. if: needs.build-android.result == 'success' &&
needs.build-linux.result == 'success' && needs.build-macos.result == 'success'
&& needs.build-windows.result == 'success') and update the step using
softprops/action-gh-release@v1 to the latest v2.x release (e.g.
softprops/action-gh-release@v2.5.0) in the "Create Release" step so the release
only publishes when all builds pass and uses the current action version.

Comment on lines +51 to +76
if: github.ref_name == 'master' || github.ref_name == 'ci' || startsWith(github.ref, 'refs/tags/')
steps:
- uses: actions/checkout@v4
- uses: dtolnay/rust-toolchain@stable
- name: Setup Java JDK
uses: actions/setup-java@v4.5.0
with:
java-version: '17'
distribution: 'temurin'
- name: Setup Android SDK
uses: android-actions/setup-android@v3
- name: Add android rust target
run: rustup target add aarch64-linux-android
- name: Install Cargo NDK
run: cargo install cargo-ndk
- name: Rust cache
uses: Swatinem/rust-cache@v2
- name: Build JNI library
run: make jni
- name: Build APK
run: make apk
- name: Upload APK
uses: actions/upload-artifact@v4
with:
name: notedeck-arm64.apk
path: crates/notedeck_chrome/android/app/build/outputs/apk/debug/app-debug.apk
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# Find and read the workflow file to check trigger configuration
cat -n .github/workflows/rust.yml | head -100

Repository: damus-io/notedeck

Length of output: 3393


🏁 Script executed:

# Find Makefiles and check the apk target
fd -a -g 'Makefile*' -g '*.mk' | while read -r f; do
  echo "== $f =="; 
  cat -n "$f" | grep -A 20 "apk\s*:"
done

Repository: damus-io/notedeck

Length of output: 152


🏁 Script executed:

# Check for Android build configuration files
fd -a -g 'build.gradle*' -g '*.gradle' crates/notedeck_chrome/android/ | head -20

Repository: damus-io/notedeck

Length of output: 262


🏁 Script executed:

# Find Makefile in repository
fd -type f -name 'Makefile' | head -20

Repository: damus-io/notedeck

Length of output: 231


🏁 Script executed:

# Search for apk target in Makefiles
rg -n "apk\s*:" --type makefile | head -50

Repository: damus-io/notedeck

Length of output: 93


🏁 Script executed:

# Check gradle build output configuration
cat -n crates/notedeck_chrome/android/app/build.gradle | grep -A 10 -B 5 "apk\|buildTypes\|debug\|release" | head -100

Repository: damus-io/notedeck

Length of output: 560


🏁 Script executed:

# Find Makefile without type filtering
find . -maxdepth 3 -name 'Makefile' -o -name 'makefile'

Repository: damus-io/notedeck

Length of output: 71


🏁 Script executed:

# Search for apk target definition
rg -n "^\s*apk\s*:" . --max-count 20

Repository: damus-io/notedeck

Length of output: 136


🏁 Script executed:

# Check what make targets exist
find . -maxdepth 2 -name 'Makefile' -exec cat {} \; | grep -E "^[a-z-]+:" | head -30

Repository: damus-io/notedeck

Length of output: 175


🏁 Script executed:

# Examine the root Makefile to see what apk target does
cat -n ./Makefile

Repository: damus-io/notedeck

Length of output: 1380


🏁 Script executed:

# Examine the Android-specific Makefile
cat -n ./crates/notedeck_chrome/android/Makefile

Repository: damus-io/notedeck

Length of output: 499


Remove the unreachable tag condition from the job's if clause.

The workflow only triggers on push: branches: [master, ci] (lines 4-7), so the condition startsWith(github.ref, 'refs/tags/') on line 51 is unreachable and will never execute. Either add on.push.tags to the workflow trigger if tag builds are intended, or remove the tag check from the job condition. The APK output path is correct for debug builds produced by gradlew build.

🤖 Prompt for AI Agents
In @.github/workflows/rust.yml around lines 51 - 76, The job-level if
conditional currently contains an unreachable check startsWith(github.ref,
'refs/tags/') because the workflow trigger only defines push branches (master,
ci); either remove that startsWith(...) clause from the job's if conditional so
it reads: if: github.ref_name == 'master' || github.ref_name == 'ci', or
alternatively add tag pushes to the workflow trigger (on.push.tags) so the
existing startsWith(github.ref, 'refs/tags/') condition becomes reachable;
update the if clause or the workflow trigger accordingly and keep the rest of
the job (steps like Build JNI library, Build APK, Upload APK) unchanged.

Comment on lines +224 to +262
NoteAction::LoadMore(until) => {
// Load more publications (older than `until` timestamp)
tracing::info!("LoadMore action triggered with until={}", until);

let kind = TimelineKind::Publications;
if timeline_cache.get(&kind).is_some() {
// Create filter for older publications
let filters =
crate::timeline::kind::publications_load_more_filter(until);

// Query nostrdb for older publications
let limit = filters.first().and_then(|f| f.limit()).unwrap_or(500) as i32;
match ndb.query(txn, &filters, limit) {
Ok(results) => {
let note_refs: Vec<notedeck::NoteRef> = results
.into_iter()
.map(notedeck::NoteRef::from_query_result)
.collect();

tracing::info!(
"LoadMore: found {} older publications",
note_refs.len()
);

if !note_refs.is_empty() {
// Insert the notes into the timeline
let note_keys: Vec<NoteKey> =
note_refs.iter().map(|nr| nr.key).collect();
timeline_res = Some(NotesOpenResult::Timeline(
TimelineOpenResult::new_notes(note_keys, kind),
));
}
}
Err(e) => {
error!("LoadMore: failed to query older publications: {:?}", e);
}
}
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

rg -n "LoadMore|load_more" --type rust -C 3 | head -150

Repository: damus-io/notedeck

Length of output: 6785


🏁 Script executed:

cd crates/notedeck_columns && find . -name "*.rs" | head -20

Repository: damus-io/notedeck

Length of output: 426


🏁 Script executed:

rg -n "pub.*subscription|fetch.*relay|relay.*fetch" --type rust crates/notedeck_columns/src/ | head -50

Repository: damus-io/notedeck

Length of output: 781


🏁 Script executed:

rg -n "TimelineSub|struct.*Sub" --type rust crates/notedeck_columns/src/ -A 5 | head -100

Repository: damus-io/notedeck

Length of output: 7035


🏁 Script executed:

rg -n "pub fn.*subscrib" --type rust crates/notedeck_columns/src/ -B 2 -A 10 | head -150

Repository: damus-io/notedeck

Length of output: 5285


🏁 Script executed:

# Check if LoadMore is supposed to interact with relay pool or subscriptions
rg -n "LoadMore" --type rust crates/notedeck_columns/src/actionbar.rs -B 10 -A 30

Repository: damus-io/notedeck

Length of output: 3078


🏁 Script executed:

rg -n "Publications|publications" --type rust crates/notedeck_columns/src/timeline/kind.rs -B 5 -A 15 | head -200

Repository: damus-io/notedeck

Length of output: 7528


🏁 Script executed:

# Check how timelines are subscribed to relays initially
rg -n "TimelineKind::Publications" --type rust crates/notedeck_columns/src/ -B 3 -A 8 | head -150

Repository: damus-io/notedeck

Length of output: 12161


🏁 Script executed:

# Look for where relay subscriptions are created for Publications
rg -n "force_add_remote|try_add_remote" --type rust crates/notedeck_columns/src/ -B 5 -A 10 | head -150

Repository: damus-io/notedeck

Length of output: 8158


🏁 Script executed:

# Check the publications_filter function
rg -n "fn publications_filter" --type rust crates/notedeck_columns/src/ -A 10

Repository: damus-io/notedeck

Length of output: 973


🏁 Script executed:

# Check if there's a publications_load_more_filter and what it contains
rg -n "fn publications_load_more_filter" --type rust crates/notedeck_columns/src/ -A 10

Repository: damus-io/notedeck

Length of output: 590


🏁 Script executed:

# Check how timeline cache opens timelines and sets up subscriptions
rg -n "pub fn open" --type rust crates/notedeck_columns/src/timeline/cache.rs -B 3 -A 20

Repository: damus-io/notedeck

Length of output: 1206


🏁 Script executed:

rg -n "pub fn open" --type rust crates/notedeck_columns/src/timeline/cache.rs -A 50 | head -100

Repository: damus-io/notedeck

Length of output: 2349


🏁 Script executed:

# Check if LoadMore is mentioned anywhere else in the codebase
rg -n "LoadMore" --type rust crates/notedeck_columns/src/ | grep -v "^Binary"

Repository: damus-io/notedeck

Length of output: 662


🏁 Script executed:

# Check if there's any discussion about this in comments or issues
git log -p --all -S "LoadMore" -- "crates/notedeck_columns/src/actionbar.rs" | head -200

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg -n "load.*more|LoadMore|older.*publications" --type rust crates/notedeck_columns/src/ -B 2 -i | grep -A 3 -B 3 "comment\|doc\|Note\|should\|assumption\|rely"

Repository: damus-io/notedeck

Length of output: 2461


🏁 Script executed:

# Check if there's any async subscription update that might be triggered elsewhere
rg -n "try_add_remote.*until|subscribe.*until" --type rust crates/notedeck_columns/src/ -B 3 -A 3

Repository: damus-io/notedeck

Length of output: 43


LoadMore queries only local nostrdb without triggering relay subscriptions for older publications.

The implementation correctly queries the local database but relies entirely on the initial Publications subscription to have already fetched older events. If older publications aren't already cached locally, "Load More" will return empty results with no fallback to request them from relays.

Consider whether a remote fetch (via try_add_remote with the load-more filter) should accompany the local query, or whether the initial subscription is expected to be long-lived enough to backfill the necessary history.

🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/actionbar.rs` around lines 224 - 262, The
LoadMore handler (NoteAction::LoadMore) currently only queries the local nostrdb
via ndb.query using the filter built by
crate::timeline::kind::publications_load_more_filter(until); change it so that
when the local query returns no/newer results you also trigger a remote fetch:
call the existing relay-subscription helper (e.g. try_add_remote) with the same
filters/limit to request older publications from relays, and ensure you keep the
existing timeline_cache/timeline_res behavior so that when remote events arrive
they will be merged into the timeline (you may need to register the subscription
id or a callback to update TimelineOpenResult/NotesOpenResult when remote
results are delivered).

Comment on lines +54 to +61
pub fn needs_fetch(&self, batch_size: usize) -> Vec<EventAddress> {
self.tree
.pending_addresses()
.into_iter()
.filter(|addr| !self.pending_fetch.contains(addr))
.take(batch_size)
.cloned()
.collect()
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

cd crates/notedeck_columns && find . -name "*.rs" -type f | head -20

Repository: damus-io/notedeck

Length of output: 426


🏁 Script executed:

cd crates/notedeck_columns && cat -n src/timeline/publication.rs | head -100

Repository: damus-io/notedeck

Length of output: 3847


🏁 Script executed:

cd crates/notedeck_columns && wc -l src/timeline/publication.rs

Repository: damus-io/notedeck

Length of output: 92


🏁 Script executed:

cd crates/notedeck_columns && cat -n src/timeline/publication.rs | sed -n '100,200p'

Repository: damus-io/notedeck

Length of output: 3965


🏁 Script executed:

cd crates/notedeck_columns && cat -n src/timeline/publication.rs | sed -n '200,300p'

Repository: damus-io/notedeck

Length of output: 4587


🏁 Script executed:

cd crates/notedeck_columns && cat -n src/timeline/publication.rs | sed -n '300,400p'

Repository: damus-io/notedeck

Length of output: 4350


🏁 Script executed:

cd crates/notedeck_columns && cat -n src/timeline/publication.rs | sed -n '400,563p'

Repository: damus-io/notedeck

Length of output: 6049


🏁 Script executed:

cd crates/notedeck_columns && rg -i "eose" --type rs

Repository: damus-io/notedeck

Length of output: 87


🏁 Script executed:

cd crates/notedeck_columns && rg "subscription" --type rs -i | head -30

Repository: damus-io/notedeck

Length of output: 87


🏁 Script executed:

cd crates/notedeck_columns && rg "sub_id" --type rs -B 2 -A 2

Repository: damus-io/notedeck

Length of output: 87


🏁 Script executed:

rg "RelayPool" --type rs -A 5 | head -50

Repository: damus-io/notedeck

Length of output: 87


🏁 Script executed:

rg -i "eose" --include="*.rs" 2>/dev/null | head -30

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "RelayPool" --include="*.rs" -A 3 | head -60

Repository: damus-io/notedeck

Length of output: 502


🏁 Script executed:

rg "unsubscribe" --include="*.rs" -B 3 -A 3

Repository: damus-io/notedeck

Length of output: 502


🏁 Script executed:

rg "mark_fetching\|pending_fetch" --include="*.rs" -B 2 -A 2

Repository: damus-io/notedeck

Length of output: 502


🏁 Script executed:

rg -i "eose" | head -30

Repository: damus-io/notedeck

Length of output: 2651


🏁 Script executed:

rg "subscription\|eose\|EOSE" -B 2 -A 2 | head -80

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "pending_fetch" -B 2 -A 2

Repository: damus-io/notedeck

Length of output: 3803


🏁 Script executed:

find . -name "*.rs" -type f | xargs grep -l "poll\|message\|event" | head -10

Repository: damus-io/notedeck

Length of output: 402


🏁 Script executed:

cat -n crates/notedeck_columns/src/app.rs | grep -A 30 "fn handle_eose"

Repository: damus-io/notedeck

Length of output: 1164


🏁 Script executed:

cat -n crates/notedeck_columns/src/app.rs | grep -B 5 "handle_eose" | head -40

Repository: damus-io/notedeck

Length of output: 489


🏁 Script executed:

rg "handle_eose\|Publications" crates/notedeck_columns/src/app.rs -B 3 -A 5 | head -100

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

wc -l crates/notedeck_columns/src/app.rs

Repository: damus-io/notedeck

Length of output: 100


🏁 Script executed:

cat -n crates/notedeck_columns/src/subscriptions.rs

Repository: damus-io/notedeck

Length of output: 1173


🏁 Script executed:

rg "Publications\|publication" crates/notedeck_columns/src/app.rs -B 2 -A 2 | head -50

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

cat -n crates/notedeck_columns/src/timeline/mod.rs | sed -n '1,100p'

Repository: damus-io/notedeck

Length of output: 3799


🏁 Script executed:

rg "Publications\|publication" crates/notedeck_columns/src/app.rs | head -40

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

cat -n crates/notedeck_columns/src/app.rs | sed -n '1,100p'

Repository: damus-io/notedeck

Length of output: 3939


🏁 Script executed:

rg "Poll\|poll_updates\|Publications\|publication" crates/notedeck_columns/src/ -l

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "publications\." crates/notedeck_columns/src/app.rs -B 2 -A 2 | head -100

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "poll_updates\|needs_fetch\|mark_fetching" crates/notedeck_columns/src/ -B 2 -A 2

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

cat -n crates/notedeck_columns/src/app.rs | sed -n '500,650p'

Repository: damus-io/notedeck

Length of output: 5957


🏁 Script executed:

rg "process_notes\|poll_updates\|Publications" crates/notedeck_columns/src/ -B 3 -A 3

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

grep -r "poll_updates\|process_notes" crates/ --include="*.rs" -B 3 -A 3

Repository: damus-io/notedeck

Length of output: 4874


🏁 Script executed:

cat -n crates/notedeck_columns/src/app.rs | sed -n '200,400p'

Repository: damus-io/notedeck

Length of output: 7849


🏁 Script executed:

cat -n crates/notedeck_columns/src/ui/publication.rs | head -150

Repository: damus-io/notedeck

Length of output: 5593


🏁 Script executed:

grep -n "poll_updates\|process_notes\|Publications" crates/notedeck_columns/src/ui/publication.rs

Repository: damus-io/notedeck

Length of output: 361


🏁 Script executed:

cat -n crates/notedeck_columns/src/timeline/publication.rs | sed -n '287,341p'

Repository: damus-io/notedeck

Length of output: 2357


🏁 Script executed:

cat -n crates/notedeck_columns/src/ui/publication.rs | sed -n '140,200p'

Repository: damus-io/notedeck

Length of output: 2561


🏁 Script executed:

rg "SubKind::" crates/notedeck_columns/src/subscriptions.rs -B 2 -A 2

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "Publications\|publication" crates/notedeck_columns/src/app.rs -c

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "handle_eose" crates/notedeck_columns/src/app.rs -B 5 -A 20

Repository: damus-io/notedeck

Length of output: 1502


🏁 Script executed:

rg "publication" crates/notedeck_columns/src/app.rs -i

Repository: damus-io/notedeck

Length of output: 428


🏁 Script executed:

grep -r "SubKind" crates/notedeck_columns/src --include="*.rs" | head -30

Repository: damus-io/notedeck

Length of output: 1675


🏁 Script executed:

rg "timeout\|Timeout\|retry\|Retry" crates/notedeck_columns/src/timeline/publication.rs

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "EOSE\|Eose" crates/notedeck_columns/src/timeline/publication.rs

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

rg "publication" crates/notedeck_columns/src/subscriptions.rs -i

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

cat -n crates/notedeck_columns/src/timeline/publication.rs | sed -n '414,479p'

Repository: damus-io/notedeck

Length of output: 2571


Add a SubKind variant for publication subscriptions and handle EOSE to clear unresolved addresses.

needs_fetch() excludes pending_fetch, but addresses only leave that set when a note resolves. If a relay doesn't return a note (or the subscription ends), those addresses remain "in-flight" forever and won't be retried. Add SubKind::Publication to track publication subscriptions, handle EOSE in handle_eose() to clear unresolved addresses from pending_fetch, and implement retry logic for addresses that don't resolve within a timeout period.

🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/timeline/publication.rs` around lines 54 - 61,
needs_fetch currently prevents re-fetch of addresses still in pending_fetch but
those are only removed when a note resolves, so publication subscriptions can
leave addresses "in-flight" forever; add a new SubKind::Publication variant
wherever SubKind is defined and ensure subscriptions created for publications
mark addresses into pending_fetch, then update handle_eose(&mut self, sub_id:
&SubscriptionId) to detect EOSE for SubKind::Publication and remove any
addresses that remain unresolved from pending_fetch (and optionally record them
as unresolved for retry), and add timeout-based retry logic (e.g., store a
timestamp per address or a retry queue) that will move addresses back out of
pending_fetch after the timeout to allow needs_fetch(&self, batch_size: usize)
to include them again; reference symbols: needs_fetch, pending_fetch,
pending_addresses, handle_eose, SubKind::Publication, tree, and SubscriptionId.

Comment on lines +763 to +767
let display = if title.len() > 20 {
format!("{}...", &title[..17])
} else {
title.to_string()
};
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Same UTF-8 slicing issue in index breadcrumbs.

The same multi-byte character panic risk exists here with &title[..17].

🐛 Use char-aware truncation
                     let display = if title.len() > 20 {
-                        format!("{}...", &title[..17])
+                        format!("{}...", title.chars().take(17).collect::<String>())
                     } else {
                         title.to_string()
                     };
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let display = if title.len() > 20 {
format!("{}...", &title[..17])
} else {
title.to_string()
};
let display = if title.len() > 20 {
format!("{}...", title.chars().take(17).collect::<String>())
} else {
title.to_string()
};
🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/ui/publication.rs` around lines 763 - 767, The
truncation uses byte-slicing (&title[..17]) which can panic on multi-byte UTF-8;
update the logic around the display variable to operate on characters/graphemes
instead: check length with title.chars().count() (or use
unicode_segmentation::UnicodeSegmentation and graphemes(true).count()) and build
the truncated string with a char/grapheme-aware operation (e.g., take(17) and
collect into a String) when creating display instead of byte-slicing the title.

Comment on lines +83 to +88
if kind == KIND_PUBLICATION_INDEX as u64 {
println!("📚 Publication Index Found:");
println!(" Title: {}", title.as_deref().unwrap_or("(untitled)"));
println!(" D-tag: {}", dtag.as_deref().unwrap_or("(none)"));
println!(" Author: {}...", &pubkey[..16]);
println!(" Sections: {} referenced", a_tags.len());
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# First, let's locate and examine the file
find . -name "fetch_publication.rs" -type f

Repository: damus-io/notedeck

Length of output: 178


🏁 Script executed:

# Once found, let's read the relevant section with context
wc -l crates/notedeck_publications/examples/fetch_publication.rs

Repository: damus-io/notedeck

Length of output: 123


🏁 Script executed:

# Read around lines 83-88 with broader context to understand pubkey
sed -n '60,100p' crates/notedeck_publications/examples/fetch_publication.rs

Repository: damus-io/notedeck

Length of output: 2587


🏁 Script executed:

# Read the beginning of the file to understand the context and find where pubkey is defined
sed -n '1,85p' crates/notedeck_publications/examples/fetch_publication.rs

Repository: damus-io/notedeck

Length of output: 3793


Avoid slicing pubkey without a length check.

&pubkey[..16] can panic if the pubkey is shorter than 16 bytes. Since pubkey defaults to an empty string when JSON parsing fails, the slice operation will panic on malformed events. Use get(..16) for safe slicing:

-                                    println!("   Author: {}...", &pubkey[..16]);
+                                    let author_preview = pubkey.get(..16).unwrap_or(pubkey);
+                                    println!("   Author: {}...", author_preview);
🤖 Prompt for AI Agents
In `@crates/notedeck_publications/examples/fetch_publication.rs` around lines 83 -
88, The code prints a truncated pubkey with &pubkey[..16], which can panic if
pubkey is shorter (e.g., when JSON parsing defaulted to empty); update the print
to safely take at most 16 chars using pubkey.get(..16) (or equivalent safe
slicing) and fall back to the full pubkey or a placeholder, while keeping the
surrounding context in the block where kind == KIND_PUBLICATION_INDEX and the
variables title, dtag, pubkey, and a_tags are used.

Comment on lines +125 to +130
#[test]
fn test_empty_inputs() {
assert_eq!(title_abbreviation(""), "");
assert_eq!(slugify(""), "");
assert_eq!(generate_dtag("", "test"), "-test");
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Edge case: Empty publication title produces d-tag with leading hyphen.

The test confirms generate_dtag("", "test") returns "-test". This leading hyphen may cause issues:

  • extract_abbreviation("-test") returns Some("") (empty string)
  • D-tags starting with hyphen may look unusual in URLs/displays

Consider handling empty abbreviations explicitly.

💡 Potential fix
 pub fn generate_dtag(publication_title: &str, section_title: &str) -> String {
     let abbrev = title_abbreviation(publication_title);
     let slug = slugify(section_title);
-    format!("{}-{}", abbrev, slug)
+    if abbrev.is_empty() {
+        slug
+    } else {
+        format!("{}-{}", abbrev, slug)
+    }
 }
🤖 Prompt for AI Agents
In `@crates/notedeck_publications/src/dtag.rs` around lines 125 - 130,
generate_dtag currently concatenates title_abbreviation and slug with a hyphen
which yields a leading hyphen for empty titles; change generate_dtag to call
title_abbreviation (and/or slugify) and only prepend the "-" when the
abbreviation is non-empty (i.e., if title_abbreviation(...) != "" then format
"{abbr}-{slug}" else return slug), and adjust tests to expect "test" for
generate_dtag("", "test"); also ensure extract_abbreviation still behaves
correctly with the new format (it should return None or Some("") according to
your semantics) by updating its expectations if needed.

Comment on lines +174 to +179
/// Reset for a new fetch operation
pub fn reset(&mut self) {
self.state = FetchState::Idle;
self.subscription_id = None;
self.events_received = 0;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Reset should clear batches_completed for a new fetch.

reset() claims to start a new fetch operation, but the batch counter carries over, which can skew logs/metrics.

🛠️ Proposed fix
 pub fn reset(&mut self) {
     self.state = FetchState::Idle;
     self.subscription_id = None;
     self.events_received = 0;
+    self.batches_completed = 0;
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
/// Reset for a new fetch operation
pub fn reset(&mut self) {
self.state = FetchState::Idle;
self.subscription_id = None;
self.events_received = 0;
}
/// Reset for a new fetch operation
pub fn reset(&mut self) {
self.state = FetchState::Idle;
self.subscription_id = None;
self.events_received = 0;
self.batches_completed = 0;
}
🤖 Prompt for AI Agents
In `@crates/notedeck_publications/src/fetcher.rs` around lines 174 - 179, The
reset() method currently resets state, subscription_id, and events_received but
fails to clear the batch counter; update Fetcher::reset (the reset method on the
struct that has the batches_completed field) to also set self.batches_completed
= 0 so batches_completed does not carry over into a new fetch operation.

Comment on lines +93 to +98
// Limit body size and collect all bytes
let limited_body = Limited::new(res.into_body(), MAX_BODY_BYTES);
let collected = BodyExt::collect(limited_body)
.await
.map_err(|e| HyperHttpError::Hyper(e))?;
let bytes = collected.to_bytes().to_vec();
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Compilation error: LengthLimitError needs boxing.

The Limited::new wrapper returns a LengthLimitError when the limit is exceeded. The error needs to be boxed to match the HyperHttpError::Hyper(Box<...>) variant.

🐛 Proposed fix
     // Limit body size and collect all bytes
     let limited_body = Limited::new(res.into_body(), MAX_BODY_BYTES);
     let collected = BodyExt::collect(limited_body)
         .await
-        .map_err(|e| HyperHttpError::Hyper(e))?;
+        .map_err(|e| HyperHttpError::Hyper(Box::new(e)))?;
     let bytes = collected.to_bytes().to_vec();
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Limit body size and collect all bytes
let limited_body = Limited::new(res.into_body(), MAX_BODY_BYTES);
let collected = BodyExt::collect(limited_body)
.await
.map_err(|e| HyperHttpError::Hyper(e))?;
let bytes = collected.to_bytes().to_vec();
// Limit body size and collect all bytes
let limited_body = Limited::new(res.into_body(), MAX_BODY_BYTES);
let collected = BodyExt::collect(limited_body)
.await
.map_err(|e| HyperHttpError::Hyper(Box::new(e)))?;
let bytes = collected.to_bytes().to_vec();
🤖 Prompt for AI Agents
In `@crates/notedeck/src/media/network.rs` around lines 93 - 98, The error mapping
for collecting the limited body must box the LengthLimitError to match the
HyperHttpError::Hyper variant; update the map_err in the Limited::new /
BodyExt::collect call so it converts the error into
HyperHttpError::Hyper(Box::new(e)) (i.e., use Box::new(e) or equivalent) when
mapping the collection error coming from BodyExt::collect, keeping the rest of
the flow (limited_body, collected, bytes) unchanged.

@alltheseas
Copy link
Copy Markdown
Contributor

@limina1 can you please explain in human speak what this means?

What is the use case?

limina1 and others added 5 commits January 18, 2026 23:43
Change ReaderMode::Index to ReaderMode::Outline for better mental model.
Users will see "Outline" instead of "Index" in the Publications view toggle
and when switching modes within a publication.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Instead of rendering full content for leaf nodes in the Outline view,
show a truncated ~100 character preview with ellipsis. This makes it
easier to navigate through the outline without being overwhelmed by
full section content.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
On narrow screens (mobile devices), publications now default to Outline
view for easier navigation through hierarchical content. Desktop users
continue to get their preference from the Publications feed toggle.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add ResolutionStrategy enum and PublicationConfig struct to control how
pending nodes are resolved:

- Incremental (default): Resolve N nodes per frame to prevent UI freezing
- AllAtOnce: Resolve everything immediately (may freeze on large books)
- LazyOnDemand: Only resolve when visible (preparation for visibility tracking)

Key changes:
- poll_updates() now does incremental cache resolution each frame
- Config controls max_nodes_per_frame (default: 10)
- open() respects the strategy when pre-resolving from cache

This prevents the UI from freezing when opening complex books with many
cached chapters.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…cking

Add LazyOnDemand resolution strategy that only resolves nodes that are
visible on screen (plus a prefetch buffer for smooth scrolling).

Key changes:
- Add visible_nodes tracking to PublicationTreeState
- Add set_visible_nodes() and visible_pending_addresses() methods
- Track rendered nodes in ReaderState during UI rendering
- Pass visibility info from UI to publication state after each frame
- poll_updates() respects visibility when strategy is LazyOnDemand

This enables the most performant mode for large publications where only
visible content is fetched and resolved.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
limina1 and others added 6 commits January 19, 2026 00:48
Replace preview snippets with expandable cards for leaf nodes in Outline
view. Clicking a leaf (either the arrow or title) toggles between:
- Collapsed: shows only the title with a ▶ indicator
- Expanded: shows full content with a ▼ indicator

This provides better navigation - users can see the full structure at a
glance and expand only the sections they want to read.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add "Loading:" toggle in the Publications feed section with three options:
- Incremental: Load 10 sections per frame (default, smooth)
- Lazy: Only load visible sections (best for huge publications)
- All at Once: Load everything immediately (may freeze)

The selected strategy is applied when opening/viewing publications.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Change default resolution strategy from Incremental to LazyOnDemand
  for best performance on large publications
- Add "▶ Expand All" / "▼ Collapse All" button in Outline view
  that toggles all leaf nodes (30041 sections) at once

The button only appears when viewing a node that has leaf children,
and toggles based on current state (if all are expanded, shows collapse).

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Remove Incremental and AllAtOnce resolution strategies - lazy on-demand
is the only mode now (best performance for all publication sizes).

Also reorder the mode toggle buttons to show Outline first, then Reader,
since Outline is the preferred navigation mode.

Removed:
- ResolutionStrategy enum (only had one variant)
- try_resolve_existing methods (not needed for lazy)
- Loading strategy toggle in Publications feed
- DEFAULT_MAX_NODES_PER_FRAME constant

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Change the default ReaderMode from Continuous to Outline so publications
open in Outline view by default. This provides better navigation for
hierarchical content.

Also simplified the mode selection logic since we no longer need the
special narrow-screen case (Outline is now the default for all screens).

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove the "Open as: Outline | Reader" toggle from Publications feed
- Publications now always open in Outline view by default
- Mode switching is done via buttons in the publication view header
- Add breadcrumbs to paginated view showing the path to current section

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@limina1
Copy link
Copy Markdown
Author

limina1 commented Jan 19, 2026

This pull request enables users to read books from relays, and offers different ways to view the content.
https://v.nostr.build/Q3SaJPJTh5pztyx9.mp4

  • First it loads the index events, which act like book covers
    When a book is clicked it will open in one of two views:
  • Outline view: shows each chapter and nested chapters
  • Continuous view: The book displays as you scroll
  • There is also a paginated view, which displays only a chapter at a time, navigable by arrows

You can also navigate by the table of contents, which takes the user directly to the chapter in the paginated view.

notedeckpub_compatible.mp4

@alltheseas
Copy link
Copy Markdown
Contributor

alltheseas commented Jan 19, 2026

Suggestions:

  1. Extract into a notedeck app - see "Dave" app as example
  2. Try using shadcn-egui elements to beautify a bit: AGENTS.md: add shadcn-egui UI element reference #1248
  3. Verify that you followed instructions at bottom of notedeck's agents.md file: https://github.com/damus-io/notedeck/blob/master/AGENTS.md

@alltheseas
Copy link
Copy Markdown
Contributor

@limina1 I'm pretty excited to run this in my android tablet, so that I can read books, create highlights that I can publish to nostr

limina1 and others added 3 commits January 20, 2026 21:53
Create a new notedeck_reader crate that provides a dedicated reader
application for NKBIP-01 publications, following the Dave app pattern.

New crate structure:
- lib.rs: ReaderApp struct with App trait implementation
- nav.rs: PublicationSelection for navigation state
- state.rs: PublicationTreeState and Publications manager
- ui/reader.rs: Full publication viewer with Outline/Continuous/Paginated modes

Integration:
- Add NotedeckApp::Reader variant to Chrome (feature-gated)
- Add AppAction::OpenPublication and SwitchToColumns for cross-app navigation
- Route kind 30040 clicks from Columns to Reader app
- Add close button in Reader to return to Columns

Also fixes NIP-11 fetching to skip non-websocket URLs like "multicast".

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add #[profiling::function] attributes to key render functions in
notedeck_reader for visibility in the puffin profiler:

- ui()
- render_header()
- render_content()
- render_publication()
- render_continuous()
- render_paginated()
- render_outline_view()
- render_section_card()
- render_toc_overlay()

These annotations allow identifying performance bottlenecks when
viewing large publications with many sections.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Flatten deeply nested conditionals in reader.rs using early returns
and let-else patterns for improved readability:

- render_header: Use Option combinators for parent navigation lookup
- render_outline_child_card: Flatten leaf content rendering with
  early returns for each error case
- render_section_card: Same pattern for section content

Before (4+ levels of nesting):
```rust
if !is_branch && is_expanded {
    if let Some(note_key) = note_key {
        if let Ok(note) = self.ndb.get_note_by_key(txn, note_key) {
            if !content.is_empty() {
                // actual logic
            }
        }
    }
}
```

After (flat with early returns):
```rust
if is_branch || !is_expanded { return; }
let Some(note_key) = note_key else { return; };
let Ok(note) = self.ndb.get_note_by_key(txn, note_key) else { return; };
if content.is_empty() { return; }
// actual logic
```

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
@jb55
Copy link
Copy Markdown
Contributor

jb55 commented Jan 21, 2026

this would be better as a separate app ?

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 10

🤖 Fix all issues with AI agents
In `@crates/notedeck_chrome/src/chrome.rs`:
- Around line 499-502: The call to
columns.decks_cache.active_columns_mut(ctx.i18n, ctx.accounts).unwrap() can
panic; replace the unwrap with fallible handling (use the ? operator if the
surrounding function returns a compatible Result, or a match/let-else that
returns or propagates an appropriate error) so that active_columns_mut's
None/Err is handled consistently with the surrounding error style; locate the
call to active_columns_mut in the function where cols is assigned and change it
to propagate or return an error (or handle the None case) instead of unwrapping.
- Around line 616-621: The inline comment incorrectly claims we handle
app_action for consistency; update the comment near the call to
result.process_router_action(&mut col.router, &mut col.sheet_router) to state
that process_router_action on NoteActionResult only acts on router_action and
intentionally ignores app_action in this path (or remove the "we handle it for
consistency" phrase), so readers understand app_action is not processed here.

In `@crates/notedeck_columns/src/timeline/publication.rs`:
- Around line 384-458: In poll_updates, avoid calling
state.visible_pending_addresses() twice; compute it once into a local (e.g., let
visible_addrs = state.visible_pending_addresses()) and reuse that variable for
both the lazy-resolution loop and for building the pending Vec, replacing the
second call to state.visible_pending_addresses() used to create visible/pending;
ensure you use that same visible_addrs (or a cloned/borrowed copy if needed)
when filtering against state.pending_fetch and when computing batch_size-limited
pending addresses.

In `@crates/notedeck_columns/src/timeline/timeline_units.rs`:
- Around line 141-167: The 30040 handling in timeline_units.rs currently rejects
events missing a "title" tag but other code (PublicationTree::from_root_note and
UI) treat title as optional; remove the strict title requirement by deleting the
has_title check (the payload.note.tags() .any(...) and the subsequent if
!has_title return None) so the 30040 branch returns
Some(NoteUnitFragmentResponse { ... }) even when the "title" tag is absent; keep
or adjust the existing content check as desired to remain consistent with other
consumers and update any affected tests or callers accordingly.

In `@crates/notedeck_columns/src/ui/timeline.rs`:
- Around line 432-441: The "Load More" label is hardcoded; update the button
text to use the localization macro by replacing the literal in the RichText
creation used for the load_more button (see load_more_text in the if
enable_load_more && !self.tab.units.is_empty() / if let Some(oldest) =
self.tab.units.oldest() block). Use tr! (e.g., pass tr!("Load More") to
RichText::new) so the user-facing string is localized while keeping the rest of
the logic (button click ->
NoteAction::LoadMore(oldest.created_at.saturating_sub(1))) unchanged.
- Around line 747-831: Replace hard-coded user-facing strings in the publication
card with localization macros: use tr! for single strings and tr_plural! for
pluralizable text. Specifically, change the default title ("Untitled
Publication") to tr!("Untitled Publication") where title.unwrap_or is used, wrap
the "by" and "published by" labels (in the publication_author and index author
horizontal blocks) with tr!, and replace the section count logic that builds "1
section"/"N sections" to use tr_plural!(section_count, "1 section", "{}
sections") (or the project's tr_plural! signature) instead of manual format!.
Update any RichText::new(...) calls that currently contain those literals to
pass the tr!/tr_plural! results so UI strings use the app's i18n pipeline.

In `@crates/notedeck_reader/src/state.rs`:
- Around line 541-544: The new_sub_id() function in this module generates UUIDs
directly, which diverges from the established subscription ID generator used
elsewhere (subscriptions::new_sub_id()); replace the local implementation with a
single canonical generator call: remove the local
uuid::Uuid::new_v4().to_string() implementation and call the shared
subscriptions::new_sub_id() (or extract the shared generator into a common
utility and call that) so all subscription IDs share the same format and
behavior across new_sub_id().
- Around line 1-11: This module duplicates logic from the columns crate; extract
the shared publication state logic (types and functions around PublicationTree,
PublicationTreeNode management, sub id generation, and related helpers) into the
notedeck_publications crate or a new common module and replace the duplicated
code in this file by importing and using that shared API; specifically, refactor
new_sub_id generation to use the unified subscriptions::new_sub_id() helper
instead of calling uuid::Uuid::new_v4() directly, move any duplicated helper
functions into the shared module, and update usages in functions and structs
that reference PublicationTree, PublicationTreeNode, and sub id creation to the
consolidated implementations.

In `@crates/notedeck_reader/src/ui/reader.rs`:
- Around line 574-582: The UI fetch status is using sections.len() for both
numerator and denominator which always shows identical values; inside the
conditional that renders the spinner (the block guarded by is_complete and the
closure passed to ui.horizontal), change the label to display meaningful
progress by using the count of resolved/loaded sections as the numerator and the
total expected sections as the denominator (e.g., use a resolved_sections_count
or loaded_count variable instead of the left-hand sections.len(), or if you only
have the current loaded count, switch to a single-value message like
"Fetching... ({} loaded)" using sections.len()), updating the format call that
currently references sections.len() twice and keeping the surrounding
ui.spinner()/ui.label code in reader.rs.

In `@crates/notedeck/src/media/network.rs`:
- Around line 19-22: The use of
HttpsConnectorBuilder::with_native_roots().expect(...) can panic on platforms
without OS trust stores; replace the expect call with proper error propagation
similar to the pattern in relay_info.rs by mapping the error (e.g., with
.map_err(...) or ?) and returning a Result from the surrounding function instead
of unwrapping; update the surrounding function signature to return Result if
needed and ensure the builder variable is constructed via
HttpsConnectorBuilder::new().with_native_roots().map_err(|e| /* convert to
appropriate error type */)? so the failure is propagated rather than panicking.
♻️ Duplicate comments (10)
crates/notedeck/src/media/network.rs (1)

91-96: Box LengthLimitError before wrapping in HyperHttpError.

HyperHttpError::Hyper expects Box<dyn Error + Send + Sync>, so this currently won’t type‑check. This looks like the same boxing issue flagged previously.

🐛 Suggested fix
-    let collected = BodyExt::collect(limited_body)
-        .await
-        .map_err(|e| HyperHttpError::Hyper(e))?;
+    let collected = BodyExt::collect(limited_body)
+        .await
+        .map_err(|e| HyperHttpError::Hyper(Box::new(e)))?;
crates/notedeck_columns/src/actionbar.rs (1)

243-276: LoadMore only queries local cache; no relay backfill.

If older publications aren’t already cached, the button yields empty results. Consider triggering a remote fetch/subscription with the same filter to backfill older events.

crates/notedeck_publications/examples/fetch_publication.rs (1)

90-90: Avoid slicing pubkey without a length check.

&pubkey[..16] can panic if pubkey is shorter than 16 characters. Since pubkey defaults to an empty string when the JSON field is missing, this will panic on malformed events.

🐛 Proposed fix
-                                    println!("   Author: {}...", &pubkey[..16]);
+                                    let author_preview = pubkey.get(..16).unwrap_or(pubkey);
+                                    println!("   Author: {}...", author_preview);
crates/notedeck_publications/src/fetcher.rs (1)

177-182: Reset should clear batches_completed for a new fetch.

reset() claims to start a new fetch operation, but the batch counter carries over, which can skew logs/metrics.

🛠️ Proposed fix
 pub fn reset(&mut self) {
     self.state = FetchState::Idle;
     self.subscription_id = None;
     self.events_received = 0;
+    self.batches_completed = 0;
 }
crates/notedeck_columns/src/ui/publication.rs (3)

406-411: Potential panic from string slicing on multi-byte characters.

&crumb_title[..12] can panic if the title contains multi-byte UTF-8 characters (e.g., emoji, non-ASCII text) and index 12 falls in the middle of a character.

🐛 Use char-aware truncation
             // Truncate long titles
-            let display_title = if crumb_title.len() > 15 {
-                format!("{}…", &crumb_title[..12])
+            let display_title = if crumb_title.chars().count() > 15 {
+                format!("{}…", crumb_title.chars().take(12).collect::<String>())
             } else {
                 crumb_title
             };

845-849: Same UTF-8 slicing issue in outline breadcrumbs.

The same multi-byte character panic risk exists here with &title[..17].

🐛 Use char-aware truncation
                     let display = if title.len() > 20 {
-                        format!("{}...", &title[..17])
+                        format!("{}...", title.chars().take(17).collect::<String>())
                     } else {
                         title.to_string()
                     };

560-570: Loading progress shows same value twice.

The progress text format!("Fetching... ({}/{})", sections.len(), sections.len()) displays the same value for both placeholders. This should show resolved sections vs total sections.

🐛 Fix progress display

The sections variable here is already the resolved sections, so you'd need access to the total count. Consider passing it through from the section_data tuple or displaying a simpler message:

-                ui.label(format!("Fetching... ({}/{})", sections.len(), sections.len()));
+                ui.label("Fetching more sections...");

Or capture total count earlier if you want to show progress.

crates/notedeck_reader/src/state.rs (2)

616-628: get_tag_value helper - same duplication issue.

This is the third instance of this helper function. When consolidating, move this to notedeck_publications as a public utility.


380-448: Same inefficiency: visible_pending_addresses() called twice.

This has the same issue as the columns version - visible_pending_addresses() is called at lines 396 and 412. Apply the same fix when consolidating.

Additionally, per coding guidelines, add #[profiling::function] to this hot path.

crates/notedeck_columns/src/timeline/publication.rs (1)

141-148: Addresses remain in pending_fetch indefinitely if relay doesn't respond.

needs_fetch() excludes addresses in pending_fetch, but these are only removed when a note is found in process_notes(). If a relay never returns a note (timeout, missing data, EOSE without result), addresses stay "in-flight" forever and won't be retried.

Consider implementing:

  1. A timeout mechanism to expire stale pending fetches
  2. EOSE handling to clear unresolved addresses from pending_fetch
  3. A retry queue with exponential backoff for failed fetches
🧹 Nitpick comments (18)
crates/notedeck_columns/src/actionbar.rs (1)

39-48: Add a doc comment for the new public helper.

process_router_action is public but undocumented. Please add a brief doc comment. As per coding guidelines, ensure docstring coverage for new public APIs.

✏️ Suggested doc comment
 impl NoteActionResult {
+    /// Execute the router action (if any) against the stack and sheet routers.
     pub fn process_router_action(
         self,
         router: &mut crate::route::ColumnsRouter<crate::route::Route>,
         sheet_router: &mut crate::route::SingletonRouter<crate::route::Route>,
     ) {
crates/notedeck_columns/src/ui/timeline.rs (1)

393-401: Document the new public load‑more API.

show_with_load_more is public but undocumented. As per coding guidelines, add doc comments for new public methods.

✏️ Suggested doc comments
     pub fn show(&mut self, ui: &mut egui::Ui) -> Option<NoteAction> {
         self.show_with_load_more(ui, false)
     }
 
+    /// Render the tab and optionally show a "Load More" control for pagination.
     pub fn show_with_load_more(
         &mut self,
         ui: &mut egui::Ui,
         enable_load_more: bool,
     ) -> Option<NoteAction> {
crates/notedeck_columns/src/route.rs (1)

593-595: Constrain routes_mut to prevent accidental invariant violations.

Exposing mutable access to the internal routes vector risks breaking ColumnsRouter invariants, as overlay_ranges indices depend on the routes vector length and structure. Current usage (nav.rs:627) is safe because it only mutates the last route's content, not the vector itself. Consider providing constrained alternatives like last_route_mut() or route_mut(index) instead of exposing the full vector.

crates/notedeck_columns/src/nav.rs (1)

623-647: Consider handling the case where route is not a Publication.

The code silently does nothing if cols.router.routes_mut().last_mut() returns None or if the route isn't a Publication. While this may be intentional (defensive coding), consider adding a debug log for unexpected states.

💡 Optional: Add debug logging for unexpected route states
 RenderNavAction::PublicationNav(nav_action) => {
     // Get mutable access to the current route and modify the publication selection
     let cols = get_active_columns_mut(ctx.i18n, ctx.accounts, &mut app.decks_cache)
         .column_mut(col);
     if let Some(route) = cols.router.routes_mut().last_mut() {
         if let crate::route::Route::Publication(ref mut selection) = route {
             match nav_action {
                 ui::publication::PublicationNavAction::Back => {
                     selection.navigate_back();
                 }
                 ui::publication::PublicationNavAction::NavigateInto(note_id) => {
                     selection.navigate_into(note_id);
                 }
                 // Index view navigation is handled locally in the UI via ReaderState
                 // stored in egui memory - these variants are only defined for future
                 // extensibility if we need to propagate them
                 ui::publication::PublicationNavAction::DrillDown(_)
                 | ui::publication::PublicationNavAction::DrillUp
                 | ui::publication::PublicationNavAction::PrevSibling
                 | ui::publication::PublicationNavAction::NextSibling => {}
             }
+        } else {
+            tracing::debug!("PublicationNav action received but current route is not a Publication");
         }
+    } else {
+        tracing::debug!("PublicationNav action received but no routes present");
     }
     return None;
 }
crates/notedeck_reader/src/lib.rs (1)

128-141: Consider localizing user-facing strings.

The empty state text is hardcoded in English. Per the coding guidelines, user-facing strings should be wrapped with tr! or tr_plural! macros for localization.

🌐 Wrap strings with tr! for localization

This would require passing i18n to render_empty_state. Example:

-impl ReaderApp {
-    fn render_empty_state(&self, ui: &mut egui::Ui) {
+impl ReaderApp {
+    fn render_empty_state(&self, ui: &mut egui::Ui, i18n: &Localization) {
         ui.vertical_centered(|ui| {
             ui.add_space(80.0);
-            ui.heading("Publications Reader");
+            ui.heading(notedeck::tr!(i18n, "Publications Reader", "Header for empty reader state"));
             ui.add_space(16.0);
-            ui.label("Select a publication from the Publications timeline to start reading.");
+            ui.label(notedeck::tr!(i18n, "Select a publication from the Publications timeline to start reading.", "Instructions for empty reader state"));
             // ...
         });
     }
 }
crates/notedeck_reader/src/ui/reader.rs (3)

362-364: Early return inside closure may cause unexpected behavior.

The return; at line 363 exits the closure passed to ui.horizontal, not the entire render_header method. While this may work as intended (early exit from the horizontal layout), consider restructuring for clarity.

💡 Consider restructuring to avoid return inside closure

The current pattern works but can be confusing. An alternative approach:

-                    let Some(ps) = pub_state else {
-                        return;
-                    };
-                    let (prev_sibling, next_sibling) = ps.tree.siblings(current_node);
+                    if let Some(ps) = pub_state {
+                        let (prev_sibling, next_sibling) = ps.tree.siblings(current_node);
+                        // ... sibling navigation buttons ...
+                    }

This makes the control flow clearer and avoids the early return pattern inside a closure.


884-884: Function has many parameters - consider using a builder or struct.

render_outline_child_card has 8 parameters, which makes it harder to read and maintain. The #[allow(clippy::too_many_arguments)] suppresses the warning, but consider grouping related parameters.

💡 Consider grouping parameters into a struct
struct OutlineChildData {
    child_idx: usize,
    title: String,
    is_branch: bool,
    is_resolved: bool,
    note_key: Option<NoteKey>,
}

fn render_outline_child_card(
    &mut self,
    ui: &mut egui::Ui,
    txn: &Transaction,
    state: &mut ReaderState,
    child: &OutlineChildData,
) -> Option<NoteAction> { ... }

This would reduce the parameter count and make call sites cleaner.


1309-1321: Potential code duplication with state.rs helper.

This get_tag_value method appears to duplicate functionality from crates/notedeck_reader/src/state.rs (lines 616-627 in relevant snippets). Consider reusing the existing helper to maintain consistency.

#!/bin/bash
# Search for get_tag_value implementations to confirm duplication
rg -n "fn get_tag_value" --type rust
crates/notedeck_publications/src/fetcher.rs (1)

126-135: Search filter is a stub that ignores the search term.

The function accepts a _search_term but doesn't use it, returning only recent publications. While the comment explains NIP-50 limitations, callers may not realize search isn't functional. Consider either:

  1. Removing this function until search is implemented
  2. Returning an error/Option to indicate search isn't supported
  3. Adding #[allow(unused_variables)] explicitly and a more prominent doc warning
crates/notedeck_columns/src/ui/publication.rs (1)

240-246: User-facing strings should use tr! macro for localization.

The hover text "Back to previous publication" and other UI strings throughout this file are not wrapped with the tr! macro. Per coding guidelines, user-facing strings should be localized.

Examples of strings needing localization:

  • Line 241: "Back to previous publication"
  • Line 266: "Switch to paginated view"
  • Line 344: "Go up to parent"
  • Line 440: "Loading Publication..."
  • Line 478: "by {}"
  • And many others throughout the file
♻️ Example fix for back button
-                    .on_hover_text("Back to previous publication")
+                    .on_hover_text(tr!(self.i18n, "Back to previous publication", "Hover text for back button in publication reader"))
crates/notedeck/src/relay_info.rs (1)

95-101: Consider using tokio::sync::RwLock for cross-thread sharing.

Per coding guidelines, tokio::sync::RwLock is preferred over std::sync::RwLock for cross-thread sharing in Notedeck. While the current usage with std::sync::RwLock works, the cache is accessed from both UI code and tokio-spawned tasks.

The risk is low since the operations are quick HashMap lookups/inserts, but switching to tokio::sync::RwLock would align with project conventions and provide better guarantees for async code.

Note: If you keep std::sync::RwLock, ensure all UI-path accesses use try_read()/try_write() with fallbacks to avoid any potential blocking.

crates/notedeck_columns/src/timeline/publication.rs (4)

17-22: Consider removing or deferring empty config struct.

PublicationConfig is currently empty with only a comment about future use. Per YAGNI principle, consider removing this until actual configuration fields are needed, or add a #[allow(dead_code)] annotation if you want to keep the placeholder.


390-397: Add #[profiling::function] annotation for performance visibility.

Per coding guidelines, performance-critical functions should be marked with #[profiling::function] for visibility in the puffin profiler. poll_updates is called each frame when a publication is open and should be profiled.

🔧 Proposed fix
     /// Returns true if any nodes were resolved
+    #[profiling::function]
     pub fn poll_updates(
         &mut self,

317-346: Consider gating diagnostic queries behind a feature flag or debug build.

The diagnostic query that checks for existing 30040/30041 events from the author is useful for debugging but adds overhead in production. Consider:

  1. Moving it behind #[cfg(debug_assertions)]
  2. Making it conditional on a tracing level check
  3. Removing it entirely once the feature is stable

621-633: get_tag_value is duplicated across four files.

This helper function exists in:

  • crates/notedeck_columns/src/timeline/publication.rs (line 622)
  • crates/notedeck_reader/src/state.rs (line 617)
  • crates/notedeck_reader/src/ui/reader.rs (line 1309)
  • crates/notedeck_columns/src/ui/publication.rs (line 1277)

All implementations are identical. Consider extracting this utility to a shared module (e.g., notedeck_publications or a common utilities crate) to reduce duplication.

crates/notedeck_publications/src/tree.rs (3)

30-32: Unused cursor field - consider removing or implementing.

The cursor field has #[allow(dead_code)] but isn't used anywhere in the implementation. Either implement cursor-based iteration (if planned) or remove the field to reduce cognitive load.


243-244: Consider caching sorted children to avoid repeated cloning and sorting.

siblings() clones and sorts the parent's children array each call. If this method is called frequently (e.g., during navigation), consider caching the sorted order or maintaining children in sorted order during insertion.


422-513: Consider expanding test coverage.

The existing tests cover basic hierarchy and counts, but consider adding tests for:

  • siblings() navigation
  • LeafIterator traversal order
  • resolve_node() state transitions
  • Edge cases like empty trees or single-node trees

Per coding guidelines, tests should live alongside modules, which is correctly done here.

Comment on lines +499 to +502
let cols = columns
.decks_cache
.active_columns_mut(ctx.i18n, ctx.accounts)
.unwrap();
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Potential panic from unwrap() on columns access.

The unwrap() call on active_columns_mut could panic if no columns are available. Consider using ? or a match/let-else pattern for consistency with the surrounding error handling style.

🐛 Suggested fix
-            let cols = columns
-                .decks_cache
-                .active_columns_mut(ctx.i18n, ctx.accounts)
-                .unwrap();
+            let Some(cols) = columns
+                .decks_cache
+                .active_columns_mut(ctx.i18n, ctx.accounts)
+            else {
+                return;
+            };
🤖 Prompt for AI Agents
In `@crates/notedeck_chrome/src/chrome.rs` around lines 499 - 502, The call to
columns.decks_cache.active_columns_mut(ctx.i18n, ctx.accounts).unwrap() can
panic; replace the unwrap with fallible handling (use the ? operator if the
surrounding function returns a compatible Result, or a match/let-else that
returns or propagates an appropriate error) so that active_columns_mut's
None/Err is handled consistently with the surrounding error style; locate the
call to active_columns_mut in the function where cols is assigned and change it
to propagate or return an error (or handle the None case) instead of unwrapping.

Comment on lines +616 to 621
if let Some(result) = m_result {
// Note: app_action is not expected for profile navigation
// but we handle it for consistency
let col = cols.selected_mut();

action.process_router_action(&mut col.router, &mut col.sheet_router);
result.process_router_action(&mut col.router, &mut col.sheet_router);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# Search for NoteActionResult definition and process_router_action implementation
rg "struct NoteActionResult|fn process_router_action" -A 3

Repository: damus-io/notedeck

Length of output: 178


🏁 Script executed:

# Search more specifically for NoteActionResult in the codebase
fd -e rs -x rg -l "NoteActionResult"

Repository: damus-io/notedeck

Length of output: 142


🏁 Script executed:

# Look at the chrome.rs file around the lines in question for more context
sed -n '610,630p' crates/notedeck_chrome/src/chrome.rs

Repository: damus-io/notedeck

Length of output: 646


🏁 Script executed:

# Search for NoteActionResult definition
rg "struct NoteActionResult" -A 10

Repository: damus-io/notedeck

Length of output: 877


🏁 Script executed:

# Search for process_router_action method implementation
rg "process_router_action" -B 2 -A 8

Repository: damus-io/notedeck

Length of output: 4106


🏁 Script executed:

# Check if NoteActionResult is defined in multiple files
fd -e rs -x rg -l "struct NoteActionResult"

Repository: damus-io/notedeck

Length of output: 103


Fix misleading inline comment about app_action handling.

The process_router_action method on NoteActionResult only processes the router_action field and completely ignores app_action. The inline comment stating "we handle it for consistency" is inaccurate—the method does not handle app_action at all. Update the comment to clarify that app_action is intentionally ignored in this code path (or remove the misleading phrase).

🤖 Prompt for AI Agents
In `@crates/notedeck_chrome/src/chrome.rs` around lines 616 - 621, The inline
comment incorrectly claims we handle app_action for consistency; update the
comment near the call to result.process_router_action(&mut col.router, &mut
col.sheet_router) to state that process_router_action on NoteActionResult only
acts on router_action and intentionally ignores app_action in this path (or
remove the "we handle it for consistency" phrase), so readers understand
app_action is not processed here.

Comment on lines +384 to +458
/// Poll for updates - resolves pending nodes from nostrdb
///
/// `batch_size` controls how many addresses to fetch per subscription.
/// Use `RelayInfoCache::min_max_event_tags()` to get this value based on relay limits.
///
/// Returns true if any nodes were resolved
pub fn poll_updates(
&mut self,
ndb: &Ndb,
pool: &mut RelayPool,
txn: &Transaction,
index_id: &NoteId,
batch_size: usize,
) -> bool {
let Some(state) = self.publications.get_mut(index_id) else {
return false;
};

// Process notes that were fetched from subscriptions (pending_fetch)
let mut resolved = state.process_notes(ndb, txn);

// Lazy resolution: only resolve visible nodes from cache
let visible_addrs = state.visible_pending_addresses();
if !visible_addrs.is_empty() {
for addr in &visible_addrs {
if let Some(note) = find_note_by_address(ndb, txn, addr) {
if state.tree.resolve_node(addr, note).is_some() {
debug!(
"poll_updates: lazy-resolved visible node {}:{}",
addr.kind, addr.dtag
);
resolved = true;
}
}
}
}

// Subscribe for visible pending addresses only
let visible = state.visible_pending_addresses();
let pending: Vec<_> = visible
.into_iter()
.filter(|addr| !state.pending_fetch.contains(addr))
.take(batch_size)
.collect();
if !pending.is_empty() {
debug!(
"poll_updates: {} pending addresses need fetching (batch_size={})",
pending.len(),
batch_size
);

let filters = build_filters_for_addresses(&pending);

if !filters.is_empty() {
// Unsubscribe from old subscription if exists
if let Some(old_sub) = state.sub_id.take() {
pool.unsubscribe(old_sub);
}

let sub_id = subscriptions::new_sub_id();
info!(
"poll_updates: creating subscription {} for {} addresses",
sub_id,
pending.len()
);
pool.subscribe(sub_id.clone(), filters);
state.sub_id = Some(sub_id);

// Mark these as being fetched
state.mark_fetching(&pending);
}
}

resolved
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

visible_pending_addresses() is called twice per poll.

Lines 406 and 422 both call state.visible_pending_addresses(), which iterates through visible nodes, children, and siblings each time. This is wasteful when the result hasn't changed between calls.

🔧 Proposed fix to reuse the computed addresses
     pub fn poll_updates(
         &mut self,
         ndb: &Ndb,
         pool: &mut RelayPool,
         txn: &Transaction,
         index_id: &NoteId,
         batch_size: usize,
     ) -> bool {
         let Some(state) = self.publications.get_mut(index_id) else {
             return false;
         };

         // Process notes that were fetched from subscriptions (pending_fetch)
         let mut resolved = state.process_notes(ndb, txn);

-        // Lazy resolution: only resolve visible nodes from cache
-        let visible_addrs = state.visible_pending_addresses();
+        // Compute visible addresses once for both resolution and subscription
+        let visible_addrs = state.visible_pending_addresses();
+
+        // Lazy resolution: only resolve visible nodes from cache
         if !visible_addrs.is_empty() {
             for addr in &visible_addrs {
                 if let Some(note) = find_note_by_address(ndb, txn, addr) {
                     if state.tree.resolve_node(addr, note).is_some() {
                         debug!(
                             "poll_updates: lazy-resolved visible node {}:{}",
                             addr.kind, addr.dtag
                         );
                         resolved = true;
                     }
                 }
             }
         }

         // Subscribe for visible pending addresses only
-        let visible = state.visible_pending_addresses();
-        let pending: Vec<_> = visible
+        let pending: Vec<_> = visible_addrs
             .into_iter()
             .filter(|addr| !state.pending_fetch.contains(addr))
             .take(batch_size)
             .collect();
🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/timeline/publication.rs` around lines 384 - 458,
In poll_updates, avoid calling state.visible_pending_addresses() twice; compute
it once into a local (e.g., let visible_addrs =
state.visible_pending_addresses()) and reuse that variable for both the
lazy-resolution loop and for building the pending Vec, replacing the second call
to state.visible_pending_addresses() used to create visible/pending; ensure you
use that same visible_addrs (or a cloned/borrowed copy if needed) when filtering
against state.pending_fetch and when computing batch_size-limited pending
addresses.

Comment on lines +141 to +167
// NKBIP-01 Publication index (kind 30040)
// Filter: must have title tag and no content (defensive programming)
30040 => {
// Reject 30040 events with content (index notes should be empty)
let content = payload.note.content();
if !content.is_empty() && !content.chars().all(|c| c.is_whitespace()) {
return None;
}

// Require a title tag
let has_title = payload
.note
.tags()
.iter()
.any(|tag| tag.count() >= 2 && tag.get_str(0) == Some("title"));
if !has_title {
return None;
}

Some(NoteUnitFragmentResponse {
fragment: NoteUnitFragment::Single(NoteRef {
key: payload.key,
created_at: payload.note.created_at(),
}),
unknown_pk: None,
})
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

NKBIP-01 kind 30040 publication index specification

💡 Result:

NKBIP-01 is the Nostr Knowledge Base/Publication Interchange Proposal that specifies the "Curated Publication Index" / modular article header format used for kind 30040 (and related 30041 content). It defines how to structure publication-index events (30040) that reference/organize modular article content (30041, 30023, 30818, etc.). See the NIPs index and the NKBIP-01 publication page for the full spec and examples. [1][2]

Sources:

  • NIPs index showing kind 30040 → NKBIP-01. [1]
  • NKBIP-01 spec/publication page (Next-Alexandria). [2]

🌐 Web query:

NKBIP-01 specification title tag required content field

💡 Result:

I can’t find a specification named “NKBIP-01.” Can you give more context or a link (organization, country, or file) so I can look up the required content for the title tag?


🏁 Script executed:

# Search for references to NKBIP-01, kind 30040, or related specifications in the codebase
rg "NKBIP|30040|publication.*index" --type rs -B 2 -A 2 | head -100

Repository: damus-io/notedeck

Length of output: 87


🏁 Script executed:

# Search for references to NKBIP-01, kind 30040, or related specifications
rg "NKBIP|30040|publication.*index" -B 2 -A 2 | head -100

Repository: damus-io/notedeck

Length of output: 6337


🏁 Script executed:

# Search for notedeck_publications crate structure
find . -path "*/notedeck_publications/src*" -name "*.rs" | head -20

Repository: damus-io/notedeck

Length of output: 371


🏁 Script executed:

# Read the tree.rs file to understand kind 30040 validation
cat -n crates/notedeck_publications/src/tree.rs | head -200

Repository: damus-io/notedeck

Length of output: 7480


🏁 Script executed:

# Continue reading tree.rs, particularly the extract_address and other validation methods
cat -n crates/notedeck_publications/src/tree.rs | sed -n '200,300p'

Repository: damus-io/notedeck

Length of output: 3950


🏁 Script executed:

# Continue reading to find the extract_address and validation methods
cat -n crates/notedeck_publications/src/tree.rs | sed -n '300,400p'

Repository: damus-io/notedeck

Length of output: 3760


🏁 Script executed:

# Search for any spec files or NIPs documentation in the repo
find . -name "*spec*" -o -name "*nip*" -o -name "NKBIP*" 2>/dev/null | grep -v ".git" | head -20

Repository: damus-io/notedeck

Length of output: 205


🏁 Script executed:

# Read the specific lines in timeline_units.rs being reviewed
cat -n crates/notedeck_columns/src/timeline/timeline_units.rs | sed -n '130,180p'

Repository: damus-io/notedeck

Length of output: 2070


🏁 Script executed:

# Search for tests or documentation about kind 30040 validation requirements
rg "30040" -B 3 -A 3 | grep -v "Binary" | head -150

Repository: damus-io/notedeck

Length of output: 10676


🏁 Script executed:

# Search for any specification or documentation about kind 30040 requirements in the codebase
rg "title.*tag|content.*empty|NKBIP.*30040" -B 2 -A 2 | head -100

Repository: damus-io/notedeck

Length of output: 7769


🏁 Script executed:

# Look for test files or examples that show what a valid kind 30040 should look like
find . -name "*.rs" -type f | xargs grep -l "30040" | xargs grep -l "test\|example" | head -5

Repository: damus-io/notedeck

Length of output: 315


🏁 Script executed:

# Check the example file to see what a valid kind 30040 should look like
cat crates/notedeck_publications/examples/fetch_publication.rs

Repository: damus-io/notedeck

Length of output: 11935


🌐 Web query:

Nostr NIP-23 NIP-30 publication standard kind 30040 specification

💡 Result:

Summary — where to find each spec and what it defines:

  • NIP‑23 — Long‑form content (kind 30023 / 30024): defines Markdown articles/blog posts, metadata tags (d, title, image, summary, published_at, t), and editability via parameterized replaceable events. See the NIP-23 spec. [1]

  • NIP‑30 — Custom emojis / emoji sets (implemented as kind 30030 via NIP‑51 Lists): NIP‑30 describes emoji shortcodes and URLs; NIP‑51 maps emoji sets to kind 30030 (parameterized replaceable). See NIP‑51 and the NIP‑30/emoji-set docs. [2][3]

  • kind 30040 / related (publication standard): community/registry entries map kind 30040 → “Curated Publication Index” (or “Modular Article Header”) and kind 30041 → “Curated Publication Content” (spec referenced as NKBIP‑01 in the NIPs index). Implementations/projects (e.g., GitCitadel / Next‑Alexandria) are using 30040/30041 to represent multi‑section publications (index + section/content events). The formal detailed spec is tracked in the NIPs/GitHub/NKBIP‑01 references and related community PRs. [4][5]

Sources:

If you want, I can fetch the NKBIP‑01 draft (30040/30041) contents from the NIPs repo or the specific GitHub PRs and extract the concrete event/tag schema for 30040/30041.


🌐 Web query:

NKBIP-01 GitHub Nostr NIPs kind 30040 specification schema

💡 Result:

NKBIP-01 is the proposal referenced by the Nostr NIPs list for kinds 30040 (Curated Publication Index) and 30041 (Curated Publication Content). The NIPs README lists those kinds and points to NKBIP-01 as the specification source; the NKBIP-01 draft/content is hosted externally (wikistr). [1][2]

Sources:

  • Nostr NIPs README (lists kind 30040 → NKBIP-01). [1]
  • nostrability schemata / package mapping of event kinds (includes 30040/30041 references). [2]

Relax kind 30040 validation to match the rest of the codebase.

The timeline_units.rs validation is stricter than how kind 30040 events are handled elsewhere. PublicationTree.from_root_note() and the UI code both treat the title tag as optional and handle missing titles gracefully (showing "(untitled)"). The current filter rejecting events without a title tag will drop valid publication indices that the rest of the application can process.

Recommend removing or relaxing the title requirement to align with how PublicationTree and UI components handle kind 30040 events. The content filtering (rejecting non-empty, non-whitespace) may also be overly defensive—consider allowing empty content to match the rest of the codebase's expectations.

🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/timeline/timeline_units.rs` around lines 141 -
167, The 30040 handling in timeline_units.rs currently rejects events missing a
"title" tag but other code (PublicationTree::from_root_note and UI) treat title
as optional; remove the strict title requirement by deleting the has_title check
(the payload.note.tags() .any(...) and the subsequent if !has_title return None)
so the 30040 branch returns Some(NoteUnitFragmentResponse { ... }) even when the
"title" tag is absent; keep or adjust the existing content check as desired to
remain consistent with other consumers and update any affected tests or callers
accordingly.

Comment on lines +432 to +441
// Show "Load More" button if enabled and there are notes
if enable_load_more && !self.tab.units.is_empty() {
if let Some(oldest) = self.tab.units.oldest() {
ui.add_space(16.0);
ui.vertical_centered(|ui| {
let load_more_text = RichText::new("Load More")
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Body));
if ui.button(load_more_text).clicked() {
// Use the oldest note's created_at - 1 as the "until" parameter
action = Some(NoteAction::LoadMore(oldest.created_at.saturating_sub(1)));
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Localize the “Load More” label.

This is user‑facing text and should use tr! for localization. As per coding guidelines, wrap new strings with localization macros.

🌍 Suggested localization
-                let load_more_text = RichText::new("Load More")
+                let load_more_text = RichText::new(tr!(
+                    self.note_context.i18n,
+                    "Load More",
+                    "load more publications button"
+                ))
                     .size(get_font_size(ui.ctx(), &NotedeckTextStyle::Body));
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
// Show "Load More" button if enabled and there are notes
if enable_load_more && !self.tab.units.is_empty() {
if let Some(oldest) = self.tab.units.oldest() {
ui.add_space(16.0);
ui.vertical_centered(|ui| {
let load_more_text = RichText::new("Load More")
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Body));
if ui.button(load_more_text).clicked() {
// Use the oldest note's created_at - 1 as the "until" parameter
action = Some(NoteAction::LoadMore(oldest.created_at.saturating_sub(1)));
// Show "Load More" button if enabled and there are notes
if enable_load_more && !self.tab.units.is_empty() {
if let Some(oldest) = self.tab.units.oldest() {
ui.add_space(16.0);
ui.vertical_centered(|ui| {
let load_more_text = RichText::new(tr!(
self.note_context.i18n,
"Load More",
"load more publications button"
))
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Body));
if ui.button(load_more_text).clicked() {
// Use the oldest note's created_at - 1 as the "until" parameter
action = Some(NoteAction::LoadMore(oldest.created_at.saturating_sub(1)));
🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/ui/timeline.rs` around lines 432 - 441, The "Load
More" label is hardcoded; update the button text to use the localization macro
by replacing the literal in the RichText creation used for the load_more button
(see load_more_text in the if enable_load_more && !self.tab.units.is_empty() /
if let Some(oldest) = self.tab.units.oldest() block). Use tr! (e.g., pass
tr!("Load More") to RichText::new) so the user-facing string is localized while
keeping the rest of the logic (button click ->
NoteAction::LoadMore(oldest.created_at.saturating_sub(1))) unchanged.

Comment on lines +747 to +831
let title = title.unwrap_or("Untitled Publication");

// Get the index author's profile (the pubkey who published the event)
let txn = note.txn().expect("note should have txn");
let index_author_profile = note_context.ndb.get_profile_by_pubkey(txn, note.pubkey());
let index_author_name = get_display_name(index_author_profile.as_ref().ok()).name();

// Set up hitbox for whole-card click detection (same pattern as NoteView)
let hitbox_id = egui::Id::new(("pub_card_hitbox", note_key));
let maybe_hitbox: Option<egui::Response> =
ui.ctx()
.data_mut(|d| d.get_temp(hitbox_id))
.map(|note_size: egui::Vec2| {
let container_rect = ui.max_rect();
let rect = egui::Rect {
min: egui::pos2(container_rect.min.x, container_rect.min.y),
max: egui::pos2(container_rect.max.x, container_rect.min.y + note_size.y),
};
ui.interact(rect, ui.id().with(hitbox_id), Sense::click())
});

// Calculate thumbnail size and position before rendering content
let thumbnail_size = 64.0;
let right_edge = ui.max_rect().right();
let options_button_width = NoteContextButton::max_width();

let response = notedeck_ui::padding(8.0, ui, |ui| {
// Publication title - large and prominent
ui.add(
egui::Label::new(
RichText::new(title)
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Heading3))
.strong(),
)
.wrap(),
);

ui.add_space(4.0);

// Publication author (from "author" tag)
if let Some(author) = publication_author {
ui.horizontal(|ui| {
ui.label(
RichText::new("by ")
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Small))
.color(ui.visuals().weak_text_color()),
);
ui.label(
RichText::new(author)
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Small))
.strong(),
);
});
}

ui.add_space(4.0);

// Index author (who published this to nostr)
ui.horizontal(|ui| {
ui.label(
RichText::new("published by ")
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Small))
.color(ui.visuals().weak_text_color()),
);
ui.label(
RichText::new(index_author_name)
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Small)),
);
});

// Section count badge
if section_count > 0 {
ui.add_space(4.0);
ui.horizontal(|ui| {
let section_text = if section_count == 1 {
"1 section".to_string()
} else {
format!("{} sections", section_count)
};
ui.label(
RichText::new(section_text)
.size(get_font_size(ui.ctx(), &NotedeckTextStyle::Small))
.color(ui.visuals().weak_text_color()),
);
});
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Localize publication card strings and pluralization.

Strings like “Untitled Publication”, “by”, “published by”, and the section count should use tr!/tr_plural! to support i18n. As per coding guidelines, wrap user‑facing strings.

🌍 Suggested localization (examples)
-    let title = title.unwrap_or("Untitled Publication");
+    let title = title.unwrap_or(tr!(
+        note_context.i18n,
+        "Untitled Publication",
+        "fallback title for publications without a title tag"
+    ));
@@
-                let section_text = if section_count == 1 {
-                    "1 section".to_string()
-                } else {
-                    format!("{} sections", section_count)
-                };
+                let section_text = tr_plural!(
+                    note_context.i18n,
+                    "{count} section",
+                    "{count} sections",
+                    "publication section count",
+                    section_count,
+                    count = section_count
+                );
🤖 Prompt for AI Agents
In `@crates/notedeck_columns/src/ui/timeline.rs` around lines 747 - 831, Replace
hard-coded user-facing strings in the publication card with localization macros:
use tr! for single strings and tr_plural! for pluralizable text. Specifically,
change the default title ("Untitled Publication") to tr!("Untitled Publication")
where title.unwrap_or is used, wrap the "by" and "published by" labels (in the
publication_author and index author horizontal blocks) with tr!, and replace the
section count logic that builds "1 section"/"N sections" to use
tr_plural!(section_count, "1 section", "{} sections") (or the project's
tr_plural! signature) instead of manual format!. Update any RichText::new(...)
calls that currently contain those literals to pass the tr!/tr_plural! results
so UI strings use the app's i18n pipeline.

Comment on lines +1 to +11
//! Publication state management for NKBIP-01 publications (kind 30040/30041)
//!
//! Uses PublicationTree from notedeck_publications crate for hierarchical
//! navigation of nested publications.

use enostr::{NoteId, RelayPool};
use nostrdb::{Filter, Ndb, Note, Transaction};
use notedeck_publications::{EventAddress, NodeType, PublicationTree, PublicationTreeNode};
use std::collections::{HashMap, HashSet};
use tracing::{debug, info};

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

This file duplicates crates/notedeck_columns/src/timeline/publication.rs.

As noted in the review of the columns crate, this entire module is nearly identical to crates/notedeck_columns/src/timeline/publication.rs. The shared functionality should be extracted to notedeck_publications or a common module to avoid maintenance burden.

Key differences observed:

  • This file uses uuid::Uuid::new_v4() directly for new_sub_id()
  • The columns version uses subscriptions::new_sub_id()

This inconsistency suggests the duplication happened during development and should be reconciled.

🤖 Prompt for AI Agents
In `@crates/notedeck_reader/src/state.rs` around lines 1 - 11, This module
duplicates logic from the columns crate; extract the shared publication state
logic (types and functions around PublicationTree, PublicationTreeNode
management, sub id generation, and related helpers) into the
notedeck_publications crate or a new common module and replace the duplicated
code in this file by importing and using that shared API; specifically, refactor
new_sub_id generation to use the unified subscriptions::new_sub_id() helper
instead of calling uuid::Uuid::new_v4() directly, move any duplicated helper
functions into the shared module, and update usages in functions and structs
that reference PublicationTree, PublicationTreeNode, and sub id creation to the
consolidated implementations.

Comment on lines +541 to +544
/// Generate a new subscription ID
fn new_sub_id() -> String {
uuid::Uuid::new_v4().to_string()
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Subscription ID generation differs from notedeck_columns.

This file generates subscription IDs with uuid::Uuid::new_v4().to_string(), while notedeck_columns/src/timeline/publication.rs uses subscriptions::new_sub_id(). This inconsistency could lead to:

  1. Different ID formats across the codebase
  2. Potential issues if subscription tracking expects a consistent format

When consolidating the duplicate code, ensure a single subscription ID generation strategy is used.

🤖 Prompt for AI Agents
In `@crates/notedeck_reader/src/state.rs` around lines 541 - 544, The new_sub_id()
function in this module generates UUIDs directly, which diverges from the
established subscription ID generator used elsewhere
(subscriptions::new_sub_id()); replace the local implementation with a single
canonical generator call: remove the local uuid::Uuid::new_v4().to_string()
implementation and call the shared subscriptions::new_sub_id() (or extract the
shared generator into a common utility and call that) so all subscription IDs
share the same format and behavior across new_sub_id().

Comment on lines +574 to +582
if !is_complete {
ui.horizontal(|ui| {
ui.spinner();
ui.label(format!(
"Fetching... ({}/{})",
sections.len(),
sections.len()
));
});
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fetch status displays incorrect denominator.

The fetch status shows sections.len() / sections.len() which will always show equal values (e.g., "5/5"). This should likely compare resolved sections against total expected sections.

🐛 Fix fetch status to show meaningful progress
         // Fetch status at bottom
         if !is_complete {
             ui.horizontal(|ui| {
                 ui.spinner();
-                ui.label(format!(
-                    "Fetching... ({}/{})",
-                    sections.len(),
-                    sections.len()
-                ));
+                ui.label("Fetching more sections...");
             });
         }

Or if you have access to the total expected count, use that instead:

ui.label(format!("Fetching... ({} loaded)", sections.len()));
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
if !is_complete {
ui.horizontal(|ui| {
ui.spinner();
ui.label(format!(
"Fetching... ({}/{})",
sections.len(),
sections.len()
));
});
if !is_complete {
ui.horizontal(|ui| {
ui.spinner();
ui.label("Fetching more sections...");
});
🤖 Prompt for AI Agents
In `@crates/notedeck_reader/src/ui/reader.rs` around lines 574 - 582, The UI fetch
status is using sections.len() for both numerator and denominator which always
shows identical values; inside the conditional that renders the spinner (the
block guarded by is_complete and the closure passed to ui.horizontal), change
the label to display meaningful progress by using the count of resolved/loaded
sections as the numerator and the total expected sections as the denominator
(e.g., use a resolved_sections_count or loaded_count variable instead of the
left-hand sections.len(), or if you only have the current loaded count, switch
to a single-value message like "Fetching... ({} loaded)" using sections.len()),
updating the format call that currently references sections.len() twice and
keeping the surrounding ui.spinner()/ui.label code in reader.rs.

Comment on lines +19 to 22
let builder = HttpsConnectorBuilder::new()
.with_native_roots()
.expect("Failed to load native root certificates");

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

cat -n crates/notedeck/src/media/network.rs | head -40

Repository: damus-io/notedeck

Length of output: 1580


🏁 Script executed:

rg "HyperHttpError" crates/notedeck/src/media/ -A 2 -B 2

Repository: damus-io/notedeck

Length of output: 5271


🏁 Script executed:

rg "with_native_roots\|with_system_roots\|with_webpki_roots" --type rust

Repository: damus-io/notedeck

Length of output: 43


🌐 Web query:

hyper_rustls HttpsConnectorBuilder with_native_roots error handling supported platforms Android iOS

💡 Result:

Short answer

  • HttpsConnectorBuilder::with_native_roots is provided only when hyper-rustls is built with the rustls-native-certs feature; it uses rustls-native-certs::load_native_certs which can fail with a std::io::Error if system roots cannot be loaded. [1][2]
  • Platform support: iOS/macOS and Windows are supported via OS APIs (Security.framework / Windows store). Android is partial: rustls-native-certs sometimes cannot find or load the Android system trust store (or needs specific runtime/API behaviour) so you can see "no valid native root CA certificates found" or io errors on Android. For a more robust OS-integrated verifier prefer rustls-platform-verifier (uses OS verifier on iOS/macOS/Windows and a best-effort Android path; Linux/WASM fall back to webpki). [2][3][4]

What the errors look like and where they come from

  • rustls-native-certs::load_native_certs -> Result<Vec, std::io::Error>. If it fails you get an io::Error and hyper-rustls connector construction (with_native_roots path) will return an Err. Docs mention messages like "with_native_roots processed 0 valid and 0 invalid certs" / "no valid native root CA certificates found" in Android logs. [2][5]
  • hyper_rustls HttpsConnectorBuilder functions that wrap native root loading are gated by crate features and will propagate the load error (check docs for the exact Error type in your hyper-rustls version). [6]

Practical recommendations

  • For iOS: using with_native_roots generally works (uses Security.framework). [3]
  • For Android:
    • Prefer rustls-platform-verifier if you need OS verification semantics (it documents Android caveats and API-level limitations such as revocation behavior). [3][4]
    • If rustls-native-certs fails on your Android runtime (common when using a non-Java runtime, NDK, or certain containers), either bundle explicit CA roots, use webpki-roots, or implement platform-specific loading (call Java APIs / TrustManager) and add them as extra roots. [2][3][5]
  • Always handle errors from load_native_certs and from HttpsConnectorBuilder::with_native_roots (propagated Result) and provide a fallback: webpki-roots or explicit CA bundle. [2][6]

Sources

  • rustls-native-certs (API, platform notes). [2]
  • rustls-platform-verifier (platform support, Android caveats). [3][4]
  • hyper-rustls HttpsConnectorBuilder docs (feature gating and with_native_roots note). [6]

If you want, I can fetch the exact error types/messages for the hyper-rustls version you are using and show a short code example that handles failures and falls back to webpki-roots—tell me the crate versions (hyper-rustls, rustls, rustls-native-certs) or paste your Cargo.toml.


🏁 Script executed:

cat crates/notedeck/Cargo.toml | grep -A 5 -B 5 "hyper"

Repository: damus-io/notedeck

Length of output: 442


🏁 Script executed:

rg "with_webpki_roots\|with_system_roots\|HttpsConnectorBuilder" crates/notedeck/src --type rust

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

# Check hyper-rustls documentation for available connector builders
rg "fn.*HttpsConnectorBuilder" --type rust -A 1

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

cat Cargo.toml | grep -A 2 "hyper-rustls\|rustls-native"

Repository: damus-io/notedeck

Length of output: 128


🏁 Script executed:

# Check if there are other connector builder options mentioned in dependencies
rg "with_webpki_roots" crates/ --type rust

Repository: damus-io/notedeck

Length of output: 43


🏁 Script executed:

# Check the actual error type that with_native_roots returns by looking at type signatures
rg "with_native_roots" crates/ -B 2 -A 2 --type rust

Repository: damus-io/notedeck

Length of output: 795


Replace .expect() with proper error handling to prevent runtime crashes on platforms where native TLS roots are unavailable.

The call to with_native_roots().expect(...) at line 20-21 will panic at runtime on platforms like Android where the OS trust store cannot be loaded. This crate already demonstrates the correct pattern in crates/notedeck/src/relay_info.rs, which uses .map_err() to propagate the error instead.

-        let builder = HttpsConnectorBuilder::new()
-            .with_native_roots()
-            .expect("Failed to load native root certificates");
+        let builder = HttpsConnectorBuilder::new()
+            .with_native_roots()
+            .map_err(|e| HyperHttpError::Hyper(Box::new(e)))?;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let builder = HttpsConnectorBuilder::new()
.with_native_roots()
.expect("Failed to load native root certificates");
let builder = HttpsConnectorBuilder::new()
.with_native_roots()
.map_err(|e| HyperHttpError::Hyper(Box::new(e)))?;
🤖 Prompt for AI Agents
In `@crates/notedeck/src/media/network.rs` around lines 19 - 22, The use of
HttpsConnectorBuilder::with_native_roots().expect(...) can panic on platforms
without OS trust stores; replace the expect call with proper error propagation
similar to the pattern in relay_info.rs by mapping the error (e.g., with
.map_err(...) or ?) and returning a Result from the surrounding function instead
of unwrapping; update the surrounding function signature to return Result if
needed and ensure the builder variable is constructed via
HttpsConnectorBuilder::new().with_native_roots().map_err(|e| /* convert to
appropriate error type */)? so the failure is propagated rather than panicking.

@limina1
Copy link
Copy Markdown
Author

limina1 commented Jan 21, 2026

@jb55 For those additional features of commenting, highlighting and writing these structures I'd say so. This pr is just to display the events as a feed.

@limina1
Copy link
Copy Markdown
Author

limina1 commented Jan 23, 2026

@alltheseas I made the adjustments you suggested. I tried to include shadcn-egui, but the components didn't fit with the rest of the app's style

@alltheseas
Copy link
Copy Markdown
Contributor

Thanks I'll check it out in the next few days. Been adding some other potential upgrades to Notedeck

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants