Skip to content

Speed up #10

@whilp

Description

@whilp

Per @anachronistic:

so - i did a couple of benchmarks on the lib and found differing speeds + heap allocations depending on the incoming string
and i believe that it's possible to 1) normalize the speed to within reason (i.e. longer strings will naturally take a bit longer to parse), 2) normalize the allocations (i believe you can produce a 0 allocation lib if you write the parser yourself)
i did a shit proof of concept and for local paths i had it down to 150ns / 0 allocations by having a Parse method that accepts a pointer to a net.URLstructure ... ex func Parse(url string, handle *net.URL) error { ... }
that was leaning mostly on strings.Index which reports first appearance of a pattern, so ex. colon := strings.Index(url, ":") ... slash := strings.Index(url, "/") ... return colon == -1 || (slash & slash > colon)
so file:///whatever and /foo/bar/whatever returned true very quickly
it gets a touch more complicated from there, but yeah: that's my challenge to you ... normalized speed (for some reasonable definition of normalized compared to string length) and 0 allocations (or at most 1 if you don't like the C-style of passing in the structure)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions