8000 Custom lexer with custom Token type · Issue #803 · lalrpop/lalrpop · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
Custom lexer with custom Token type #803
Closed
@ethindp

Description

@ethindp

I was looking through the custom Lexer tutorial, but I'm not really sure how to use the lexer I've written with Lalrpop. In particular, the regular expressions I use are ones that the Regex crate does not (and most likely will not) support; I instead use fancy_regex for those features. However, my Lexer returns types like this:

#[derive(Clone, Debug, Serialize, Deserialize)]
pub enum Token {
    Identifier {
        line: usize,
        column: usize,
        value: String,
    },
// ...

(It also returns an anyhow::Result<Vec<Token>>.) What would be the (correct) way of incorporating this lexer into a Lalrpop grammar? (I just have a standalone tokenize function, though I can change that to a full Lexer type if that's required.)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0