8000 RenetVisualizer: Fixes panic when max value is 0 in visualizer by jfto23 · Pull Request #173 · lucaspoffo/renet · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

RenetVisualizer: Fixes panic when max value is 0 in visualizer #173

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 28, 2025

Conversation

jfto23
Copy link
Contributor
@jfto23 jfto23 commented Feb 23, 2025

This bug is reproducible in the bevy_demo example:

  1. Run cargo run --bin server --features netcode
  2. Make sure Show all clients is checked
  3. Run cargo run --bin client --features netcode

I get the following crash on the server

2025-02-22T15:58:57.015140Z  INFO bevy diagnostic: frame_count: 3593.000000   (avg 3533.500000)
Player 1740239936136 connected.
thread 'main' panicked at C:\Users\JF\.cargo\registry\src\index.crates.io-6f17d22bba15001f\emath-0.29.1\src\lib.rs:148:5:
assertion failed: from.start() != from.end()
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
Encountered a panic in system `server::update_visulizer_system`!
2025-02-22T15:58:57.832135Z  WARN bevy_ecs::world::command_queue: CommandQueue has un-applied commands being dropped. Did you forget to call SystemState::apply?
Encountered a panic in system `bevy_app::main_schedule::Main::run_main`!
error: process didn't exit successfully: `F:\programming\renet\target\debug\server.exe` (exit code: 101)

@lucaspoffo lucaspoffo merged commit 02c97e8 into lucaspoffo:master Apr 28, 2025
@lucaspoffo
Copy link
Owner

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0