-
Notifications
You must be signed in to change notification settings - Fork 10
Oscillator synchronization #21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Cool! As you might've guessed, this happens because the clocks were started at different times and there's no way to reset them. A simple solution would be adding a global "Reset" button which resets everything, or do you have some other solution in mind? Your demo also gives me some non-critical node ideas:
|
Maybe you need to have some time variable input into each of the oscillators so that you can synchronize and more precisely work with time. |
Also time plugged into the noise and everything that creates waves. |
Also this way you could use anything as time, including the "value" node with keyframed value of time. |
And make repeated patterns with noise (using modulo on time) |
The only problem with this I see is keyframing time is a bad idea because you usually render at 24 fps which is huge for the sound waves. |
Maybe you could keyframe, when you start the clock? |
I've also thought about some kind of time signal system, but tying that to Blender's timeline seems really intriguing. I hadn't really thought of that. Ideally keyframed animations would be sampled in the backend (i.e. at full audio sampling frequency), such that the keyframe points and maybe even curve types are copied to the backend. This would also mean that the sound could be exactly the same each time, instead of having the inevitable jitter when the values are updated from Blender continually. Not to mention the ability to bounce the audio (render to a file faster than real-time). Maybe play a MIDI file according to the time as well? I dream of a piano roll editor inside Blender, but that's probably not going to happen without an external window. The Oscillator node is fairly cluttered already, so I was thinking this externally clocked version would be a separate node, not sure if that's a good idea. I feel like live MIDI-based stuff is still the main paradigm with Audionodes. But hey, this timeline paradigm could turn out really fun. I guess having a node that exposes the current Blender playback position would be a good start then. |
I want these 2 oscillators to have the same phase offset, to be synchronized to form a chord

both phase offsets are set to 0, and their playback is absolutely not synchronized.
Here's what I'm doing, just some fun experiments.
https://www.youtube.com/watch?v=EKgEFXkxCBg
The text was updated successfully, but these errors were encountered: