Advertisement

sychronizing to music

Started by April 04, 2005 10:11 PM
4 comments, last by fractoid 19 years, 5 months ago
how do you synchronize demos and stuff to the music. what is the trick?
How I've done it in the past is by opening up my music file in a sound editor and taking note of exactly what times parts of the music I want to synchronise to occur. Then I render my scene with those times in mind.

E.g. In my LOTR Two Towers demo I had a single mp3 with all my music and sound effects. There were some effects I wanted to synchronise with sound effects and changes in the music (e.g. the horn shattering and the changes in the music between different scenes). All I did was only render each scene for the exact time that I had noted in the sound editor and immediately start the next. Implementation-wise, I just kept an "elapsed time" variable which I checked each frame sort of like this:

void RenderScene(GLfloat deltaTime){  // update the elapsed time with the "delta" (change) in time since   // the last render  m_elapsedTime += deltaTime;  // render the intro scene for first 32 seconds  if (m_elapsedTime >= 32.0)  {    RenderIntroScene(deltaTime);  }  // render the ring scene for next 13 seconds  else if (m_elapsedTime >= 45.0)  {    RenderRingScene(deltaTime);  }  /* etc. etc. */} 


My actual implementation was a little more complicated but essentially worked like that above. Check out the source of my demo if you want to see in more detail how I did it. Have a look at OpenGLRenderer.cpp. Apologies for the messy code though [smile]

There are probably better ways of doing this, but it worked out nicely for me and wasn't too complicated.

Hope that helps. [smile]

Michael Wallace
Advertisement
Alternatively, if you're using tracker-style music (.mod files or similar) then you can get the exact time etc. from the music. It's broken up into rows and patterns, with each row being 64 patterns and each pattern being 64 sub-patterns. A pattern is like your standard whole-note beat. If you're using something like BASS then you can query it directly for the current row/pat/subpat. I use this to, for instance, write a function such as:
bool DetectSnareDrum(int row, int pat, int subpat) {    return (row % 1 && (pat == 62 || pat == 63) && ((subpat % 5) == 0);}//...if (row == 5)    DrawSceneForRow5()else if (DetectSnareDrum(row, pat, subpat))    DrawThingThatOnlyAppearsWhenSnareDrumGoes();
Quote: Original post by badbrad29
how do you synchronize demos and stuff to the music. what is the trick?


There's support for this in the sound lib FMOD.

from the doc that comes with FMOD:
Quote: All you have to do is drop 'markers' into a wav editing program like SoundForge, and FMOD will automatically generate callbacks as the stream plays when the play cursor runs over the markers!.
.
.
Note that only WAV files and MP3 with RIFF wrappers will work here. The markers are saved in a RIFF chunk

don't these methods make the scene stay for extra time on a faster machine and not allow time to finish scenes on a slow one?

about this fmod stuff. i'm new to fmod and actually only first used it the other day. could i get an explination on how this would be done. it would be appriciated. i'll dowload the source and check it out. if i have any questions i'll ask. thanks.
No. Methods based on frame counters, for instance, would vary with the speed of the computer. On the other hand, time-based synchronization (such as Custard Slice's) runs at the same speed on any computer by definition. Row/pattern synchronization (if you're using a music type with such information built in) locks particular actions with particular positions in the music. Skanatic's approach is more flexible than either of these (you can, for instance, move scene start/endpoints, switch effects on and off etc. directly from your music editor) but requires more work to use, and is generally more artist friendly and less programmer friendly.

This topic is closed to new replies.

Advertisement