Understanding the sound architecture

Your applications can load sound data from five main sources:

  • External sound files loaded at run time

  • Sound resources embedded as assets within the application

  • Sound data being generated dynamically through the use of the sampleData event handler

Sound data can be fully loaded before it is played back, or it can be streamed, meaning that it is played back while it is still loading.

The OpenFL sound classes support sound files that are stored in the mp3, ogg, or wav formats. They cannot directly load or play sound files in other formats, such as AAC or AIFF.

The OpenFL sound architecture makes use of the following classes in the openfl.media package.

Class Description
openfl.media.Sound The Sound class handles the loading of sound, manages basic sound properties, and starts a sound playing.
openfl.media.SoundChannel When an application plays a Sound object, a new SoundChannel object is created to control the playback. The SoundChannel object controls the volume of both the left and right playback channels of the sound. Each sound that plays has its own SoundChannel object.
openfl.media.SoundLoaderContext The SoundLoaderContext class specifies how many seconds of buffering to use when loading a sound, and whether OpenFL looks for a policy file from the server when loading a file. A SoundLoaderContext object is used as a parameter to the Sound.load() method.
openfl.media.SoundTransform The SoundTransform class contains values that control sound volume and panning. A SoundTransform object can be applied to an individual SoundChannel object, to the global SoundMixer object, or to a Microphone object, among others.
openfl.media.ID3Info An ID3Info object contains properties that represent ID3 metadata information that is often stored in mp3 sound files.

Each sound that is loaded and played needs its own instance of the Sound class and the SoundChannel class.

results matching ""

    No results matching ""