swift - Realtime Audio with AVAudioEngine in iOS 8 -
swift - Realtime Audio with AVAudioEngine in iOS 8 -
hej. want implement realtime  sound application new avaudioengine in swift. has experience new framework? how real time applications work?
my first  thought store (processed) input   info avaudiopcmbuffer object ,  allow play avaudioplayernode can see in demo class:
import avfoundation  class audioio {     var audioengine: avaudioengine     var audioinputnode : avaudioinputnode     var audioplayernode: avaudioplayernode     var audiomixernode: avaudiomixernode     var audiobuffer: avaudiopcmbuffer      init(){         audioengine = avaudioengine()         audioplayernode = avaudioplayernode()         audiomixernode = audioengine.mainmixernode           allow framelength = uint32(256)         audiobuffer = avaudiopcmbuffer(pcmformat: audioplayernode.outputformatforbus(0), framecapacity: framelength)         audiobuffer.framelength = framelength          audioinputnode = audioengine.inputnode          audioinputnode.installtaponbus(0, buffersize:framelength, format: audioinputnode.outputformatforbus(0), block: {(buffer, time) in              allow channels = unsafearray(start: buffer.floatchanneldata, length: int(buffer.format.channelcount))              allow floats = unsafearray(start: channels[0], length: int(buffer.framelength))              var = 0; < int(self.audiobuffer.framelength); i+=int(self.audiomixernode.outputformatforbus(0).channelcount)             {                 // doing real time stuff                 self.audiobuffer.floatchanneldata.memory[i] = floats[i];             }             })          // setup  sound engine         audioengine.attachnode(audioplayernode)         audioengine.connect(audioplayernode, to: audiomixernode, format: audioplayernode.outputformatforbus(0))         audioengine.startandreturnerror(nil)          // play player , buffer         audioplayernode.play()         audioplayernode.schedulebuffer(audiobuffer, attime: nil, options: .loops, completionhandler: nil)     } }    but far away real time , not efficient. ideas or experiences? , not matter, if prefer objective-c or swift, grateful notes, remarks, comments, solutions, etc.
i've been experimenting avaudioengine in both objective-c , swift. in objective-c version of engine, sound processing done purely in c (by caching raw c sample pointers available through avaudiopcmbuffer, , operating on info c code). performance impressive. out of curiosity, ported engine swift. tasks playing sound file linearly, or generating tones via fm synthesis, performance quite good, arrays involved (e.g. granular synthesis, sections of sound played , manipulated in non-linear fashion), there important performance hit. best optimization, cpu usage 30-40% greater objective-c/c version. i'm new swift, perhaps there other optimizations of ignorant, far can tell, c/c++ still best selection realtime audio. @ amazing sound engine. i'm considering this, direct utilize of older c api.
if need process live audio, avaudioengine may not you. see reply question: i want phone call 20 times per sec installtaponbus:buffersize:format:block:
 swift ios8 avfoundation core-audio avaudioengine 
 
Comments
Post a Comment