想要在ios设备上录制音频,可以使用AVAudioRecorder类,确保你已经将CoreAudio.framework 库添加到目标文件中
AV框架中的AVAudioRecorder类使得在iOS中录制音频变得很简单。开始录制音频需要提供一些参数给AVAudioRecorder实例的initWithURL:settings:error:方法:
保存录音文件的URL
文件的URL是一个本地URL.AV框架会根据URL的扩展名来决定录制文件的音频格式。所以要仔细选择扩展名。
在采样之前和过程中使用的settings
包括采样率、频道以及其他音频录制器开始录音的信息。Setting是一个dictionary对象。
初始化错误发生时保存到error 变量中。
你可以在出现异常的情况下得到这个实例中的值。
initWithURL:settings:error: 方法的setting参数很有意思。很多值都可以保存在这个setting字典里,但是在本节中我们只讨论一些最重要的:
AVFormatIDKey
录音的格式。可能的值有:
• kAudioFormatLinearPCM
• kAudioFormatAppleLossless
AVSampleRateKey
录制音频的采样率。
AVNumberOfChannelsKey
录制音频的频道编号。
AVEncoderAudioQualityKey
录制音频的质量,可能的值有:
AVAudioQualityMin
• AVAudioQualityLow
• AVAudioQualityMedium
• AVAudioQualityHigh
• AVAudioQualityMax
掌握了所有这些信息后,我们可以开始写一个可以录制音频文件然后用AVAudioPlayer播放的程序。我们要做的具体事情是:
1. 用 Apple Lossless 格式录制音频。
2. 把录制的音频文件用Recording.m4a文件名保存到程序的Documents目录中。
3. 在录音开始10秒后停止录制并且立刻开始播放录制的音频。
代码:头文件:
#import <UIKit/UIKit.h> #import <CoreAudio/CoreAudioTypes.h> #import <AVFoundation/AVFoundation.h> @interface ViewController : UIViewController<AVAudioPlayerDelegate,AVAudioRecorderDelegate> @property(nonatomic, strong)AVAudioRecorder *audioRecorder; @property(nonatomic, strong)AVAudioPlayer *audioPlayer; -(NSString *)audioRecordingPath; -(NSString *)audioRecordingSettings; @end
实现文件:
// // ViewController.m // 录制音频 // // Created by Rio.King on 13-11-2. // Copyright (c) 2013年 Rio.King. All rights reserved. // #import "ViewController.h" @interface ViewController () @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; NSError *error = nil; NSString *pathAsString = [self audioRecordingPath]; NSURL *audioRecordingURL = [NSURL fileURLWithPath:pathAsString]; self.audioRecorder = [[AVAudioRecorder alloc] initWithURL:audioRecordingURL settings:[self audioRecordingSettings] error:&error]; if (self.audioRecorder != nil) { self.audioRecorder.delegate = self; /*prepare the recorder and then start the recording*/ if ([self.audioRecorder prepareToRecord] && [self.audioRecorder record]) { NSLog(@"Successfully started to record."); /*after five seconds, let's stop the recording process*/ [self performSelector:@selector(stopRecordingOnAudioRecorder:) withObject:self.audioRecorder afterDelay:10.0f]; }else{ NSLog(@"Failed to record."); self.audioRecorder = nil; } }else{ NSLog(@"failed to create an instance of the audio recorder."); } } -(NSString *)audioRecordingPath{ NSString *result = nil; NSArray *folders = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsFolder = [folders objectAtIndex:0]; result = [documentsFolder stringByAppendingPathComponent:@"Recording.m4a"]; return result; } - (NSDictionary *) audioRecordingSettings{ NSDictionary *result = nil; /* Let's prepare the audio recorder options in the dictionary. Later we will use this dictionary to instantiate an audio recorder of type AVAudioRecorder */ NSMutableDictionary *settings = [[NSMutableDictionary alloc] init]; [settings setValue:[NSNumber numberWithInteger:kAudioFormatAppleLossless] forKey:AVFormatIDKey]; [settings setValue:[NSNumber numberWithFloat:44100.0f] forKey:AVSampleRateKey]; [settings setValue:[NSNumber numberWithInteger:1] forKey:AVNumberOfChannelsKey]; [settings setValue:[NSNumber numberWithInteger:AVAudioQualityLow] forKey:AVEncoderAudioQualityKey]; result = [NSDictionary dictionaryWithDictionary:settings]; return result; } - (void) stopRecordingOnAudioRecorder:(AVAudioRecorder *)paramRecorder{ /* Just stop the audio recorder here */ [paramRecorder stop]; } - (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag{ if (flag){ NSLog(@"Successfully stopped the audio recording process."); /* Let's try to retrieve the data for the recorded file */ NSError *playbackError = nil; NSError *readingError = nil; NSData *fileData = [NSData dataWithContentsOfFile:[self audioRecordingPath] options:NSDataReadingMapped error:&readingError]; /* Form an audio player and make it play the recorded data */ self.audioPlayer = [[AVAudioPlayer alloc] initWithData:fileData error:&playbackError]; /* Could we instantiate the audio player? */ if (self.audioPlayer != nil){ self.audioPlayer.delegate = self; /* Prepare to play and start playing */ if ([self.audioPlayer prepareToPlay] && [self.audioPlayer play]){ NSLog(@"Started playing the recorded audio."); } else { NSLog(@"Could not play the audio."); } } else { NSLog(@"Failed to create an audio player."); } } else { NSLog(@"Stopping the audio recording failed."); } /* Here we don't need the audio recorder anymore */ self.audioRecorder = nil; } - (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{ if (flag){ NSLog(@"Audio player stopped correctly."); } else { NSLog(@"Audio player did not stop correctly."); } if ([player isEqual:self.audioPlayer]){ self.audioPlayer = nil; } else { /* This is not the player */ } } - (void)audioPlayerBeginInterruption:(AVAudioPlayer *)player{ /* The audio session has been deactivated here */ } - (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withFlags:(NSUInteger)flags{ if (flags == AVAudioSessionInterruptionFlags_ShouldResume){ [player play]; } } #pragma mark -处理音频录制过程中的中断 - (void)audioRecorderBeginInterruption:(AVAudioRecorder *)recorder{ NSLog(@"Recording process is interrupted"); } - (void)audioRecorderEndInterruption:(AVAudioRecorder *)recorder withFlags:(NSUInteger)flags{ if (flags == AVAudioSessionInterruptionFlags_ShouldResume){ NSLog(@"Resuming the recording..."); [recorder record]; } } - (void)didReceiveMemoryWarning { [super didReceiveMemoryWarning]; // Dispose of any resources that can be recreated. } @end
运行结果:
2013-11-02 21:01:52.144 录制音频[1786:a0b] Successfully started to record.
2013-11-02 21:02:02.146 录制音频[1786:a0b] Successfully stopped the audio recording process.
2013-11-02 21:02:02.160 录制音频[1786:a0b] Started playing the recorded audio.
2013-11-02 21:02:12.128 录制音频[1786:a0b] Audio player stopped correctly.
想要在你的应用中播放音频文件,可以使用AV框架(Audio 和 Video 框架)里的AVAudioPllayer 类。
播放音频- (void)viewDidLoad { [super viewDidLoad]; dispatch_queue_t dispatchQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); dispatch_async(dispatchQueue, ^(void){ NSBundle *mainBundle = [NSBundle mainBundle]; NSString *filePath = [mainBundle pathForResource:@"慢慢" ofType:@"mp3"]; NSData *fileData = [NSData dataWithContentsOfFile:filePath]; NSError *error = nil; /*start the audio player*/ self.audioPlay = [[AVAudioPlayer alloc] initWithData:fileData error:&error]; /*did we get an instance of AVAudioPlayer?*/ if (self.audioPlay != nil) { /*set the delegate and start playing*/ self.audioPlay.delegate = self; if ([self.audioPlay prepareToPlay] && [self.audioPlay play]) { /*successfully started playing*/ }else{ /*failed to play*/ } }else{ /*failed to instantiate AVAudioPlayer*/ } }); }
在viewDidLoad 方法中,我们用GCD来异步从歌曲数据中加载数据到NSData实例中,而且把数据提供给音频播放器。这么做是因为加载不同长度的音频文件中的数据需要很长时间,如果我们在主线程中做的话会有影响UI体验的风险。因此,我们利用一个全局的并发队列来确保代码不在主线程中运行。
处理播放音频时的中断你想让你的AVAudioPlayer实例在被打断后恢复播放,例如来电。
#pragma mark - 处理播放音频时的中断 -(void)audioPlayerBeginInterruption:(AVAudioPlayer *)player{ /*audio session is interrupted.The player will be paused here */ } -(void)audioPlayerEndInterruption:(AVAudioPlayer *)player withFlags:(NSUInteger)flags{ if (flags == AVAudioSessionInterruptionFlags_ShouldResume && player !=nil) { [player play]; } }
当然,,模拟器并不能模拟来电中断,你必须在真机中才可以调试。