this app was developed and works fine under ios 5.0, but crashes under ios 4.3





(8)


Maybe you can try the code snippet below

[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&sessionError];
NSError * audio_session_err = nil;
[audio_session setCategory: AVAudioSessionCategoryPlayAndRecord error:&audio_session_err];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&audio_session_err];
[[AVAudioSession sharedInstance] setDelegate:self];
NSLog(@"!");

UInt32 audioRouteOverride = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker,sizeof (audioRouteOverride),&audioRouteOverride);
UInt32 allowMixing = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(allowMixing), &allowMixing);    

if (audio_session_err) {
  NSLog(@"audioSession: %@ %d %@", [audio_session_err domain], [audio_session_err code], [audio_session_err description]);
} else {
  audio_session_err = nil;
  [[AVAudioSession sharedInstance] setActive:YES error:&audio_session_err];
  if (!audio_session_err) NSLog(@"audio session is activated successfully");
}

I think audio_session = [[AVAudioSession sharedInstance] retain]; dispatchs the method setMode: by default. And the setMode: is only available in iOS 5.0 and later(refer to the Doc).

Or you can try to comment out the code:

UInt32 audioRouteOverride = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker,sizeof (audioRouteOverride),&audioRouteOverride);
UInt32 allowMixing = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(allowMixing), &allowMixing);

There must be a method dispatches the setMode: by default. Try yourself. :p


Try info malloc 0x706a7f0 in your gdb to get the object that the selector was sent to. Note, the 0x706a7f0 is the address that shown in your crash output as the one in your first code snippet.

And another tip, you might do make clean(Poduct->Clean) and rebuild it.

I developed an iPhone app under iOS 5.0, and it works fine. But when it comes to iOS 4.3(Base SDK = latest iOS 5.0, compiler = Apple LLVM 3.0, Deployment Target = iOS 4.3), it crashes after launching.

The output around crash point looks like:

2011-12-06 16:25:08.177 FMWei[466:c203] -[AVAudioSession setMode:error:]: unrecognized selector sent to instance 0x706a7f0
2011-12-06 16:25:08.181 FMWei[466:c203] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[AVAudioSession setMode:error:]: unrecognized selector sent to instance 0x706a7f0'

It looks like that AVAudioSession doesn't have a member function setMode:error: while I invoked it. But what's strange is that I didn't invoke a function whose name is setMode:error:. The code about audio processing is:

audio_session = [[AVAudioSession sharedInstance] retain];
audio_session_err = nil;
[audio_session setCategory: AVAudioSessionCategoryPlayAndRecord error:&audio_session_err];
NSLog(@"!");

UInt32 audioRouteOverride = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker,sizeof (audioRouteOverride),&audioRouteOverride);
UInt32 allowMixing = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(allowMixing), &allowMixing);    

if (audio_session_err) 
{
    NSLog(@"audioSession: %@ %d %@", [audio_session_err domain], [audio_session_err code], [audio_session_err description]);
}
else
{
    audio_session_err = nil;
    [audio_session setActive:YES error:&audio_session_err];
    if (!audio_session_err) NSLog(@"audio session is activated successfully");
}

Please help me figure out why it crashes under iOS 4.3 with the strange error. Thank you!




At runtime, lots of methods are called that are not in your code, but which are called behind the scenes as a result of the API calls you have made.

I would focus not on the method that is being called, but on why the object it is sent to is unable to respond to the selector. The object could have been cast as the wrong type, and so is not inheriting the right methods. (In the code snippet you show, you don't explicitly cast AVAudioSession *audio_session.) The other direction is to check that you're not using some other API call that is iOS 5 only, which in the background is calling this method and thus generating the error.

Finally, if you're only recently changed your build target to include iOS 4.3, you may simply need to do a clean build (Product > Clean) so that it compiles iOS 4.3-compatible code.




How do I record audio on iPhone with AVAudioRecorder?

Actually, there are no examples at all. Here is my working code. Recording is triggered by the user pressing a button on the navBar. The recording uses cd quality (44100 samples), stereo (2 channels) linear pcm. Beware: if you want to use a different format, especially an encoded one, make sure you fully understand how to set the AVAudioRecorder settings (read carefully the audio types documentation), otherwise you will never be able to initialize it correctly. One more thing. In the code, I am not showing how to handle metering data, but you can figure it out easily. Finally, note that the AVAudioRecorder method deleteRecording as of this writing crashes your application. This is why I am removing the recorded file through the File Manager. When recording is done, I save the recorded audio as NSData in the currently edited object using KVC.

#define DOCUMENTS_FOLDER [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]


- (void) startRecording{

UIBarButtonItem *stopButton = [[UIBarButtonItem alloc] initWithTitle:@"Stop" style:UIBarButtonItemStyleBordered  target:self action:@selector(stopRecording)];
self.navigationItem.rightBarButtonItem = stopButton;
[stopButton release];

AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *err = nil;
[audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
if(err){
    NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
    return;
}
[audioSession setActive:YES error:&err];
err = nil;
if(err){
    NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
    return;
}

recordSetting = [[NSMutableDictionary alloc] init];

[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey]; 
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];



// Create a new dated file
NSDate *now = [NSDate dateWithTimeIntervalSinceNow:0];
NSString *caldate = [now description];
recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", DOCUMENTS_FOLDER, caldate] retain];

NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
if(!recorder){
    NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
    UIAlertView *alert =
    [[UIAlertView alloc] initWithTitle: @"Warning"
                               message: [err localizedDescription]
                              delegate: nil
                     cancelButtonTitle:@"OK"
                     otherButtonTitles:nil];
    [alert show];
    [alert release];
    return;
}

//prepare to record
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;

BOOL audioHWAvailable = audioSession.inputIsAvailable;
if (! audioHWAvailable) {
    UIAlertView *cantRecordAlert =
    [[UIAlertView alloc] initWithTitle: @"Warning"
                               message: @"Audio input hardware not available"
                              delegate: nil
                     cancelButtonTitle:@"OK"
                     otherButtonTitles:nil];
    [cantRecordAlert show];
    [cantRecordAlert release]; 
    return;
}

// start recording
[recorder recordForDuration:(NSTimeInterval) 10];

}

- (void) stopRecording{

[recorder stop];

NSURL *url = [NSURL fileURLWithPath: recorderFilePath];
NSError *err = nil;
NSData *audioData = [NSData dataWithContentsOfFile:[url path] options: 0 error:&err];
if(!audioData)
    NSLog(@"audio data: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
[editedObject setValue:[NSData dataWithContentsOfURL:url] forKey:editedFieldKey];   

//[recorder deleteRecording];


NSFileManager *fm = [NSFileManager defaultManager];

err = nil;
[fm removeItemAtPath:[url path] error:&err];
if(err)
    NSLog(@"File Manager: %@ %d %@", [err domain], [err code], [[err userInfo] description]);



UIBarButtonItem *startButton = [[UIBarButtonItem alloc] initWithTitle:@"Record" style:UIBarButtonItemStyleBordered  target:self action:@selector(startRecording)];
self.navigationItem.rightBarButtonItem = startButton;
[startButton release];

}

- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{

NSLog (@"audioRecorderDidFinishRecording:successfully:");
// your actions here

}



I have uploaded a sample project. You can take a look.

VoiceRecorder




Unrecognized selector sent to instance error from Utility App navigation controller view

You incorrectly assigned the type UIViewController to your custom view-controller. You should select the class-type that actually applies to your custom view-controller (the one that holds the method implementation sendMail).

The problem with your code / setup is that your custom view controller gets instantiated with the type UIViewController. But, UIViewController does not implement any method called sendMail, hence you get an exception.

As you are not specifying the class name of your custom view controller, I will simply assume one for the sake for this answer; MyCustomViewController

Since you appear to use the InterfaceBuilder for setting those things up, use it to change the type of your view-controller towards MyCustomViewController.

EDIT

From your comments, I can see that you actually instantiate the view controller using code. In that case, replace your way with this:

MyCustomViewController *controller = [[MyCustomViewController alloc] initWithNibName:@"ExMobSendFeedback" bundle:nil]; 
controller.title = @"Feedback"; 
[self.navigationController pushViewController:controller animated:YES];
[controller release];



Although this is an answered question (and kind of old) i have decided to post my full working code for others that found it hard to find good working (out of the box) playing and recording example - including encoded, pcm, play via speaker, write to file here it is:

AudioPlayerViewController.h:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface AudioPlayerViewController : UIViewController {
AVAudioPlayer *audioPlayer;
AVAudioRecorder *audioRecorder;
int recordEncoding;
enum
{
    ENC_AAC = 1,
    ENC_ALAC = 2,
    ENC_IMA4 = 3,
    ENC_ILBC = 4,
    ENC_ULAW = 5,
    ENC_PCM = 6,
} encodingTypes;
}

-(IBAction) startRecording;
-(IBAction) stopRecording;
-(IBAction) playRecording;
-(IBAction) stopPlaying;

@end

AudioPlayerViewController.m:

#import "AudioPlayerViewController.h"

@implementation AudioPlayerViewController

- (void)viewDidLoad
{
    [super viewDidLoad];
    recordEncoding = ENC_AAC;
}

-(IBAction) startRecording
{
NSLog(@"startRecording");
[audioRecorder release];
audioRecorder = nil;

// Init audio with record capability
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:nil];

NSMutableDictionary *recordSettings = [[NSMutableDictionary alloc] initWithCapacity:10];
if(recordEncoding == ENC_PCM)
{
    [recordSettings setObject:[NSNumber numberWithInt: kAudioFormatLinearPCM] forKey: AVFormatIDKey];
    [recordSettings setObject:[NSNumber numberWithFloat:44100.0] forKey: AVSampleRateKey];
    [recordSettings setObject:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
    [recordSettings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
    [recordSettings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
    [recordSettings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];   
}
else
{
    NSNumber *formatObject;

    switch (recordEncoding) {
        case (ENC_AAC): 
            formatObject = [NSNumber numberWithInt: kAudioFormatMPEG4AAC];
            break;
        case (ENC_ALAC):
            formatObject = [NSNumber numberWithInt: kAudioFormatAppleLossless];
            break;
        case (ENC_IMA4):
            formatObject = [NSNumber numberWithInt: kAudioFormatAppleIMA4];
            break;
        case (ENC_ILBC):
            formatObject = [NSNumber numberWithInt: kAudioFormatiLBC];
            break;
        case (ENC_ULAW):
            formatObject = [NSNumber numberWithInt: kAudioFormatULaw];
            break;
        default:
            formatObject = [NSNumber numberWithInt: kAudioFormatAppleIMA4];
    }

    [recordSettings setObject:formatObject forKey: AVFormatIDKey];
    [recordSettings setObject:[NSNumber numberWithFloat:44100.0] forKey: AVSampleRateKey];
    [recordSettings setObject:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
    [recordSettings setObject:[NSNumber numberWithInt:12800] forKey:AVEncoderBitRateKey];
    [recordSettings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
    [recordSettings setObject:[NSNumber numberWithInt: AVAudioQualityHigh] forKey: AVEncoderAudioQualityKey];
}

NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/recordTest.caf", [[NSBundle mainBundle] resourcePath]]];


NSError *error = nil;
audioRecorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSettings error:&error];

if ([audioRecorder prepareToRecord] == YES){
    [audioRecorder record];
}else {
    int errorCode = CFSwapInt32HostToBig ([error code]); 
    NSLog(@"Error: %@ [%4.4s])" , [error localizedDescription], (char*)&errorCode); 

}
NSLog(@"recording");
}

-(IBAction) stopRecording
{
NSLog(@"stopRecording");
[audioRecorder stop];
NSLog(@"stopped");
}

-(IBAction) playRecording
{
NSLog(@"playRecording");
// Init audio with playback capability
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryPlayback error:nil];

NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/recordTest.caf", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.numberOfLoops = 0;
[audioPlayer play];
NSLog(@"playing");
}

-(IBAction) stopPlaying
{
NSLog(@"stopPlaying");
[audioPlayer stop];
NSLog(@"stopped");
}

- (void)dealloc
{
[audioPlayer release];
[audioRecorder release];
[super dealloc];
}

@end

Hope this will help some of you guys.




The AVAudioSession method

- (BOOL)setMode:(NSString *)theMode error:(NSError **)outError

Is marked in the documentation as being available only for iOS 5 and later. In fact given the recent addition of modes to the documentation, it looks like audio session modes are not available at all prior to iOS 5.