音频播放失败-E/android.media.AudioTrack﹕前声道必须存在于多声道配置中



我以前很少使用android的录音类,所以我在这方面的知识并不多。

我写了一个小应用程序,可以在后台录制音频,然后以PCM格式回放(我正在做一些测试,看看麦克风在后台使用了多少电池)。但当我尝试运行play()方法时,我会得到logcat错误:

11-03 00:20:05.744  18248-18248/com.bacon.corey.audiotimeshift E/android.media.AudioTrack﹕ Front channels must be present in multichannel configurations
11-03 00:20:05.748  18248-18248/com.bacon.corey.audiotimeshift E/AudioTrack﹕ Playback Failed

我在谷歌上搜索了这些错误,但似乎找不到任何关于它们的信息。

如果有人不介意给我一些建议,我将不胜感激。

这是应用程序的代码(它非常草率和未完成,因为它只用于测试电池寿命):

public class MainActivity extends ActionBarActivity {
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        if (savedInstanceState == null) {
            getSupportFragmentManager().beginTransaction()
                    .add(R.id.container, new PlaceholderFragment())
                    .commit();
        }
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.menu_main, menu);
        return true;
    }
    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        // Handle action bar item clicks here. The action bar will
        // automatically handle clicks on the Home/Up button, so long
        // as you specify a parent activity in AndroidManifest.xml.
        int id = item.getItemId();
        //noinspection SimplifiableIfStatement
        if (id == R.id.action_settings) {
            return true;
        }
        return super.onOptionsItemSelected(item);
    }
    /**
     * A placeholder fragment containing a simple view.
     */
    public static class PlaceholderFragment extends Fragment {
        public PlaceholderFragment() {
        }
        @Override
        public View onCreateView(LayoutInflater inflater, ViewGroup container,
                                 Bundle savedInstanceState) {
            View rootView = inflater.inflate(R.layout.fragment_main, container, false);
            return rootView;
        }
    }
    public void play(View view) {
        Toast.makeText(this, "play", Toast.LENGTH_SHORT).show();
// Get the file we want to playback.
        File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Get the length of the audio stored in the file (16 bit so 2 bytes per short)
// and create a short array to store the recorded audio.
        int musicLength = (int)(file.length()/2);
        short[] music = new short[musicLength];

        try {
// Create a DataInputStream to read the audio data back from the saved file.
            InputStream is = new FileInputStream(file);
            BufferedInputStream bis = new BufferedInputStream(is);
            DataInputStream dis = new DataInputStream(bis);
// Read the file into the music array.
            int i = 0;
            while (dis.available() > 0) {
                music[musicLength-1-i] = dis.readShort();
                i++;
            }

// Close the input streams.
            dis.close();

// Create a new AudioTrack object using the same parameters as the AudioRecord
// object used to create the file.
            AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                    11025,
                    AudioFormat.CHANNEL_OUT_MONO,
                    AudioFormat.ENCODING_PCM_16BIT,
                    musicLength,
                    AudioTrack.MODE_STREAM);
// Start playback
            audioTrack.play();
// Write the music buffer to the AudioTrack object
            audioTrack.write(music, 0, musicLength);

        } catch (Throwable t) {
            Log.e("AudioTrack","Playback Failed");
        }
    }
    public void record(View view){
        Toast.makeText(this, "record", Toast.LENGTH_SHORT).show();
        Log.v("ACS", "OnCreate called");
        Intent intent = new Intent(this, ACS.class);
        startService(intent);
    }
    public void stop(View view){
        Toast.makeText(this, "stop", Toast.LENGTH_SHORT).show();
        Intent intent = new Intent(this, ACS.class);
        stopService(intent);
    }
}

public class ACS extends IntentService {
    AudioRecord audioRecord;
    public ACS() {
        super("ACS");
    }
    @Override
    protected void onHandleIntent(Intent intent) {
        Log.v("ACS", "ACS called");
        record();
    }
    public void record() {
        Log.v("ACS", "Record started");
        int frequency = 11025;
        int channelConfiguration = AudioFormat.CHANNEL_IN_MONO;
        int audioEncoding = AudioFormat.ENCODING_PCM_16BIT;
        File file = new File(Environment.getExternalStorageDirectory() + File.separator + "ACS.pcm");
// Delete any previous recording.
        if (file.exists())
            file.delete();

// Create the new file.
        try {
            file.createNewFile();
        } catch (IOException e) {
            throw new IllegalStateException("Failed to create " + file.toString());
        }
        try {
// Create a DataOuputStream to write the audio data into the saved file.
            OutputStream os = new FileOutputStream(file);
            BufferedOutputStream bos = new BufferedOutputStream(os);
            DataOutputStream dos = new DataOutputStream(bos);
// Create a new AudioRecord object to record the audio.
            int bufferSize = AudioRecord.getMinBufferSize(frequency, channelConfiguration, audioEncoding);
            audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
                    frequency, channelConfiguration,
                    audioEncoding, bufferSize);
            short[] buffer = new short[bufferSize];
            audioRecord.startRecording();

            while (audioRecord.getRecordingState() == audioRecord.RECORDSTATE_RECORDING) {
                int bufferReadResult = audioRecord.read(buffer, 0, bufferSize);
                for (int i = 0; i < bufferReadResult; i++)
                    dos.writeShort(buffer[i]);
            }

            audioRecord.stop();
            dos.close();
        } catch (Throwable t) {
            Log.e("AudioRecord", "Recording Failed");
        }
        Log.v("ACS", "Record stopped");
    }
    public void onDestroy(){
        audioRecord.stop();
        Log.v("ACS", "onDestroy called, Record stopped");
    }
}

提前感谢

科里:)

我有相同的错误消息"android.media.AudioTrack﹕前通道必须存在于多通道配置中"。

当我将音频设置从AudioFormat.CHANNEL_OUT_MONO更改为AudioFormat.CHANNEL_IN_MONO时,错误消息消失。(或者您可以尝试不同的配置,如AudioFormat.CHANNEL_IN_STEREO

AudioTrack audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC,
                    11025,
                    AudioFormat.CHANNEL_IN_MONO,
                    AudioFormat.ENCODING_PCM_16BIT,
                    musicLength,
                    AudioTrack.MODE_STREAM);

但我不知道为什么会发生这种事。希望得到帮助。

单声道音频文件需要同时发送到左扬声器和右扬声器。执行逻辑OR以设置此路由:

最终int frontPair=AudioFormat.CHANNEL_OUT_FRONT_LEFT|AudioFormat.FHANNEL_UT_FRONT_RIGHT;

    AudioFormat audioFormat = new AudioFormat.Builder()
            .setEncoding(AudioFormat.ENCODING_PCM_8BIT)
            .setSampleRate(audioSamplingRate)
            .setChannelMask(frontPair)
            .build();

相关内容

最新更新