如何在安卓设备上使用声网 SDK 播放音频(android sdk)

声网 SDK 有播放音频的功能,我们可以使用这个功能在视频通话过程中播放用于娱乐或教育的歌曲、录音和音效。下面这个教程会教大家如何在安卓设备上使用声网 SDK 在视频通话过程中播放音频文件(android sdk)。


前期准备

  • 一个声网开发者帐户(详细步骤可参考这篇文章)。
  • 了解如何使用声网搭建安卓版直播推流应用。
  • 了解安卓开发的基础知识。
  • Android Studio。
  • 一个安卓设备。


在 Gradle 中添加依赖

首先,在 App 模块的 build.gradle 文件中添加下列依赖,同时下载声网Agora 第三方库。记得一定要使用最新的声网库哦。

implementation 'com.yanzhenjie:permission:2.0.3'
implementation 'io.agora.rtc:full-sdk:3.5.0'

在 Manifest.xml 文件中添加权限

在 Manifest.xml file 文件中添加以下权限:

<uses-permission android:name="android.permission.CAMERA" />
 <uses-permission android:name="android.permission.INTERNET" />
 <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
 <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
 <uses-permission android:name="android.permission.RECORD_AUDIO" />

创建一个 RtcEngine 实例

现在我们要通过初始化 RtcEngine 来创建一个 RtcEngine 实例,并把 IRtcEngineEventHandler 和 APP ID 传递给 create 方法。IRtcEngineEventHandler 是一种提供默认实现的抽象类:

try {
 mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
} catch (Exception e) {
 Log.e(LOG_TAG, Log.getStackTraceString(e));
 throw new RuntimeException("fatal error\n" + Log.getStackTraceString(e));
}

预加载音频文件

我们需要预加载音频文件来加快音频处理速度。音频 ID 和音频文件路径都录入在 preloadEffect() 函数中,我们要在用户加入视频通话前调用这个函数,然后它会对指定的音频文件进行预加载:

// Gets the global audio effect manager.
 audioEffectManager = engine.getAudioEffectManager();
 int id = 0;
//add the file path for the audio you want to play
 audioEffectManager.preloadEffect(id++, Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3"
);


我们用 playEffect() 函数来播放参数中传递的音频文件,下面列出的是 playEffect() 函数中录入的参数:

  1. soundId:需要播放的音效文件的音效 ID。
  2. filePath:音效文件的文件路径。
  3. loopCount:循环次数。
  4. Pitch:音效的音调。
  5. pan:设置音效的空间位置。
  6. volume:设置音量百分比。

接下来,我们通过预调用 playEffect() 函数来播放音效文件。

//play the effect and limit the size of the file
 audioEffectManager.playEffect(
 0,
 Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3", -1, 0.0, 100, true, 0
 );
 // Pauses all audio effects.
 audioEffectManager.pauseAllEffects();


温馨提示,安卓版声网Agora SDK 只支持下列音频文件格式:

  • MP3
  • AAC
  • M4A
  • 3GP
  • WAV


调整音量

我们可以通过调用 adjustPlaybackSignalVolume() 函数并传递音量值/百分比来设置视频通话的音频音量,视频通话的默认音量为100,如果要调整正在播放的音频文件的音量,可以使用下面的函数:

mRtcEngine.adjustPlaybackSignalVolume(55);

播放音频

下列代码会获取音频文件路径,当用户点击播放按钮时,这些代码就会播放音频:

engine.startAudioMixing("add your file path here", false, false, -1, 0);

停止播放

当用户点击停止播放按钮时,我们用 stopAudioMixing() 函数停止音频播放。

engine.stopAudioMixing();

继续播放

我们使用 resumeAudioMixing() 函数来继续播放音频。

engine.resumeAudioMixing();

音频播放功能和声网视频通话 SDK 的集成

现在你一定已经知道如何使用声网 SDK 方法来播放音频文件啦,那么你一定还需要下面这个代码段,因为下面的代码段可以帮你把声网音频播放功能集成进视频推流应用中。

public class MainActivity extends AppCompatActivity{
    private RtcEngine mRtcEngine;
    private IAudioEffectManager audioEffectManager;

    // Permissions
    private static final int PERMISSION_REQ_ID = 22;
    private static final String[] REQUESTED_PERMISSIONS = {Manifest.permission.RECORD_AUDIO, Manifest.permission.CAMERA};

    private static final String LOG_TAG = MainActivity.class.getSimpleName();

    // Handle SDK Events
    private final IRtcEngineEventHandler mRtcEventHandler = new IRtcEngineEventHandler() {
        @Override
        public void onUserJoined(final int uid, int elapsed) {
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    // set first remote user to the main bg video container
                    setupRemoteVideoStream(uid);
                }
            });
        }

        // remote user has left channel
        @Override
        public void onUserOffline(int uid, int reason) { // Tutorial Step 7
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    onRemoteUserLeft();
                }
            });
        }

        // remote user has toggled their video
        @Override
        public void onRemoteVideoStateChanged(final int uid, final int state, int reason, int elapsed) {
            runOnUiThread(new Runnable() {
                @Override
                public void run() {
                    onRemoteUserVideoToggle(uid, state);
                }
            });
        }
    };

    private void preloadAudioEffect(){
        // Gets the global audio effect manager.
        audioEffectManager = mRtcEngine.getAudioEffectManager();
        int id = 0;
        audioEffectManager.preloadEffect(id++, Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3");
        audioEffectManager.playEffect(
                0,
                Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3",
                -1,
                1,
                0.0,
                100,
                true,
                0
        );
        // Pauses all audio effects.
        audioEffectManager.pauseAllEffects();
    }

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        findViewById(R.id.bass).setVisibility(View.GONE); // set the join button hidden
        findViewById(R.id.beautify).setVisibility(View.GONE); // set the join button hidden

        if (checkSelfPermission(REQUESTED_PERMISSIONS[0], PERMISSION_REQ_ID) &&
                checkSelfPermission(REQUESTED_PERMISSIONS[1], PERMISSION_REQ_ID)) {
            initAgoraEngine();
        }


    }

    private void initAgoraEngine() {
        try {

            mRtcEngine = RtcEngine.create(getBaseContext(), getString(R.string.agora_app_id), mRtcEventHandler);
            preloadAudioEffect();
        } catch (Exception e) {
            Log.e(LOG_TAG, Log.getStackTraceString(e));

            throw new RuntimeException("NEED TO check rtc sdk init fatal error\n" + Log.getStackTraceString(e));
        }
        setupSession();
    }

    private void setupSession() {
        mRtcEngine.setChannelProfile(Constants.CHANNEL_PROFILE_COMMUNICATION);

        mRtcEngine.enableVideo();

        mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration(VideoEncoderConfiguration.VD_640x480, VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_30,
                VideoEncoderConfiguration.STANDARD_BITRATE,
                VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT));
    }

    private void setupLocalVideoFeed() {

        // setup the container for the local user
        FrameLayout videoContainer = findViewById(R.id.floating_video_container);
        SurfaceView videoSurface = RtcEngine.CreateRendererView(getBaseContext());
        videoSurface.setZOrderMediaOverlay(true);
        videoContainer.addView(videoSurface);
        mRtcEngine.setupLocalVideo(new VideoCanvas(videoSurface, VideoCanvas.RENDER_MODE_FIT, 0));
    }

    private void setupRemoteVideoStream(int uid) {
        // setup ui element for the remote stream
        FrameLayout videoContainer = findViewById(R.id.bg_video_container);
        // ignore any new streams that join the session
        if (videoContainer.getChildCount() >= 1) {
            return;
        }

        SurfaceView videoSurface = RtcEngine.CreateRendererView(getBaseContext());
        videoContainer.addView(videoSurface);
        mRtcEngine.setupRemoteVideo(new VideoCanvas(videoSurface, VideoCanvas.RENDER_MODE_FIT, uid));
        mRtcEngine.setRemoteSubscribeFallbackOption(io.agora.rtc.Constants.STREAM_FALLBACK_OPTION_AUDIO_ONLY);

    }



    // join the channel when user clicks UI button
    public void onjoinChannelClicked(View view) {
        mRtcEngine.joinChannel(null, "test-channel", "Extra Optional Data", 0); // if you do not specify the uid, Agora will assign one.
        setupLocalVideoFeed();
        findViewById(R.id.joinBtn).setVisibility(View.GONE); // set the join button hidden
        findViewById(R.id.bass).setVisibility(View.VISIBLE); // set the join button hidden
        findViewById(R.id.beautify).setVisibility(View.VISIBLE); // set the join button hidden

    }


    private void leaveChannel() {
        mRtcEngine.leaveChannel();
    }

    private void removeVideo(int containerID) {
        FrameLayout videoContainer = findViewById(containerID);
        videoContainer.removeAllViews();
    }

    private void onRemoteUserVideoToggle(int uid, int state) {
        FrameLayout videoContainer = findViewById(R.id.bg_video_container);

        SurfaceView videoSurface = (SurfaceView) videoContainer.getChildAt(0);
        videoSurface.setVisibility(state == 0 ? View.GONE : View.VISIBLE);

        // add an icon to let the other user know remote video has been disabled
        if(state == 0){
            ImageView noCamera = new ImageView(this);
            noCamera.setImageResource(R.drawable.video_disabled);
            videoContainer.addView(noCamera);
        } else {
            ImageView noCamera = (ImageView) videoContainer.getChildAt(1);
            if(noCamera != null) {
                videoContainer.removeView(noCamera);
            }
        }
    }

    private void onRemoteUserLeft() {
        removeVideo(R.id.bg_video_container);
    }



    public boolean checkSelfPermission(String permission, int requestCode) {
        Log.i(LOG_TAG, "checkSelfPermission " + permission + " " + requestCode);
        if (ContextCompat.checkSelfPermission(this,
                permission)
                != PackageManager.PERMISSION_GRANTED) {

            ActivityCompat.requestPermissions(this,
                    REQUESTED_PERMISSIONS,
                    requestCode);
            return false;
        }
        return true;
    }


    @Override
    public void onRequestPermissionsResult(int requestCode,
                                           @NonNull String permissions[], @NonNull int[] grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        Log.i(LOG_TAG, "onRequestPermissionsResult " + grantResults[0] + " " + requestCode);

        switch (requestCode) {
            case PERMISSION_REQ_ID: {
                if (grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
                    Log.i(LOG_TAG, "Need permissions " + Manifest.permission.RECORD_AUDIO + "/" + Manifest.permission.CAMERA);
                    break;
                }

                initAgoraEngine();
                break;
            }
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();

        leaveChannel();
        RtcEngine.destroy();
        mRtcEngine = null;
    }

        public void playAudio (View w){
            mRtcEngine.startAudioMixing(Environment.getExternalStorageDirectory().getPath()+"/Song/Caiiro.mp3",
                    false, false, -1, 0);
           //adjusting the volume
            mRtcEngine.adjustAudioMixingVolume(90);
            Toast.makeText(getApplicationContext(),
                    "just played the song",
                    Toast.LENGTH_LONG);


        }

        public void stopAudio (View v){
            mRtcEngine.stopAudioMixing();
            Toast.makeText(getApplicationContext(),
                    "stopped playing music ",
                    Toast.LENGTH_LONG);
        }

}

如果你还不知道怎么使用声网 SDK 搭建一对一视频通话应用,可以查看 GitHub Hermes 写的教程。上面的代码跟这个教程里教的是同样的理念哦。


小结

这个教程教会我们如何使用声网 SDK 来进行以下操作:

  • 预加载音频
  • 播放和暂停音频
  • 调整音频音量


总结

好啦,现在你知道怎么使用声网 SDK 在视频通话过程中播放音频文件了吧。

感谢读到这里的童鞋们~你可以在 这里 了解更多使用声网 SDK 来播放音频的信息哦,你也可以在 GitHub 声网API Examples 查看更多声网的功能。如果你想要粘贴或引用我使用的 SDK,可以在 Github 声网 API Examples 源代码 查看。


其他资源

如果你在操作过程中遇到问题,可以查看 声网官方文档


原文作者:Boemo Wame MmopelwaIn Boemo是一位喜欢探索创新方法的软件开发人员。他喜欢钻研学习复杂的理念,然后把这些复杂的理念用简单有趣且便于理解的方式讲述出来。
原文链接:How to Play Audio Using the Agora SDK in Android
推荐阅读
相关专栏
SDK 教程
164 文章
本专栏仅用于分享音视频相关的技术文章,与其他开发者和声网 研发团队交流、分享行业前沿技术、资讯。发帖前,请参考「社区发帖指南」,方便您更好的展示所发表的文章和内容。