将实时安卓网络摄像头视频上传到RTP/RTP服务器



我已经做了适当的研究,但仍然缺乏关于我想要实现的事情的信息。

因此,我想编程一个应用程序,用户可以录制视频并立即(实时)将视频上传到RTP/RTP服务器。服务器端不会有问题。我不清楚的是如何在手机端实现这一点。

到目前为止,我的研究是,我必须在录制时将视频写入本地套接字,而不是文件,因为如果写入文件,3gp文件将无法访问,直到最终确定(当视频停止,并且标题信息已写入视频中关于长度等)。

当套接字接收到连续数据时,我需要将其包装成RTP数据包,并将其发送到远程服务器。我可能还必须先做基本的编码(这还不那么重要)。

如果这个理论到目前为止是正确的,有人知道吗。我还想知道是否有人可以向我介绍一些类似方法的代码片段,尤其是用于将视频发送到服务器的代码片段。我还不知道该怎么做。

非常感谢并向致以最良好的问候

您的总体方法听起来是正确的,但您需要考虑几件事。

因此,我想编程一个应用程序,用户可以录制视频并立即(实时)将视频上传到RTP/RTP服务器。

  • 我假设您想上传到RTSP服务器,以便它可以将内容重新分发到多个客户端
  • 您将如何处理RTP会话到RTSP服务器?您需要以某种方式通知RTSP服务器用户将上传实时媒体,以便打开适当的RTP/RTCP套接字等
  • 您将如何处理身份验证?多个客户端设备

到目前为止,我的研究是,我必须在录制时将视频写入本地套接字,而不是文件,因为如果写入文件,3gp文件将无法访问,直到最终确定(当视频停止,并且标题信息已写入视频中关于长度等)。

通过RTP/RTCP实时发送帧是正确的方法。当捕获设备捕获每一帧时,您需要对其进行编码/压缩并通过套接字发送。3gp和mp4一样,是一种用于文件存储的容器格式。对于实时捕获,不需要写入文件。唯一有意义的是,例如,在HTTP实时流媒体或DASH方法中,媒体在通过HTTP提供服务之前被写入传输流或mp4文件。

当套接字接收到连续数据时,我需要将其包装成RTP数据包,并将其发送到远程服务器。我可能还必须先做基本的编码(这还不那么重要)。

我不同意,编码非常重要,否则你可能永远无法发送视频,你将不得不处理成本(通过移动网络)和取决于分辨率和帧速率的媒体数量等问题。

如果这个理论到目前为止是正确的,有人知道吗。我还想知道是否有人可以向我介绍一些类似方法的代码片段,特别是用于将视频发送到服务器的代码片段。我还不知道该怎么做。

以sphydroid开源项目为起点。它包含许多必要的步骤,包括如何配置编码器、分组到RTP、发送RTCP以及一些RTSP服务器功能。Spyroid设置了一个RTSP服务器,以便在使用VLC等RTSP客户端设置RTSP会话时对媒体进行编码和发送。由于您的应用程序是由想要向服务器发送媒体的手机用户驱动的,因此您可能需要考虑另一种方法来启动发送,即使您向服务器发送某种消息,例如像sphydroid中那样设置RTSP会话。

一年前,我创建了一个安卓应用程序,可以使用rtsp通过tcp将相机/麦克风流式传输到wowza媒体服务器。

一般的方法是创建unix套接字,获取其文件描述符并将其提供给android媒体记录器组件。然后,指示媒体记录器将mp4/h264格式的摄像机视频记录到该文件描述符中。现在,您的应用程序读取客户端套接字,解析mp4以删除标头并从中获取iframe,然后将其动态打包到rtsp流中。

类似的方法也可以用于声音(通常为AAC)。当然,你必须处理自己的时间戳,而整个方法中最棘手的是视频/音频同步。

这是它的第一部分,可以称为rtspocket。它以连接的方式与媒体服务器协商,然后您可以将流本身写入其中。我稍后再看。

package com.example.android.streaming.streaming.rtsp;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.UnsupportedEncodingException;
import java.math.BigInteger;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.net.SocketException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.Locale;
import java.util.concurrent.ConcurrentHashMap;
import android.util.Base64;
import android.util.Log;
import com.example.android.streaming.StreamingApp;
import com.example.android.streaming.streaming.Session;
import com.example.android.streaming.BuildConfig;
public class RtspSocket extends Socket {
    public static final int RTSP_HEADER_LENGTH = 4;
    public static final int RTP_HEADER_LENGTH = 12;
    public static final int MTU = 1400;
    public static final int PAYLOAD_OFFSET = RTSP_HEADER_LENGTH + RTP_HEADER_LENGTH;
    public static final int RTP_OFFSET = RTSP_HEADER_LENGTH;
    private ConcurrentHashMap<String, String> headerMap = new ConcurrentHashMap<String, String>();
    static private final String kCRLF = "rn";
    // RTSP request format strings
    static private final String kOptions = "OPTIONS %s RTSP/1.0rn";
    static private final String kDescribe = "DESCRIBE %s RTSP/1.0rn";
    static private final String kAnnounce = "ANNOUNCE %s RTSP/1.0rn";
    static private final String kSetupPublish = "SETUP %s/trackid=%d RTSP/1.0rn";
    @SuppressWarnings("unused")
    static private final String kSetupPlay = "SETUP %s/trackid=%d RTSP/1.0rn";
    static private final String kRecord = "RECORD %s RTSP/1.0rn";
    static private final String kPlay = "PLAY %s RTSP/1.0rn";
    static private final String kTeardown = "TEARDOWN %s RTSP/1.0rn";
    // RTSP header format strings
    static private final String kCseq = "Cseq: %drn";
    static private final String kContentLength = "Content-Length: %drn";
    static private final String kContentType = "Content-Type: %srn";
    static private final String kTransport = "Transport: RTP/AVP/%s;unicast;mode=%s;%srn";
    static private final String kSession = "Session: %srn";
    static private final String kRange = "range: %srn";
    static private final String kAccept = "Accept: %srn";
    static private final String kAuthBasic = "Authorization: Basic %srn";
    static private final String kAuthDigest = "Authorization: Digest username="%s",realm="%s",nonce="%s",uri="%s",response="%s"rn";
    // RTSP header keys
    static private final String kSessionKey = "Session";
    static private final String kWWWAuthKey = "WWW-Authenticate";
    byte header[] = new byte[RTSP_MAX_HEADER + 1];
    static private final int RTSP_MAX_HEADER = 4095;
    static private final int RTSP_MAX_BODY = 4095;
    static private final int RTSP_RESP_ERR = -6;
    // static private final int RTSP_RESP_ERR_SESSION = -7;
    static public final int RTSP_OK = 200;
    static private final int RTSP_BAD_USER_PASS = 401;
    static private final int SOCK_ERR_READ = -5;
    /* Number of channels including control ones. */
    private int channelCount = 0;
    /* RTSP negotiation cmd seq counter */
    private int seq = 0;
    private String authentication = null;
    private String session = null;
    private String path = null;
    private String url = null;
    private String user = null;
    private String pass = null;
    private String sdp = null;
    private byte[] buffer = new byte[MTU];
    public RtspSocket() {
        super();
        try {
            setTcpNoDelay(true);
            setSoTimeout(60000);
        } catch (SocketException e) {
            Log.e(StreamingApp.TAG, "Failed to set socket params.");
        }
        buffer[RTSP_HEADER_LENGTH] = (byte) Integer.parseInt("10000000", 2);
    }
    public byte[] getBuffer() {
        return buffer;
    }
    public static final void setLong(byte[] buffer, long n, int begin, int end) {
        for (end--; end >= begin; end--) {
            buffer[end] = (byte) (n % 256);
            n >>= 8;
        }
    }
    public void setSequence(int seq) {
        setLong(buffer, seq, RTP_OFFSET + 2, RTP_OFFSET + 4);
    }
    public void setSSRC(int ssrc) {
        setLong(buffer, ssrc, RTP_OFFSET + 8, RTP_OFFSET + 12);
    }
    public void setPayload(int payload) {
        buffer[RTP_OFFSET + 1] = (byte) (payload & 0x7f);
    }
    public void setRtpTimestamp(long timestamp) {
        setLong(buffer, timestamp, RTP_OFFSET + 4, RTP_OFFSET + 8);
    }
    /** Sends the RTP packet over the network */
    private void send(int length, int stream) throws IOException {
        buffer[0] = '$';
        buffer[1] = (byte) stream;
        setLong(buffer, length, 2, 4);
        OutputStream s = getOutputStream();
        s.write(buffer, 0, length + RTSP_HEADER_LENGTH);
        s.flush();
    }
    public void sendReport(int length, int ssrc, int stream) throws IOException {
        setPayload(200);
        setLong(buffer, ssrc, RTP_OFFSET + 4, RTP_OFFSET + 8);
        send(length + RTP_HEADER_LENGTH, stream);
    }
    public void sendData(int length, int ssrc, int seq, int payload, int stream, boolean last) throws IOException {
        setSSRC(ssrc);
        setSequence(seq);
        setPayload(payload);
        buffer[RTP_OFFSET + 1] |= (((last ? 1 : 0) & 0x01) << 7);
        send(length + RTP_HEADER_LENGTH, stream);
    }
    public int getChannelCount() {
        return channelCount;
    }
    private void write(String request) throws IOException {
        try {
            String asci = new String(request.getBytes(), "US-ASCII");
            OutputStream out = getOutputStream();
            out.write(asci.getBytes());
        } catch (IOException e) {
            throw new IOException("Error writing to socket.");
        }
    }
    private String read() throws IOException {
        String response = null;
        try {
            InputStream in = getInputStream();
            int i = 0, len = 0, crlf_count = 0;
            boolean parsedHeader = false;
            for (; i < RTSP_MAX_BODY && !parsedHeader && len > -1; i++) {
                len = in.read(header, i, 1);
                if (header[i] == 'r' || header[i] == 'n') {
                    crlf_count++;
                    if (crlf_count == 4)
                        parsedHeader = true;
                } else {
                    crlf_count = 0;
                }
            }
            if (len != -1) {
                len = i;
                header[len] = '';
                response = new String(header, 0, len, "US-ASCII");
            }
        } catch (IOException e) {
            throw new IOException("Connection timed out. Check your network settings.");
        }
        return response;
    }
    private int parseResponse(String response) {
        String[] lines = response.split(kCRLF);
        String[] items = response.split(" ");
        String tempString, key, value;
        headerMap.clear();
        if (items.length < 2)
            return RTSP_RESP_ERR;
        int responseCode = RTSP_RESP_ERR;
        try {
            responseCode = Integer.parseInt(items[1]);
        } catch (Exception e) {
            Log.w(StreamingApp.TAG, e.getMessage());
            Log.w(StreamingApp.TAG, response);
        }
        if (responseCode == RTSP_RESP_ERR)
            return responseCode;
        // Parse response header into key value pairs.
        for (int i = 1; i < lines.length; i++) {
            tempString = lines[i];
            if (tempString.length() == 0)
                break;
            int idx = tempString.indexOf(":");
            if (idx == -1)
                continue;
            key = tempString.substring(0, idx);
            value = tempString.substring(idx + 1);
            headerMap.put(key, value);
        }
        tempString = headerMap.get(kSessionKey);
        if (tempString != null) {
            // Parse session
            items = tempString.split(";");
            tempString = items[0];
            session = tempString.trim();
        }
        return responseCode;
    }
    private void generateBasicAuth() throws UnsupportedEncodingException {
        String userpass = String.format("%s:%s", user, pass);
        authentication = String.format(kAuthBasic, Base64.encodeToString(userpass.getBytes("US-ASCII"), Base64.DEFAULT));
    }
    public static String md5(String s) {
        MessageDigest digest;
        try {
            digest = MessageDigest.getInstance("MD5");
            digest.update(s.getBytes(), 0, s.length());
            String hash = new BigInteger(1, digest.digest()).toString(16);
            return hash;
        } catch (NoSuchAlgorithmException e) {
            e.printStackTrace();
        }
        return "";
    }
    static private final int CC_MD5_DIGEST_LENGTH = 16;
    private String md5HexDigest(String input) {
        byte digest[] = md5(input).getBytes();
        String result = new String();
        for (int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
            result = result.concat(String.format("%02x", digest[i]));
        return result;
    }
    private void generateDigestAuth(String method) {
        String nonce, realm;
        String ha1, ha2, response;
        // WWW-Authenticate: Digest realm="Streaming Server",
        // nonce="206351b944cb28fe37a0794848c2e36f"
        String wwwauth = headerMap.get(kWWWAuthKey);
        int idx = wwwauth.indexOf("Digest");
        String authReq = wwwauth.substring(idx + "Digest".length() + 1);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, String.format("Auth Req: %s", authReq));
        String[] split = authReq.split(",");
        realm = split[0];
        nonce = split[1];
        split = realm.split("=");
        realm = split[1];
        realm = realm.substring(1, 1 + realm.length() - 2);
        split = nonce.split("=");
        nonce = split[1];
        nonce = nonce.substring(1, 1 + nonce.length() - 2);
        if (BuildConfig.DEBUG) {
            Log.d(StreamingApp.TAG, String.format("realm=%s", realm));
            Log.d(StreamingApp.TAG, String.format("nonce=%s", nonce));
        }
        ha1 = md5HexDigest(String.format("%s:%s:%s", user, realm, pass));
        ha2 = md5HexDigest(String.format("%s:%s", method, url));
        response = md5HexDigest(String.format("%s:%s:%s", ha1, nonce, ha2));
        authentication = md5HexDigest(String.format(kAuthDigest, user, realm, nonce, url, response));
    }
    private int options() throws IOException {
        seq++;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kOptions, url));
        request.append(String.format(kCseq, seq));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- OPTIONS Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- OPTIONS Response ---nn" + response);
        return parseResponse(response);
    }
    @SuppressWarnings("unused")
    private int describe() throws IOException {
        seq++;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kDescribe, url));
        request.append(String.format(kAccept, "application/sdp"));
        request.append(String.format(kCseq, seq));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- DESCRIBE Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- DESCRIBE Response ---nn" + response);
        return parseResponse(response);
    }
    private int recurseDepth = 0;
    private int announce() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kAnnounce, url));
        request.append(String.format(kCseq, seq));
        request.append(String.format(kContentLength, sdp.length()));
        request.append(String.format(kContentType, "application/sdp"));
        request.append(kCRLF);
        if (sdp.length() > 0)
            request.append(sdp);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- ANNOUNCE Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- ANNOUNCE Response ---nn" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;
                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("ANNOUNCE");
                }
                ret = announce();
                recurseDepth--;
            }
        }
        return ret;
    }
    private int setup(int trackId) throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kSetupPublish, url, trackId));
        request.append(String.format(kCseq, seq));
        /* One channel for rtp (data) and one for rtcp (control) */
        String tempString = String.format(Locale.getDefault(), "interleaved=%d-%d", channelCount++, channelCount++);
        request.append(String.format(kTransport, "TCP", "record", tempString));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- SETUP Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- SETUP Response ---nn" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;
                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("SETUP");
                }
                ret = setup(trackId);
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }
    private int record() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kRecord, url));
        request.append(String.format(kCseq, seq));
        request.append(String.format(kRange, "npt=0.000-"));
        if (authentication != null)
            request.append(authentication);
        if (session != null)
            request.append(String.format(kSession, session));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- RECORD Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- RECORD Response ---nn" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;
                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("RECORD");
                }
                ret = record();
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }
    @SuppressWarnings("unused")
    private int play() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kPlay, url));
        request.append(String.format(kCseq, seq));
        request.append(String.format(kRange, "npt=0.000-"));
        if (authentication != null)
            request.append(authentication);
        if (session != null)
            request.append(String.format(kSession, session));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- PLAY Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- PLAY Response ---nn" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;
                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("PLAY");
                }
                ret = record();
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }
    private int teardown() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kTeardown, url));
        request.append(String.format(kCseq, seq));
        if (authentication != null)
            request.append(authentication);
        if (session != null)
            request.append(String.format(kSession, session));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- TEARDOWN Request ---nn" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- TEARDOWN Response ---nn" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;
                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("TEARDOWN");
                }
                ret = record();
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }
    public void connect(String dest, int port, Session session) throws IOException {
        int trackId = 1;
        int responseCode;
        if (isConnected())
            return;
        if (!session.hasAudioTrack() && !session.hasVideoTrack())
            throw new IOException("No tracks found in session.");
        InetSocketAddress addr = null;
        try {
            addr = new InetSocketAddress(dest, port);
        } catch (Exception e) {
            throw new IOException("Failed to resolve rtsp server address.");
        }
        this.sdp = session.getSDP();
        this.user = session.getUser();
        this.pass = session.getPass();
        this.path = session.getPath();
        this.url = String.format("rtsp://%s:%d%s", dest, addr.getPort(), this.path);
        try {
            super.connect(addr);
        } catch (IOException e) {
            throw new IOException("Failed to connect rtsp server.");
        }
        responseCode = announce();
        if (responseCode != RTSP_OK) {
            close();
            throw new IOException("RTSP announce failed: " + responseCode);
        }
        responseCode = options();
        if (responseCode != RTSP_OK) {
            close();
            throw new IOException("RTSP options failed: " + responseCode);
        }
        /* Setup audio */
        if (session.hasAudioTrack()) {
            session.getAudioTrack().setStreamId(channelCount);
            responseCode = setup(trackId++);
            if (responseCode != RTSP_OK) {
                close();
                throw new IOException("RTSP video failed: " + responseCode);
            }
        }
        /* Setup video */
        if (session.hasVideoTrack()) {
            session.getVideoTrack().setStreamId(channelCount);
            responseCode = setup(trackId++);
            if (responseCode != RTSP_OK) {
                close();
                throw new IOException("RTSP audio setup failed: " + responseCode);
            }
        }
        responseCode = record();
        if (responseCode != RTSP_OK) {
            close();
            throw new IOException("RTSP record failed: " + responseCode);
        }
    }
    public void close() throws IOException {
        if (!isConnected())
            return;
        teardown();
        super.close();
    }
}

我试图达到同样的结果(但由于缺乏经验而放弃)。我的方法是使用ffmpeg和/或avlib,因为它已经有了可工作的rtmp堆栈。所以在理论上,您所需要的只是将视频流路由到ffmpeg进程,该进程将流式传输到服务器。

在客户端使用3gp有原因吗?使用mp4(在标头中设置了MOOV原子),您可以分块读取临时文件并将其发送到服务器,但可能会有轻微的时间延迟,这也取决于您的连接速度。您的rtsp服务器应该能够将mp4重新编码回3gp,以进行低带宽查看。

在这一点上,如果我必须接受相机(原始流)并立即将其提供给一组客户端,我会选择谷歌闲逛路线并使用WebRTC。有关工具集/SDK,请参阅ondello的"平台部分"。在评估过程中,您应该查看WebRTC与RTSP的比较优势。

IMO凭借其状态性,RTSP将成为防火墙和NAT背后的一个夜器。AFAIK在3G/4G上,在第三方应用程序中使用RTP有点风险。

也就是说,我在git上发布了一个旧的android/rtp/rtsp/sdp项目,使用netty的libs和"extract"。我认为这个项目试图从Youtube视频中检索并播放容器中的音轨(视频音轨被忽略,没有通过网络提取),所有这些视频当时都是为RTSP编码的。我觉得有一些数据包和帧头问题,我受够了RTSP,放弃了它。

如果你必须追求RTP/RTP,那么其他海报提到的一些数据包和帧级别的东西就在android类和outflow 附带的测试用例中。

这里是RTSP会话类。它使用rtsp套接字与媒体服务器进行通信。它的目的也是保存会话参数,例如,它可以发送哪些流(视频和/或音频)、队列,以及一些音频/视频同步代码。

已使用的接口。

package com.example.android.streaming.streaming.rtsp;
public interface PacketListener {
    public void onPacketReceived(Packet p);
}

会话本身。

package com.example.android.streaming.streaming;
import static java.util.EnumSet.of;
import java.io.IOException;
import java.util.EnumSet;
import java.util.concurrent.BlockingDeque;
import java.util.concurrent.LinkedBlockingDeque;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.locks.Condition;
import java.util.concurrent.locks.ReentrantLock;
import android.app.Activity;
import android.content.SharedPreferences;
import android.hardware.Camera;
import android.hardware.Camera.CameraInfo;
import android.os.SystemClock;
import android.preference.PreferenceManager;
import android.util.Log;
import android.view.SurfaceHolder;
import com.example.android.streaming.BuildConfig;
import com.example.android.streaming.StreamingApp;
import com.example.android.streaming.streaming.audio.AACStream;
import com.example.android.streaming.streaming.rtsp.Packet;
import com.example.android.streaming.streaming.rtsp.Packet.PacketType;
import com.example.android.streaming.streaming.rtsp.PacketListener;
import com.example.android.streaming.streaming.rtsp.RtspSocket;
import com.example.android.streaming.streaming.video.H264Stream;
import com.example.android.streaming.streaming.video.VideoConfig;
import com.example.android.streaming.streaming.video.VideoStream;
public class Session implements PacketListener, Runnable {
    public final static int MESSAGE_START = 0x03;
    public final static int MESSAGE_STOP = 0x04;
    public final static int VIDEO_H264 = 0x01;
    public final static int AUDIO_AAC = 0x05;
    public final static int VIDEO_TRACK = 1;
    public final static int AUDIO_TRACK = 0;
    private static VideoConfig defaultVideoQuality = VideoConfig.defaultVideoQualiy.clone();
    private static int defaultVideoEncoder = VIDEO_H264, defaultAudioEncoder = AUDIO_AAC;
    private static Session sessionUsingTheCamera = null;
    private static Session sessionUsingTheCamcorder = null;
    private static int startedStreamCount = 0;
    private int sessionTrackCount = 0;
    private static SurfaceHolder surfaceHolder;
    private Stream[] streamList = new Stream[2];
    protected RtspSocket socket = null;
    private Activity context = null;
    private String host = null;
    private String path = null;
    private String user = null;
    private String pass = null;
    private int port;
    public interface SessionListener {
        public void startSession(Session session);
        public void stopSession(Session session);
    };
    public Session(Activity context, String host, int port, String path, String user, String pass) {
        this.context = context;
        this.host = host;
        this.port = port;
        this.path = path;
        this.pass = pass;
    }
    public boolean isConnected() {
        return socket != null && socket.isConnected();
    }
    /**
     * Connect to rtsp server and start new session. This should be called when
     * all the streams are added so that proper sdp can be generated.
     */
    public void connect() throws IOException {
        try {
            socket = new RtspSocket();
            socket.connect(host, port, this);
        } catch (IOException e) {
            socket = null;
            throw e;
        }
    }
    public void close() throws IOException {
        if (socket != null) {
            socket.close();
            socket = null;
        }
    }
    public static void setDefaultVideoQuality(VideoConfig quality) {
        defaultVideoQuality = quality;
    }
    public static void setDefaultAudioEncoder(int encoder) {
        defaultAudioEncoder = encoder;
    }
    public static void setDefaultVideoEncoder(int encoder) {
        defaultVideoEncoder = encoder;
    }
    public static void setSurfaceHolder(SurfaceHolder sh) {
        surfaceHolder = sh;
    }
    public boolean hasVideoTrack() {
        return getVideoTrack() != null;
    }
    public MediaStream getVideoTrack() {
        return (MediaStream) streamList[VIDEO_TRACK];
    }
    public void addVideoTrack(Camera camera, CameraInfo info) throws IllegalStateException, IOException {
        addVideoTrack(camera, info, defaultVideoEncoder, defaultVideoQuality, false);
    }
    public synchronized void addVideoTrack(Camera camera, CameraInfo info, int encoder, VideoConfig quality,
            boolean flash) throws IllegalStateException, IOException {
        if (isCameraInUse())
            throw new IllegalStateException("Camera already in use by another client.");
        Stream stream = null;
        VideoConfig.merge(quality, defaultVideoQuality);
        switch (encoder) {
        case VIDEO_H264:
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Video streaming: H.264");
            SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context.getApplicationContext());
            stream = new H264Stream(camera, info, this, prefs);
            break;
        }
        if (stream != null) {
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Quality is: " + quality.resX + "x" + quality.resY + "px " + quality.framerate
                        + "fps, " + quality.bitrate + "bps");
            ((VideoStream) stream).setVideoQuality(quality);
            ((VideoStream) stream).setPreviewDisplay(surfaceHolder.getSurface());
            streamList[VIDEO_TRACK] = stream;
            sessionUsingTheCamera = this;
            sessionTrackCount++;
        }
    }
    public boolean hasAudioTrack() {
        return getAudioTrack() != null;
    }
    public MediaStream getAudioTrack() {
        return (MediaStream) streamList[AUDIO_TRACK];
    }
    public void addAudioTrack() throws IOException {
        addAudioTrack(defaultAudioEncoder);
    }
    public synchronized void addAudioTrack(int encoder) throws IOException {
        if (sessionUsingTheCamcorder != null)
            throw new IllegalStateException("Audio device is already in use by another client.");
        Stream stream = null;
        switch (encoder) {
        case AUDIO_AAC:
            if (android.os.Build.VERSION.SDK_INT < 14)
                throw new IllegalStateException("This device does not support AAC.");
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Audio streaming: AAC");
            SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context.getApplicationContext());
            stream = new AACStream(this, prefs);
            break;
        }
        if (stream != null) {
            streamList[AUDIO_TRACK] = stream;
            sessionUsingTheCamcorder = this;
            sessionTrackCount++;
        }
    }
    public synchronized String getSDP() throws IllegalStateException, IOException {
        StringBuilder sdp = new StringBuilder();
        sdp.append("v=0rn");
        /*
         * The RFC 4566 (5.2) suggests to use an NTP timestamp here but we will
         * simply use a UNIX timestamp.
         */
        //sdp.append("o=- " + timestamp + " " + timestamp + " IN IP4 127.0.0.1rn");
        sdp.append("o=- 0 0 IN IP4 127.0.0.1rn");
        sdp.append("s=Vedroidrn");
        sdp.append("c=IN IP4 " + host + "rn");
        sdp.append("i=N/Arn");
        sdp.append("t=0 0rn");
        sdp.append("a=tool:Vedroid RTPrn");
        int payload = 96;
        int trackId = 1;
        for (int i = 0; i < streamList.length; i++) {
            if (streamList[i] != null) {
                streamList[i].setPayloadType(payload++);
                sdp.append(streamList[i].generateSDP());
                sdp.append("a=control:trackid=" + trackId++ + "rn");
            }
        }
        return sdp.toString();
    }
    public String getDest() {
        return host;
    }
    public int getTrackCount() {
        return sessionTrackCount;
    }
    public static boolean isCameraInUse() {
        return sessionUsingTheCamera != null;
    }
    /** Indicates whether or not the microphone is being used in a session. **/
    public static boolean isMicrophoneInUse() {
        return sessionUsingTheCamcorder != null;
    }
    private SessionListener listener = null;
    public synchronized void prepare(int trackId) throws IllegalStateException, IOException {
        Stream stream = streamList[trackId];
        if (stream != null && !stream.isStreaming())
            stream.prepare();
    }
    public synchronized void start(int trackId) throws IllegalStateException, IOException {
        Stream stream = streamList[trackId];
        if (stream != null && !stream.isStreaming()) {
            stream.start();
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Started " + (trackId == VIDEO_TRACK ? "video" : "audio") + " channel.");
            //            if (++startedStreamCount == 1 && listener != null)
            //                listener.startSession(this);
        }
    }
    public void startAll(SessionListener listener) throws IllegalStateException, IOException {
        this.listener = listener;
        startThread();
        for (int i = 0; i < streamList.length; i++)
            prepare(i);
        /*
         * Important to start video capture before audio capture. This makes
         * audio/video de-sync smaller.
         */
        for (int i = 0; i < streamList.length; i++)
            start(streamList.length - i - 1);
    }
    public synchronized void stopAll() {
        for (int i = 0; i < streamList.length; i++) {
            if (streamList[i] != null && streamList[i].isStreaming()) {
                streamList[i].stop();
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, "Stopped " + (i == VIDEO_TRACK ? "video" : "audio") + " channel.");
                if (--startedStreamCount == 0 && listener != null)
                    listener.stopSession(this);
            }
        }
        stopThread();
        this.listener = null;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "Session stopped.");
    }
    public synchronized void flush() {
        for (int i = 0; i < streamList.length; i++) {
            if (streamList[i] != null) {
                streamList[i].release();
                if (i == VIDEO_TRACK)
                    sessionUsingTheCamera = null;
                else
                    sessionUsingTheCamcorder = null;
                streamList[i] = null;
            }
        }
    }
    public String getPath() {
        return path;
    }
    public String getUser() {
        return user;
    }
    public String getPass() {
        return pass;
    }
    private BlockingDeque<Packet> audioQueue = new LinkedBlockingDeque<Packet>(MAX_QUEUE_SIZE);
    private BlockingDeque<Packet> videoQueue = new LinkedBlockingDeque<Packet>(MAX_QUEUE_SIZE);
    private final static int MAX_QUEUE_SIZE = 1000;
    private void sendPacket(Packet p) {
        try {
            MediaStream channel = (p.type == PacketType.AudioPacketType ? getAudioTrack() : getVideoTrack());
            p.packetizer.send(p, socket, channel.getPayloadType(), channel.getStreamId());
            getPacketQueue(p.type).remove(p);
        } catch (IOException e) {
            Log.e(StreamingApp.TAG, "Failed to send packet: " + e.getMessage());
        }
    }
    private final ReentrantLock queueLock = new ReentrantLock();
    private final Condition morePackets = queueLock.newCondition();
    private AtomicBoolean stopped = new AtomicBoolean(true);
    private Thread t = null;
    private final void wakeupThread() {
        queueLock.lock();
        try {
            morePackets.signalAll();
        } finally {
            queueLock.unlock();
        }
    }
    public void startThread() {
        if (t == null) {
            t = new Thread(this);
            stopped.set(false);
            t.start();
        }
    }
    public void stopThread() {
        stopped.set(true);
        if (t != null) {
            t.interrupt();
            try {
                wakeupThread();
                t.join();
            } catch (InterruptedException e) {
            }
            t = null;
        }
        audioQueue.clear();
        videoQueue.clear();
    }
    private long getStreamEndSampleTimestamp(BlockingDeque<Packet> queue) {
        long sample = 0;
        try {
            sample = queue.getLast().getSampleTimestamp() + queue.getLast().getFrameLen();
        } catch (Exception e) {
        }
        return sample;
    }
    private PacketType syncType = PacketType.AnyPacketType;
    private boolean aligned = false;
    private final BlockingDeque<Packet> getPacketQueue(PacketType type) {
        return (type == PacketType.AudioPacketType ? audioQueue : videoQueue);
    }
    private void setPacketTimestamp(Packet p) {
        /* Don't sync on SEI packet. */
        if (!aligned && p.type != syncType) {
            long shift = getStreamEndSampleTimestamp(getPacketQueue(syncType));
            Log.w(StreamingApp.TAG, "Set shift +" + shift + "ms to "
                    + (p.type == PacketType.VideoPacketType ? "video" : "audio") + " stream ("
                    + (getPacketQueue(syncType).size() + 1) + ") packets.");
            p.setTimestamp(p.getDuration(shift));
            p.setSampleTimestamp(shift);
            if (listener != null)
                listener.startSession(this);
            aligned = true;
        } else {
            p.setTimestamp(p.packetizer.getTimestamp());
            p.setSampleTimestamp(p.packetizer.getSampleTimestamp());
        }
        p.packetizer.setSampleTimestamp(p.getSampleTimestamp() + p.getFrameLen());
        p.packetizer.setTimestamp(p.getTimestamp() + p.getDuration());
        //        if (BuildConfig.DEBUG) {
        //            Log.d(StreamingApp.TAG, (p.type == PacketType.VideoPacketType ? "Video" : "Audio") + " packet timestamp: "
        //                    + p.getTimestamp() + "; sampleTimestamp: " + p.getSampleTimestamp());
        //        }
    }
    /*
     * Drop first frames if len is less than this. First sync frame will have
     * frame len >= 10 ms.
     */
    private final static int MinimalSyncFrameLength = 15;
    @Override
    public void onPacketReceived(Packet p) {
        queueLock.lock();
        try {
            /*
             * We always synchronize on video stream. Some devices have video
             * coming faster than audio, this is ok. Audio stream time stamps
             * will be adjusted. Other devices that have audio come first will
             * see all audio packets dropped until first video packet comes.
             * Then upon first video packet we again adjust the audio stream by
             * time stamp of the last video packet in the queue.
             */
            if (syncType == PacketType.AnyPacketType && p.type == PacketType.VideoPacketType
                    && p.getFrameLen() >= MinimalSyncFrameLength)
                syncType = p.type;
            if (syncType == PacketType.VideoPacketType) {
                setPacketTimestamp(p);
                if (getPacketQueue(p.type).size() > MAX_QUEUE_SIZE - 1) {
                    Log.w(StreamingApp.TAG, "Queue (" + p.type + ") is full, dropping packet.");
                } else {
                    /*
                     * Wakeup sending thread only if channels synchronization is
                     * already done.
                     */
                    getPacketQueue(p.type).add(p);
                    if (aligned)
                        morePackets.signalAll();
                }
            }
        } finally {
            queueLock.unlock();
        }
    }
    private boolean hasMorePackets(EnumSet<Packet.PacketType> mask) {
        boolean gotPackets;
        if (mask.contains(PacketType.AudioPacketType) && mask.contains(PacketType.VideoPacketType)) {
            gotPackets = (audioQueue.size() > 0 && videoQueue.size() > 0) && aligned;
        } else {
            if (mask.contains(PacketType.AudioPacketType))
                gotPackets = (audioQueue.size() > 0);
            else if (mask.contains(PacketType.VideoPacketType))
                gotPackets = (videoQueue.size() > 0);
            else
                gotPackets = (videoQueue.size() > 0 || audioQueue.size() > 0);
        }
        return gotPackets;
    }
    private void waitPackets(EnumSet<Packet.PacketType> mask) {
        queueLock.lock();
        try {
            do {
                if (!stopped.get() && !hasMorePackets(mask)) {
                    try {
                        morePackets.await();
                    } catch (InterruptedException e) {
                    }
                }
            } while (!stopped.get() && !hasMorePackets(mask));
        } finally {
            queueLock.unlock();
        }
    }
    private void sendPackets() {
        boolean send;
        Packet a, v;
        /*
         * Wait for any type of packet and send asap. With time stamps correctly
         * set, the real send moment is not important and may be quite
         * different. Media server will only check for time stamps.
         */
        waitPackets(of(PacketType.AnyPacketType));
        v = videoQueue.peek();
        if (v != null) {
            sendPacket(v);
            do {
                a = audioQueue.peek();
                if ((send = (a != null && a.getSampleTimestamp() <= v.getSampleTimestamp())))
                    sendPacket(a);
            } while (!stopped.get() && send);
        } else {
            a = audioQueue.peek();
            if (a != null)
                sendPacket(a);
        }
    }
    @Override
    public void run() {
        Log.w(StreamingApp.TAG, "Session thread started.");
        /*
         * Wait for both types of front packets to come and synchronize on each
         * other.
         */
        waitPackets(of(PacketType.AudioPacketType, PacketType.VideoPacketType));
        while (!stopped.get())
            sendPackets();
        Log.w(StreamingApp.TAG, "Flushing session queues.");
        Log.w(StreamingApp.TAG, "    " + audioQueue.size() + " audio packets.");
        Log.w(StreamingApp.TAG, "    " + videoQueue.size() + " video packets.");
        long start = SystemClock.elapsedRealtime();
        while (audioQueue.size() > 0 || videoQueue.size() > 0)
            sendPackets();
        Log.w(StreamingApp.TAG, "Session thread stopped.");
        Log.w(StreamingApp.TAG, "Queues flush took " + (SystemClock.elapsedRealtime() - start) + " ms.");
    }
}

检查这个答案:通过WIFI进行视频流?

然后,如果你想在安卓手机上看到直播,那么在你的应用程序中包括vlc插件,并通过实时流媒体协议(rtsp)连接。

Intent i = new Intent("org.videolan.vlc.VLCApplication.gui.video.VideoPlayerActivity");
i.setAction(Intent.ACTION_VIEW);
i.setData(Uri.parse("rtsp://10.0.0.179:8086/")); 
startActivity(i);

如果你在安卓手机上安装了VLC,那么你可以使用intent进行流式传输,并如上所示传递ip地址和端口号。

相关内容

  • 没有找到相关文章

最新更新