NMS v3.21 WHIP协议WebRTC推流指南
一、协议实现
NMS v3.21基于WHIP协议(WebRTC-HTTP Ingestion Protocol)重新实现WebRTC推流功能。
二、功能特性
- 客户端开放:支持自定义实现
- 流采集能力 :
- 摄像头/麦克风捕获
- 桌面/应用窗口采集
- Canvas/Video元素捕获
- 支持视频滤镜、AI特效等
- 完整WebRTC功能支持(参考WebRTC Samples)
三、配置说明
ini
[webrtc]
# 使用whip和whep标准协议实现webrtc直接推流和播放
# ip 服务部署在公网,设置公网IP地址,内网就填内网ip,使用此IP地址直接进行通讯, 如不设置使用stun识别
ip =
# port 监听端口, 设置后开启服务
port = 10000
# ice_tcp ice是否通过tcp传输, Chrome支持\OBS不支持
ice_tcp = 0
# stun 服务器地址, 多个服务器地址用 | 分割
stun = stun:stun.l.google.com:19302
关键参数说明:
ip
:当明确设置固定公网ip地址时,WebRTC使用1:1 的NAT通讯,像RTMP等协议一样进行快速直连。port
:WebRTC的通讯端口,目前直接使用TCP传输,在公网通讯中,不掉包、不乱序、不花屏、不卡顿。stun
:stun服务器,如果不配置固定公网ip,则通过stun识别公网地址。- 注意:WHIP是以HTTP POST形式进行SDP信令交换,因此HTTP或HTTPS端口必须打开。流地址使用HTTP、HTTPS的端口,WebRTC作为内部通讯端口并不在流地址中表现。
四、流地址格式
HTTPS场景(需有效证书)
bash
https://server_name:8443/live/stream.whip
- 大多数web应用采用域名访问网站时,打开摄像头和麦克风需要网站以https访问,证书需要机构颁发,否则需要预先同意使用自签名证书。当web以https访问是,推流地址也必须是https地址,采用域名访问,NMS需要正确配置证书。
HTTP场景(本地测试)
bash
http://server_ip:8000/live/stream.whip
- 还有一种情况,当web以127.0.0.1或localhost访问时,不需要https就可以采集摄像头,因此推流地址可以是http协议的ip地址。
五、推流控制
- 结束推流 :按照协议定义,当post 本机 SDP信息后,服务器会在body里响应服务端SDP,在header中会返回一个
location
字段,值为资源id
,以HTTP DELETE请求该地址时,就可以立即结束推流。 - 超时机制:如果不进行该操作,服务端也会根据ICE状态判断流断开超时,不是及时的,一般为10秒左右。
六、集成示例
1. 定义video标签预览摄像头
html
<video id="localVideo" playsinline autoplay muted></video>
<div class="box">
<button id="startButton">Start</button>
<button id="callButton">Call</button>
<button id="hangupButton">Hang Up</button>
</div>
2. 创建媒体流
javascript
const startButton = document.getElementById('startButton');
const callButton = document.getElementById('callButton');
const hangupButton = document.getElementById('hangupButton');
callButton.disabled = true;
hangupButton.disabled = true;
startButton.addEventListener('click', start);
callButton.addEventListener('click', call);
hangupButton.addEventListener('click', hangup);
const localVideo = document.getElementById('localVideo');
localVideo.addEventListener('loadedmetadata', function () {
console.log(`Local video videoWidth: ${this.videoWidth}px, videoHeight: ${this.videoHeight}px`);
});
let localStream;
let pc1;
let whipLocation;
const offerOptions = {
offerToReceiveAudio: 1,
offerToReceiveVideo: 1
};
const configuration = {
iceServers: [
{
urls: "stun:stun.l.google.com:19302"
}
]
};
async function start() {
console.log('Requesting local stream');
startButton.disabled = true;
try {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true, video: { width: 1280, height: 720, frameRate: 30 } });
console.log('Received local stream');
localVideo.srcObject = stream;
localStream = stream;
callButton.disabled = false;
} catch (e) {
alert(`getUserMedia() error: ${e.name}`);
}
}
3. 发起推流
javascript
async function call() {
callButton.disabled = true;
hangupButton.disabled = false;
console.log('Starting call');
const videoTracks = localStream.getVideoTracks();
const audioTracks = localStream.getAudioTracks();
if (videoTracks.length > 0) {
console.log(`Using video device: ${videoTracks[0].label}`);
}
if (audioTracks.length > 0) {
console.log(`Using audio device: ${audioTracks[0].label}`);
}
console.log('RTCPeerConnection configuration:', configuration);
pc1 = new RTCPeerConnection(configuration);
console.log('Created local peer connection object pc1');
// pc1.addEventListener('icecandidate', e => onIceCandidate(pc1, e));
pc1.onicegatheringstatechange = gatheringStateChange;
pc1.addEventListener('iceconnectionstatechange', e => onIceStateChange(pc1, e));
localStream.getTracks().forEach(track => pc1.addTrack(track, localStream));
console.log('Added local stream to pc1');
try {
console.log('pc1 createOffer start');
const offer = await pc1.createOffer(offerOptions);
await pc1.setLocalDescription(offer);
const answer = await postSDPOffer("https://192.168.0.2:8443/live/demo.whip", pc1.localDescription.sdp);
await pc1.setRemoteDescription(answer);
whipLocation = answer.location;
// await onCreateOfferSuccess(offer);
} catch (e) {
onCreateSessionDescriptionError(e);
}
}
async function postSDPOffer(url, sdp) {
let result = {
status: 0,
type: 'answer',
};
try {
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/sdp',
"video-bitrate": 2_000_000,
"video-keyint": 3,
"audio-bitrate": 64_000
},
body: sdp
});
result.status = response.status;
result.sdp = await response.text();
result.location = response.headers.get("Location");
} catch (error) {
result.error = error;
}
return result;
}
4. 结束推流
javascript
function hangup() {
console.log('Ending call');
pc1.close();
pc1 = null;
hangupButton.disabled = true;
callButton.disabled = false;
fetch(whipLocation, { method: "DELETE" });
}
七、OBS推流配置

注意:
- 注意,此处端口仍然是http端口,而非配置中的webrtc端口。8000端口负责交换信令(显式),10000端口用于媒体传输(隐式)。
八、在线演示推流DEMO
NodeMedia Webrtc WHIP 推流 Demo(右键可查看完整源码)