Why Is WebRTC Treated Differently From Regular SIP In Asterisk
I’m learning about WebRTC clients, and am wondering why Asterisk treats them differently from any other SIP client.
The media (RTP) should be no different, so the only difference should be on the signaling side. I noticed that the Asterisk wiki mentions the need for res_pjsip_transport_websocket, so does that mean Asterisk requires the signaling to occur over a websocket?
If I used a SIPJS fork which places the signaling over UDP (eg https://github.com/cwysong85/sipjs-udp) will it just be a regular SIP client and I shouldn’t have to configure anything special in Asterisk, just regular PJSIP.
One thought on - Why Is WebRTC Treated Differently From Regular SIP In Asterisk
The signaling can go over whatever transport (UDP, Websocket, TCP, TLS). Websockets are commonly used because as I stated in my other response it is what the browser provides. From a media level WebRTC itself is different because it uses additional standards than a regular SIP client. It does ICE, STUN, TURN, DTLS-SRTP (which makes the SDP incompatible with non DTLS-SRTP SDP), and others for media streams, packet loss, and more. Could a normal SIP client use those? Yes. Do they? Usually no.
All of this isn’t driven by Asterisk, but WebRTC.