
Parlona Blog
How to Stream Asterisk Audio via WebRTC (2025 Guide)
April 17, 2025
How to Stream Asterisk Audio via WebRTC (2025 Guide)
Modern telephony systems increasingly require real-time audio access. Whether you're building:
- live speech-to-text (STT) pipelines
- AI receptionists that respond instantly
- browser-based monitoring dashboards
- or custom voice-driven applications
β¦the ability to stream audio from Asterisk to a browser or an AI service becomes essential.
This is where WebRTC comes in.
π What Is WebRTC?
WebRTC (Web Real-Time Communication) is an open standard that enables:
- Real-time audio & video transport
- Low-latency data channels
- Browser-to-browser or browser-to-server communication
- Built-in NAT traversal, STUN/TURN, DTLS-SRTP encryption
WebRTC is supported by all major browsers without plugins. For Asterisk, it provides a secure, low-latency way to send audio directly to a web browser, Node.js service, or an AI backend.
π Why Use WebRTC With Asterisk?
β Real-time Speech-to-Text (STT)
If you're building live STT with Whisper, Deepgram, or Google STT, you need audio frames as they happen. WebRTC provides low latency, Opus audio, and reliable jitter buffering.
β AI Receptionists & Voicebots
WebRTC enables AI agents to hear the caller instantly, transcribe intent, decide how to respond, and speak back naturally.
β Live Call Dashboards
Many businesses want real-time call monitoring and analytics. WebRTC allows secure, low-latency streaming to supervisor dashboards.
β Routing Audio to External Engines
You can stream Opus audio to:
- ASR engines
- ML models
- Compliance systems
- Analytics services
π Step-by-Step: Enable WebRTC Audio Streaming in Asterisk
π§© Step 1: Install WebRTC Dependencies
sudo apt install libsrtp2-dev
Ensure your Asterisk build includes:
- res_http_websocket
- chan_pjsip
- codec_opus
βοΈ Step 2: Configure WebRTC Transport in PJSIP
Edit
/etc/asterisk/pjsip.conftransport=transport-wss protocol=wss bind=0.0.0.0
Create a WebRTC endpoint:
[webrtc-endpoint] type = endpoint transport = transport-wss context = internal disallow = all allow = opus webrtc = yes
π‘ Step 3: Enable HTTP + WebSocket
Edit
/etc/asterisk/http.conf[general] enabled=yes bindaddr=0.0.0.0 bindport=8088 websocket_enabled=yes
Restart Asterisk.
π Step 4: Create a Stasis App
In
extensions.confexten => 7002,1,Stasis(webrtc_streamer)
π Step 5: Forward Audio to a WebRTC Server or Browser
Example Node.js WebRTC handler:
peer.addStream(asteriskStream);
This can forward Opus audio to STT, AI bots, dashboards, or ML pipelines.
π― Summary
With WebRTC enabled, Asterisk becomes a modern audio streaming platform, enabling:
- Real-time transcription
- AI-powered assistants
- Live call dashboards
- Browser-based softphones
- Low-latency AI-driven telephony
WebRTC transforms Asterisk into a next-generation communication system ready for AI.