How to Stream Asterisk Audio via WebRTC (2025 Guide)

Parlona Blog

How to Stream Asterisk Audio via WebRTC (2025 Guide)

April 17, 2025

AsteriskWebRTCSTTAIVoIPPBX

How to Stream Asterisk Audio via WebRTC (2025 Guide)

Modern telephony systems increasingly require real-time audio access. Whether you're building:

  • live speech-to-text (STT) pipelines
  • AI receptionists that respond instantly
  • browser-based monitoring dashboards
  • or custom voice-driven applications

…the ability to stream audio from Asterisk to a browser or an AI service becomes essential.

This is where WebRTC comes in.

πŸ” What Is WebRTC?

WebRTC (Web Real-Time Communication) is an open standard that enables:

  • Real-time audio & video transport
  • Low-latency data channels
  • Browser-to-browser or browser-to-server communication
  • Built-in NAT traversal, STUN/TURN, DTLS-SRTP encryption

WebRTC is supported by all major browsers without plugins. For Asterisk, it provides a secure, low-latency way to send audio directly to a web browser, Node.js service, or an AI backend.

🌟 Why Use WebRTC With Asterisk?

βœ” Real-time Speech-to-Text (STT)

If you're building live STT with Whisper, Deepgram, or Google STT, you need audio frames as they happen. WebRTC provides low latency, Opus audio, and reliable jitter buffering.

βœ” AI Receptionists & Voicebots

WebRTC enables AI agents to hear the caller instantly, transcribe intent, decide how to respond, and speak back naturally.

βœ” Live Call Dashboards

Many businesses want real-time call monitoring and analytics. WebRTC allows secure, low-latency streaming to supervisor dashboards.

βœ” Routing Audio to External Engines

You can stream Opus audio to:

  • ASR engines
  • ML models
  • Compliance systems
  • Analytics services

πŸ›  Step-by-Step: Enable WebRTC Audio Streaming in Asterisk

🧩 Step 1: Install WebRTC Dependencies

sudo apt install libsrtp2-dev

Ensure your Asterisk build includes:

  • res_http_websocket
  • chan_pjsip
  • codec_opus

βš™οΈ Step 2: Configure WebRTC Transport in PJSIP

Edit

/etc/asterisk/pjsip.conf
:

transport=transport-wss
protocol=wss
bind=0.0.0.0

Create a WebRTC endpoint:

[webrtc-endpoint]
type = endpoint
transport = transport-wss
context = internal
disallow = all
allow = opus
webrtc = yes

πŸ“‘ Step 3: Enable HTTP + WebSocket

Edit

/etc/asterisk/http.conf
:

[general]
enabled=yes
bindaddr=0.0.0.0
bindport=8088
websocket_enabled=yes

Restart Asterisk.

🌐 Step 4: Create a Stasis App

In

extensions.conf
:

exten => 7002,1,Stasis(webrtc_streamer)

πŸ”Œ Step 5: Forward Audio to a WebRTC Server or Browser

Example Node.js WebRTC handler:

peer.addStream(asteriskStream);

This can forward Opus audio to STT, AI bots, dashboards, or ML pipelines.


🎯 Summary

With WebRTC enabled, Asterisk becomes a modern audio streaming platform, enabling:

  • Real-time transcription
  • AI-powered assistants
  • Live call dashboards
  • Browser-based softphones
  • Low-latency AI-driven telephony

WebRTC transforms Asterisk into a next-generation communication system ready for AI.

Contact Us

Parlona Logo