Skip to content

Commit 9c0d679

Browse files
authored
Merge pull request #25 from membraneframework/live-view
LiveView for establishing WebRTC connections
2 parents 99dc41a + 72fbce7 commit 9c0d679

File tree

42 files changed

+1821
-72
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

42 files changed

+1821
-72
lines changed

.credo.exs

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -164,7 +164,6 @@
164164
{Credo.Check.Design.DuplicatedCode, false},
165165
{Credo.Check.Readability.AliasAs, false},
166166
{Credo.Check.Readability.MultiAlias, false},
167-
{Credo.Check.Readability.Specs, []},
168167
{Credo.Check.Readability.SinglePipe, false},
169168
{Credo.Check.Readability.WithCustomTaggedTuple, false},
170169
{Credo.Check.Refactor.ABCSize, false},

.formatter.exs

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,5 +4,5 @@
44
".formatter.exs",
55
"*.exs"
66
],
7-
import_deps: [:membrane_core]
7+
import_deps: [:membrane_core, :phoenix]
88
]

README.md

Lines changed: 117 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -20,16 +20,129 @@ def deps do
2020
end
2121
```
2222

23-
## Usage
23+
## Demos
2424

2525
The `examples` directory shows how to send and receive streams from a web browser.
26-
There are following two demos there:
26+
There are the following three demos:
27+
* `live_view` - a simple Phoenix LiveView project using `Membrane.WebRTC.Live.Player` and `Membrane.WebRTC.Live.Capture` to echo video stream
28+
captured from the user's browser.
2729
* `phoenix_signaling` - showcasing simple Phoenix application that uses `Membrane.WebRTC.PhoenixSignaling` to echo stream captured
2830
from the user's browser and sent via WebRTC. See `assets/phoenix_signaling/README.md` for details on how to run the demo.
29-
* `webrtc_signaling` - it consists of two scripts: `file_to_browser.exs` and `browser_to_file.exs`. The first one display stream from
30-
the fixture file in the user's browser. The later one captures user's camera input from the browser and saves it in the file.
31+
* `webrtc_signaling` - it consists of two scripts: `file_to_browser.exs` and `browser_to_file.exs`. The first one displays the stream from
32+
the fixture file in the user's browser. The latter captures the user's camera input from the browser and saves it in the file.
3133
To run one of these demos, type: `elixir <script_name>` and visit `http://localhost:4000`.
3234

35+
## Exchanging Signaling Messages
36+
37+
To establish a WebRTC connection you have to exchange WebRTC signaling messages between peers.
38+
In `membrane_webrtc_plugin` it can be done by the user, with `Membrane.WebRTC.Signaling` or by passing WebSocket address to
39+
`Membrane.WebRTC.Source` or `Membrane.WebRTC.Sink`, but there are two additional ways of doing it, dedicated to be used within
40+
`Phoenix` projects:
41+
- The first one is to use `Membrane.WebRTC.PhoenixSignaling` along with `Membrane.WebRTC.PhoenixSignaling.Socket`
42+
- The second one is to use `Phoenix.LiveView` `Membrane.WebRTC.Live.Player` or `Membrane.WebRTC.Live.Capture`. These modules expect
43+
`t:Membrane.WebRTC.Signaling.t/0` as an argument and take advantage of WebSocket used by `Phoenix.LiveView` to exchange WebRTC
44+
signaling messages, so there is no need to add any code to handle signaling messages.
45+
46+
### How to use Membrane.WebRTC.PhoenixSignaling in your own Phoenix project?
47+
48+
The see the full example, visit `example/phoenix_signaling`.
49+
50+
1. Create a new socket in your application endpoint, using the `Membrane.WebRTC.PhoenixSignaling.Socket`, for instance at `/signaling` path:
51+
```
52+
socket "/signaling", Membrane.WebRTC.PhoenixSignaling.Socket,
53+
websocket: true,
54+
longpoll: false
55+
```
56+
2. Create a Phoenix signaling channel with the desired signaling ID and use it as `Membrane.WebRTC.Signaling.t()`
57+
for `Membrane.WebRTC.Source`, `Membrane.WebRTC.Sink` or [`Boombox`](https://github.com/membraneframework/boombox):
58+
```
59+
signaling = Membrane.WebRTC.PhoenixSignaling.new("<signaling_id>")
60+
61+
# use it with Membrane.WebRTC.Source:
62+
child(:webrtc_source, %Membrane.WebRTC.Source{signaling: signaling})
63+
|> ...
64+
65+
# or with Membrane.WebRTC.Sink:
66+
...
67+
|> child(:webrtc_sink, %Membrane.WebRTC.Sink{signaling: signaling})
68+
69+
# or with Boombox:
70+
Boombox.run(
71+
input: {:webrtc, signaling},
72+
output: ...
73+
)
74+
```
75+
76+
>Please note that `signaling_id` is expected to be globally unique for each WebRTC connection about to be
77+
>estabilished. You can, for instance:
78+
>1. Generate a unique id with `:uuid` package and assign it to the connection in the page controller:
79+
>```
80+
>unique_id = UUID.uuid4()
81+
>render(conn, :home, layout: false, signaling_id: unique_id)
82+
>```
83+
>
84+
>2. Generate HTML based on HEEx template, using the previously set assign:
85+
>```
86+
><video id="videoPlayer" controls muted autoplay signaling_id={@signaling_id}></video>
87+
>```
88+
>
89+
>3. Access it in your client code:
90+
>```
91+
>const videoPlayer = document.getElementById('videoPlayer');
92+
>const signalingId = videoPlayer.getAttribute('signaling_id');
93+
>```
94+
95+
96+
3. Use the Phoenix Socket to exchange WebRTC signaling data.
97+
```
98+
let socket = new Socket("/signaling", {params: {token: window.userToken}})
99+
socket.connect()
100+
let channel = socket.channel('<signaling_id>')
101+
channel.join()
102+
.receive("ok", resp => { console.log("Signaling socket joined successfully", resp)
103+
// here you can exchange WebRTC data
104+
})
105+
.receive("error", resp => { console.log("Unable to join signaling socket", resp) })
106+
```
107+
108+
Visit `examples/phoenix_signaling/assets/js/signaling.js` to see how WebRTC signaling messages exchange might look like.
109+
110+
## Integrating Phoenix.LiveView with Membrane WebRTC Plugin
111+
112+
`membrane_webrtc_plugin` comes with two `Phoenix.LiveView`s:
113+
- `Membrane.WebRTC.Live.Capture` - exchanges WebRTC signaling messages between `Membrane.WebRTC.Source` and the browser. It
114+
expects the same `Membrane.WebRTC.Signaling` that has been passed to the related `Membrane.WebRTC.Source`. As a result,
115+
`Membrane.Webrtc.Source` will return the media stream captured from the browser, where `Membrane.WebRTC.Live.Capture` has been
116+
rendered.
117+
- `Membrane.WebRTC.Live.Player` - exchanges WebRTC signaling messages between `Membrane.WebRTC.Sink` and the browser. It
118+
expects the same `Membrane.WebRTC.Signaling` that has been passed to the related `Membrane.WebRTC.Sink`. As a result,
119+
`Membrane.WebRTC.Live.Player` will play media streams passed to the related `Membrane.WebRTC.Sink`. Currently supports up
120+
to one video stream and up to one audio stream.
121+
122+
### Usage
123+
124+
To use `Phoenix.LiveView`s from this repository, you have to use related JS hooks. To do so, add the following code snippet to `assets/js/app.js`
125+
126+
```js
127+
import { createCaptureHook, createPlayerHook } from "membrane_webrtc_plugin";
128+
129+
let Hooks = {};
130+
const iceServers = [{ urls: "stun:stun.l.google.com:19302" }];
131+
Hooks.Capture = createCaptureHook(iceServers);
132+
Hooks.Player = createPlayerHook(iceServers);
133+
```
134+
135+
and add `Hooks` to the WebSocket constructor. It can be done in the following way:
136+
137+
```js
138+
new LiveSocket("/live", Socket, {
139+
params: SomeParams,
140+
hooks: Hooks,
141+
});
142+
```
143+
144+
To see the full usage example, you can go to `examples/live_view/` directory in this repository (take a look especially at `examples/live_view/assets/js/app.js` and `examples/live_view/lib/example_project_web/live_views/echo.ex`).
145+
33146
## Copyright and License
34147

35148
Copyright 2020, [Software Mansion](https://swmansion.com/?utm_source=git&utm_medium=readme&utm_campaign=membrane_webrtc_plugin)

assets/capture.js

Lines changed: 57 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,57 @@
1+
export function createCaptureHook(iceServers = [{ urls: `stun:stun.l.google.com:19302` }]) {
2+
return {
3+
async mounted() {
4+
this.handleEvent(`media_constraints-${this.el.id}`, async (mediaConstraints) => {
5+
console.log(`[${this.el.id}] Received media constraints:`, mediaConstraints);
6+
7+
const localStream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
8+
const pcConfig = { iceServers: iceServers };
9+
this.pc = new RTCPeerConnection(pcConfig);
10+
11+
this.pc.onicecandidate = (event) => {
12+
if (event.candidate === null) return;
13+
console.log(`[${this.el.id}] Sent ICE candidate:`, event.candidate);
14+
message = { type: `ice_candidate`, data: event.candidate };
15+
this.pushEventTo(this.el, `webrtc_signaling`, message);
16+
};
17+
18+
this.pc.onconnectionstatechange = () => {
19+
console.log(
20+
`[${this.el.id}] RTCPeerConnection state changed to`,
21+
this.pc.connectionState
22+
);
23+
};
24+
25+
this.el.srcObject = new MediaStream();
26+
27+
for (const track of localStream.getTracks()) {
28+
this.pc.addTrack(track, localStream);
29+
this.el.srcObject.addTrack(track);
30+
}
31+
32+
this.el.play();
33+
34+
this.handleEvent(`webrtc_signaling-${this.el.id}`, async (event) => {
35+
const { type, data } = event;
36+
37+
switch (type) {
38+
case `sdp_answer`:
39+
console.log(`[${this.el.id}] Received SDP answer:`, data);
40+
await this.pc.setRemoteDescription(data);
41+
break;
42+
case `ice_candidate`:
43+
console.log(`[${this.el.id}] Recieved ICE candidate:`, data);
44+
await this.pc.addIceCandidate(data);
45+
break;
46+
}
47+
});
48+
49+
const offer = await this.pc.createOffer();
50+
await this.pc.setLocalDescription(offer);
51+
console.log(`[${this.el.id}] Sent SDP offer:`, offer);
52+
message = { type: `sdp_offer`, data: offer };
53+
this.pushEventTo(this.el, `webrtc_signaling`, message);
54+
});
55+
},
56+
};
57+
}

assets/index.js

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
import { createCaptureHook } from "./capture.js";
2+
import { createPlayerHook } from "./player.js";
3+
4+
export { createCaptureHook, createPlayerHook };

assets/player.js

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
export function createPlayerHook(iceServers = [{ urls: `stun:stun.l.google.com:19302` }]) {
2+
return {
3+
async mounted() {
4+
this.pc = new RTCPeerConnection({ iceServers: iceServers });
5+
this.el.srcObject = new MediaStream();
6+
7+
this.pc.ontrack = (event) => {
8+
this.el.srcObject.addTrack(event.track);
9+
};
10+
11+
this.pc.onicecandidate = (ev) => {
12+
console.log(`[${this.el.id}] Sent ICE candidate:`, ev.candidate);
13+
message = { type: `ice_candidate`, data: ev.candidate };
14+
this.pushEventTo(this.el, `webrtc_signaling`, message);
15+
};
16+
17+
const eventName = `webrtc_signaling-${this.el.id}`;
18+
this.handleEvent(eventName, async (event) => {
19+
const { type, data } = event;
20+
21+
switch (type) {
22+
case `sdp_offer`:
23+
console.log(`[${this.el.id}] Received SDP offer:`, data);
24+
await this.pc.setRemoteDescription(data);
25+
26+
const answer = await this.pc.createAnswer();
27+
await this.pc.setLocalDescription(answer);
28+
29+
message = { type: `sdp_answer`, data: answer };
30+
this.pushEventTo(this.el, `webrtc_signaling`, message);
31+
console.log(`[${this.el.id}] Sent SDP answer:`, answer);
32+
33+
break;
34+
case `ice_candidate`:
35+
console.log(`[${this.el.id}] Recieved ICE candidate:`, data);
36+
await this.pc.addIceCandidate(data);
37+
}
38+
});
39+
},
40+
};
41+
}

examples/live_view/.gitignore

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
# The directory Mix will write compiled artifacts to.
2+
/_build/
3+
4+
# If you run "mix test --cover", coverage assets end up here.
5+
/cover/
6+
7+
# The directory Mix downloads your dependencies sources to.
8+
/deps/
9+
10+
# Where 3rd-party dependencies like ExDoc output generated docs.
11+
/doc/
12+
13+
# Ignore .fetch files in case you like to edit your project deps locally.
14+
/.fetch
15+
16+
# If the VM crashes, it generates a dump, let's ignore it too.
17+
erl_crash.dump
18+
19+
# Also ignore archive artifacts (built via "mix archive.build").
20+
*.ez
21+
22+
# Temporary files, for example, from tests.
23+
/tmp/
24+
25+
# Ignore package tarball (built via "mix hex.build").
26+
phoenix_signaling-*.tar
27+
28+
# Ignore assets that are produced by build tools.
29+
/priv/static/assets/
30+
31+
# Ignore digested assets cache.
32+
/priv/static/cache_manifest.json
33+
34+
# In case you use Node.js/npm, you want to ignore these.
35+
npm-debug.log
36+
/assets/node_modules/
37+

examples/live_view/README.md

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
# Example Project
2+
3+
Example project showing how `Membrane.WebRTC.Live.Capture` and `Membrane.WebRTC.Live.Player` can be used.
4+
5+
It contains a simple demo, where:
6+
- the video stream is get from the browser and sent via WebRTC to Elixir server using `Membrane.WebRTC.Live.Capture`
7+
- then, this same video stream is re-sent again to the browser and displayed using `Membrane.WebRTC.Live.Player`.
8+
9+
This demo uses also [Boombox](https://hex.pm/packages/boombox).
10+
11+
The most important file in the project is `live_view/lib/example_project_web/live/echo.ex`, that
12+
contains the usage of `Boombox` and LiveViews defined in `membrane_webrtc_plugin` package.
13+
14+
You can also take a look at `live_view/assets/js/app.js` to see how you can use JS hooks from `membrane_webrtc_plugin`.
15+
16+
## Run server
17+
18+
To start Phoenix server:
19+
20+
* Run `mix setup` to install and setup dependencies
21+
* Start Phoenix with `mix phx.server` or inside IEx with `iex -S mix phx.server`
22+
23+
Now you can visit [`localhost:4000`](http://localhost:4000) from your browser.
Lines changed: 51 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,51 @@
1+
// If you want to use Phoenix channels, run `mix help phx.gen.channel`
2+
// to get started and then uncomment the line below.
3+
// import "./user_socket.js"
4+
5+
// You can include dependencies in two ways.
6+
//
7+
// The simplest option is to put them in assets/vendor and
8+
// import them using relative paths:
9+
//
10+
// import "../vendor/some-package.js"
11+
//
12+
// Alternatively, you can `npm install some-package --prefix assets` and import
13+
// them using a path starting with the package name:
14+
//
15+
// import "some-package"
16+
//
17+
18+
// Include phoenix_html to handle method=PUT/DELETE in forms and buttons.
19+
import "phoenix_html";
20+
// Establish Phoenix Socket and LiveView configuration.
21+
import { Socket } from "phoenix";
22+
import { LiveSocket } from "phoenix_live_view";
23+
import topbar from "../vendor/topbar";
24+
25+
import { createCaptureHook, createPlayerHook } from "membrane_webrtc_live";
26+
27+
let Hooks = {};
28+
const iceServers = [{ urls: "stun:stun.l.google.com:19302" }];
29+
Hooks.Capture = createCaptureHook(iceServers);
30+
Hooks.Player = createPlayerHook(iceServers);
31+
32+
let csrfToken = document.querySelector("meta[name='csrf-token']").getAttribute("content");
33+
let liveSocket = new LiveSocket("/live", Socket, {
34+
longPollFallbackMs: 2500,
35+
params: { _csrf_token: csrfToken },
36+
hooks: Hooks,
37+
});
38+
39+
// Show progress bar on live navigation and form submits
40+
topbar.config({ barColors: { 0: "#29d" }, shadowColor: "rgba(0, 0, 0, .3)" });
41+
window.addEventListener("phx:page-loading-start", (_info) => topbar.show(300));
42+
window.addEventListener("phx:page-loading-stop", (_info) => topbar.hide());
43+
44+
// connect if there are any LiveViews on the page
45+
liveSocket.connect();
46+
47+
// expose liveSocket on window for web console debug logs and latency simulation:
48+
// >> liveSocket.enableDebug()
49+
// >> liveSocket.enableLatencySim(1000) // enabled for duration of browser session
50+
// >> liveSocket.disableLatencySim()
51+
window.liveSocket = liveSocket;

0 commit comments

Comments
 (0)