How to implement the Android recording screenChrome 2017-02-15 15:32:01 release
Collection0 Collection
I've long wanted to be able to record it directly from the Android screen and encode it in a variety of formats so that it can be embedded anywhere without the need to install any software.
Now we are close to that goal. The chrome team is adding a feature that lets you share your screen from your Android device via Getusermedia. I created a prototype to record the screen and stream the recording to another device to record it to a file and add a device frame to it.
While WEBRTC has a lot of complicated details, the infrastructure is not as complex as it is in general.
The recording process is divided into two stages:
1. Capture locally (and optionally record locally);
2. Transfer to remote Desktop.
Capture Screen
Getusermedia is a very useful API. It allows you to access any camera or microphone in real-time, inline, directly in the Web. Getusermediaapi can be used to request connections to only certain types of devices. For example, by setting the parameter {audio:true}, you can request to connect only to devices that support audio, and by setting {video:{' mandatory ': {width:1920,height:1080}}, you can indicate that only the HD camera is connected.
Chrome is about to launch a new parameter {' Chromemediasource ': ' Screen '}, which chrome should use as a streaming source.
It is currently behind the mark and is completely experimental in nature. On Android, you need to turn on chrome://flags#enable-usermedia-screen-capturing to enable it. You can also track the implementation of Chrome error 487935.
Const CONSTRAINTS = {
Audio:false,//mandatory.
Video: {' mandatory ': {' chromemediasource ': ' Screen '}}
};
Const SUCCESSCALLBACK = (Stream) = = {
Do something with the stream.
Attach to WebRTC connections
Record via Mediarecorder
};
Consterrorcallback = () + = {
We don ' t have access to the API
};
Navigator.getusermedia (Constraints,successcallback, errorcallback);
That's all the code.
Of course, this is technically. In practice, you do not get access directly. Users need to grant access to Getusermedia's media stream (as usual), and because of this powerful feature of this API, users must explicitly choose to use it to share their own screens. Once the user explicitly chooses to use this feature, the system will explicitly indicate that they are on the shared screen.
Now that you have a screen stream that can be stored locally, you can also transfer it to an external location via WEBRTC.
With Mediarecorder, you can record your screen locally, as I did in the WebGL recording snippet. I also created a simple demo that can record 10 seconds on the screen and then download the recorded content to your device.
(function () {
Download locally
Functiondownload (BLOB) {
varURL = window. Url.createobjecturl (BLOB);
Vara = document.createelement (' a ');
A.style.display= ' None ';
a.href= URL;
A.download= ' TEST.WEBM ';
Document.body.appendChild (a);
A.click ();
SetTimeout (function () {
Document.body.removeChild (a);
Window. Url.revokeobjecturl (URL);
},100);
}
Constsuccesscallback = (Stream) = = {
Set up the recorder
Letblobs = [];
Letrecorder = new Mediarecorder (stream, {mimeType: ' video/webm; CODECS=VP9 '});
Recorder.ondataavailable= e = {if (e.data && e.data.size > 0) Blobs.push (e.data)};
Recorder.onstop= (e) = Download (new Blob (BLOBs, {type: ' VIDEO/WEBM '}));
Record for ten seconds.
SetTimeout (() =>recorder.stop (), 10000);
Start recording.
Recorder.start ()//collect 10ms chunks of data
};
Consterrorcallback = (Err) = = {
We don ' t have access to the API
Console.log (ERR)
};
Navigator.getusermedia ({
Audio:false,
video:{' mandatory ': {' chromemediasource ': ' Screen '}
},successcallback, Errorcallback);
})();
From: HTTP://WWW.JIANSHU.COM/P/5055C29173DF
How to implement the Android recording screen