Scripts for generating a personal video-streaming site from DASH-format videos
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
Charles Pence cf818e33a3
Update README and copyrights.
3 weeks ago
css Write in the Shaka-player bits, responsive aspect-preserving videos, styling, built-in dev server. 1 year ago
js Write in the Shaka-player bits, responsive aspect-preserving videos, styling, built-in dev server. 1 year ago
templates Markdown the videos on the full pages, but don't put desc's on the front page. 3 weeks ago
.gitignore Basic Gruntfile that compiles CSS and JS. 1 year ago
COPYING Update README and copyrights. 3 weeks ago
Gruntfile.js Big package/bundle upgrades. 3 weeks ago
README.md Update README and copyrights. 3 weeks ago
metadata.js Markdown the videos on the full pages, but don't put desc's on the front page. 3 weeks ago
package.json Markdown the videos on the full pages, but don't put desc's on the front page. 3 weeks ago
yarn.lock Markdown the videos on the full pages, but don't put desc's on the front page. 3 weeks ago

README.md

Video Site

This is a set of scripts for generating a personal video website, which will stream videos using DASH and Google's Shaka Player. It will produce a set of HTML, CSS, and JS suitable for serving (with your video files) from any HTTP server, including Amazon S3.

To get started using it, you first need to transcode your video files and arrange them in the way expected by the scripts here.

Input Directory Structure

Video files have to be arranged in a particular way for this script to work, but it's pretty simple. First, all videos need to be placed into "categories," which are just a single layer of folder structure. Let's consider the following example, which we'll use in the rest of the README:

  • category-1
    • video-a
    • video-b
  • category-2
    • video-c

Only one level of categories is supported. Each category is itself described by a metadata file, called _metadata.json, and placed in each category folder. This file should contain:

{
  "id": "category-1",
  "title": "name of category"
}

Each video within a category is described by a metadata file. Assume that we have a video called video-a, as above. The metadata file describing that video will be called video-a_meta.json, and it should contain:

{
  "id": "video-a",
  "category": "category-1",
  "title": "short",
  "description": "long",
  "date": "2012-04-23T18:25:43.511Z",
  "tags": ["tag", "tag"],
  "aspect": "16by9"
}

The id value should match whatever occurs in the filename before _meta.json. The category is the name of the folder in which it sits. The aspect value should be set to either 16by9, 4by3, or 1by1, as needed. The rest of the metadata values should be self-explanatory, and will be displayed on individual video pages and used for sorting.

In addition to the _meta.json file, two more files are required. A thumbnail of the video (preferably at 360p size) must be present as video-a_thumb.png. (At the moment, we only support PNG thumbnails.) The video content itself must be present as a complete DASH package, found at video-a_manifest.mpd. These can be prepared, for example, by ffmpeg. (For more information on transcoding to DASH packages with ffmpeg, see the bottom of this README.)

To put all of that together, then, our video-a, if it were made from 1080p source material, might be in a directory that looked like:

  • category-1
    • _metadata.json
    • video-a_360p.webm
    • video-a_480p.webm
    • video-a_720p.webm
    • video-a_1080p.webm
    • video-a_audio.webm
    • video-a_manifest.mpd
    • video-a_meta.json
    • video-a_thumb.png
    • ...

The top five files are the individual elements of the DASH package, and should have been created by your DASH packager along with the manifest. The last two are user-generated. All these files should be found in the category-1 folder.

Lastly, there is some site-wide metadata that you must configure with a _metadata.json file in the root of your input folder. That file should contain:

{
  "title": "site title",
  "lead": "a lead bit of text for the index page",
  "copyright": "a copyright/license claim for the site footer"
}

Usage

Run the script to generate HTML, CSS, and JS with:

grunt --input=path/to/videos/

License

Copyright 2020-2021, Charles H. Pence, and released under the MIT License.

Appendix: Transcoding to DASH Packages

This really lies beyond the scope of this README, but if you're interested in how I create DASH packages from the videos I get from my cell phone or camera, I first transcode the files to WebM, with the two-pass encoding commands:

  ffmpeg -i $input \
    -an \
    -vf "scale=-2:$size" -sws_flags sinc \
    -c:v libvpx-vp9 \
    -keyint_min 72 -g 72 \
    -tile-columns 4 -frame-parallel 1 \
    -auto-alt-ref 1 -lag-in-frames 25 -row-mt 1 \
    -movflags faststart -f webm -dash 1 \
    -b:v $bitrate \
    -max_muxing_queue_size 1024 \
    -y -pass 1 /dev/null

  ffmpeg -i $input \
    -an \
    -vf "scale=-2:$size" -sws_flags sinc \
    -c:v libvpx-vp9 \
    -keyint_min 72 -g 72 \
    -tile-columns 4 -frame-parallel 1 \
    -auto-alt-ref 1 -lag-in-frames 25 -row-mt 1 \
    -movflags faststart -f webm -dash 1 \
    -b:v $bitrate \
    -max_muxing_queue_size 1024 \
    -y -pass 2 "${output}_${size}p.webm"

I use the following sizes and bitrates:

  • 1080p: 3000k
  • 720p: 1500k
  • 480p: 500k
  • 360p: 300k

I extract the audio track using:

ffmpeg -i $input \
  -c:a libvorbis -b:a 192k \
  -vn -f webm -dash 1 \
  -y "${output}_audio.webm"

Lastly, I build the DASH package itself with:

ffmpeg \
  -f webm_dash_manifest -i ${output}_1080p.webm \
  -f webm_dash_manifest -i ${output}_720p.webm \
  -f webm_dash_manifest -i ${output}_480p.webm \
  -f webm_dash_manifest -i ${output}_360p.webm \
  -f webm_dash_manifest -i ${output}_audio.webm \
  -c copy \
  -map 0 -map 1 -map 2 -map 3 -map 4 \
  -f webm_dash_manifest \
  -adaptation_sets "id=0,streams=0,1,2,3 id=1,streams=4" \
  -y "${output}_manifest.mpd"

Obviously, adjust that if you have fewer video tracks. Lastly, ffmpeg has a video filter for extracting its best guess at "interesting" thumbnails from a video, which you can call with:

ffmpeg -i "${output}_360p.webm" -vf  thumbnail -frames:v 1 "${output}_thumb.png"