105 Tips for Nadeshiko
deterenkelt edited this page 3 months ago

Tips for Nadeshiko

Nadeshiko is the command line tool, which analyses the source file
and calculates optimal options for ffmpeg.


How to encode faster

Working around it

  • If the problem is that the encoding hogs the processor right now, why not postpone it?
  • Remember, that longer videos take longer to encode. And the longer is the video, the more it will be compressed. That said, it may be tempting to encode this episode (or a half of it), but to think about it, is it worth showing this to people in 360p? Most often the answer would be “no”. Also, the bigger is the file size, the more laggy will be the playback over the internet. Though Nadeshiko allows an “unlimited” size, this doesn’t mean it’s worth it to upload the entire Matrix in 480p as an .mp4 or a .webm file. For your information: when the video is streamed to you, it is kind of pre-sliced into little chunks to send you portions for playback – which are requested continuously, as needed. When the video is a file, the situation is different. It must be downloaded first. Sometimes, a large portion must be downloaded first, sometimes larger segments, sometimes entirely. And for files over 70…100 MiB it’s a problem.
Technically… for MP4 only the moov atom part must be downloaded. Nadeshiko makes sure, that the atom is placed at the beginning of the file, so it is downloaded first. But this moov atom, which holds the “summary” of all frames for the playback device, still has to be downloaded. And it must be downloaded entirely. And its size depends on the size of the video, so large videos have a large moov atom.

for VP9 the situation is slightly better, as this format was created with an intention to make streaming for the file “as is” (without “chunking” it first) easier, than it was with MP4. But VP9 pays for that with larger segments, which have to be downloaded, before they can be played. This creates lengthy pauses on oversized files.

Please remember, that Nadeshiko is a tool to cut short videos. That browsers aren’t actually video players. And that playing back a gigabyte over the internet will not be as fast as from the hard disk.


  • Disable Hyper-threading. As ffmpeg is able to fully utilise the cores it takes, there’s basically no “free” or “unused” CPU cycles, which Hyper-threading could fill. Thus an additional layer virtualising CPU cores as double, becomes a superfluous overhead. Read also Wikipedia: Drawbacks of Hyper-threading;
  • (for VP9) Try libvpx_pass2_cpu_used=1 – the bigger the number, the less CPU work is allowed. Be aware, that reducing the time, that the codec is allowed to use the CPU, worsens the quality of encoded file and increases the probability to get on a roller coaster.
  • Try hardware encoding on a GPU that supports it:
    1. Buy a GeForce 10 video card.
    2. Compile ffmpeg with nvenc.
    3. Copy modules/nadeshiko-encode-libvpx-vp9.sh encoding module to a separate file and edit it to cover up for all the quirks of the nvenc version of the VP9 codec.
    4. Brace for lost quality, as hardware encoders said to be less efficient.
      See also: FAQ № 3 and FAQ № 4.
  • Avoid extra work.
    • Don’t forget to turn off hardsubbing, when it’s not necessary. Even if the video would happen to not have any subtitles between Time1 and Time2, if Nadeshiko-mpv detects, that subtitles are on in the player, it will force Nadeshiko to use the filter to render them. Which is extra work.
    • Use source of a higher quality. (Generally, this concerns animation.) The less noise and artefacts there is, the faster go the complexity checks (and the better is the quality).

      From most preferable to least…
      • Bluray source (i.e. the disk itself or a 1:1 copy of it);
      • HEVC/AV1 encodes > H.264 encodes;
      • 10 bit > 8 bit;
      • 4:4:4 chroma > 4:2:0 chroma;
      That is, HEVC will always be preferable to H.264. 8bit HEVC encode would be better than 10bit H.264.
      However…that’s on about the same settings. Which in the wild you may find only in benchmarks, and that only when they know what they are doing and aren’t trying to cheat. In the wild you may see a crappy HEVC encode as well as a decent H.264, paying for quality with the size in gigabytes. So, when you choose the source, it’s better to look at samples or screenshots.
      This advice doesn’t work for cinema on blurays, as the film grain in the source is what will make it more complex.

  • Avoid streamed content. Dumped streams are still streams, which are always averagely transcoded from the source (maybe transcoded several times already, as the licences were passing from one hands to another). No one will ever guarantee even that the average quality will be 100% average: there always will be broken frames with mosaic pixels, because some chunks were malformed or lost on their way.
  • Avoid rips of another rips. The source of a rip must be the real, the original source as available (or would be available) in retail.
  • Faster encode may be achieved with a deliberate use of libx264 instead of libvpx-vp9 at the cost of
    • getting a significantly poorer quality at size limits;
    • losing the ability to fit as much time as with VP9 in the same space.
  • (for VP9) Encode on a $2000 Xeon, as it’s intended to be (笑)


How to remove sound without re-encoding

Suppose you have encoded a video clip. But then, after playing it back, the speech lines were cut abruptly or it just played awry/awkwardly when looped, so you would decide, that the video would be better without an audio track. With ffmpeg it’s possible to cut it out without re-encoding:

For .webm files

$ ffmpeg -i video.webm -c copy -an out.webm

For .mp4 files

$ ffmpeg -i video.mp4 -c copy -an -movflags +faststart out.mp4


Encoding options for animu

  • VP9:  use ffmpeg_pix_fmt=yuv444p in the config file – this shouldn’t hurt compatibility for VP9-capable browsers.
  • H.264:  -tune animation adds more options to preserve quality over film.
  • H.264:  libx264_keyint=25 will provide 2× more key frames, which will result in better quality (important for mostly dynamic videos, or videos with quick scenes – that happen within 1 second). Though 50, that is the default, should be enough.


Encoding long videos

  • H.264, VP9: Option unlimited will lift the restriction to maximum file size.
  • H.264, VP9: With the option vb= you can raise the bitrate to improve quality.
  • H.264, VP9: Read descriptions in the per-codec configuration files – some options may be worth putting in a separate preset for encoding long files.
  • H.264: This codec needs an increased number of key frames to maintain quality under strict requirements to size, so when you lift those restrictions with unlimited, you may as well set libx264_keyint to 450–500 to increase the distance between one key frame and another.


Adding custom options to the ffmpeg command

First, you must know, is it an input option or an output option?

If this question confuses you, read the docs on encoding. When you know, what you want to place, where and for what purpose, continue here.

An input option may be added to the string, that defines the FFmpeg binary in the config file.

 # Input file options for encoding.
#  Extend this array with strings, one option or key per string.
#  Default value: an empty array
ffmpeg_input_options=( '-some' 'input_parameter' )

Where the output options will go, depends on the codec set in question. For the set based on libx264, the appropriate variables would be

#  Place for user-specified ffmpeg options
#  These will be applied ONLY when used with libx264 as an encoder.
#  Array of strings!  I.e. =(-key value  -other-key "value with spaces")

And for libvpx you should edit the following ones

#  Place for user-specified ffmpeg options
#  These will be applied ONLY when libvpx is used as an encoder.
#  Array of strings!  I.e. =(-key value  -other-key "value with spaces")


Calling Nadeshiko with a shorter name

For those who uses Nadeshiko in a terminal, typing the whole path to nadeshiko.sh would take a good chunk of time. Making a short command that would run Nadeshiko would be more convenient.

There are two methods to create a short name:

  • a shell alias would be quick to set up, but it won’t work with launchers like dmenu;
  • creating a small wrapper and local directory would take more time, but this would work everywhere.

The next section describes setting up a shell alias. If you’d prefer doing it with a wrapper, jump one section ahead.

Setting up a shell alias

Let’s say, that we want to shorten the command

$ ~/path/to/nadeshiko.sh

to one word

$ nadeshiko

Put the following line in your ~/.bashrc, this will create a shell alias for the long command

alias nadeshiko="$HOME/path/to/nadeshiko.sh"

The tilde expansion won’t work inside quotes, so $HOME must be used to specify the home directory. The next time you launch a terminal the new alias nadeshiko will work. To make the new command accessible in already started terminals, re-source ~/.bashrc in them:

$ . ~/.bashrc

If the alias doesn’t work, check that your ~/.bash_profile sources ~/.bashrc. You can also place an alias directly in ~/.bash_profile.

Setting up a wrapper

Some launchers, like dmenu for example, do not recognise aliases, for a launcher differs from a shell. To make a command, that a launcher would find, a separate file must be created. The file name will serve as the new command, so, if we name it “nadeshiko” and put

#! /usr/bin/env bash

#  nadeshiko
#  A wrapper script for nadeshiko.sh

#  Change the path to the real one!
$HOME/path/to/nadeshiko.sh "$@"

in this file, it will contain a command, that we need to launch. However, our new script nadeshiko must be an executable file in order to be command, so the next step, run

$ chmod +x nadeshiko

what’s left is to put the new script somewhere, where the shells and launchers look for executable files.

Creating a local /bin

In the operating system, executable files usually reside in directories like

  • /bin;
  • /usr/bin;
  • /usr/local/bin;

and the list of these directories is always present in the environment variable PATH

$ echo $PATH

He-ere they are! A new script can be placed directly in any of them. But this will be bad, because only the operating system must use system directories. Thus let’s create a local /bin for ourselves. In this example it will be ~/bin.

$ mkdir ~/bin

To add ~/bin to the list or directories, where shells and launchers look for commands, add these two lines to ~/.bashrc

#  Add my $HOME/bin to PATH, if it’s not there yet
[ "${PATH//*$HOME\/bin*/}" ] && export PATH="$HOME/bin:$PATH"

The last command looks complex, because it needs to verify, that the new directory was added only once. Shells often run one inside another, and ~/.bashrc is re-read each time. Without this check ~/bin could be added to PATH multiple times – which will only slow down shells and launchers, as they have to search every directory in the PATH list.

Now move the new script to ~/bin/

$ mv nadeshiko ~/bin/

Almost done. As with shell aliases, programs in the running session do not know yet, that PATH has changed. You need to log out and log in again. Then try running Nadeshiko with

$ nadeshiko

in your terminal or your launcher. It should work.


DVD subtitles: how to change colours

This section refers to the pixel based VobSub, which are itself a video stream. It doesn’t relate to ASS/SSA, SubRip or any other text subtitles.

16 colours must be specified with a -palette input parameter to ffmpeg, so you need to alter the ffmpeg_input_options in nadeshiko.rc.sh. An example is shown below, the colours must be given as RGB hex codes.

ffmpeg_input_options=(-palette '00ff00,11aacb,…,000000,000000')

There’s no agreement on where among the 16 colours should be the text and the outline. A video with VobSub stream may use a palette freely, have several text colours or different outlines with same text colour. The set has to be determined experimentally.


Video, audio or subtitles are out of sync

Sometimes the source video itself isn’t normal, and the resulting clip comes out with issues. There are still some helpful options that you may try to add to ffmpeg_input_options or <codec>_pass<number>_extra_options. See the main config file and the codec-specific files for libvpx-vp9 and libx264.

It is presumed that you know at least the very basics of FFmpeg, where to place input and output options, what are streams, I/P/B frames, PTS, TB, sampling rate etc. If not, proceed to this page.

  • -vsync. Input option.
  • -af aresample=async=1. Output option.
  • -af aresample=async=1:first_pts=0. Output option.
  • -fflags +genpts. Input option. Sometimes -fflags +genpts+igndts may help.
  • -copyts. Output option. Makes sense only when you transcode an entire video or from 00:00:00.000 to some position.

Chapters are cursed. It’s easier to download another version of whatever you’re watching, than to cut clip from a video with chapters. PTS can be sometimes – but only sometimes! – deduced from the duration of the preceding chapters. Bugged. Cursed. Forget and never download again.

The options recommended here aren’t used by default, for in the most cases they are not needed or may do harm instead of helping (e.g. add bloat frames). Thus the addition of these options is left to the consideration of the user.


I think that I have a colour space problem…

It may happen, that the colours will look different in the encoded file, despite that the conversion was done with all the required properties present. The reason for that is probably in the mistagged source file. That is, it has either matrix coefficients (sometimes referred to as just “colorspace”), primaries, transfer characteristics or the colour range set to a different value, than what the video was actually encoded with.

One common case is tied to misuse of the colour range attribute: either the software didn’t place the tag, or place the wrong tag, or a human ticked the checkbox without thinking, assuming that if they do an encode on the PC, then the range must also be “pc” (it not necessarily so). If you see, that the colours are off, and are sure, that the colour space characteristics are alright, try passing -color_range pc or -color_range tv via ffmpeg_input_options array in the configuration file.

To check the original value, you may raed the Nadeshiko log file, or check it with ffprobe:

$ ffprobe -hide_banner -v error  \
          -select_streams V  \
          -show_entries stream=color_range
          -of default=noprint_wrappers=1  \



All the materials in this wiki are permitted to use under the terms of the Creative Commons BY NC SA licence 4.0, with the exception for

  • material taken from the sources outside of this wiki;
  • screenshots from films and television series.

The aforementioned exceptions are subjects of the licences specified by their respective rights holders.