I am trying to get 3 things all ready to test on other systems -- HDR, libaom 3.8, and x265 -- so when I do, I will check to make sure multibit compile works there too and make it a default. Even if it takes longer for me. At one point I thought that if you rendered 8-bit on the multibit x265, it would result in a bigger file than on the build non-multibit system. But I did a render test today and the files were exactly the same.
As you said, for me also the size of the files obtained with the multibit and std versions is identical, as is the quality (to the eye) of the video. The rendering speed is similar (just faster the std version, but by a little). I would be for eliminating the std version and using only the multibit, but on this point I would like to hear everyone's opinion.
Compiling with multibit on Ubuntu 16, results in an Error in the log file, although it seems to work anyway. Errors are: ./libx265_main12.a(api.cpp.o): In function `x265_12bit::x265_api_get_208(int)': api.cpp:(.text+0x2004): undefined reference to `dlopen' api.cpp:(.text+0x201c): undefined reference to `dlsym' api.cpp:(.text+0x20ad): undefined reference to `dlopen' ./libx265_main12.a(api.cpp.o): In function `x265_12bit::x265_api_query(int, int, int*)': api.cpp:(.text+0x2167): undefined reference to `dlopen' api.cpp:(.text+0x217f): undefined reference to `dlsym' api.cpp:(.text+0x224d): undefined reference to `dlopen' collect2: error: ld returned 1 exit status make[5]: *** [x265] Error 1 make[4]: *** [CMakeFiles/cli.dir/all] Error 2 make[3]: *** [all] Error 2 Have not tested Debian 9.1 32-bit yet as am getting hung up on cause of error.