Also, need to enforce that g_settings.core_specific_config is true
after loading core specific config. It's possible that core specific
option is set to false before loading a different core (using global
config only).
Merging yesterday was probably a bit premature.
One issue I overlooked was that per-core configs were not flushed to disk
when loading a new core on PC. The per-core flushing only happened on
main_exit(), which is only run on application termination. This hence
would only work with consoles with exitspawn.
config_set_defaults() must be called when loading per-core-specifics as
well or lots of options silently leak into other core specific configs
when cores are changed.
The handling with g_extern.config_path and original_config_path was
difficult logic and very error prone considering it was mutated aribitrarily by RGUI.
I've removed the original config path concept and stuck
with that config_path is *only* for global config, and
core_specific_config_path is for core-specifics (which are resolved
during config load). Saves some memory too,
which is always nice.
The block_config_read solution I proposed yesterday was not good after
all (in fact, broken on PC), and the current solution should work better.
"RetroArch Config" option in RGUI now only shows global config.
This is a feature from ES2_compat extension.
It fixes the speed issue associated with using 16-bit textures on
desktop GL. Improves performance a bit as well as there's less bandwidth
usage during shading.
On my HD3000 laptop, performance improved with ~10%.
Use bool for return instead of int (many bugs because of that ...).
Remove all use of exceptions, use delayed constructors (due to no
exceptions ...). Drop use of unique_ptr in D3D9 (not really needed).
int is not acceptable as a return type for anything regarding sizes.
long is dubious as well, but better (64-bit on sane ABIs and the return
type of ftell()).