mirror of
https://github.com/hrydgard/ppsspp.git
synced 2024-11-27 15:30:35 +00:00
72029b678a
Basically, software culling fails in some configuration, like the one we end up with on Mali. As noted by unknownbrackets in #15661, the viewport Z scale, offset is -0.0, 0.0. We end up with CalcCullParams computing minZValue == maxZValue == 1.0f, and with the vertices ending up with z,w == 1.0, 1.0. and as a result, the inside/outside calculations will always decide that it's outside. Changing the comparisons from >= / <= to > / < fixes the problem, but I don't know if this might break something else. Anyhow, here's the simple way to repro on PC: Change the ending of GPU_Vulkan::CheckFeatures to: ```c return GPU_USE_LIGHT_UBERSHADER | GPU_USE_BLEND_MINMAX | GPU_USE_TEXTURE_NPOT | GPU_USE_INSTANCE_RENDERING | GPU_USE_VERTEX_TEXTURE_FETCH | GPU_USE_TEXTURE_FLOAT | GPU_USE_16BIT_FORMATS | GPU_USE_TEXTURE_LOD_CONTROL | GPU_USE_DEPTH_TEXTURE | GPU_USE_ACCURATE_DEPTH; ``` |
||
---|---|---|
.. | ||
Common | ||
D3D11 | ||
Debugger | ||
Directx9 | ||
GLES | ||
Software | ||
Vulkan | ||
ge_constants.h | ||
GeConstants.cpp | ||
GeDisasm.cpp | ||
GeDisasm.h | ||
GPU.cpp | ||
GPU.h | ||
GPU.vcxproj | ||
GPU.vcxproj.filters | ||
GPUCommon.cpp | ||
GPUCommon.h | ||
GPUInterface.h | ||
GPUState.cpp | ||
GPUState.h | ||
Math3D.cpp | ||
Math3D.h |