Fix problem switching from Uint16 to Uint32 indices for outlining. #8820
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Adding edge outlines usually requires adding new vertices, and when we're using a UNSIGNED_SHORT index buffer and the new vertices push us over 65536 vertices, we need to upgrade to an UNSIGNED_INT index buffer. I had previously added this capability, and wrote tests for it, but didn't have any real-world data to test it with at the time. Unfortunately, that code doesn't actually work, and will corrupt rendering of the model when it's triggered. It was all part of my plan to serve as a good illustration for everyone of the dangers of only writing tests for some code and not actually trying it with real data. 😆
The problem is that
Model
also squirrels away the bufferView'scomponentType
inModelLoadResources.indexBuffersToCreate
and then uses that to create the buffer instead of the accessor'scomponentType
in the glTF. So I'm also helping to illustrate the dangers of denormalization. 👍