this post was submitted on 24 Sep 2023
958 points (97.7% liked)
Dank Memes
6146 readers
2 users here now
This is the place to be on the interweb when Reddit irreversibly becomes a meme itself and implodes
If you are existing mods from r/dankmemes, you should be mod here too, kindly DM me on either platform
The many rules inherited from
- Be nice, don't be not nice
- No Bigotry or Bullying
- Don't be a dick!
- Censor any and all personal information from posts and comments
- No spam, outside links, or videos.
- No Metabaiting
- No brigading
- Keep it dank!
- Mark NSFW and spoilers appropriately
- NO REEEEEEE-POSTS!
- No shitposting
- Format your meme correctly. No posts where the title is the meme caption!
- No agenda posting!
- Don't be a critic
- Karma threshold? What's that?
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
3d models consist of images right? Coordinates for the image does not take up much?
Yes but also more polygons for more detailed models. Which.. is more space but it can't be that much lol idk tho
It depends how they store the models, if they're using normal mapping (which they probably are) they will need to store the following in a file: Position (x,y,z) normal (x,y,z) texturecoordinates (u,v) tangent (x,y,z) bitangent (x,y,z). For each vertex, assuming that they're using a custom binary format and 32-bit (4 byte) floats, 56 bytes per vertex. The Sponza model which is commonly used for testing has around 1.9 million vertices: in our hypothetical format at least, 106.4MB for the vertices. But we also have to store the indices which are a optimisation to prevent the repetition on common vertices. Sponza has 3.9 million triangles, 3 32-bit integers per triangle gets us an additional 46.8 MB. So using that naeive format which should be extremely fast to load and alot of models, 3D model data is no insignificant contributor to file size.