NVil Forum
General Category => Community Help => Topic started by: midnitecarnival on October 30, 2020, 07:04:38 pm
-
Sorry, I'm starting to discover questions I probably should've been asking twenty years ago
Lets say I have a 12 x 12 Sphere and it is the same size as the default plane so that the plane looks like a plate cutting through the center of the sphere...
Is that going to require more system resources to work with than if I had scaled the 12 x 12 sphere down so that it appeared as a ball resting in the center of a large field?
I had always assumed it was just poly count affecting system resources.
-
My guess is scale is just an array of 3 floats (4bytes per float) in memory space, it shouldn't allocate any additional resources if the number is 1000 or 1, its still a float array. But IStonia can clarify :D
-
My guess is scale is just an array of 3 floats (4bytes per float) in memory space, it shouldn't allocate any additional resources if the number is 1000 or 1, its still a float array. But IStonia can clarify :D
Correct.
-
My guess is scale is just an array of 3 floats (4bytes per float) in memory space, it shouldn't allocate any additional resources if the number is 1000 or 1, its still a float array. But IStonia can clarify :D
Correct.
Cool. Thanks, Peeps
-
Yes, that most definitely shouldn't have any impact on memory.
Each vertex has a position defined by a 3D vector, which consists of three floating-point numbers defining a vertex position in space. Each one of those values has a constant amount of bits allocated in your computer's memory. It's predefined, so it doesn't matter if you scale your object by 0.0001 or 10,000 - the allocated memory for that variable (say, a vertex position on X axis) will always remain the same.
The only problem you may encounter with scaling, is the loss of precision when you cross the limits provided by 32-bit or 64-bit floating-point numbers - you will start getting strange "artifacts" all over your mesh (it will deform in unexpected ways).