perf(ini): Improve INI parsing performance by using std::from_chars#2532
perf(ini): Improve INI parsing performance by using std::from_chars#2532xezon merged 4 commits intoTheSuperHackers:mainfrom
Conversation
|
| Filename | Overview |
|---|---|
| Core/GameEngine/Source/Common/INI/INI.cpp | Replaces sscanf with std::from_chars for int, unsigned int, and float parsing; uses a non-standard __cplusplus constant (201611L instead of 201703L) and exposes a latent portability risk for float from_chars on older C++17 toolchains. |
Flowchart
%%{init: {'theme': 'neutral'}}%%
flowchart TD
A["INI::scanInt / scanUnsignedInt / scanReal(token)"] --> B{"USE_STD_FROM_CHARS_PARSING == 1?"}
B -- Yes --> C["scanType<T>(token)\nstd::from_chars → Int64 or Real"]
C --> D{"ec != std::errc{}?"}
D -- Error --> E["throw INI_INVALID_DATA"]
D -- OK --> F["static_cast<T>(result)"]
B -- No --> G["sscanf(%d/%u/%f)"]
G --> H{"sscanf result != 1?"}
H -- Error --> E
H -- OK --> I["value"]
I --> J{"USE_STD_FROM_CHARS_PARSING == -1?"}
J -- Yes --> K["Compare sscanf vs from_chars result"]
K --> L{"mismatch?"}
L -- Yes --> E
L -- No --> M["return value"]
J -- No --> M
F --> M
Prompt To Fix All With AI
This is a comment left during a code review.
Path: Core/GameEngine/Source/Common/INI/INI.cpp
Line: 62
Comment:
**Non-standard `__cplusplus` version constant**
`201611L` is not a defined value of `__cplusplus` for any C++ standard; the value for C++17 is `201703L`. In practice the check still works (no compiler emits `201611L`, so `201703L >= 201611L` is true for C++17 builds), but the magic number is misleading and should use the canonical constant.
```suggestion
#if __cplusplus >= 201703L
```
How can I resolve this? If you propose a fix, please make it concise.
---
This is a comment left during a code review.
Path: Core/GameEngine/Source/Common/INI/INI.cpp
Line: 1646
Comment:
**`std::from_chars` for floats not universally available in C++17 implementations**
The floating-point overloads of `std::from_chars` were added to the C++17 standard but compiler support lagged significantly: GCC added it in 11.1 (2021), Clang in 12.0 (2021), and MSVC in VS 2019 16.4. Any C++17 toolchain older than those thresholds will fail to compile this line because `<charconv>` won't provide a `std::from_chars` overload for `float`. If the project already requires a minimum compiler version that covers these, a comment documenting that requirement here would prevent future confusion.
How can I resolve this? If you propose a fix, please make it concise.Reviews (6): Last reviewed commit: "Simplified macro." | Re-trigger Greptile
| #if USE_STD_FROM_CHARS_PARSING == -1 | ||
| if (value != scanType<Real>(token)) | ||
| throw INI_INVALID_DATA; |
There was a problem hiding this comment.
Float
!= comparison in validation mode may produce false positives
sscanf("%f") and std::from_chars are two independently implemented parsers. For most INI values this is fine, but for decimal strings near a floating-point midpoint (halfway rounding boundary) the two parsers can legitimately round in opposite directions and produce results that differ by 1 ULP — causing the validation mode to throw INI_INVALID_DATA even though both parsed the same string. If this mode is used to validate a full game initialization, those spurious exceptions will hide real data rather than proving parity.
Consider using an epsilon comparison or checking that the two results round-trip to the same string rather than requiring bitwise identity.
Prompt To Fix With AI
This is a comment left during a code review.
Path: Core/GameEngine/Source/Common/INI/INI.cpp
Line: 1706-1708
Comment:
**Float `!=` comparison in validation mode may produce false positives**
`sscanf("%f")` and `std::from_chars` are two independently implemented parsers. For most INI values this is fine, but for decimal strings near a floating-point midpoint (halfway rounding boundary) the two parsers can legitimately round in opposite directions and produce results that differ by 1 ULP — causing the validation mode to throw `INI_INVALID_DATA` even though both parsed the same string. If this mode is used to validate a full game initialization, those spurious exceptions will hide real data rather than proving parity.
Consider using an epsilon comparison or checking that the two results round-trip to the same string rather than requiring bitwise identity.
How can I resolve this? If you propose a fix, please make it concise.1b3fd1c to
4c09a10
Compare
2f282b2 to
913b500
Compare
|
I realized too late that |
This PR replaces 3 cases of
sscanfthat are used to parse integers and floats from ini files. It's replaced withstd::from_chars, which is a C++17 replacement for C functions likesscanf.std::from_charscan be a good deal faster. I checked the duration of the engine initialization, and it improved by ~2% for debug and release builds.For convenience I made it so that
#define USE_STD_FROM_CHARS_PARSING -1compares the old and new methods, and throws if there's a difference. I didn't come across a difference for the standard ini files.