-
Notifications
You must be signed in to change notification settings - Fork 75
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove dependencies on FP16 support status. #2403
base: main
Are you sure you want to change the base?
Conversation
📝 TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2403. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/. |
@@ -178,11 +178,9 @@ calc_iou(nntrainer::Tensor &bbox1_x1, nntrainer::Tensor &bbox1_y1, | |||
if (type_intersection_width == ml::train::TensorDim::DataType::FP32) { | |||
intersection_width.apply_i<float>(nntrainer::ActiFunc::relu<float>); | |||
} else if (type_intersection_width == ml::train::TensorDim::DataType::FP16) { | |||
#ifdef ENABLE_FP16 | |||
if (!is_fp16_enabled()) | |||
throw std::runtime_error("Not supported data type: fp16"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or maybe we can remove these checks and let it be cheked at apply_i() or its underlying function.
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401101715550.16663599014282-0711bc2e4ff8ce50b473fa280b69e77e844ee6f7/. |
0711bc2
to
cf20b63
Compare
cf20b63
to
b39f102
Compare
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401111518590.69792294502258-b39f102221f8a4befd4dc0ee4a8dda8418e13f35/. |
b39f102
to
844710b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@myungjoo, 💯 All CI checkers are successfully verified. Thanks.
844710b
to
97bb166
Compare
cibot: @myungjoo, nntrainer/tensor/half_tensor.cpp includes bug(s). Please fix incorrect coding constructs in your commit before entering a review process. |
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401111710550.024688005447388-97bb166b953093a350a4980e2b70622fdd78320d/. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
97bb166
to
8af9d00
Compare
cibot: @myungjoo, nntrainer/tensor/blas_interface.cpp includes bug(s). Please fix incorrect coding constructs in your commit before entering a review process. |
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401121455320.91482901573181-8af9d00cb4e30c4c2823d225cf18c7b9f29e5d6c/. |
8af9d00
to
a658336
Compare
cibot: @myungjoo, nntrainer/tensor/blas_interface.cpp includes bug(s). Please fix incorrect coding constructs in your commit before entering a review process. |
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401121635590.17495489120483-a658336e6aa8f0a78564d7f1927a7569bf024b9a/. |
api/ccapi/include/tensor_dim.h
Outdated
|
||
#ifdef ENABLE_FP16 | ||
#ifdef USE__FP16 | ||
#define _FP16 __fp16 | ||
#else | ||
#define _FP16 _Float16 | ||
#endif | ||
#else /* !ENABLE_FP16 */ | ||
#define _FP16 uint16_t /* Keep the FP16 programming interface, but don't allow using it in run-time */ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
after this fix, I think we'll need to add test cases to certify every part is well covered.
In the case of tensors, such as follows.
Tensor fp32_tensor(..., Tdatatype::FP32);
EXPECT_THROW({ fp32_tensor.getValue<FP16>(1); }, std::invalid_argument);
EXPECT_THROW(
{
fp32_tensor.apply(
(std::function<_FP16(_FP16)>)[&](_FP16 in) { return idx++ % 2; });
},
std::invalid_argument);
previously, there was no need to check because _FP16
cannot be used when fp16 is not enabled.
if everyone agrees, I'll work on adding test cases at the tensor level.
a658336
to
6f56848
Compare
cibot: @myungjoo, nntrainer/tensor/blas_interface.cpp includes bug(s). Please fix incorrect coding constructs in your commit before entering a review process. |
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401121720560.46521592140198-6f56848e31db42dc880f2a6a3451c6a92614e81f/. |
_FP16 is the macro to unify different fp16 typenames across different architectures or libraries. Note that util_simd.cpp has the correct name (_FP16) while the header has the incorrect naming (__fp16). Although this does not break the build or execution, this is not good for readability and dependency clean-ups. CC: @skykongkong8 Signed-off-by: MyungJoo Ham <[email protected]>
6f56848
to
015380d
Compare
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401171150420.69172310829163-015380d503d4d3deb9daf81de023a08c987485c8/. |
d5f28b8
to
559e50b
Compare
Most modules should NOT depend on the existnace of FP16 support. Let a single (or minimal) entity (a single class?) know and handle it. This is a suggestion commit that shows an example how to refactor it. Most modules are not yet touched with this commit. Signed-off-by: MyungJoo Ham <[email protected]>
559e50b
to
44e38f4
Compare
cibot: @myungjoo, A builder checker could not be completed because one of the checkers is not completed. In order to find out a reason, please go to http://ci.nnstreamer.ai/nntrainer/ci/repo-workers/pr-checker/2403-202401171414420.47200703620911-44e38f40192ea522591d310ed5126b7e32a84cce/. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM 👍
We can consider this PR proposed for better readability and maintainability. We are in the middle of refactoring Tensor, BLAS, and Multi-Type Support and the codes will be changed a lot. |
This allows writing FP16-independent code without breaking the current code. This is a non-invasive subset of nnstreamer#2403 to address the concern of nnstreamer#2403 (comment) I believe nntrainer can use this during the refactoring so that we do not introduce additional code divergences. Signed-off-by: MyungJoo Ham <[email protected]>
Most modules should NOT depend on the existnace of FP16 support. Let a single (or minimal) entity (a single class?) know and handle it.
This is a suggestion commit that shows an example how to refactor it. Most modules are not yet touched with this commit.
CC: @skykongkong8 (I've seen some #if/#endif in your commits.)