Age | Commit message (Collapse) | Author |
|
The markdown formatting to make it easier for new users to read
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
- Move future plan to from README to TODO
- The offline build has already supported, remove it from future plan
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Move limitation from README to LIMITATION.md
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Move maintenance from README to MAINTAINERS.md
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Move `build and run' section from README to BUILD.md
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
The title for README is meta-tensorflow
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Follow tensorflow brand guidelines:
`Please only use the TensorFlow name and marks when accurately
referencing this software distribution... When referring to our
marks, please include the following attribution statement:'
https://www.tensorflow.org/extras/tensorflow_brand_guidelines.pdf
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
The openjdk-8-native 212b04 is a pre-build binary from ojdkbuild which
is a community build using source code from OpenJDK project.
[https://github.com/ojdkbuild/ojdkbuild]
The reason to use it rather than build from source by meta-java:
- The meta-java introduce extra 38 depend recipes to openjdk-8-native,
without meta-java could save build time 20%, and will not be broke
by meta-java build failure
- Only use openjdk-8-native to build bazel-native, and then use
bazel-native to build others, any target recipes/packages does not
require openjdk-8-native directly
The pre-build binary only supports on x86-64, for other hosts, we
could add use the one provided by meta-java. What you do is just
add layer meta-java to your build.
The idea refers meta-renesas-ai:
https://github.com/renesas-rz/meta-renesas-ai/blob/master/meta-tensorflow/recipes-devtools/openjdk/openjdk-8-native_151-1.b12.bb
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Once bazel fetches tarballs from internet, it will save them to distdir.
And the following build support offline build.
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Previously model does not accurate, refer upstream:
https://www.tensorflow.org/lite/models/image_classification/overview
update to latest modle.
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Add required bsp layer, build beagleboard, raspberrypi,
genericx86 (including atom), genericx86-64, and
imx6sxsabresd successfully.
So remove beagleboard, raspberrypi, atom from future plan.
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|
|
Signed-off-by: Hongxu Jia <hongxu.jia@windriver.com>
|