Add flax and friends for training neural networks
This updates our formatter, which changed what it expects very slightly.
Update all the pieces of code to match
optax is for the actual optimizers, jax is the underlying accelerated
linear algebra library, tensorflow is for loading datasets and exporting
models.
Change-Id: Ic4c3b425cda74267e1d0ad1615c42452cbefab8a
Signed-off-by: Austin Schuh <austin.linux@gmail.com>
diff --git a/motors/pistol_grip/generate_cogging.py b/motors/pistol_grip/generate_cogging.py
index 5b0a6e6..dc9f87d 100644
--- a/motors/pistol_grip/generate_cogging.py
+++ b/motors/pistol_grip/generate_cogging.py
@@ -8,7 +8,7 @@
def main(argv):
if len(argv) < 4:
- print 'Args: input output.cc struct_name'
+ print('Args: input output.cc struct_name')
return 1
data_sum = [0.0] * 4096
data_count = [0] * 4096