diff --git a/android/README.md b/android/README.md index 8c185a030..980de432f 100755 --- a/android/README.md +++ b/android/README.md @@ -9,8 +9,8 @@ Click on the links below to read about the features of the apps. -- [OpenBot app](app/README.md) -- [Controller app](controller/README.md) +- [Robot App](app/README.md) +- [Controller App](controller/README.md) ## Install the apps diff --git a/android/app/README.md b/android/app/README.md index f7e134e8e..18e925c48 100644 --- a/android/app/README.md +++ b/android/app/README.md @@ -1,4 +1,4 @@ -# OpenBot App +# Robot App ## DISCLAIMERS @@ -6,24 +6,25 @@ care is neccessary when using automated control (e.g. person following or driving policy). Make sure you always have a game controller connected and are familiar with the key mapping so you can stop the vehicle at any time. Use at your own risk! 2. **App under development:** The application is under development and may crash or exhibit unexpected behaviour depending on your phone model and version of the operating system. Make sure to test all functionalities with no wheels connected. Use at your own risk! -## Default Activity +## App Screens -The app starts with the [DefaultActivity](src/main/java/org/openbot/robot/DefaultActivity.java) which includes the most important features of the OpenBot app in a single screen. It displays the connection status to the vehicle and reports measurements from vehicle sensors. The robot can be controlled by standard BT game controllers or another smartphone running the OpenBot [controller app](../controller). We have also implemented a data logger to collect datasets with the robot. Currently, we record readings from following sensors: camera, gyroscope, accelerometer, magnetometer, ambient light sensor, and barometer. Using the Android API, we are able to obtain the following sensor readings: RGB images, angular speed, linear acceleration, gravity, magnetic field strength, light intensity, atmospheric pressure, latitude, longitude, altitude, bearing, and speed. In addition to the phone sensors, we record body sensor readings (wheel odometry, obstacle distance and battery voltage), which are transmitted via the serial link. We also record and timestamp control signals received from a connected controller, if present. Lastly, we integrate several neural networks for person following and autonomous navigation. +### Main Menu -### Code Structure +The app starts with a menu screen that shows all available screens. The settings screen can be opened with a click on the icon at the top right corner. By clicking on the other icons the user can access various screens whose functionalities are explained in the following. -The [TensorFlow Lite Object Detection Android Demo](https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android) was used as starting point to integrate TFLite models and obtain the camera feed. The [DefaultActivity](src/main/java/org/openbot/robot/DefaultActivity.java) runs the main thread and inherits from the [CameraActivity](src/main/java/org/openbot/robot/CameraActivity.java) to manage the camera and UI. The [SensorService](src/main/java/org/openbot/robot/SensorService.java) reads all other phone sensors and logs them. The [ServerService](src/main/java/org/openbot/robot/ServerService.java) and [NsdService](src/main/java/org/openbot/robot/NsdService.java) establish a connection to a local [Python server](../../policy/README.md#web-app) with a React frontend. If you collect data it can be uploaded automatically for visualization, training ML models and downloading trained models to the robot. The [env](src/main/java/org/openbot/env) folder contains utility classes such as the [Vehicle](src/main/java/org/openbot/env/Vehicle.java) interface, [GameController](src/main/java/org/openbot/env/GameController.java) interface, [PhoneController](src/main/java/org/openbot/env/PhoneController.java) interface and an [AudioPlayer](src/main/java/org/openbot/env/AudioPlayer.java) for the audible feedback. The [tflite](src/main/java/org/openbot/tflite) folder contains the model definitions for the [Autopilot](src/main/java/org/openbot/tflite/Autopilot.java) and [Detector](src/main/java/org/openbot/tflite/Detector.java) networks. +

+Main Menu +Settings Menu +

-### User Interface +### Default -In the following we briefly explain the user interface. +The [DefaultActivity](src/main/java/org/openbot/original/DefaultActivity.java) includes the most important features of the OpenBot app in a single screen. It displays the connection status to the vehicle and reports measurements from vehicle sensors. The robot can be controlled by standard BT game controllers or another smartphone running the OpenBot [controller app](../controller). We have also implemented a data logger to collect datasets with the robot. Currently, we record readings from following sensors: camera, gyroscope, accelerometer, magnetometer, ambient light sensor, and barometer. Using the Android API, we are able to obtain the following sensor readings: RGB images, angular speed, linear acceleration, gravity, magnetic field strength, light intensity, atmospheric pressure, latitude, longitude, altitude, bearing, and speed. In addition to the phone sensors, we record body sensor readings (wheel odometry, obstacle distance and battery voltage), which are transmitted via the serial link. We also record and timestamp control signals received from a connected controller, if present. Lastly, we integrate several neural networks for person following and autonomous navigation. -

- App GUI - App GUI +

+ App GUI

- #### USB Connection The drop-down menu is used to set the baud rate. The default is 115200 and you should not need to change this unless you mess with the Arduino firmware. The app will attempt to connect automatically, but in case you encounter issues you can use this switch to disconnect/connect. @@ -34,7 +35,10 @@ The field **Battery** displays the battery voltage as measured by the Arduino vi #### Control -The first button is for selecting the **control mode**. Currenlty, the only control mode is **Gamepad**. In the future, it will be possible to control the robot with another smartphone, i.e. **Phone** or via **WebRTC**. +The first button is for selecting the **control mode**. There are two different control modes: + +- **Gamepad**: The app receives controls from a connected BT controller. +- **Phone**: The robot can be controlled via another smartphone with the controller app installed or though a Python script running on a computer connected to the same network. The second button is for selecting the **drive mode**. There are three different drive modes when using a game controller (e.g. PS4): @@ -63,15 +67,16 @@ The switch on the right is used to toggle logging on and off. On the game contro #### Camera -The first item shows the preview resolution. The second item shows the crop resolution. This is the image that is used as input to the neural networks. You will notice that this resolution changes depending on which model you select below. If you train your own autopilot, make sure to select the `AUTOPILOT_F` model. The crop resolution should show `256x96`. The switch on the right is used to toggle between the rear and the front camera. +The first item shows the preview resolution. The second item shows the crop resolution. This is the image that is used as input to the neural networks. You will notice that this resolution changes depending on which model you select below. If you train your own autopilot, make sure to select the `AUTOPILOT_F` model. The crop resolution should show `256x96`. The switch on the right is used to toggle between the rear and the front camera. #### Model -There are three models that come with the app: +There are two models that come with the app: -- **DETECTOR_V1_1_0_Q**: This model is used for person following. It uses a SSD object detector with MobileNet V1 backbone. The model is quantized for better performance on embedded devices. -- **DETECTOR_V3_S_Q**: This model is used for person following. It uses a SSD object detector with MobileNet V3 backbone. The model is quantized for better performance on embedded devices. -- **AUTOPILOT_F**: This model is used for autonomous navigation. It will predict controls directly from the camera input. Chances are that it will not work in your environment. You should follow our instructions to train your own [Driving Policy](../../policy) and replace it. +- **MobileNetV1-300**: This model is used for person following. It uses a SSD object detector with MobileNet V1 backbone. The model is quantized for better performance on embedded devices. It comes with the app. +- **CIL-Mobile**: This model is used for autonomous navigation. It will predict controls directly from the camera input. Chances are that it will not work in your environment. You should follow our instructions to train your own [Driving Policy](../../policy) and replace it. + +Additonal models can be downloaded from the Model Management screen. The switch on the right is used to turn the network on and off. When the network is running, it produces the controls for the robot and the game controller is disabled. However, you may still use the buttons on the game controller, for example to toggle this switch with the R1 trigger button to regain control of the robot. @@ -85,20 +90,12 @@ Use the drop-down menu to select the device on which the neural network should b If a model is active, the inference speed in [ms] will be displayed next to the device which is running the model. -## Additional fragments - -Pressing the back button opens up the menu screen that shows all available activities and fragments. Users may choose a custom screen to use specific features rather than the default activity. - -

-App GUI -

- -### Free Roam +### Free Roam Free Roam offers simple robot control with real time updates and information about battery, speed and distance from surfaces.

-Alt text +Alt text

- **Battery**: The battery icon shows realtime battery levels of the connected robot. @@ -115,36 +112,69 @@ Free Roam offers simple robot control with real time updates and information abo - **Sonar**: The sonar view distance of robot from an oncoming object in cm. -- **Control**: Controller, Drive Mode and Speed are used to control robot settings as described above in Control section. +- **Control**: Controller, Drive Mode and Speed are used to control robot settings as described above in the Control section. ### Data Collection Simple UI for collection of data sets.

-Alt text +Alt text

- - **Preview Resolution**: Used to switch between resolutions of camera preview. There are 3 settings: - ***FULL_HD*** (1920x1080p) - ***HD*** (1280x720p) - ***SD*** (640x360) -- **Training Resolution**: Used to switch between resolutions of images saved for training different models. +- **Model Resolution**: Used to switch between resolutions of images saved for training different models. + +### Controller Mapping + +Simple UI to check the button and joystick mapping of a connected BT controller. + +

+Alt text +

+ +### Autopilot + +Simple UI for running autopilot models. + +

+Alt text +

+ +### Object Tracking -### AI Models +Simple UI for tracking objects of 80 different classes. -Simple UI for running AI models. +

+Alt text +Alt text +

+ +### Model Management

-Alt text +Alt text

+The following AI models can be downloaded: + +- **MobileNetV3-320**: This model is used for person following. It uses a SSD object detector with MobileNet V3 backbone. The model is quantized for better performance on embedded devices. +- **YoloV4-tiny-224**: This model is used for person following. It uses the tiny version of YoloV4, a state-of-the-art object detector. The model is quantized for better performance on embedded devices. +- **YoloV4-tiny-416**: This model is used for person following. It uses the tiny version of YoloV4, a state-of-the-art object detector. The model is quantized for better performance on embedded devices. +- **YoloV4-224**: This model is used for person following. It uses the state-of-the-art object detector YoloV4. The model is quantized for better performance on embedded devices. + ## Add your own fragment Please refer to the [ContributionGuide](ContributionGuide.md) to learn how to add your own fragments to the OpenBot app. +## Code Structure + +The [TensorFlow Lite Object Detection Android Demo](https://github.com/tensorflow/examples/tree/master/lite/examples/object_detection/android) was used as starting point to integrate TFLite models and obtain the camera feed. The [DefaultActivity](src/main/java/org/openbot/robot/DefaultActivity.java) runs the main thread and inherits from the [CameraActivity](src/main/java/org/openbot/robot/CameraActivity.java) to manage the camera and UI. The [SensorService](src/main/java/org/openbot/robot/SensorService.java) reads all other phone sensors and logs them. The [ServerService](src/main/java/org/openbot/robot/ServerService.java) and [NsdService](src/main/java/org/openbot/robot/NsdService.java) establish a connection to a local [Python server](../../policy/README.md#web-app) with a React frontend. If you collect data it can be uploaded automatically for visualization, training ML models and downloading trained models to the robot. The [env](src/main/java/org/openbot/env) folder contains utility classes such as the [Vehicle](src/main/java/org/openbot/env/Vehicle.java) interface, [GameController](src/main/java/org/openbot/env/GameController.java) interface, [PhoneController](src/main/java/org/openbot/env/PhoneController.java) interface and an [AudioPlayer](src/main/java/org/openbot/env/AudioPlayer.java) for the audible feedback. The [tflite](src/main/java/org/openbot/tflite) folder contains the model definitions for the [Autopilot](src/main/java/org/openbot/tflite/Autopilot.java) and [Detector](src/main/java/org/openbot/tflite/Detector.java) networks. + ## Next (optional) Train your own [Driving Policy](../../policy/README.md) diff --git a/docs/images/ai_screen.jpg b/docs/images/ai_screen.jpg deleted file mode 100644 index dfb92ef69..000000000 Binary files a/docs/images/ai_screen.jpg and /dev/null differ diff --git a/docs/images/app_gui_1.jpg b/docs/images/app_gui_1.jpg deleted file mode 100644 index 6268c67e4..000000000 Binary files a/docs/images/app_gui_1.jpg and /dev/null differ diff --git a/docs/images/app_gui_2.jpg b/docs/images/app_gui_2.jpg deleted file mode 100644 index 72ba36e07..000000000 Binary files a/docs/images/app_gui_2.jpg and /dev/null differ diff --git a/docs/images/logger_screen.jpg b/docs/images/logger_screen.jpg deleted file mode 100644 index 2fb6b3489..000000000 Binary files a/docs/images/logger_screen.jpg and /dev/null differ diff --git a/docs/images/menu_selector.jpg b/docs/images/menu_selector.jpg deleted file mode 100644 index 069ce30dd..000000000 Binary files a/docs/images/menu_selector.jpg and /dev/null differ diff --git a/docs/images/screen_autopilot.jpg b/docs/images/screen_autopilot.jpg new file mode 100644 index 000000000..3ca906967 Binary files /dev/null and b/docs/images/screen_autopilot.jpg differ diff --git a/docs/images/screen_controller_mapping.jpg b/docs/images/screen_controller_mapping.jpg new file mode 100644 index 000000000..da6c02ab6 Binary files /dev/null and b/docs/images/screen_controller_mapping.jpg differ diff --git a/docs/images/screen_data_collection.jpg b/docs/images/screen_data_collection.jpg new file mode 100644 index 000000000..ed442eedd Binary files /dev/null and b/docs/images/screen_data_collection.jpg differ diff --git a/docs/images/screen_default.jpg b/docs/images/screen_default.jpg new file mode 100644 index 000000000..0ffa56d84 Binary files /dev/null and b/docs/images/screen_default.jpg differ diff --git a/docs/images/game_screen.jpg b/docs/images/screen_free_roam.jpg similarity index 100% rename from docs/images/game_screen.jpg rename to docs/images/screen_free_roam.jpg diff --git a/docs/images/screen_main.jpg b/docs/images/screen_main.jpg new file mode 100644 index 000000000..7630ad9aa Binary files /dev/null and b/docs/images/screen_main.jpg differ diff --git a/docs/images/screen_model_management.jpg b/docs/images/screen_model_management.jpg new file mode 100644 index 000000000..40dfdf0b0 Binary files /dev/null and b/docs/images/screen_model_management.jpg differ diff --git a/docs/images/screen_object_tracking_1.jpg b/docs/images/screen_object_tracking_1.jpg new file mode 100644 index 000000000..fe9eb5269 Binary files /dev/null and b/docs/images/screen_object_tracking_1.jpg differ diff --git a/docs/images/screen_object_tracking_2.jpg b/docs/images/screen_object_tracking_2.jpg new file mode 100644 index 000000000..6937bd447 Binary files /dev/null and b/docs/images/screen_object_tracking_2.jpg differ diff --git a/docs/images/screen_settings.jpg b/docs/images/screen_settings.jpg new file mode 100644 index 000000000..17d245f52 Binary files /dev/null and b/docs/images/screen_settings.jpg differ