A Python implementation of the server that handles the client data and application logic
- A backend server to handle data processing and running of various services.
- It takes input from a variety of sources, including video streams and the WebSocket Server.
- Designed to be easy for developers to implement new services and to be able to support real-time data processing.
- Make sure
python3is installed
- Install
conda(Miniconda). Note that some packages may not work with Anaconda.- Upgrade pip, using
pip install --upgrade pip setuptools wheel
- Upgrade pip, using
- Create new conda environment
tomusingconda env create -f environment-cpu.yml- If you have a previous environment, then update it,
conda env update --file environment-cpu.yml --prune. - To completely remove previous env run,
conda remove -n tom --all, then recreate the environment. - For ARM Mac (M1-Mn chip),
- If the installation fails due to
pyaudio, please follow this - If the installation fails due to
egg_info, change the dependencypsycopg2topsycopg2-binaryinenvironment-cpu.yml - If the installation fails due to
googlemaps, either remove it fromenvironment-cpu.ymlor install it separately usingpip install --use-pep517 googlemapsafter activating thetomenvironment.
- If the installation fails due to
- If you have a previous environment, then update it,
- Activate
tomenvironment,conda activate tom - Download the pretrained weights for YOLOv8 from Ultralytics (e.g., yolov8n.pt).
- Copy the downloaded file to the
Processors/Yolov8/weightsdirectory and rename it asmodel.pt(i.e.,Processors/Yolov8/weights/model.pt).
- Copy the downloaded file to the
- Create the environment files:
- For development environment: Copy
.sample_envto.env.devand [optional] update the values if needed.- e.g.,
CAMERA_VIDEO_SOURCE = 0uses the default camera and can be changed to any video stream/URL/file source.- [Optional] To use the HoloLens camera, uncomment the following lines in main.py and update the IP address in
credential/hololens_credential.json:# from APIs.hololens import hololens_portal # hololens_portal.set_api_credentials() # hololens_portal.set_hololens_as_camera()
- [Optional] To use the HoloLens camera, uncomment the following lines in main.py and update the IP address in
- e.g.,
- For testing environment: Copy
.sample_envto.env.testand [optional] update the values if needed.
- For development environment: Copy
- [Optional] Create the required credential files inside the newly created
credentialfolder, ONLY for third-party libraries you use. Please refer to the Third-party libraries section to obtain credentials. (Note: JSON format must be correct.)- Create a file
credential/hololens_credential.jsonwith Hololens credentials such as{"ip": "IP","username": "USERNAME","password": "PASSWORD"}- Configure the Hololens Device Portal. Save your credentials to
credential/hololens_credential.json
- Configure the Hololens Device Portal. Save your credentials to
- Create a file
credential/google_cloud_credentials.jsonwith Google Cloud API credentials.- Follow authentication to get json key file and rename it to
google_cloud_credentials.json
- Follow authentication to get json key file and rename it to
- Create a file
credential/openai_credential.jsonwith OpenAI credentials such as{"openai_api_key": "KEY"} - Create a file
credential/gemini_credential.jsonwith Gemini credentials such as{"gemini_api_key": "KEY"} - Create a file
credential/anthropic_credential.jsonwith Anthropic credentials such as{"anthropic_api_key": "KEY"} - Create a file
credential/google_maps_credential.jsonwith Google Maps credentials such as{"map_api_key": "KEY"}- Current Google Maps APIs used:
- To use Google Maps API, you need to have a Google Cloud Platform (GCP) account to get an API key and enable the APIs shown above.
- Create a file
credential/ors_credential.jsonwith Openrouteservice credentials such as{"map_api_key": "KEY"} - Create a file
credential/geoapify_credential.jsonwith Geoapify credentials such as{"map_api_key": "KEY"} - Create a file
credential/fitbit_credential.jsonwith Fitbit credentials such as{"client_id": "ID","client_secret": "SECRET"}
- Create a file
- [Optional] If you want to simulate running assistance on a treadmill, follow the steps in Running Demo Service
- [Optional] To use
APIs/local_yyy(e.g., local_vector_db), please follow theREADME.mdinside those local APIs.- Note: Certain services (e.g., memory_assistance_service) depends on those local APIs.
- Makesure the clients (e.g., HoloLens, Xreal, WearOS Watch) are connected to the same Wi-Fi network of the Sever. Use a private network, as public networks may block certain ports used by websocket communication (e.g., 8090). Note: Campus network may not work.
- Use
ipconfig/ifconfigin your terminal to get the Server IP address. Look for the IPv4 address under the Wi-Fi section. - Set up TOM-Client-Unity on the HoloLens/Xreal and make sure to update the IP address in
Videos/TOM/tom_config.json. - [Optional] Set up TOM-Client-WearOS on the Android smartwatch and make sure to update the IP address in
app/src/main/java/com/hci/tom/android/network/Credentials.kt. - [Optional] Set up TOM-Client-Web on a computer and make sure to update the IP address in
src/constant/config.ts. - [Troubleshooting] If clients cannot connect to the server via websocket, try the following steps:
- Ensure that all clients are on the same network as the server. For devices running Windows OS, such as PCs or HoloLens, set the network connection to private.
- Check the firewall settings on the server machine and allow the server application to communicate through the firewall.
- To test if the server is reachable, use another computer on the same network to run WebSocketClientTester.html. This test will attempt to open port 8090 on the server, confirming if it's accessible from another device.
- Use the
tomenvironment:- Activate it via the command line:
conda activate tom(for Conda users) or through your IDE.
- Activate it via the command line:
- Export the environment variable
ENV:- For Windows Command Prompt:
set ENV=dev
- For Windows PowerShell:
$env:ENV = "dev"
- For Linux/Mac:
export ENV=dev
- For Windows Command Prompt:
- Run the application:
- Execute
main.pyusing:(Avoid usingpython main.py
py main.py.)
- Execute
- [Optional] Configure your IDE (e.g., VSCode, PyCharm) to run the server with the environment variable
ENV=dev. - [Optional] Run the clients after the server has started.
- Run
pytestviapython -m pytest(orpython -m pytest Tests\...\yy.pyorpython -m pytest Tests\...\yy.py::test_xxto run specific tests)
- See the implemented services
- Examples
- Download the first person video here (you can download
fpv_short.mp4orfpv.mp4).- Copy the video/s to
Tests/RunningFpv/. - Configure which video to be used (short/full) in the
.envfile (FPV_OPTION).
- Copy the video/s to
- Set up the Unity and WearOS clients as mentioned in the Setup the Clients section.
- Ensure that
DemoRunningCoach.yamlis set in/Config, andRunningCoach.yamlis in/Config/Ignoreon the python server.
- See DeveloperGuide.md for more details on development guidelines and adding new services/components.
- Structuring Your Project
- Modules
- python-testing
- To find/update dependencies, use
conda env export > environment.yml(orpip3 freeze > requirements.txt) ref
- YOLO-WORLD on WSL - Guide for setting up YOLO-WORLD on a WSL machine.
- cloud-vision-api-python, google-cloud-vision,
- Detic
- google-maps, google-maps-services-python
- OpenAI
- Gemini
- Anthropic
- Nominatim OpenStreetMap
- Openrouteservice, OpenStreetMap
- Geoapify
- GeographicLib
- Shapely
- Ultralytics YOLOv8
- python-fitbit
- python-vlc
- See other dependencies in environment-cpu.yml