PHINIX (Perceptive Helper with Intelligent Navigation and Intuitive eXperience) is a wearable system that helps people who are visually impaired or blind in perceiving their surroundings. PHINIX uses AI cameras to visualize the environment in three dimensions, analyze the scene using Artificial Intelligence (AI), and communicate with the user using audio and/or haptic feedback. To learn more about PHINIX, please visit Ximira's Product page.
PHINIX is developed by Ximira, a not-for-profit organization with a mission to develop an innovative, wearable system that gives blind users more freedom to participate in all aspects of society. To learn more about our story, please visit Ximira's About Us page.
pathway_rec_compressed.mp4
obj_det_compressed.mp4
face_rec_blur_compressed.mp4
Here are some useful resources to get started with PHINIX:
-
To set up a local machine to run PHINIX inside Docker, see Setting up PHINIX locally for development.
-
To set up code and run PHINIX using Docker, see our guide on Setting up PHINIX using Docker.
-
To learn more about the hardware PHINIX requires, see Hardware Requirements for PHINIX.
-
To learn more about our organization and product, we recommend you check out our official website.
All of our documentation lives inside the
/docs
directory in this
repository. If there is anything that is missing in our documentation,
please let us know by
creating an issue.
We invite you to contribute and help improve PHINIX 💚
Here are a few ways you can get involved:
-
Reporting Bugs: If you come across any bugs or issues, please create an issue and follow the default issue template.
-
Suggestions: Have ideas or improvements that you think would make PHINIX better? We'd love to hear from you! Check out our contribution guide to learn how to contribute to our community and codebase.
-
Questions: If you have questions or need to get in touch with our team, you can contact us at [email protected].