AI-powered navigation assistant using TensorFlow Lite for real-time object detection, multimodal feedback, and enhanced environmental awareness on Android devices.
- Real-Time Perception
- 80+ object detection with TensorFlow Lite (EfficientDet-Lite0 model)
- Adaptive ambient light adjustment (Auto-flash)
- 16-direction spatial localization
- Smart Feedback System
- Multi-level voice alerts (Danger/Caution/Info)
- Configurable vibration patterns
- Dynamic feedback frequency adaptation
- Accessibility Optimized
- Full voice interaction support
- Low-power sensor fusion
- Huawei/Xiaomi device optimizations
- Core Frameworks
- TensorFlow Lite 2.8+
- Android CameraX
- AndroidX Lifecycle
- Key Components
CameraManager
: Camera control & image pipelineObjectDetectorHelper
: TFLite inference engineFeedbackManager
: Multimodal feedback systemOverlayView
: Detection visualization
- Android 9.0+ (API 24+)
- Camera2 API support
- Recommended: Light sensor & vibration motor
- Clone repository:
git clone https://github.com/yourusername/guide.git
- Import to Android Studio:
- Use Android Studio Arctic Fox+
- Gradle 7.0+ & Android Gradle Plugin 7.0+
- Model deployment:
- Place
efficientdet_lite0.tflite
inapp/src/main/assets
- Place
- First launch:
- Grant camera permission
- Allow TTS engine initialization
- Basic operations:
- Automatic environment scanning starts
- Tap settings (bottom-right) for preferences
- Two-finger swipe down for emergency stop
- Feedback modes:
- Danger alerts: Continuous vibration + priority speech
- Regular warnings: Single vibration + standard speech
- Environment updates: Speech-only notifications
<!-- app/src/main/res/xml/settings.xml -->
<PreferenceScreen xmlns:android="http://schemas.android.com/apk/res/android">
<string-array name="pref_confidence_entries">
<item>High Accuracy (0.7)</item>
<item>Balanced (0.5)</item>
<item>Sensitive (0.3)</item>
</string-array>
<string-array name="pref_feedback_frequency_entries">
<item>Realtime</item>
<item>Power Saver</item>
<item>Emergency Only</item>
</string-array>
</PreferenceScreen>
- Custom detection model:
objectDetectorHelper = ObjectDetectorHelper( context = this, modelName = "custom_model.tflite", labelPath = "labels.txt" )
- Add new feedback pattern:
feedbackManager.registerFeedbackProfile( profileName = "door_alert", vibrationPattern = longArrayOf(0, 200, 100, 300), ttsTemplate = "Door detected ${distance}m ahead" )
We welcome contributions through:
- Issue reporting:
- Use [Issue Template]
- Include device model & reproduction steps
- Code contributions:
- Fork repository and create feature branch
- Follow [Kotlin Style Guide]
- Submit PR with linked issue
- Localization support:
- Add translations in
feedback_labels.csv
- Test multilingual TTS compatibility
- Add translations in
Apache License 2.0
Key differences from Chinese version:
1. Technical terms use official translations (e.g., "TensorFlow Lite" instead of localized names)
2. Device brands retain original names (Huawei/Xiaomi)
3. Measurement units use international standards (e.g., "m" for meters)
4. Development references align with Android ecosystem conventions
5. Localization instructions emphasize multilingual support
This version maintains technical accuracy while being accessible to global developers and users.