A Wearable Intelligent Guide Device

⚠ 转载请注明出处:作者:ZobinHuang,更新日期:Mar.31 2021


知识共享许可协议

    本作品ZobinHuang 采用 知识共享署名-非商业性使用-禁止演绎 4.0 国际许可协议 进行许可,在进行使用或分享前请查看权限要求。若发现侵权行为,会采取法律手段维护作者正当合法权益,谢谢配合。

This is a project that we aimed at helping blind people to regain their freedom of travel.

    This introduction will be divided into three parts:
    1. The first part briefly introduces the composition of the device developed by our team and related technologies. 
    2. The second part mainly introduces my work. 
    3. The third part introduces the test results and application prospects of our device.

Part1: Introduction of the device​

    In the daily life of blind people, indoor and outdoor travel is a big problem. Patients with severe blindness cannot even move normally indoors. Based on the above problems, our team designed an intelligent blind guide device. The device is mainly composed of a 3D infrared imaging camera, a SOC system-on-chip, an ultrasonic matrix radar, a GPS module, and a natural language interactive processor. It uses SLAM (simultaneous localization and mapping) scanning imaging, convolutional neural network neural network, Key technologies such as natural language interaction and ultrasonic matrix theory radar to implement navigation for users. 

    When the device is enabled, user positioning is performed first. GPS equipment and inertial navigation equipment work together to achieve precise positioning. After the positioning is completed, the device recognizes the user's voice instruction and calls the mapping software to perform optimal route planning. During the process, the SOC system analyzes the data transmitted by the lidar and camera, processes the surrounding environment information, performs real-time 3D modeling on the surrounding environment, and performs path planning in complex environments. Besides, the device will give real-time voice prompts to avoid obstacles.The final design of the device is shown below:

    As the picture shows, 1-10 is the head-mounted sensor part. 4 and 9 are ultrasonic matrix radars. 5, 7, and 8 are three-dimensional depth-of-field cameras. 1, 2, 3 are head-mounted frames. 10 is a data connection line for transmitting data collected by the sensor to the SOC processor system. 11, 14, 15, 16 constitute the main control system of the guide equipment. Among them, ⑾ is a relay module, which controls the power supply of the system. 15 is a speech synthesis module, which is used for speech synthesis and recognition during natural language interaction. 16 is a ZYNQ system-on-chip, which is used for processing video stream data and ultrasonic matrix radar data. 14 is a nine-axis gyroscope for indoor positioning. 12 is a headphone cable, and 13 is a microphone.

Part2: My Work

    My main work is the development of positioning algorithm based on Kalman filter, which will be briefly introduced below.
    GPS navigation is mainly a global positioning navigation system, which is a radio navigation method, while inertial navigation is an autonomous navigation method. For inertial navigation, the gyroscope measures the three-axis angular velocity and the accelerometer measures the three-axis velocity. The disadvantage of inertial navigation is that the positioning accuracy will increase over time. Although GPS navigation has a small positioning error, it is susceptible to external environmental interference. Therefore, our device uses two combined navigation methods and uses Kalman filtering algorithms to correct positioning errors to achieve indoor and outdoor accurate positioning. The following briefly introduces the implementation method.

(A) The Estimation Phase​

According to the Kalman state estimation equation, we can get:

`\hat{X_k^-}=F_k·\hat{X_k-1}+B_k·\vec{u_k}+ 0` ...... ①
`\hat{P_k^-}=A·\hat{P_k-1}·A^T+Q` ...... ②

    In formula ①, `\hat{X_k^-}` represents the preliminary estimation result of the position of k-th round; `\hat{X_k-1}` represents the final estimation result of the position of k-1-th round. The meaning of the formula is to initially estimate the current position(`\hat{X_k^-}`) through the precise position of previous round(`\hat{X_k-1}`). `F_k` represents the system state coefficient matrix, which is a coefficient closely related to the measured forward speed; `B_k` is called the control matrix, and `\vec{u_k}` is called the control vector. The product of `B_k` and `\vec{u_k}` represents the measured acceleration.
    In formula ②, `\hat{P_k^-}` represents the preliminary estimation variance of the k-th round; `\hat{P_k-1}` represents the final estimation variance of the k-1-th round, and  Q represents the variance of the independent noise applied to the system from the outside. It is worth noting that the Kalman filter algorithm assumes that the system and measurement noise are Gaussian white noise.
    That is to say, in our project, according to formula ①, the inertial navigation initially estimate the position at the current time (`\hat{X_k^-}`)  through the measured acceleration data (`B_k · \vec{u_k}`) combined with the estimated position of last round(`\hat{X_k-1}`), and we can initially estimate the variance of the position estimate at the current time according to formula ②. The following figure illustrates this process

(B) THE UPDATE PHASE​

    Let `N_0` be the distribution of position estimates obtained in the above estimation stage, `N_1` be the distribution of GPS positioning measurement data which is also Gaussian white noise distribution), and  `N^'` be the final Kalman position estimation distribution. μ and σ represent their mean and variance.
    In Kalman's theory, `N^'` is the product of `N_0` and `N_1`, so the mean and variance of Kalman's best estimated distribution can be obtained as:

`μ^' = μ_0 + (σ_0^2 · (μ_1 - μ_0))/(σ_0^2 + σ_1^2)` ...... ①
`(σ^')^2 = σ_0^2  - (σ_0^4)/(σ_0^2 + σ_1^2)` ...... ②  

    Taking `K = (σ_0^2)/(σ_0^2 + σ_1^2)` as the Kalman gain matrix, and substituting it into the above formula, we can get:
`μ^' = μ_0 + K · (μ_1 - μ_0)` ...... ③
`(σ^')^2 = σ_0^2 - K · σ_0^2` ...... ④

    According to this correction equation to modify the preliminary estimation result, the final position estimation result can be obtained. My understanding is that the final result is a weighted result of the inertial navigation module data and GPS measurement data. If the GPS position is more accurate, the value of k will become larger, and the final data will be closer to the GPS measurement result. Conversely, if the data of inertial navigation is more accurate, the smaller the value of K, the closer the final estimation result will be to the measurement data of inertial navigation. The result is  shown in the below:

(C) The Optimal Estimation​

    In the research of Kalman's filtering algorithm, it can be proved that the introduction of such a correction mechanism using inertial navigation data and GPS positioning data can largely solve the gyro drift of inertial navigation and GPS signal interference. In summary, this scheme provides an algorithm basis for precise navigation and positioning of the device.
    Eventually, our positioning & navigation algorithm flow is shown below:

    In addition, I also wrote the driver code of the speech recognition module, image processing module, and ultrasound matrix module. Finally we integrated all the module together and debugged the device.

Part3: Test Results Of Device

    In the spring of 2019, we went to Chengdu Special Education School several times and brought this device to blind children. The actual test results were satisfactory, and children were able to use the device to walk freely in school according to instructions. The teachers at the school thanked us for the original intention and results of this work, and we are also very happy that the children regained their right to travel freely.

    Our Device is constantly improving and getting better and better and we obtained national patent certificate in July 2019, and it has been designated for testing in some schools.