Smartphone based Approximate Localization using Highlighted Texts from Images
Pervasive and Mobile Computing
In many application scenarios, an approximate location can suffice instead of achieving high accuracy of GPS, or other network infrastructure enabled localization. This can lead to design of localization systems low in resource consumption, and faster in obtaining a result. In this work, we design and implement a lightweight localization system, called WhereAmI, that can perform coarse localization with low resource requirement. The key intuition behind this work is that a collection of nearby textual signs in an image representing a user’s surrounding forms a bag-of-words that provides a unique signature for her location. Due to the low performance of Optical Character Recognition (OCR) engine in outdoor settings, we develop a keyword-based positioning algorithm that can work even with partial errors in the detected texts representing business names. The partial errors in recognized business names are handled by using an n-gram-based text correction approach. We use a cloud based web service for offloading parts of the application workloads intelligently to save resources, like energy and network cost. The Android based prototype of WhereAmI is tested in uncontrolled environments. The experimental results show that WhereAmI can achieve 95% accuracy while consuming 20% less power than that of GPS. The proposed keyword-based positioning algorithm takes about 59 ms on average for returning the location.
Im, Taeyu, Darius Coelho, Klaus Mueller, Pradipta De.
"Smartphone based Approximate Localization using Highlighted Texts from Images."
Pervasive and Mobile Computing, 46: 1-17.
doi: 10.1016/j.pmcj.2018.02.004 source: https://doi.org/10.1016/j.pmcj.2018.02.004