Beruflich Dokumente
Kultur Dokumente
ISSN No:-2456-2165
Tagging Module:
3. Image Processing: All other phases of the user module are common with
In this stage, the whole building is tagged in terms of the tagging module.
various image checkpoints. Every image is mapped with
particular coordinates. All this image data is processed and V. CONCLUSION
stored in a database.
In this paper, we have proposed a complete framework
4. Adding place to map in coordinate system: specially designed for an Augmented Reality based Indoor
In addition to the image processing system if we want navigation system. After evaluating various approaches we
to add any custom checkpoint in the system we can add it concluded that a QR based approach combined with a
manually and it will be labelled using user input and stored Vision based approach would be better due to its higher
in the database the same way that other checkpoints will be accuracy, efficiency, moderate cost as compared to other
stored. approaches. The system will help users to tag any indoor
location and get directions to various indoor places by using
5. Map Generation: sensors in their smartphone devices and Augmented Reality
All the details related to the checkpoint like images, Projections. In Future, the scope can be extended to a QR
coordinates, relative position to reference point etc. are less mechanism, which will also work for outdoor places.
stored in one database and act as a Map.
REFERENCES
User Module:
[1]. A. Satan, "Bluetooth-based indoor navigation mobile
system," 2018 19th International Carpathian Control
Conference (ICCC), Szilvasvarad, 2018, pp. 332-337,
doi: 10.1109/CarpathianCC. 2018.8399651.
[2]. D. Mamaeva, M. Afanasev, V. Bakshaev and M.
Kliachin, "A multi-functional method of QR code
used during the process of indoor navigation," 2019
International Conference on Engineering and
Telecommunication (EnT), Dolgoprudny, Russia,
2019, pp. 1-4, doi: 10.1109/EnT47717.2019.9030587.
[3]. G. Gerstweiler , E. Vonach and H. Kaufmann,
“HyMoTrack: A Mobile AR Navigation System for
Complex Indoor Environments” in Sensors
16(1):17,December 2015.
[4]. J. Dong, M. Noreikis, Y. Xiao and A. Ylä-Jääski,
"ViNav: A Vision-Based Indoor Navigation System
for Smartphones," in IEEE Transactions on Mobile
Computing, vol. 18, no. 6, pp. 1461-1475, 1 June
2019, doi: 10.1109/TMC.2018.2857772.
Fig 3:- User Module