Sizhuo Ma, Jian Wang, Wenzheng Chen, Suman Banerjee, Mohit Gupta, Shree Nayar
Proceedings of the 29th Annual International Conference on Mobile Computing and Networking
This research paper presents QfaR, a novel system that enables mobile devices to scan visual codes (like QR codes) from significantly longer distances than traditional methods.
Key Innovations in Edge Computing Context:
- Location-Guided Scanning: QfaR leverages a crowdsourced database of physical locations of visual codes. By utilizing the device’s location information, the system can significantly narrow down the set of possible codes within the camera’s view. This reduces the computational burden on the device, enabling faster and more efficient code scanning.
- Edge-Based Processing: The core of QfaR’s functionality, including location-based filtering and code recognition, can be implemented on the edge device itself. This minimizes the need for constant communication with a central server, reducing latency and improving privacy.
- Enhanced Applications: QfaR unlocks a wide range of new applications in edge computing environments:
- Smart Cities: Scanning visual codes on distant objects for real-time information, such as traffic updates, environmental data, or points of interest.
- Industrial Automation: Monitoring and controlling equipment from a distance using visual code-based identifiers.
- Augmented Reality: Enhancing AR experiences by overlaying digital information onto real-world objects identified through long-distance visual code scanning.
In essence, QfaR demonstrates how edge computing principles, combined with innovative computer vision techniques, can revolutionize the use of visual codes in a variety of applications. By enabling long-distance scanning, QfaR expands the reach and functionality of visual codes in edge environments, opening up new possibilities for information access, automation, and enhanced user experiences.
Read the paper here.
Contact us with questions.