In the future when or if AR becomes more popularize, people will be able to identify objects on the fly. For example, Google and a few other company are working on AI object recognition that identifies everyday things like cars and tree. Apps like SkyView on the other hand, augment the world base on position and orientation inputs, rather than visual ones.
After a short time using the Skyview app on an Android phone, other variants I'd like to see is perhaps one that can view the bottom of the ocean, although that isn't the most explored area so a lot of data are missing. This would also be potentially great if integreted with geo-caching if it were able to track object in a 3d space rather than just on the inside surface of a sphere.
One feature I'd like to change about this app is a potential ability to turn off the visual clutter and allow the app to directly point at each elements like how the terminator would visually mark their targets. This means that the app would need to detect faint lights so it wouldn't work in area with a bigger light polution. Perhaps the app should also be able to indentify the amount of light polution base on locational data as well!
By knowing about light polution, then perhaps people may become more aware of the problem that can be fix by using IDA (Internation Darksky Approved) bulbs on most street lamps rather than old sodium or current LED bulbs. Pollution comes in more form than simply carbon and wastes! I know Japan hsa a park that works to prevent it.