

How does the construction app know what needs to be constructed and how?
How does the waiter app know which table ordered what, needs attention, etc?
How does the IT app know on which port every device is connected?
These things are all real hard to know. Having glasses that display the knowledge could be really nice but for all these magic future apps, having a display is only part of the need.
I figure they have AI chatbots making the decisions and it can’t
termtell the difference between talking about violence and advocating for violence