Meta-sensing (also known as metaveillance) is the sensing of sensing (veillance of veillance).
"Meta" is a Greek word that means "beyond". For example, a meta-conversation is a conversation about conversations. A meta-argument is an argument about arguments. Metadata is data about data. An example of meta-data is the GPS coordinates where a photograph was taken. This meta-data is appended into the header information of the picture.
An example of meta-sensing is metavision.
Metavision is the vision of vision, i.e. seeing sight, and visualizing vision.
This year (2020), we have secured $200,000 in funds to hire students to work on meta-sensing for autonomous vehicles (funded by Ford Motor Company of Canada). We have positions in Toronto and in Silicon Valley and in Shenzhen and Xiamen for people skilled in the art of metasensing.
Metavision makes it possible to photograph the sensory capacity of a self-driving car, and provide photographic evidence (to a courtroom, judge, jury, etc.), to prove the car's sensors were in good working order when it left the assembly plant for example.
Metavision is the name of the AR (Augmented Reality) company that my PhD student, Raymond Lo, and I, and others, founded in Silicon Valley, California (we raised $75,000,000 US) to manufacture computerized eyeglasses.
Raymond Lo, the "brains" behind the company, is now at Harvard University, and is visiting Toronto for the next two months.
The kinds of things all of us really like are things that are extremely simple yet profoundly deep conceptually. Linking the prevous labs, we have the 1-pixel camera connected to the 1-pixel display, to give rise to a simple example of meta-sensory 3D AR (Augmented Reality). There is an overlay of some virtual information in perfect and exact alignment with some form of physical reality. In a sense you now have a true and accurate scientific "outstrumentTM", i.e. a scientific instrument that has a "readout" that reads "out" into the real world.
See also Augmented Reality robotics