Behind all this scientific bureaucratese (via MERX):
“The objective is the design and development of a decision support system through the fusion of biometrics signals for behavioural diagnostic applications. The design and development of the system will be completed over 3 phases. The system will be based on the automatic processing and the classification of data stemming mainly from Electroencephalogram (EEG), Electrocardiogram (ECG) recordings and tracking of 3D stereoscopic facial characteristics. This processing should provide the integrated system capability, including vital signs, stereoscopic cameras and EEG, to quantify and provide automated diagnosis of the stressors that may affect the operational readiness of Canadian Forces (CF) operators and identify behavioural patterns with the aim of detecting hostile intend.” (I’m guessing they mean “intent” here)
and this tidbit from the Statement of Work (downloadable here):
“Over the past 10 years, (Defence Research and Development Canada) Toronto has developed a number of proprietary medical diagnostic technologies that include both 3D volumetric imaging and vital signs monitoring capabilities. Most recently, however, it has been identified that the above technologies have the potential to address:
• medical diagnostic applications,
• assessment of the type of stressors that may influence the operational readiness of CF operators, and
• identification of behavioural patterns with the aim of detecting hostile intend.”
is an interestimg research question:
How can we “read” faces (electronically), co-relate facial expressions with different states of mind, and figure out how to “read” either hostile intent or how a C.F. member is doing (fatigue, anxiety, etc.)?
I look forward to interesting things coming from this – not to mention discussion of the ethics of, say, a machine saying Cpl. Bloggins or Capt. Smith is too tired/anxious/whatever to work.
Update (1): Who won the contract, and for how much? (PDF)
Update (2): An American variation on the research theme.