EDUCATION

PhD in Image and Signal Processing and Cognitive Science.
Master of Artificial Intelligence and Algorithmic (Specialization: Image processing).
Engineer in Computer Science (Specialization: Computer systems).

NEWS

2019

We are organizing the "International Workshop on Automated Assessment of Pain" in conjunction with IEEE FG 2020, May 18th - May 22nd, Buenos Aires, Argentina, http://aap-2020.net/
I am the host of a Methods Event on “Computational Approaches to Pain Detection” at the 2020 Society of Affective Science annual conference.
Demo Chair for the ACM International Conference on Multimodal Interaction (ICMI 2020), Amsterdam, Netherlands.
Our Paper titled “Gram Matrices Formulation of Body Shape Motion: An Application for Depression Severity Assessment” was accepted for Machine Learning for the Diagnosis and Treatment of Affective Disorders @IEEE ACII 2019, Cambridge, UK.
Associate Editor for Frontiers in Computer Science.
Our Paper titled “Categorical timeline allocation for diagnostic head movement tracking feature analysis” was accepted for FGAHI@CVPR, 2019.
Senior Program Committee for the 8th International Conference on Affective Computing & Intelligent Interaction ACII 2019.
Grace Hopper Celebration GHC 2019 AI Program Committee Member!
We are organizing the “2nd International Workshop on Face and Gesture Analysis for Health Informatics” in conjunction with CVPR 2019, June 16th - June 21st, Long Beach, CA, http://cvpr2019.thecvf.com/. Call for papers soon.
Our Paper titled “Automated measurement of head movement synchrony during dyadic depression severity interviews“ was accepted for IEEE International Conference on Automatic Face and Gesture Recognition, Lille, France, 2019.

2018

We are organizing the “6th International Workshop on Context Based Affect Recognition (CBAR 2019)” in conjunction with FG 2019 in May 2019 in Lille France http://fg2019.org/. Call for papers soon.
Our Paper titled “Dynamics of Face and Head Movement in Infants with and without Craniofacial Microsomia: An Automatic Approach“ was accepted for Plastic and Reconstructive Surgery-Global Open Journal, 2018.
Social Media Chair for ICMI 2019 in Suzhou, China (Call coming soon!)
Group effort “Abstract Animations for the Communication and Assessment of Pain in Adults: Cross-Sectional Feasibility Study” was accepted for Journal of Medical Internet Research.
Our Paper titled “Detecting Depression Severity by Interpretable Representations of Motion Dynamics” was accepted for FGAHI at FG 2018.
Our Paper titled “Objective Measurement of Head Movement Differences in Children With and Without Autism Spectrum Disorder” was accepted for Molecular Autism Journal.
Area Chair for IEEE Automatic Face and Gesture Recognition 2019 (FG 2019).
Our Paper titled “Facial Expressiveness in Infants With and Without Craniofacial Microsomia: Preliminary Findings” was accepted for Cleft Palate-Craniofacial Journal.
We are organizing the 1st International Workshop on Face and Gesture Analysis for Health Informatics to be held in conjunction with IEEE FG 2018 on May 15-19, 2018, Xi’an, China..

2016-2017

Our Paper titled “Automatic AU detection in infants using convolutional neural network” was accepted for ACII 2017.
I am the Co-PI of an NIH Research Grant Award “Craniofacial Microsomia: Facial Expression from Ages 1 to 3 Years”.
I am the recipient of an NIH Research Grant Award “Multimodal Assessment of Occurrence and Intensity of Pain for Research and Clinical Use”.
We are organizing the 5th International Workshop on “Context Based Affect Recognition (CBAR 2017)” to be held in conjunction with ACII 2017 in October 2017 in San Antonio, Texas.
Depression Severity Interviews released in June 2017 (http://www.pitt.edu/~emotion/depression.htm).
Our Paper titled “Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding” was accepted for IEEE Journal of Biomedical and Health Informatics, 2016.
Our Book Chapter “Automatic, Objective, and Efficient Measurement of Pain Using Automated Face Analysis”, is in Press In Ken Prkachin, Zina Trost, and Kai Karos (EDs.), Handbook of Social and interpersonal processes in pain: We don’t suffer alone, Springer (2016-2017).
Area Chair for IEEE Automatic Face and Gesture Recognition 2017 (FG 2017).
We are organizing the 4th International Workshop on "Context Based Affect Recognition CBAR2016" @ CVPR2016.

2015

Our Paper titled “Automatic Measurement of Head and Facial Movement for Analysis and Detection of Infants’ Positive and Negative Affect” was accepted for Frontiers in ICT - Human-Media Interaction, 2015.
Our Paper titled “Multimodal Detection of Depression in The Context of Clinical Interview” was accepted for ICMI 2015.
Our Paper titled “What Can Head and Facial Movements Convey about Positive and Negative Affect?” was accepted for IEEE ACII 2015 and won the prize of Best Paper Award
Our Team won the 30, 300, 3000 Pain Research Challenge, funded by the Virginia Kaufman Endowment Fund.
Outstanding Reviewer Award at IEEE FG 2015.
We are organizing the 1st International Workshop on “Interpersonal Synchrony and Influence: INTERPERSONAL” at the 17th ACM international conference on multimodal interaction ICMI, 9-13 November 2015, Seattle, Washington, USA.
Our Paper titled “Head Movement Dynamics During Normal and Perturbed Mother-Infant Interaction” was accepted for IEEE Transactions on Affective Computing, 2015.
Our Collective Paper titled. “Open Challenges in Modeling, Analysis and Synthesis of Human Behaviour in Human-Human and Human-Machine Interactions.” was accepted for Cognitive Computation, Springer, 2015.
Merlin Teodosia Suarez and I are organizing the 3rd International Workshop on "Context Based Affect Recognition”. at the 11th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2015) : CBAR 2015.

2014

Our Paper titled “Log-Normal and Log-Gabor Descriptors for Expressive Events Detection and Facial Features Segmentation” was accepted for Information Sciences, ELSEVIER, 2014..
Our Paper titled “Interpersonal Coordination of Head Motion in Distressed Couples” was accepted for IEEE Transactions on Affective Computing, 2014.
Two Papers titled “Towards Multimodal Pain Assessment for Research and Clinical Use” and “Intra- and Interpersonal Functions of Head Motion in Emotion Communication” were accepted for RFMI/ICMI 2014.
Our Paper titled “Dyadic Behavior Analysis in Depression Severity Assessment Interviews” was accepted for ICMI 2014.
Co- Publication Chair of ACM ICMI 2014, 16th ACM International Conference on Multimodal Interaction, 12-16 November 2014, Istanbul, Turkey

2012-2013

Merlin Teodosia Suarez and I are organizing the 2nd International Workshop on "Context Based Affect Recognition CBAR2013” at the 2013 AAAI/IEEE Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII 2013).
Our Paper titled “Automatic Detection of Pain Intensity” won the prize of Outstanding Paper at ACM ICMI 2012.

PROFESSIONAL ACTIVITIES

Editorial Board

Workshop/Conference Organization:

Program Committee Member:

Society/Network Membership

Member of IEEE. Member of ACM.

PUBLICATIONS

EDITORIALS

Hammal Zakia, & Merlin Teodosia Suarez Towards Context Based Affective Computing: Introduction to the Third International CBAR 2015 Workshop
Proceedings of the 11th IEEE International Conference on Automatic Face and Gesture Recognition Conference and Workshops (IEEE FG’15). Ljubljana, Slovenia, May 2015.
Hammal Zakia, & Merlin Teodosia Suarez Towards Context Based Affective Computing. Proceedings of the 5th Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII’13), Geneva, Switzerland, September 2013.

BOOK CHAPTERS

Hammal, Z., Cohn, J. F. (2018).
Automatic, Objective, and Efficient Measurement of Pain Using Automated Face Analysis ,
In Ken Prkachin, Zina Trost & Kai Karos (EDs.), Handbook of Social and interpersonal processes in pain: We don’t suffer alone, pp. 121-146, Springer.
Cohn, J. F., Onal, I., Chu, W. S., Girard, J. M., Jeni, L. A., Hammal, Z. (2018).
Affective facial computing: Generalizability across domains. ,
In X. Alameda-Pineda, E. Ricci & N. Sebe (Eds.), Multimodal behavioral analysis in the wild: Advances and challenges. NY, NY: pp. 407-441, Elsevier
Girard, J. M., Cohn, J. F., Mahoor, M. H., Mavadati, S. M., Hammal, Z., Rosenwald, D.P. (2016).
Non-verbal social withdrawal in depression: Evidence from manual and automatic analyses.
In P. Ekman & E. Rosenberg (Eds.), What the face reveals: Basic and applied studies of spontaneous expression using the facial action coding system (FACS) (3rd ed.). New York, NY: Oxford University Press.
Hammal, Z., Massot C. (2011).
Gabor-like image filtering for transient feature detection and global energy estimation applied to multi-expression classification.
In Book Communications in Computer and Information Science (CCIS 229), pp. 135-153, ISBN 978-3-642-25382-9.
Hammal, Z. (2010).
From face to facial expression.
In Book Advances in Face Image Analysis: Techniques and Technologies, pp. 217-238, Yu-Jin Zhang, eds. IGI Global.

JOURNAL PAPERS

Hammal, Z., Wallace, E., Speltz, M. L., Heike, C. L., Birgfeld, C. B., Cohn, J. F. (2019)
Dynamics of Face and Head Movement in Infants with and without Craniofacial Microsomia: An Automatic Approach.
Plastic and Reconstructive Surgery Global Open Journal, vol. 7, no. 1, e2081.
https://journals.lww.com/prsgo/Fulltext/2019/01000/Dynamics_of_Face_and_Head_Movement_in_Infants_with.9.aspx
Jonassaint CR, Rao N, Sciuto A, Switzer GE, De Castro L, Kato GJ, Jonassaint JC, Hammal Z., Shah N, Wasan A. (2018)
Abstract Animations for the Communication and Assessment of Pain in Adults: Cross-Sectional Feasibility Study.
Journal of Medical Internet Research, vol. 20, no. 8.
https://www.jmir.org/2018/8/e10056/
Martin, K. B., Hammal, Z., Ren, G., Cohn, J. F., Cassell, J., Ogihara, M., . . . Messinger, D. S. (2018)
Objective measurement of head movement differences in children with and without autism spectrum disorder
Molecular Autism, vol. 9, no. 1, pp. 14.
https://molecularautism.biomedcentral.com/articles/10.1186/s13229-018-0198-4
Hammal, Z., Cohn, J-F., Wallace, E-R., Heike, C-L., Birgfeld, C-B., Oster, H., & Speltz, M-L. (2018)
Facial Expressiveness in Infants With and Without Craniofacial Microsomia: Preliminary Findings.
Cleft Palate-Craniofacial Journal, vol. 55, no. 5, pp. 711-720.
https://journals.sagepub.com/doi/10.1177/1055665617753481
Hdibeklioglu*, H., Hammal, Z.*, Cohn, J-F. (2018)
Dynamic Multimodal Measurement of Depression Severity Using Deep Autoencoding.
[*Equal contribution], IEEE Journal of Biomedical and Health Informatics, vol. 22, no. 2, pp. 525-536
https://ieeexplore.ieee.org/document/7869262
Hammal, Z., Cohn, J-F, Heike, C., and Speltz, M-L. (2015) Automatic Measurement of Head and Facial Movement for Analysis and Detection of Infants’ Positive and Negative Affect.
In Frontiers in ICT - Human-Media Interaction, vol. 2, no. 21, pp. 397-413.
https://www.frontiersin.org/articles/10.3389/fict.2015.00021/full
Hammal, Z., Cohn, J.F., & Messinger, D. (2015)
Head Movement Dynamics During Normal and Perturbed Mother-Infant Interaction.
IEEE Transactions on Affective Computing, vol. 6, no. 4, pp. 361-370.
https://ieeexplore.ieee.org/document/7084624
A. Vinciarelli, A. Esposito, E. Andre, F. Bonin, M. Chetouani, J-F, Cohn, M. Cristani, F. Fuhrmann, E. Gilmartin, Z. Hammal, D. Heylen, R. Kaiser, M. Koutsombogera, A. Potamianos, S. Renals, G. Riccardi, A. Ali Salah. (2015)
Open Challenges in Modeling, Analysis and Synthesis of Human Behaviour in Human-Human and Human-Machine Interactions Cognitive Computation, vol. 7, no. 4, pp. 397-413.
http://link.springer.com/article/10.1007/s12559-015-9326-z
Z. Hammal. (2014) "Log-Normal and Log-Gabor Descriptors for Expressive Events Detection and Facial Features Segmentation".
Information Sciences, ELSEVIER. vol. 288, pp. 462–480.
http://dx.doi.org/10.1016/j.ins.2014.07.002
Hammal, Z., Cohn, J.F., & George, D.T. (2014) Interpersonal Coordination of Head Motion in Distressed Couples.
IEEE Transactions on Affective Computing, vol. 5, no. 2, pp. 155-167.
https://ieeexplore.ieee.org/document/6823675
Girard, J-M, Cohn, J-F, Mahoor, M-H., Mavadati, S-M., Hammal, Z, & Rosenwald, D. (2014)
Nonverbal Social Withdrawal in Depression: Evidence from manual and automatic analysis.
Special Issue on Best papers of the Automatic Face and Gesture Recognition 2013.
International Journal Image and Vision Computing, vol. 32, no. 10, pp. 641-647.
https://www.ncbi.nlm.nih.gov/pubmed/25378765
Z. Hammal and M. Kunz. (2012)
Pain Monitoring: A Dynamic and Context-sensitive System.
Pattern Recognition. vol. 45, no. 4, pp. 1265-1280.
https://www.sciencedirect.com/science/article/pii/S0031320311003931
Hammal Z., M. Arguin, and F. Gosselin. (2009)
Comparing a Novel Model Based on the Transferable Belief Model with Humans During the Recognition of Partially Occluded Facial Expressions.
Journal of Vision, vol. 9, no. 2, pp. 1-19.
http://www.journalofvision.org/9/2/22/
Hammal Z., Couvreur L., Caplier A., Rombaut M. (2007)
Facial Expressions Classification: A new approach based on Transferable Belief Model.
International journal of approximate reasoning, elsevier, vol. 46, no. 3, pp. 542-567.
https://www.sciencedirect.com/science/article/pii/S0888613X07000187
Z. Hammal, N. Eveno, A. Caplier, and P-Y Coulon. (2006)
Parametric models for facial features segmentation.
Signal processing, Elsevier, vol. 86, no. 2, pp. 399- 413.
https://www.sciencedirect.com/science/article/pii/S0165168405001970
Hammal Z., Eveno N, Caplier A., Coulon P.-Y (2006)
Extraction des traits caracteristiques du visage à l'aide de modèle paramétriques adaptés. Traitement du signal (TS 2005),vol 22, num 01, pp 59-71, 2005. Selected as from the best papers of the 19ème colloque sur le traitemant du signal et des images Gretsi 2003.

PROCEEDING PAPERS

Daoudi*, M., Hammal*, Z., et al., (2019).
Gram Matrices Formulation of Body Shape Motion: An Application for Depression Severity Assessment.
Machine Learning for the Diagnosis and Treatment of Affective Disorders @IEEE ACII 2019, Cambridge, UK. [*Equal contribution].
Ogihara, M., Hammal, Z., Martin, K. B., Cohn, J. F., Cassell, J., Ren, G., & Messinger, D. S. (2019).
Categorical timeline allocation for diagnostic head movement tracking feature analysis.
Face and Gesture Analysis for Health Informatics FGAHI@CVPR, 2019, Long Beach, CA.
Bhatia, S., Goecke, R., Hammal, Z., Cohn, J. F. (2019)
Automated measurement of head movement synchrony during dyadic depression severity interviews.
Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition, Lille, France, 1-8. .
Kacem, A., Hammal, Z., Cohn, J.F. & Daoudi, M. (2018).
Detecting depression severity by interpretable representations of motion dynamics.
IEEE International Conference on Automatic Face and Gesture Recognition, Xi’an, China. .
Hammal, Z., Chu, W-S., Cohn, J-F, Heike, C-L, and Speltz, M-L. (2017).
Automatic AU detection in infants using convolutional neural network.
Proceedings of the Affective Computing and Intelligent Interactions (ACII2017), San Antonio, TX. .
Hdibeklioglu*, H., Hammal*, Z., Yang, Y., Cohn, J. F. (2015).
Multimodal Detection of Depression in the Context of Clinical Interviews.
Proceedings of the ACM International Conference on Multimodal Interaction (ICMI 2015), Seattle, Washington . * Equal contribution.
Hammal, Z., Cohn, J-F, Heike, C., and Speltz, M-L. (2015)
What Can Head and Facial Movements Convey about Positive and Negative Affect?
The 6th biannual Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII 2015), September 21-24, Xi’an, China,
Best Paper Award
Hammal, Z. and Cohn, J-F. (2014)
Towards Multimodal Pain Assessment for Research and Clinical Use
RFMI in conjunction with the 16th ACM International Conference on Multimodal Interaction ICMI 2014.
12-16 November 2014, Istanbul, Turkey
Hammal, Z. and Cohn, J-F. (2014)
Intra- and Interpersonal Functions of Head Motion in Emotion Communication
RFMI in conjunction with the 16th ACM International Conference on Multimodal Interaction ICMI 2014.
12-16 November 2014, Istanbul, Turkey
Scherer S., Hammal, Z., Yang Y., Morency L. P., and Cohn J. F. (2014)
Dyadic Behavior Analysis in Depression Severity Assessment Interviews
16th ACM International Conference on Multimodal Interaction ICMI 2014. 12-16 November 2014, Istanbul, Turkey
Hammal, Z., Cohn J. F., Messinger D. S., Masson W., & Mahoor M. (2013)
Head Movement Dynamics During Normal and Perturbed Parent-Infant Interaction
The fifth biannual Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII 2013), Geneva, Switzerland, September 2-5, 2013.
Hammal Z., Cohn J-F., Bailie T., George D-T.,Saraghi, J., Chiquero J-N., & Lucey., S. (2013)
Temporal Coordination of Head Motion in Couples with History of Interpersonal Violence.
Gesture Recognition, FG2013, Shanghai, China, April 22-26, 201.
Hammal, Z., Cohn, J-F, (2012)
Automatic Detection of Pain Intensity.
14th ACM International Conference on Multimodal Interaction ICMI 2012. Oct 22-26, Santa Monica, CA, USA.
Outstanding Paper Award.
Hammal, Z. (2011)
Efficient Detection of Consecutive Facial Expression Apexes Using Biologically Based Log-Normal Filters.
in ADVANCES IN VISUAL COMPUTING. Lecture Notes in Computer Science, 2011, Volume 6939/2011, pp. 586-595, DOI: 10.1007/978-3-642-24028-7_54.
Hammal Z., Massot C. (2010)
Holistic and Feature-Based Information Towards Dynamic Multi-Expressions Recognition.
International Conference on Computer Vision Theory and Applications, Angers, France.
Selected among the best papers of VISAPP 2010
Hammal Z., Massot C. (2009)
Dynamic Facial Expression Classification Based on Human Visual Cues Information.
Workshop series: Emotion and Computing – Current Research and Future Impact-held in the framework of the 32nd Annual Conference on
Artificial Intelligence KI.
Z. Hammal (2009)
Context based Recognition of Pain Expression Intensities.
The 5th Workshop on Emotion in Human-Computer Interaction - Real World Challenges -held at the 23rd BCS HCI Group conference. Cambridge University, Cambridge, UK, September 2009.
Z. Hammal, M. Kunz, M. Arguin, and F. Gosselin. (2008)
Spontaneous Pain Expression Recognition in Video Sequences.
Proc. BCS Int’l Conf. on Visions of Computer Science (BCS-Visions 2008). Imperial College London September 22-24, 2008.
Selected among the best papers of the BCS-Visions 2008.
Z. Hammal, A. Martin and F. Gosselin. (2007)
Comparing a Transferable Belief Model Capable of Recognizing Facial Expressions with the Latest Human Data.
in ADVANCES IN VISUAL COMPUTING. Lecture Notes in Computer Science, 2007, Volume 4841/2007, pp. 509-520, DOI: 10.1007/978-3-540-76858-6_50.
I. Buciu, Z. Hammal, A. Caplier, N. Nikolaidis and I. Pitas (2006)
Enhencing Facial Expression Classification By Information Fusion.
Proc. 14 th European Signal Processing Conference (Eusipco 2006).
Z. Hammal (2006)
Dynamic facial expression understanding based on temporal modeling of transferable belief model.
Proc. International conference on computer vision theory and application. (Visapp 2006). 25-28 February 2006, Setubal, Portugal
Hammal Z., Massot C., Bedoya G., Caplier A. (2005)
Eyes Segmentation Applied to Gaze Direction and Vigilance Estimation.
PATTERN RECOGNITION AND IMAGE ANALYSIS. Lecture Notes in Computer Science, 2005, Volume 3687/2005, pp. 236-246, DOI: 10.1007/11552499_27.
Hammal Z., Couvreur L., Caplier A., Rombaut M. (2005)
Facial Expressions Recognition Based on The Belief Theory: Comparison with Diferent Classifiers.
IMAGE ANALYSIS AND PROCESSING. Lecture Notes in Computer Science, 2005, Volume 3617/2005, pp. 743-752, DOI: 10.1007/11553595_91
Hammal Z., Caplier A., Rombaut M (2005)
Belief Theory Applied to Facial Expressions Classification.
PATTERN RECOGNITION AND IMAGE ANALYSIS.Lecture Notes in Computer Science, 2005, Volume 3687/2005, pp. 183-191, DOI: 10.1007/11552499_21
Hammal Z., Bozkurt B., Couvreur L., Unay D., Caplier A., Dutoit T. (2005)
Passive versus Active: Vocal Classification System.
Proc. 13th European Signal Processing Conference. (EUSIPCO 2005). Turkey.
Hammal Z., Caplier A., Rombaut M. (2005)
A fusion process based on belief theory for classification of facial basic emotions.
Proc. IEEE Fusion'2005 the 8th International Conference on Information fusion. (ICIF2005), 8 pages.
Hammal Z., Bozkurt B., Couvreur L., Unay D., Caplier A., Dutoit T (2005)
Classification bimodale d'expressons vocales.
20ème colloque sur le traitemant du signal et des images (Gretsi'2005), Louvain-la-Neuve , Belgium, Septembre 2005.
Hammal Z., Caplier A.
Eyes and eyebrows parametric models for automatic segmentation.
Proc. IEEE Southhwest Symposium on Image Analysis and Interpretation. (IEEE) (SSIAI04). Turkey.
Hammal Z., Caplier A., Rombaut M. (2004)
Classification d'expressions faciales par la theorie de l'évidence.
Rencontre Francophones sur la Logique Floue et ses Applications (LFA2004), Nantes, 18-19 November.
Hammal Z., Caplier A. (2004)
Analyse dynamique des transformations des traits du visage lors de la production d'une emotion.
Atelier sur l'analyse du geste (RFIA 2004), Toulouse, 2004
Hammal Z., Eveno N., Caplier A., Coulon P.-Y (2003)
Extraction réaliste des traits caractéristiques du visage à l'aide de modèles paramétriques adaptés.
19ème colloque sur le traitemant du signal et des images (Gretsi'2003), Paris, September 2003
Selected among the 10 best papers of GRETSI 2003 to be published in the Journal Traitement du signal.

PUBLISHED ABSTRACTS

Chen, M., Chow, S. M., Hammal, Z., Messinger, D. S., Cohn, J. F. (2019)
Time-series modeling of infant-mother head movement dynamics.
SRCD Biennial Meeting, March 21-23, Baltimore, Maryland.
Messinger, D. S., Martin, K.B., Cohn, J. F., Hammal, Z. (2018))
An early behavioral index of ASD from automated microanalysis of movement dynamics.
Society of Biological Psychiatry Annual Meeting, New York, NY.
Martin, K. B., Messinger, D. S., Hammal, Z., Cohn, J. F. (2017)
Automated measurement of head movement coordination in infant-parent dyads and later ASD outcomes.
The International Meeting for Autism Research, San Francisco, California, USA.
Martin, K. B., Hammal, Z., Cohn, J. F., Cassell, J., Gang R., Oighara, M., Britton, J. C., Gutierrez, A., Messinger, D. S. (2016)
Temporal and frequency feature integrations for head movement pattern analysis.
University of Miami Neural Engineering Symposium, Miami, Florida.
Hammal, Z., Cohn, J. F., Messinger, D. S. (2015)
Head movement dynamics during play and perturbed mother-infant interaction.
Association for Psychological Science, May, New York, NY.
Hammal, Z., F. Gosselin, I.Peretz and S. Hebert. (2010)
Spatial Frequencies Mediating Music Reading.
Proc. Vision Science Society (VSS 2010). 7-12 May 2010 Naples Grand Hotel, Floride (JoV 2010),
Hammal, Z., Gosselin, F., Fortin, I. (2009)
How efficient are the recognition of dynamic and static facial expressions?
Proc. Vision Science Society (VSS). 8-13 May, Naples Grand Hotel, Florida.
C. Roy, S. Roy, D. Fiset,, Z. Hammal., P. Rainvill, F. Gosselin. (2008)
Recognizing static and dynamic facial expressions of Pain: Gaze-tracking and Bubbles experiments.
Proc. Vision Science Society (VSS 2008). 9-14 May 2008, Naples Grand Hotel, Florida
Hammal, Z., Tsuchiya, N., Adolphs, R., Arguin, M., Schyns, P.G., Gosselin, F. (2008)
What does the activity in the Amygdala and the Insula correlate with in fearful and disgusted faces.
Proc. Vision Science Society (VSS). 9-14 May, Naples Grand Hotel, Florida.
Roy, S., Roy, C., Fiset, D., Hammal, Z., Blais, C., Rainville, P., Gosselin, F. (2008)
Recognizing static and dynamic facial expressions of pain: Gaze-tracking and Bubbles experiments.
Proc. Vision Science Society (VSS). 9-14 May, Naples Grand Hotel, Florida.

SELECTED SEMINARS AND INVITED TALKS

Z. Hammal, (2019).
Behavioral AI: Multimodal Human Behavior Analysis and Recognition for Research and Clinical Use.
Rice University, 14th October 2019.
Z. Hammal, (2019).
Behavioral AI: Multimodal Human Behavior Analysis and Recognition for Research and Clinical Use.,
The Vector Institute for Artificial Intelligence, Toronto, Canada, 20th September 2019.
Z. Hammal, (2019).
Multimodal Human Behavior Analysis and Modeling: Depression Severity Assessment - A Case Study.
Invited Talk at the Machine Learning for the Diagnosis and Treatment of Affective Disorders @ ACII 2019, 3rd September 2019.
Z. Hammal, (2019).
Automatic Human Behavior Analysis and Recognition for Research and Clinical Use.
Institut Systèmes Intelligents et de Robotique Université Pierre et Marie Curie, May 20th, Paris.
Z. Hammal, (2019).
Automatic Human Behavior Analysis and Recognition for Research and Clinical Use.
DIRO, University of Montreal, Canada, March 2019.
Z. Hammal, (2016).
Multimodal Human Behavior Analysis for Research and Clinical Use.
Invited speaker at Brain, Behavior, and Cancer Seminars, University of Pittsburgh Medical Center (UPMC), Pittsburgh, January 2016.
Z. Hammal, (2015).
Towards Automatic Pain Assessment for Research and Clinical Use.
Invited speaker at University of Pittsburgh Medical Center (UPMC), Pittsburgh, January
Z. Hammal, (2014).
Emotional Experience and Interpersonal Communication in Nonverbal Behavior.
Invited speaker at Florida International University, Miami, 22 March 2014.
Z. Hammal, (2014).
Head Movement Dynamics in the Still-Face Paradigm
Invited speaker at University of Miami, Miami, 20 March 2014.
Z. Hammal, (2014).
Towards Multimodal Assessment of Pain for Research and Clinical Use.
Invited speaker at Children’s Hospital of Pittsburgh, Pittsburgh, 14 March 2014.
Z. Hammal, (2010).
Automatic affect Recognition.
Invited speaker at Bell Canada, Montreal, 20 September 2010.
Z. Hammal, (2010).
Music to my eyes: spatial frequencies mediating music reading.
Invited speaker to the seminars of CERNEC, university of Montreal, 17 September 2010.
Z. Hammal, (2010).
Automatic Analysis and Interpretation of Human Facial Behavior.
Invited speaker, McGill University, 23 June 2010, Montréal.
Z. Hammal, (2010).
Segmentation et Interprétation d'indices Visuels pour la Reconnaissance Séquentielle des Expressions Faciales.
Invited speaker, Institut TELECOM - TELECOM ParisTech. 27 May 2010, Paris, France..

DATABASE:

Hammal-Caplier database: 21 different subjects with 3 sequences per subject (Smile, Surprise and Disgust). Each sequence is recorded during 5 seconds. For each different acquisition the subject is asked to simulate one expression beginning by the Neutral expression, evolving to the considered expression and coming back to the Neutral expression. Ref: Hammal Z., Couvreur L., Caplier A., Rombaut M. Facial Expressions Classification: A new approach based on Transferable Belief Model. INTERNATIONAL JOURNAL OF APPROXIMATE REASONING. 46(3):542-567, 2007).