You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you open your mouth and move down the jaw, the pitch of the head orientation is altered negatively.
Is the head orientation completely determined by OpenFace? What kind of data do we have access to in GazeTracker code?
The text was updated successfully, but these errors were encountered:
Yes, all the head orientation is determined by OpenFace, I just read what it gives me, adjust and send to opentrack trough UDP.
It also affects negatively if you use glasses, I think that the problem is that the model that the AI uses doesn't have too many faces with people wearing glasses or doing different faces. So we would need a lot of volunteers to put the facial points in faces with glasses or opening the mouth to train the AI...
So I guess you're only using the not-so-good head pose provided by OpenFace...
Wouldn't it be possible to obtain the landmark 3D points, exclude the jaw points, and compose a more accurate head rotation?
I guess you can get the CSV data and see if it can be edited with a python script in case it's correctly categorized, quite a lot of work tough as from what I see it uses many datasets...
Hello Gabriel!
If you open your mouth and move down the jaw, the pitch of the head orientation is altered negatively.
Is the head orientation completely determined by OpenFace? What kind of data do we have access to in GazeTracker code?
The text was updated successfully, but these errors were encountered: