The gaze tracker has become an important human machine interface, while it remains challenging for the users to move their head randomly. That is, the system performance appears significantly inferior due to the motion of a human head. However, the head movement compensation cannot be simply formulated as an eye detection problem. The minor error of the eye corner detection algorithm leads to an unacceptable result for limbus circle matching since the incorrect eyeball model is adopted. In this paper, we further explore the 3-D eyeball model anchored by the inner eye corner point. The relative location between the eyeball center and the inner eye corner is analyzed. This feature is used to guide the eyeball model construction as well as the limbus circle matching. The experimental results show the proposed approach allows a range of the estimation error for eye detection and the final limbus circle matching performance has been remarkably improved.