Humanoid robots are increasingly popular due to their flexible nature and how easy it is for humans to relate to them. Imitation learning has been used to transfer skills and experience between robots, however, transferring skills from a humanoid demonstrator to a non-humanoid such as a wheeled robot is no simple task. The imitator must be able to abstract the behaviour it observes and in addition translate this behaviour to actions that it is able to perform. Since humanoid robots' have much larger ranges of motion than most robots, this is can be quite difficult. This work describes an approach for a non-humanoid robot to learn a task by observing demonstrations of a humanoid robot using global vision. We use a combination of tracking sequences of primitives and predicting future primitive actions from existing combinations using forward models to construct more abstract behaviours that bridge the differences between the different robot types. To evaluate the success of learning from a humanoid demonstrator, we also evaluate how well a wheeled robot can learn from a physically identical wheeled robot, as well as a much smaller wheeled robot.