Scientists from Hong Kong's Baptist University has developed an authentication technique which uses a person's lip motions.A team led by Professor Cheung Yiu-ming has created a system that verifies a person's identity by simultaneously matching the password content with the underlying behavioural characteristics of lip movement.According to the researchers, Nobody can mimic a user's lip movement when uttering the password which can be changed at any time.Yiu-ming said that the new technique has a number of advantages over conventional security access control methods.”The dynamic characteristics of lip motions are resistant to mimicry, so a lip password can be used singly for speaker verification, as it is able to detect and reject a wrong password uttered by the user or the correct password spoken by an imposter, and verification based on a combination of lip motions and password content ensures that access control is doubly secure”.He also said that compared with traditional voice-based authentication, the acquisition and analysis of lip movements is less susceptible to background noise and distance, moreover, it can even be used by a speech-impaired person. Finally, he noted that there is no language boundary, in other words, a person from any country can use this lip password verification system.Professor Cheung said: “The same password spoken by two persons is different and a learning system can distinguish them.” The study adopted a computational learning model which extracts the visual features of lip shape, texture and movement to characterise lip sequence. Samples of lip sequence are collected and analysed to train the models and determine the threshold of accepting and rejecting a spoken password.
Select Page















