The competition on gesture recognition was organised in collaboration with the DARPA Deep Learning program. This challenge was part of a series of challenges on the theme of unsupervised and transfer learning. The goal is to push the state of the art in algorithms capable of learning data representations, which may be re-used from task to task, using unlabelled data and/or labelled data from similar domains. In this challenge, the competitors were given a large database of videos of American Sign Language performed by native signers and videos of International Sign Language performed by non-native signers, which we collected using Amazon Mechanical Turk. Entrants each developed a system that was tested in a live competition on gesture recognition. The test was carried on a small but new sign language vocabulary. The platform of the challenge remains open after the end of the competition and all the datasets are freely available for research in the Pascal 2 repository.