1 min read

Reptile, developed by Open AI, is a simple meta-learning algorithm. Meta-learning is the process of learning how to learn. A meta-learning algorithm takes in a distribution of tasks, where each task is a learning problem, and it produces a quick learner. This means a learner must be able to generalize from a small number of examples.

An example of a meta-learning problem is few-shot classification. Here, each task is a classification problem within which the learner after seeing only 1 – 5 input-output examples from each class must classify new inputs.

What Reptile does

It samples a task repeatedly, performs stochastic gradient descent on it, and finally updates the initial parameters towards the final parameters learned on the task.

Any Comparisons?

Reptile performs as well as MAML, which is also a broadly applicable meta-learning algorithm. Unlike MAML, Reptile is simple to implement and more computationally efficient.

Some features of Reptile :

  • Reptile seeks an initialization for the parameters of a neural network, such that the network can be fine-tuned using a small amount of data from a new task.
  • Unlike MAML, Reptile simply performs stochastic gradient descent (SGD) on each task in a standard way. This means it does not unroll a computation graph or calculate any second derivatives. Hence, Reptile takes less computation and memory than MAML.
  • The current Reptile implementation uses TensorFlow for the computations involved, and includes code for replicating the experiments on Omniglot and Mini-ImageNet.

To Read more on how Reptile works, visit the OpenAI blog. To view Reptile implementations, visit its GitHub Repository.


A Data science fanatic. Loves to be updated with the tech happenings around the globe. Loves singing and composing songs. Believes in putting the art in smart.


Please enter your comment!
Please enter your name here