Floating point operations per second (FLOPS) of Machine Learning models

In this article, we take a look at the FLOPs values of various machine learning models like VGG19, VGG16, GoogleNet, ResNet18, ResNet34, ResNet50, ResNet152 and others. The FLOPS range from 19.6 billion to 0.72 billion.

Read this article to take a look at FLOPS of various machine learning models

  • VGG19 has 19.6 billion FLOPs
  • VGG16 has 15.3 billion FLOPs
  • ResNet 152 model has 11.3 billion FLOPs
  • ResNet 101 model has 7.6 billion FLOPs
  • ResNet 50 model has 3.8 billion FLOPs
  • ResNet 34 model has 3.6 billion FLOPs
  • ResNet 18 model has 1.8 billion FLOPs
  • GooglenNet has 1.5 billion FLOPs
  • AlexNet has 0.72 billion FLOPs


Have a doubt or thought? Join the discussion now
This is a companion discussion topic for the original entry at http://iq.opengenus.org/floating-point-operations-per-second-flops-of-machine-learning-models/
1 Like

An interesting point to note is that ResNet152 despite having 152 layers is faster than VGG19 which has 19 layers.

Points to note:

  • Increasing (:arrow_up:) layers does not mean that the model will run slower
  • Increasing (:arrow_up:) layers does not increase model accuracy. It may even decrease the accuracy