One of the main considerations, when it comes to price, relates to your requirements, regarding the size of the computer vision model. Model selection in machine learning is a product of, in the best possible way, to live up to the requirements. Read more about Performance Of Computer Vision Algorithms here.

Different size of models has different demand for hardware as illustrated below:

  1. Normal size model: Using Graphic Cards, this means either by direct connection or cloud/WiFi-connection to a GPU-server.
  2. Small size model: Using a Central Processing Unit, this means either with the CPU server or a Single Board Computer.
  3. Micro size model: Using a offline small Single Board Computer or microprocessor.

It is significantly more complex when implementing solutions on a small embedded device, and therefore the pricing varies depending on the size of the model. An Smaller model size can also be used to lower cloud-server cost, and boosting the profit margin of the AI-business case. We discuss the different capabilities of Edge Computing vs Cloud Computing here, and you can see our own edge cases here.