Loading request...
The user requests data or ratios comparing the resources (power, time) consumed for AI model training versus inferencing. This is prompted by concerns about significant resource consumption during the training phase, especially for large models, and the desire for more transparency.
I realize that useful AI models will spend most of their time being used (inferencing) rather than being trained. But, it still seems that the resources consumed for training are more than significant, especially since they often are trained many times. It would be informative to see some data or ratios, especially since i've heard that the biggest models can consume staggering amounts of power and time. These IBM vids are great, thank you all.