Why more isn't always better Podcast Por  arte de portada

Why more isn't always better

Why more isn't always better

Escúchala gratis

Ver detalles del espectáculo

Acerca de esta escucha

In this episode we tackle AI training data.

More specifically the thought that more data will always improve your models. From overfitting and noisy datasets to issues of scale, labeling, and ethics you'll understand that with AI models Quality is almost always better than quantity.We tackle problems where lack of good data leads to real issues and the ethicality of using widely available data on the internet without consent.



Todavía no hay opiniones