[Stanford] Weak Supervision: A New Programming Paradigm for Machine Learning
Apr 3, 2019
Weak Supervision: A New Programming Paradigm for Machine Learning
Alex Ratner, Paroma Varma, Braden Hancock, Chris Ré, and other members of Hazy Lab
quote
… These hand-labeled training sets are expensive and time-consuming to create — often requiring person-months or years to assemble, clean, and debug — especially when domain expertise is required. On top of this, tasks often change and evolve in the real world. For example, labeling guidelines, granularities, or downstream use cases often change, necessitating re-labeling (e.g., instead of classifying reviews only as positive or negative, introducing a neutral category). For all these reasons, practitioners have increasingly been turning to weaker forms of supervision, such as heuristically generating training data with external knowledge bases, patterns/rules, or other classifiers. Essentially, these are all ways of programmatically generating training data—or, more succinctly, programming training data. …
https://ai.stanford.edu/blog/weak-supervision/
← Back to all articles Quick Navigation: Next:[ j ] – Prev:[ k ] – List:[ l ]