Author:
Stoltz Dustin S.,Taylor Marshall A.
Abstract
Abstract
Deductive analysis entails searching for a meaning in our corpora. Researchers commonly hand-label a manageable random sample of documents, using these labels to build classifiers, validate classifications, or both. Here, we’ll discuss using labels to build classifiers (i.e. supervision) and use these labels to validate classifications. We’ll also explore using pretrained models to classify documents and using inference with text networks as a way to test relational hypotheses with texts.
Publisher
Oxford University PressNew York
Reference457 articles.
1. Wikipedia, sociology, and the promise and pitfalls of Big Data.;Big Data & Society,2015
2. Optimality of training/test size and resampling effectiveness in cross-validation.;Journal of Statistical Planning and Inference,2019
3. Ahmed, Hadeer, Issa Traore, and Sherif Saad. 2017. “Detection of online fake news using N-gram analysis and machine learning techniques.” In Intelligent, Secure, and Dependable Systems in Distributed and Cloud Environments, pp. 127–38. Springer International Publishing.
4. Detecting opinion spams and fake news using text classification.;Security and Privacy,2018