Validating high frequency deployment of the Diet Quality Questionnaire

cg.authorship.typesCGIAR multi-centreen
cg.contributor.donorCGIAR Trust Funden
cg.contributor.initiativeDigital Innovation
cg.coverage.countryGuatemala
cg.coverage.countryRwanda
cg.coverage.iso3166-alpha2GT
cg.coverage.iso3166-alpha2RW
cg.coverage.regionAmericas
cg.coverage.regionCentral America
cg.coverage.regionEastern Africa
cg.howPublishedGrey Literatureen
cg.placeWashington, D.C.en
cg.reviewStatusInternal Reviewen
cg.subject.actionAreaSystems Transformation
cg.subject.impactAreaNutrition, health and food security
cg.subject.impactAreaPoverty reduction, livelihoods and jobs
cg.subject.impactPlatformNutrition, Health and Food Security
cg.subject.impactPlatformPoverty Reduction, Livelihoods and Jobs
dc.contributor.authorManners, Rhysen
dc.contributor.authorInternational Institute of Tropical Agricultureen
dc.date.accessioned2023-11-27T18:10:39Zen
dc.date.available2023-11-27T18:10:39Zen
dc.identifier.urihttps://hdl.handle.net/10568/134741
dc.titleValidating high frequency deployment of the Diet Quality Questionnaireen
dcterms.abstractIn recent work, Manners et al. (2022) crowdsourced the Diet Quality Questionnaire (DDQ), assessing whether a lean and low-cost data collection system could be deployed for mapping of diet quality. In 52 weeks of data collection, the system generated responses from more than 80,000 unique respondents, collecting around 1800 respondents per week. The preliminary success of the piloted system points towards a viable alternative modality for deployment for the DQQ. Crowdsourcing data is an attractive option for the DQQ, generating data at a relatively low-cost. The scaling potential of a high-frequency, crowdsourced based system is evidenced by a second pilot launching imminently in Guatemala. However, there remain questions regarding the accuracy and reliability of crowdsourced data- respondents may inaccurately respond intentionally (for malicious purposes or gaming of the system), or unintentionally (due to a lack of understanding). Validation of crowdsourced data has been done via simple phone based follow ups, to more complex machine learning frameworks. Despite the uncertainties around crowdsourced data, crowdsourcing may provide respondents with a sense of anonymity, responding more accurately, without the feeling of enumerator expectations. Enumerator biases have been well documented in enumerator administered data collection, where respondents may adapt responses based upon their perceptions of what they think the enumerator wants to hear. Enumerator and mobile phone generated diet quality data may be hindered by different issues of reliability and accuracy. Previous studies have sought to address similar problems of comparing different technologies, through observational benchmarking (e.g. Matthys et al., 2007; Fallaize et al., 2014; Putz et al., 2019). In a recent study, Rogers et al. (2021) assessed the accuracy of two dietary recall data collection methods, against a weighed food record. The application of this method permitted a quantitative dietary benchmark to be established, through enumerator observation of consumption. This benchmark was used to compare the accuracy and reliability of the data collection methods under study.en
dcterms.accessRightsOpen Access
dcterms.audienceCGIARen
dcterms.audienceDevelopment Practitionersen
dcterms.bibliographicCitationManners, Rhys; and International Institute of Tropical Agriculture (IITA). 2023. Validating high frequency deployment of the Diet Quality Questionnaire. Digital Innovation Research Update. https://hdl.handle.net/10568/134741en
dcterms.extent34 p.en
dcterms.issued2023-11-22
dcterms.languageen
dcterms.licenseOther
dcterms.publisherInternational Food Policy Research Instituteen
dcterms.subjectdieten
dcterms.subjectdataen
dcterms.subjectsurvey designen
dcterms.typeWorking Paper

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Rhys Manners (2023) -- Working Paper -- Validating DQQ.pdf
Size:
684.55 KB
Format:
Adobe Portable Document Format
Description:
Working Paper

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
1.75 KB
Format:
Item-specific license agreed upon to submission
Description: