Effective Quality Control in Crowdsourcing

quality control in crowdsourcing n.w
1 / 41
Embed
Share

Explore the challenges and strategies for quality control in crowdsourcing, analyzing state-of-the-art platforms and resources used. Understand the paradigm shift in IT project development towards community-based quality assurance techniques.

  • Crowdsourcing
  • Quality Control
  • IT Project Management

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Quality Control in Crowdsourcing EECS/IT811: IT Project Management Sairath Bhattacharjya April 18, 2019

  2. Organization Introduction Crowdsourcing Challenges in crowdsourcing Quality control in crowdsourcing Analyzing state-of-the-art platforms Conclusions Question & Answer 2

  3. Resources used [1 of 2] Quality Control in Crowdsourcing: A Survey of Quality Attributes, Assessment Techniques, and Assurance Actions Journal: ACM Computing Surveys (CSUR) Volume 51 Issue 1, April 2018 Pages 7:1 7:40 Authors: Florian Daniel, Pavel Kucherbaev, Cinzia Cappiello, Boualem Benatallah, Mohammad Allahbakhsh 3

  4. Why I chose this topic? Relatively new topic Paradigm shift in IT project development Community approach Quality control on a unknown group of people is a challenging work 4

  5. Crowdsourcing [1 of 3] The term "crowdsourcing" was coined in 2005 by Jeff Howe and Mark Robinson It means outsourcing work to the crowd Crowdsourcing is the practice of engaging a crowd or group for a common goal often innovation, problem solving, or efficiency It improved costs, speed, quality, flexibility, scalability, or diversity 5

  6. Crowdsourcing [2 of 3] 6

  7. Crowdsourcing [3 of 3] Different negotiation models: Market place model: Tasks with limited complexity e.g. Amazon Mechanical Turk, Microworker Contest model: For more creative work e.g. Topcoder, 99designs Auction model: Workers bid for work e.g. Fiverr, Freelancer Volunteering: For community contribution e.g. Wikipedia, Crowd crafting 7

  8. Challenges of crowdsourcing Workers are unknown and have varied skill and motivation Platform providers are unaware of the requestor s task Defending against malpractices and extraction of sensitive information Harmful code injection All above results in poor quality and increases cost 8

  9. Quality in crowdsourcing [1 of 2] 9

  10. Quality in crowdsourcing [2 of 2] 10

  11. Quality model: dimensions Dimensions Data Tasks People Description Requestor User interface Worker Incentives Profile Terms & condition Credentials Performance Experience Group 11

  12. Quality model: attributes [1 of 3] Data Accuracy Consistency Timeliness Ad-hoc attributes 12

  13. Quality model: attributes [2 of 3] Tasks Performance Description User interface Terms & condition Incentives Clarity Complexity Cost efficiency Time efficiency Privacy IP protection Information security Compliance Usability Learnability Robustness Extrinsic Intrinsic 13

  14. Quality model: attributes [3 of 3] People Requestor Worker Group Profile Credentials Experience Communicative Generosity Fairness Promptness Availability Diversity Non-collusiveness Age Gender Location Openness Conscientiousness Extraversion Neuroticism Motivation Badges Reliability Reputation Skills Certificates 14

  15. Quality assessment Techniques Individual Group Computer based 15

  16. Quality assessment: individual [1 of 2] Rating: Requestor can rate the work of the worker Qualification test: The worker is required to fill a questionnaire to gain access to the task Self-assessment: The worker is asked to assess the quality of their own work Personality test: Test the actions, attributes and personality the worker possess 16

  17. Quality assessment: individual [2 of 2] Referral: Someone well known refers someone else Expert review: The requestor relies on someone s judgement Usability check: Check if the task design follows the best practices 17

  18. Quality assessment: group [1 of 2] Voting: Let the crowd decide the best output Group consensus: Similar to voting, except that it is the rating assigned to an item Output agreement: Two or more workers produce the same output for the same input Peer review: Similar to expert review, except that it is done by multiple peers to avoid bias 18

  19. Quality assessment: group [2 of 2] Feedback aggregation: Look into the feedback provided on the previous work of the worker User study: Understanding if the worker is aware of using the UIs for the task 19

  20. Quality assessment: computation-based [1 of 3] Ground truth: Injecting questions into the tasks whose answers are known and can be aggregated to find accuracy of worker Outlier analysis: Look at data points that are significantly different from others to raise suspicion Fingerprinting: Capture behavioral trace of worker during task execution to predict quality, error and cheating 20

  21. Quality assessment: computation-based [2 of 3] Achievements: Verify that workers with pre- defined achievement can access the task Implicit feedback: Content based feedback analysis from evaluators Association analysis: Analyzing recommendations Task execution log analysis: Trace worker interaction and task completion 21

  22. Quality assessment: computation-based [3 of 3] Content analysis: Analyzing the task description and task labels Transfer learning: Understand if the worker has worked on a similar task before Collusion detection: Identify group of colluding workers 22

  23. Quality assurance Improve data quality Select people Improve extrinsic motivation Incentivize people Strategy Improve intrinsic motivation Train people Improve task design Control execution 23

  24. Improve data quality Cleanse data: Assuring the input data is accurate Aggregate output: Aggregate output of multiple workers for same input to improve quality Filter output: Remove bad items and choose only the good one Iterative improvement: Ask one worker to improve the work of another worker (as a separate task) 24

  25. Select people [1 of 2] Filter worker: Based on the skillset and attitude of the workers Assign worker: Proactively push tasks to cherry-picked workers Recommend tasks: Provide recommendations to workers for tasks Promote workers: Increase the reach of the task to more workers 25

  26. Select people [1 of 2] Situated crowdsourcing: Bringing tasks to the workers physically Recruit team: Recruit workers whose profile match the task specification 26

  27. Improve extrinsic motivation Tailor rewards: Ensure right rewards for the work Pay bonus: Pay additional rewards for good performance Promote workers: Raise the worker to a higher position 27

  28. Improve intrinsic motivation Share purpose: Understanding the bigger picture motivates workers Self-monitoring: Allow workers to compare their work with others Social transparency: Build trust and bond between workers and requestor Gamify task: Induce curiosity in the task 28

  29. Train people Prime workers: Induce positive effect in task performance Teach workers: Provide workers with suitable instructions Provide feedback: Workers getting feedback on their work improves quality Team work: Allow workers to communicate between each other 29

  30. Improve task design [1 of 2] Lower complexity: Identify the right granularity of the task Decompose tasks: Decompose tasks into smaller sub-tasks Separate duties: Organize work such that one worker has one task Validate worker input: Check value provided by worker from the task UI 30

  31. Improve task design [2 of 2] Improve usability: Ensure that the task UI follow usage guidelines Prompt for rationale: Collect rationale from workers on their own work Introduce breaks: Provide breaks for monotonous works Embrace errors: Embrace errors that can be rectified in post processing 31

  32. Control execution Reserve workers: Maintain a pool of workers to minimize wait time Flood task list: Ensure that the tasks are on the top of the list for workers attention Dynamically instantiate task: Identify quality issues while the task is in execution Control task order: Helps avoid useless cost Inter-task coordination: Manage workflow for inter-related tasks 32

  33. Analyzing state-of-the-art platforms Analyzed 14 platforms Mechanical Turk CrowdFlower MobileWorks Crowdcrafting Turkit Jabberwocky CrowdWeaver AskSheet Turkomatic CrowdForge Upwork 99designs Topcoder Innocentive 33

  34. Analyzing quality model [1 of 2] The core concern is accuracy of output (13/14) Platform allows requestors to fine-tune quality by fine-tuning extrinsic incentives (13/14) Selection of workers are mainly based on location (9/14) and skills (8/14) Half of them allow reliability tracking and reputation management system 34

  35. Analyzing quality model [2 of 2] Points for future research Personality: Characteristic and behavior of requestor and worker is generally neglected Transparency: Requestors and workers are generally anonymous Group work: Quality and benefits of group work is not fully realized User interface quality: Very little attention is paid to the platform 35

  36. Analyzing quality assessment Rating (9/14) and voting (6/14) are the most common method Current challenges: Self assessment is underestimated No proper measure for user interface assessment Assessments are mainly after task execution and not during the process Computer bases assessment are limited 36

  37. Analyzing quality assurance [1 of 2] Tailoring to rewards dominates (11/14) Filter output (8/14) and filter workers (6/14) are widely adopted Rationale is supported where there is interactions with workers (8/14) There is strong support for inter-task coordination and task ordering 37

  38. Analyzing quality assurance [2 of 2] Crucial aspects to approach next: Task recommendation will help workers find the right work and improve quality Long-term relationship for quicker availability and better understanding of the requestor s environment will benefit Workflow integration for managing the increasing complexity of crowdsourced work 38

  39. Conclusions Crowdsourcing can solve problems that neither individual nor computers can solve on their own We analyzed quality control for crowdsourced work Further work need to be done in domain specific service and crowd work regulation and ethics 39

  40. Question and Answers 40

  41. Other Resources https://www.designhill.com/design-blog/best- crowdsourcing-sites-for-your-business/ https://en.wikipedia.org/wiki/Crowdsourcing https://crowdsourcingweek.com/what-is- crowdsourcing/ https://ieeexplore.ieee.org/abstract/documen t/7888402 https://youtu.be/Z_8vWQjU7iY 41

Related


More Related Content