Difference between revisions of "Crowdsourcing"

From CURATEcamp
Jump to: navigation, search
Line 1: Line 1:
*Who's using user-added metadata?
+
1. Who's using user-added metadata?
**Successes and Failures
+
*Successes and Failures
**Importing (Ingesting) or Exporting
+
*Importing (Ingesting) or Exporting
*What types of metadata fields are good candidates for crowd-sourced metadata?
+
2. What types of metadata fields are good candidates for crowd-sourced metadata?
*What effective incentives can be provided for metadata entry?
+
3. What effective incentives can be provided for metadata entry?
*Mechanical Turk
+
4. Mechanical Turk
**Used by Amazon
+
*Used by Amazon
**Looks computer generated, human created
+
*Looks computer generated, human created
*What possible legal issues might there be with crowd-sourced metadata?
+
5. What possible legal issues might there be with crowd-sourced metadata?
*What quality control or authority control systems can be implemented?
+
6. What quality control or authority control systems can be implemented?
**What reputation systems might be employed to handle quality / authority issues?
+
*What reputation systems might be employed to handle quality / authority issues?
*What methods of integration could their be with non-user generated metadata?
+
7. What methods of integration could their be with non-user generated metadata?
*Controlled Vocabularies
+
8. Controlled Vocabularies
**There must be some consideration of domain-specific vocabularies
+
*There must be some consideration of domain-specific vocabularies
**One size does not fit all
+
*One size does not fit all
*Awareness - Users must be aware of crowd-sourcing features
+
9. Awareness - Users must be aware of crowd-sourcing features
**Marketing
+
*Marketing
**Advertising
+
*Advertising
**Public Relations
+
*Public Relations

Revision as of 05:08, 12 May 2012

1. Who's using user-added metadata?

  • Successes and Failures
  • Importing (Ingesting) or Exporting

2. What types of metadata fields are good candidates for crowd-sourced metadata? 3. What effective incentives can be provided for metadata entry? 4. Mechanical Turk

  • Used by Amazon
  • Looks computer generated, human created

5. What possible legal issues might there be with crowd-sourced metadata? 6. What quality control or authority control systems can be implemented?

  • What reputation systems might be employed to handle quality / authority issues?

7. What methods of integration could their be with non-user generated metadata? 8. Controlled Vocabularies

  • There must be some consideration of domain-specific vocabularies
  • One size does not fit all

9. Awareness - Users must be aware of crowd-sourcing features

  • Marketing
  • Advertising
  • Public Relations