Difference between revisions of "Crowdsourcing"

From CURATEcamp
Jump to: navigation, search
(Discussions revolving around the crowd-sourcing of metadata and tag/label schemes.)
 
Line 1: Line 1:
1) Who's using user-added metadata?
+
*Who's using user-added metadata?
  - Successes and Failures
+
**Successes and Failures
  - Importing (Ingesting) or Exporting
+
**Importing (Ingesting) or Exporting
2) What types of metadata fields are good candidates for crowd-sourced metadata?
+
*What types of metadata fields are good candidates for crowd-sourced metadata?
3) What effective incentives can be provided for metadata entry?
+
*What effective incentives can be provided for metadata entry?
4) Mechanical Turk
+
*Mechanical Turk
  - Used by Amazon
+
**Used by Amazon
  - Looks computer generated, human created
+
**Looks computer generated, human created
5) What possible legal issues might there be with crowd-sourced metadata?
+
*What possible legal issues might there be with crowd-sourced metadata?
6) What quality control or authority control systems can be implemented?
+
*What quality control or authority control systems can be implemented?
  - What reputation systems might be employed to handle quality / authority issues?
+
**What reputation systems might be employed to handle quality / authority issues?
7) What methods of integration could their be with non-user generated metadata?
+
*What methods of integration could their be with non-user generated metadata?
8) Controlled Vocabularies
+
*Controlled Vocabularies
  - There must be some consideration of domain-specific vocabularies
+
**There must be some consideration of domain-specific vocabularies
  - One size does not fit all
+
**One size does not fit all
9) Awareness - Users must be aware of crowd-sourcing features
+
*Awareness - Users must be aware of crowd-sourcing features
  - Marketing
+
**Marketing
  - Advertising
+
**Advertising
  - Public Relations
+
**Public Relations

Revision as of 04:06, 12 May 2012

  • Who's using user-added metadata?
    • Successes and Failures
    • Importing (Ingesting) or Exporting
  • What types of metadata fields are good candidates for crowd-sourced metadata?
  • What effective incentives can be provided for metadata entry?
  • Mechanical Turk
    • Used by Amazon
    • Looks computer generated, human created
  • What possible legal issues might there be with crowd-sourced metadata?
  • What quality control or authority control systems can be implemented?
    • What reputation systems might be employed to handle quality / authority issues?
  • What methods of integration could their be with non-user generated metadata?
  • Controlled Vocabularies
    • There must be some consideration of domain-specific vocabularies
    • One size does not fit all
  • Awareness - Users must be aware of crowd-sourcing features
    • Marketing
    • Advertising
    • Public Relations