Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
ruochenx 's Collections
Sidewalk
Multimodal Dataset COT
OCR
Multimodal LLM
DPO dataset
Chinese OCR

DPO dataset

updated Sep 10, 2024
Upvote
-

  • argilla/ultrafeedback-binarized-preferences-cleaned

    Viewer • Updated Dec 11, 2023 • 60.9k • 2.91k • 157

  • mlabonne/orpo-dpo-mix-40k

    Viewer • Updated Oct 17, 2024 • 44.2k • 410 • 297

  • zake7749/kyara-chinese-preference-rl-dpo-s0-30K

    Viewer • Updated Sep 7, 2024 • 30.2k • 9 • 3
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs