23 Downloads Updated 1 year ago
Updated 1 year ago
1 year ago
2aef458b7638 · 9.1GB ·
LHK_DPO_v1 is trained via Direct Preference Optimization(DPO) from https://huggingface.co/TomGrc/FusionNet_7Bx2_MoE_14B.
Original model is from https://huggingface.co/HanNayeoniee/LHK_DPO_v1