Llama-3-Taiwan-8B is a 8B parameter model finetuned on a large corpus of Traditional Mandarin and English data using the Llama-3 architecture. It demonstrates state-of-the-art performance on various Traditional Mandarin NLP benchmarks.
528 Pulls Updated 4 months ago