Supports translation between English, French, Chinese(Mandarin) and Japanese.
534 Pulls Updated 3 months ago
Updated 3 months ago
3 months ago
55228e97cdb9 · 6.3GB
model
archllama
·
parameters8.03B
·
quantizationQ4_K_M
6.3GB
params
{"stop":["\u003c|start_header_id|\u003e","\u003c|end_header_id|\u003e","\u003c|eot_id|\u003e","\u003
128B
template
{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if
260B
system
You are a highly skilled professional translator. You are a native speaker of English, Japanese, Fre
926B
Readme
This model is finetuned meta-llama/Meta-Llama-3.1-8B-Instruct and AWQ quantized and converted version to run even without a GPU.
Supports translation between English, French, Chinese(Mandarin) and Japanese.
There are many varieties of Chinese, with Mandarin being the most commonly used, so please use Mandarin instead of Chinese in your prompts.
Original
https://huggingface.co/dahara1/llama-translate-gguf
Examples
>>> Translate English to Japanese: Heaven helps those who help themselves.
天は自助の人を助ける。
>>> /clear
Cleared session context
>>> Translate English to Japanese: Rome wasn’t built in a day.
ローマは一日にして成らず。
Python
import ollama
model = "7shi/llama-translate:8b-q4_K_M"
def translation(instruction, input_text):
prompt = f"""### Instruction:
{instruction}
### Input:
{input_text}
### Response:
"""
messages = [{ "role": "user", "content": prompt }]
try:
# Send the POST request and capture the response
response = ollama.chat(model=model, messages=messages)
# print(response)
except ollama.ResponseError as e:
# if the request was failed
print("Error:", e.error)
return None
# Extract the 'content' field from the response
response_content = response["message"]["content"].strip()
return response_content
if __name__ == "__main__":
translated_line = translation(f"Translate Japanese to English.", "アメリカ代表が怒涛の逆転劇で五輪5連覇に王手…セルビア下し開催国フランス代表との決勝へ")
print(translated_line)
translated_line = translation(f"Translate Japanese to Mandarin.", "石川佳純さんの『中国語インタビュー』に視聴者驚き…卓球女子の中国選手から笑顔引き出し、最後はハイタッチ「めちゃ仲良し」【パリオリンピック】")
print(translated_line)
translated_line = translation(f"Translate Japanese to French.", "開催国フランス すでに史上最多のメダル数に パリオリンピック")
print(translated_line)
translated_line = translation(f"Translate English to Japanese.", "U.S. Women's Volleyball Will Try For Back-to-Back Golds After Defeating Rival Brazil in Five-Set Thriller")
print(translated_line)
translated_line = translation(f"Translate Mandarin to Japanese.", "2024巴黎奥运中国队一日三金!举重双卫冕,花游历史首金,女曲再创辉煌")
print(translated_line)
translated_line = translation(f"Translate French to Japanese.", "Handball aux JO 2024 : Laura Glauser et Hatadou Sako, l’assurance tous risques de l’équipe de France")
print(translated_line)
Modelfile
FROM llama-translate.f16.Q4_K_M.gguf
SYSTEM "You are a highly skilled professional translator. You are a native speaker of English, Japanese, French and Mandarin. Translate the given text accurately, taking into account the context and specific instructions provided. Steps may include hints enclosed in square brackets [] with the key and value separated by a colon:. If no additional instructions or context are provided, use your expertise to consider what the most appropriate context is and provide a natural translation that aligns with that context. When translating, strive to faithfully reflect the meaning and tone of the original text, pay attention to cultural nuances and differences in language usage, and ensure that the translation is grammatically correct and easy to read. For technical terms and proper nouns, either leave them in the original language or use appropriate translations as necessary. Take a deep breath, calm down, and start translating."
TEMPLATE """{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>"""
PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
PARAMETER stop "<|eot_id|>"
PARAMETER stop "<|reserved_special_token"
More details (in Japanese):