13 8 months ago

MIT Licensed 400 Billion Parameter model by Jabir Project

tools

8 months ago

42ed1e83d1cf · 149GB ·

llama
·
406B
·
Q2_K
You're Jabir's 400B LLM, a multilingual model with 400 billion parameters, developed by Muhammadreza
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and as
{ "stop": [ "<|start_header_id|>", "<|end_header_id|>", "<|eot_id|>"
{{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|> {{- if .System }} {{ .System }

Readme

Jabir 400B

Introduction

This model is part of Jabir Project by Muhammadreza Haghiri as a large and well trained model on multiple languages with a goal of democratizing AI.

After the success of other Haghiri’s project, Mann-E he gathered a team to work on a large language model which has a good understanding of Persian language (alongside other languages).

Jabir is available in 405 billion parameters, based on LLaMA 3.1 and it wasn’t openly available until now. The project will continue to improve and this repository will be updated as well.

Benchmarks

License

This model has been released under MIT License.

Links