-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Issues: hiyouga/LLaMA-Factory
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
command 'llamafactory-cli webui' failed to start if HTTP_PROXY is set
bug
Something isn't working
pending
This problem is yet to be addressed
#6739
opened Jan 22, 2025 by
qiangyt
1 task done
请求支持 qwen2-audio
enhancement
New feature or request
pending
This problem is yet to be addressed
#6738
opened Jan 22, 2025 by
superFilicos
1 task done
请问:是否支持微调 DeepSeek-R1-Distill 系列模型?
enhancement
New feature or request
pending
This problem is yet to be addressed
#6737
opened Jan 22, 2025 by
CsBoBoNice
1 task done
After lora training, model repeat output !
bug
Something isn't working
pending
This problem is yet to be addressed
#6736
opened Jan 22, 2025 by
VectorGready
1 task done
How to jump a single sample?
bug
Something isn't working
pending
This problem is yet to be addressed
#6732
opened Jan 21, 2025 by
sxj1215
1 task done
qwen2.5-32B-Instruct模型2机npu 910B训练结束后,保存模型时失败
bug
Something isn't working
npu
This problem is related to NPU devices
pending
This problem is yet to be addressed
#6726
opened Jan 21, 2025 by
chriszhangmq
1 task done
windows下微调llama3.1-instruction 开始训练后报错 无法启动微调
bug
Something isn't working
pending
This problem is yet to be addressed
#6725
opened Jan 21, 2025 by
LJXCMQ
1 task done
label smoothing 支持
enhancement
New feature or request
pending
This problem is yet to be addressed
#6719
opened Jan 20, 2025 by
652994331
1 task done
please add bitune support
enhancement
New feature or request
pending
This problem is yet to be addressed
#6718
opened Jan 20, 2025 by
chaitu0032
1 task done
qwen2_vl 模型 使用 vllm 引擎推理时对视频文件的传入未进行处理
bug
Something isn't working
enhancement
New feature or request
pending
This problem is yet to be addressed
#6708
opened Jan 20, 2025 by
imwxc
1 task done
Param Size Mismatch after Fine-tuned on Llama-3-8B-Instruct
bug
Something isn't working
pending
This problem is yet to be addressed
#6700
opened Jan 18, 2025 by
bw-wang19
1 task done
Size mismatch error in the middle of training
bug
Something isn't working
pending
This problem is yet to be addressed
#6699
opened Jan 18, 2025 by
NicoZenith
1 task done
自动被killed且无任何报错信息
bug
Something isn't working
pending
This problem is yet to be addressed
#6687
opened Jan 17, 2025 by
duyu09
1 task done
deepspeed 容器环境下no-ssh多机多卡训练
bug
Something isn't working
pending
This problem is yet to be addressed
#6685
opened Jan 17, 2025 by
Justin-12138
1 task done
no success funtuning method with pissa in LLama Board,but it's ok using commanding line
bug
Something isn't working
pending
This problem is yet to be addressed
#6656
opened Jan 15, 2025 by
chuangzhidan
1 task done
NPU双机多卡微调报HCCL错误
bug
Something isn't working
npu
This problem is related to NPU devices
pending
This problem is yet to be addressed
#6646
opened Jan 15, 2025 by
AlbertWang001
1 task done
[Feature] Add option to run New feature or request
pending
This problem is yet to be addressed
llamafactory-cli webui
in background
enhancement
#6644
opened Jan 14, 2025 by
steveepreston
VENV环境运行时也只能跑在默认环境上
bug
Something isn't working
pending
This problem is yet to be addressed
#6604
opened Jan 11, 2025 by
cbb2625274797
1 task done
mistral-of-depth微调报错:forward() takes from 7 to 8 positional arguments but 9 were given
pending
This problem is yet to be addressed
#6557
opened Jan 7, 2025 by
jingtian11
1 task done
docker下内核和pip兼容问题
pending
This problem is yet to be addressed
#6554
opened Jan 7, 2025 by
SDAIer
1 task done
自定义损失函数报错
pending
This problem is yet to be addressed
#6534
opened Jan 6, 2025 by
rocket2q19
1 task done
resume_from_checkpoint oom killed
pending
This problem is yet to be addressed
#6486
opened Dec 30, 2024 by
sunrise224
1 task done
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.