Skip to content

[Bugfix][Kernel]Fix incorrect output tokens when running ChatGLM-9b inferencing on MI250 #1121

[Bugfix][Kernel]Fix incorrect output tokens when running ChatGLM-9b inferencing on MI250

[Bugfix][Kernel]Fix incorrect output tokens when running ChatGLM-9b inferencing on MI250 #1121

Annotations

1 warning

This job succeeded