-
Notifications
You must be signed in to change notification settings - Fork 4
local chatglm api #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
apis/tests/chatglm_test.py
Outdated
| @@ -0,0 +1,13 @@ | |||
| import sys | |||
| sys.path.append('/home/jerry_kon/my_chatglm/ChatLLM') | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this.
apis/tests/chatglm_test.py
Outdated
| @@ -0,0 +1,13 @@ | |||
| import sys | |||
| sys.path.append('/home/jerry_kon/my_chatglm/ChatLLM') | |||
| from apis.api import ChatLLM | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Format this with python lint tools.
apis/tests/chatglm_test.py
Outdated
| import sys | ||
| sys.path.append('/home/jerry_kon/my_chatglm/ChatLLM') | ||
| from apis.api import ChatLLM | ||
| chat = ChatLLM( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for unit test, this is not available because we have no model in this project. I'm still figuring out the proper approach to solve this, so currently, just remove this test.
apis/tests/llama_test.py
Outdated
| @@ -0,0 +1,13 @@ | |||
| import sys | |||
| sys.path.append('/home/jerry_kon/my_chatglm/ChatLLM') | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this.
llms/chatglm.py
Outdated
| @@ -0,0 +1,22 @@ | |||
| from transformers import AutoTokenizer, AutoModel | |||
|
|
|||
| #if not history: | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove this, it's ugly.
llms/chatglm.py
Outdated
| self.tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, trust_remote_code=True) | ||
| self.model = AutoModel.from_pretrained(model_name_or_path, trust_remote_code=True).half().cuda() | ||
| self.model = self.model.eval() | ||
| def completion(self,system_prompt, user_prompt): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Format these codes plz.
llms/chatglm.py
Outdated
| @@ -0,0 +1,20 @@ | |||
| from transformers import AutoTokenizer, AutoModel | |||
|
|
|||
| class ChatglmChat: | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should inherit from APIChat
llms/chatglm.py
Outdated
| self.model = ( | ||
| AutoModel.from_pretrained(model_name_or_path, trust_remote_code=True) | ||
| .half() | ||
| .cuda() | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| self.model = ( | |
| AutoModel.from_pretrained(model_name_or_path, trust_remote_code=True) | |
| .half() | |
| .cuda() | |
| ) | |
| self.model = AutoModel.from_pretrained(model_name_or_path, trust_remote_code=True) | |
| .half() | |
| .cuda() |
apis/tests/llama_test.py
Outdated
| @@ -0,0 +1,11 @@ | |||
| from apis.api import ChatLLM | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
format this
apis/tests/llama_test.py
Outdated
| @@ -0,0 +1,11 @@ | |||
| from apis.api import ChatLLM | |||
| chat = ChatLLM( | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please refer to python test about how to write a Test. Basically, it should be
Class TestXXX():
pass
def test_xxx():
pass
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove this to the top-level ChatLLM/integration_tests/ which means you should create a new folder.
Also add a test for chatGLM
| ) | ||
| self.model = self.model.eval() | ||
|
|
||
| def completion(self, system_prompt, user_prompt): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a warning here if we do provide the system_prompt.
add feature about local chatglm api
add the relevant test program