For the implementation of LLM core(s), we design the LLMCore as the base class as below that defines several functionalities.
classLLMCore(ABC):def__init__(self,llm_name:str,max_gpu_memory:dict=None,eval_device:str=None,max_new_tokens:int=256,log_mode:str="console",use_context_manager:bool=False):""" Initialize LLMCore with model configurations """pass@abstractmethoddefload_llm_and_tokenizer(self) ->None:""" Load the LLM model and tokenizer """passdeftool_calling_input_format(self,prompt:list,tools:list) ->list:""" Format prompts to include tool information """passdefparse_tool_calls(self,tool_call_str):""" Parse and add tool call identifiers for models without tool support """pass@abstractmethoddefaddress_syscall(self,llm_syscall,temperature=0.0):""" Process the syscall sent to the LLM """pass@abstractmethoddefllm_generate(self,prompt,temperature=0.0):""" Generate a response based on the provided prompt """pass
Different instances are implemented by inheriting this LLMCore class and override its abstract methods.