Files
assist/config/__pycache__/llm_config.cpython-311.pyc

23 lines
1.9 KiB
Plaintext
Raw Normal View History

<EFBFBD>
z"5i<00><00><><00>dZddlmZedddddd<08> <09><00>Zed
d d dd<08> <0A><00>Zeddddd<08> <0A><00>ZeZdefd<12>ZdS)u&
LLM配置文件 - 千问模型配置
<EFBFBD>)<01> LLMConfig<69>qwenz#sk-c0dbefa1718d46eaa897199135066f00<30>1https://dashscope.aliyuncs.com/compatible-mode/v1<76>qwen-plus-latest<73>ffffff<66>?<3F><><00><06>provider<65>api_key<65>base_url<72>model<65> temperature<72>
max_tokens<EFBFBD>openaizsk-your-openai-api-key-herez gpt-3.5-turbo)r
r r rr<00> anthropicz"sk-ant-your-anthropic-api-key-herezclaude-3-sonnet-20240229<32>returnc <00><><00> ddlm}|<00><00>}|<01><00><00>}t|<02>dd<04><00>|<02>dd<06><00>|<02>dd<08><00>|<02>d d
<EFBFBD><00>|<02>d d <0C><00>|<02>d d<0E><00><00><0F><00>S#t
$r
t cYSwxYw)uo
获取默认的LLM配置
优先从统一配置管理器获取,如果失败则使用本地配置
r)<01>
get_configr
rr <00>r rr rrrrrr )<07>src.config.unified_configr<00>get_llm_configr<00>get<65> Exception<6F>DEFAULT_CONFIG)r<00>config<69>llm_dicts <20>#D:\code\assist\config\llm_config.py<70>get_default_llm_configr's<><00><00>
<1E>8<>8<>8<>8<>8<>8<><1B><1A><1C><1C><06><19>(<28>(<28>*<2A>*<2A><08><19><1D>\<5C>\<5C>*<2A>f<EFBFBD>5<>5<><1C>L<EFBFBD>L<EFBFBD><19>B<EFBFBD>/<2F>/<2F><1D>\<5C>\<5C>*<2A>.a<>b<>b<><1A>,<2C>,<2C>w<EFBFBD>(:<3A>;<3B>;<3B> <20> <0C> <0C>]<5D>C<EFBFBD>8<>8<><1F>|<7C>|<7C>L<EFBFBD>$<24>7<>7<> 
<EFBFBD>
<EFBFBD>
<EFBFBD>
<EFBFBD><EFBFBD> <15><1E><1E><1E><1D><1D><1D><1D><1E><><EFBFBD>s<00>B0B3<00>3C<03>CN)<08>__doc__<5F>src.agent.llm_clientr<00> QWEN_CONFIG<49> OPENAI_CONFIG<49>ANTHROPIC_CONFIGrr<00><00>r<00><module>r&s<><00><01><04><04>+<2B>*<2A>*<2A>*<2A>*<2A>*<2A><18>i<EFBFBD> <13> 1<> @<40>
<1C><13><13> <02><02><02> <0B><1A> <09> <15> )<29>
<19><13><13> <02><02><02> <0A><1D>9<EFBFBD> <18> 0<>
$<24><13><13> <02><02><02><10><1D><0E><1E> <09><1E><1E><1E><1E><1E>r%