remove llm config max_input_tokens 40k put max_prompt_token=50k by default for our agents.
remove llm config max_input_tokens 40k
put max_prompt_token=50k by default for our agents.