Show HN: My LLM CLI tool can run tools now, from Python code or plugins
a year ago
- #Python
- #Tools
- #LLM
- LLM 0.26 introduces support for tools, allowing LLMs to access Python functions via CLI or Python library.
- Tools can be installed via plugins, enhancing model capabilities across OpenAI, Anthropic, Gemini, and local models like Ollama.
- The update includes features like --tool/-T for loading tools by name and --functions for passing Python code directly.
- Python API now supports tools in both async and sync contexts, with new chain() method for handling tool calls.
- Example tools include llm_version and llm_time, with more complex plugins like llm-tools-simpleeval for mathematical operations.
- Ad-hoc tools can be defined using --functions, enabling custom functionality like blog searches directly from the CLI.
- The update addresses LLMs' limitations in tasks like mathematics by integrating external tools for accurate calculations.
- Future plans include Model Context Protocol (MCP) support and improvements to tool execution logs and plugin development.