Software & Apps

Josefalbers / Vimlm: Vimlm is a Vim Plugin that provides an assistant with LLM editing by allowing users to a local LLM command and automatic ingesting code in context.

vimlm

Coding partner in LLM for Vim, inspired by Github Copilot / cursor. Allocate comprehension code understanding, meeting, and AI help directly to your workflow.

  • Model agnostic – Use any MPX model equal by a configuration file
  • Vim-native UX – Intuitive keybindings and answers to the window in the split-window
  • Deeper context – understands the context of the code from:
    • Current file
    • Visual Choices
    • Files mentioned
    • Structure in project directory
  • CONVERSATION – Itative pinution with follow-up questions
  • Security gapped in air-gapped – 100% Offline – No apis, no tracking, no data leak
  • Apple M-Series Chip
  • Python 3.12.8
  • Ctrl-l: Add current line + file to context
    Example Prompt: "Regex for removing HTML tags from item.content"
  • Choose Code → Ctrl-l: Add selected block + current context file
    Example Prompt: "Convert this to async/await syntax"

3. Conversation of Function

  • Ctrl-j: Keep current thread
    Example Follow-Up: "Use Manifest V3 instead"

4. Leaving code and replace

  • Ctrl-p: Enter the code blocks from the response to:
    • Last Visual Pelection (Normal Mode)
    • Active Choice (Visual Mode)

Example of Workflow:

  1. Choose a block of code in visual mode
  2. Promptings Ctrl-l: "Convert this to async/await syntax"
  3. Press Ctrl-p To replace the selection with code generated

!include – Add external context

!include (PATH)  # Add files/folders to context
  • !include (no path): current folder
  • !include ~/projects/utils.py: Specific file
  • !include ~/docs/api-specs/: Full folder
    Example: "AJAX-ify this app !include ~/scrap/hypermedia-applications.summ.md"

!deploy – Create project files

!deploy (DEST_DIR)  # Extract code blocks to directory
  • !deploy (no path): current directory
  • !deploy ./src: Specific directory
    Example: "Create REST API endpoint !deploy ./api"

!continue – keep generation

!continue (MAX_TOKENS)  # Continue stopped response
  • !continue: Default 2000 Tokens
  • !continue 3000: Custom Token Limit
    Example: "tl;dr !include large-file.txt !continue 5000"

!followup – Press the poison

!followup  # Equivalent to Ctrl-j

Example:
Initial: "Create Chrome extension"
Follow-up: "Add dark mode support !followup"

Many orders of many orders in a quick:

"Create HTMX component !include ~/lib/styles.css !deploy ./components !continue 4000"

Biting mode action
Ctrl-l Normal / visually Sending current file + Select LLM
Ctrl-j often Continue the conversation
Ctrl-p Normal / visually Replace selection with code made
Esc prepared Cancel the input

Vimlm uses a file in JSON config with the following parameters in the Configuring:

{
  "DEBUG": true,
  "LLM_MODEL": null,
  "NUM_TOKEN": 2000,
  "SEP_CMD": "!",
  "USE_LEADER": false
}
  1. Browse models: MLX community models to deal with face
  2. Edit config file:
{
  "LLM_MODEL": "mlx-community/DeepSeek-R1-Distill-Qwen-7B-4bit",
  "NUM_TOKEN": 9999
}
  1. Save the: ~/vimlm/config.json
  2. Restart Vimlm

If you want to use In the area of For Vimlm key sealing:

Villm is licensed under Apache-2.0 License.


https://opengraph.githubassets.com/c5fb738454fede1b530e3a64fef1c10603eb743317387a66022e0ecf34482de0/JosefAlbers/VimLM

2025-02-15 02:34:00

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button