How to Install#

Prerequisites#

Before starting, ensure you have the following installed:

Required (All Users)#

  • Neovim v0.11+ (earlier versions may work but are untested)
  • Git - for plugin management
  • ripgrep (rg) - for Telescope file searching
  • fd - for Telescope extended file finding capabilities
  • A Nerd Font installed and configured in your terminal

Language-Specific#

For Java Development:

  • JDK 11 or later - SDKMan is recommended for Java installation

For Rust Development:

  • Rust toolchain - Install via rustup
  • rust-analyzer - Will be auto-installed via Mason on first use
  • codelldb - Will be auto-installed via Mason for debugging

First Time Setup#

  1. Clone this configuration to your Neovim config directory:
git clone https://github.com/sshaaf/neovim.git ~/.config/nvim
  1. Open Neovim for the first time:
nvim
  1. Lazy.nvim will automatically:

    • Install all plugins
    • Java: Install nvim-jdtls and jdtls language server via Mason
    • Rust: Install rustaceanvim, crates.nvim, rust-analyzer, and codelldb via Mason
    • Set up debugging support with nvim-dap for both languages
    • Configure neotest for testing

    Wait for all installations to complete (you’ll see progress notifications).

  2. Verify installation by running:

:checkhealth

All checks should pass except for optional warnings (which can be ignored).

Optional: AI Assistant Setup#

This configuration includes CodeCompanion.nvim for AI-assisted coding using self-hosted LLMs via Ollama.

Installing Ollama#

  1. Install Ollama (choose your platform):

    macOS:

    brew install ollama

    Linux:

    curl -fsSL https://ollama.com/install.sh | sh

    Or download from: ollama.com

  2. Start Ollama service:

    ollama serve
  3. Pull a code model (recommended for Java development):

    ollama pull deepseek-coder:6.7b

    Alternative models:

    • ollama pull qwen2.5-coder:7b - Excellent code understanding
    • ollama pull codellama:13b - Solid all-around model
    • ollama pull starcoder2:15b - Good for completions
  4. Verify Ollama is running:

    curl http://localhost:11434/api/tags

Using the AI Assistant#

Once Ollama is running with a model downloaded:

  • <Space>ac - Open AI chat
  • <Space>ae - Edit selection with AI (visual mode)
  • <Space>ax - Explain code (visual mode)
  • <Space>aa - Show AI actions menu

See Quick Reference for all AI keybindings.