Skip to content

ahaggard2013/binaryninja-ollama

Repository files navigation

"Buy Me A Coffee"

Binary Ninja Ollama (v1.0.8)

Author: Austin Haggard

Binary Ninja Ollama integrates with your own locally hosted ollama server to rename functions and variables with AI

Description:

Ollama is a tool that allows you to pull open source AI models and run them locally. Some models require extensive computing power, while others can be ran on your personal laptop. Results will vary greatly depending on the model you choose to use :).

Why use this over sidekick/openai?

  1. It's FREE and easy to setup locally.
  2. Did I say it's FREE?
  3. It can be ran anywhere without internet!
  4. Your data kept between you and your ollama server with no third party.

Features

This plugin integrates Ollama with Binary Ninja and supports the actions listed below:

  • Setting which server/port binary ninja should use to connect to ollama.
    • Have a (very) high powered gaming PC? Use it to host ollama and point binja to llama3:70b/gemma2:27b.
    • Running this on a laptop? Host it locally, set it to localhost and run gemma2:latest/llama3:latest.
    • There are tons of other models to try, but I've primarily tested this with varients of llama3/gemma2 with decent results.
  • Query your locally hosted ollama server to determine what a given function does.
    • This can be utilized to rename all function in bulk, or individually targeted functions.
  • Allows users to rename variables in HLIL using ollama.
    • This can be utilized to rename individual variables within an instruction.
    • This can be used to rename all variables within a function.

Installation

If you're installing this as a standalone plugin, you can place (or sym-link) this in Binary Ninja's plugin path. Default paths are detailed on Vector 35's documentation.

Dependencies

  • Python 3.10+
  • ollama installed with pip3 install ollama

Ollama Server

This requires you to have access to or host your own ollama server and pull down any models you would like to use.

Follow the instructions on https://ollama.com to setup your server and pull any models you would like to try. Once this is done a server should be automatically started and accessed via localhost:11434.

Usage

Rename all function variables

The rename all function variables option will parse all varaibles within a function and attempt to rename them based on the following prompt:

prompt = (
             f"In one word, what should the variable '{variable}' be named in the below Function? "
             f"The name must meet the following criteria: all lowercase letters, usable in Python code"
)

Before variables renaming After variables renaming

Rename all functions

The rename all functions option will loop through all functions, smallest to largest, within a binaryview and rename them based on the prompt:

prompt = (
    f"Given the following HLIL decompiled code snippet, provide a Python-style function name that describes what the code is doing. "
    f"The name must meet the following criteria: all lowercase letters, usable in Python code, with underscores between words. "
    f"Only return the function name and no other explanation or text data included."
)

Before functions renaming After functions renaming After functions renaming

Rename target function

Renaming a target function uses the same prompt as renaming all functions, but limits it the selected function when triggering the plugin.

Rename target function variable

Renaming a target variable uses the same prompt as renaming all variables, but limits it the selected function when triggering the plugin.

Settings

Settings is triggered at the first call to any renaming operation when binary ninja is first started, or by triggering it manually. The appplied settings will persist within a binary ninja session.

The settings window allows you to set the IP, port, and model to use within ollama. Only downloaded models are selectable.

Plugin settings option Plugin connection options Plugin model options

Known Issues

  • On larger functions AI will ignore the prompt and return large blocks of text describing the function. This is mitigated by ignoring the returned value and throwing a "can't rename function' log, but could be further investigated
  • The chosen server being non-existent could be handled better.

Feature Request

  • Anything you are intesested in that is not included?

    • Open an issue!
    • Make a pull request.
  • Ideas:

    • target single variable naming instead of all variables on a line
    • generate AI comments describing code
    • Structure Recovery

License

This plugin is released under a MIT license.