GitXplorerGitXplorer
o

transformer-debugger

public
4050 stars
241 forks
9 issues

Commits

List of commits on branch main.
Verified
87e6db7b7e73ded5037eeeff05deb5e81548a10a

Add TopK activation function (#29)

TTomDLT committed 7 months ago
Verified
dc1f898725113bec6cf1006e48f9c5219f8fbdde

Merge pull request #26 from machina-source/main

WWuTheFWasThat committed 8 months ago
Unverified
5952c9ecc667fa6324402a382c45635d90b22e5a

Update model to gpt-4o

mmachina-source committed 8 months ago
Verified
5dd31890f63410a920161edad7e0f815777decc7

Merge pull request #12 from eltociear/patch-1

WWuTheFWasThat committed 9 months ago
Verified
547c254d9c375e2cb9a83f743973d1a8bbb9702a

Merge pull request #23 from machina-source/update-model

WWuTheFWasThat committed 9 months ago
Unverified
6181543162c3f9516ef6f26f0d40ef072ca94ca9

Update model to latest version

mmachina-source committed 9 months ago

README

The README file for this repository.

Transformer Debugger

Transformer Debugger (TDB) is a tool developed by OpenAI's Superalignment team with the goal of supporting investigations into specific behaviors of small language models. The tool combines automated interpretability techniques with sparse autoencoders.

TDB enables rapid exploration before needing to write code, with the ability to intervene in the forward pass and see how it affects a particular behavior. It can be used to answer questions like, "Why does the model output token A instead of token B for this prompt?" or "Why does attention head H attend to token T for this prompt?" It does so by identifying specific components (neurons, attention heads, autoencoder latents) that contribute to the behavior, showing automatically generated explanations of what causes those components to activate most strongly, and tracing connections between components to help discover circuits.

These videos give an overview of TDB and show how it can be used to investigate indirect object identification in GPT-2 small:

What's in the release?

  • Neuron viewer: A React app that hosts TDB as well as pages with information about individual model components (MLP neurons, attention heads and autoencoder latents for both).
  • Activation server: A backend server that performs inference on a subject model to provide data for TDB. It also reads and serves data from public Azure buckets.
  • Models: A simple inference library for GPT-2 models and their autoencoders, with hooks to grab activations.
  • Collated activation datasets: top-activating dataset examples for MLP neurons, attention heads and autoencoder latents.

Setup

Follow these steps to install the repo. You'll first need python/pip, as well as node/npm.

Though optional, we recommend you use a virtual environment or equivalent:

# If you're already in a venv, deactivate it.
deactivate
# Create a new venv.
python -m venv ~/.virtualenvs/transformer-debugger
# Activate the new venv.
source ~/.virtualenvs/transformer-debugger/bin/activate

Once your environment is set up, follow the following steps:

git clone git@github.com:openai/transformer-debugger.git
cd transformer-debugger

# Install neuron_explainer
pip install -e .

# Set up the pre-commit hooks.
pre-commit install

# Install neuron_viewer.
cd neuron_viewer
npm install
cd ..

To run the TDB app, you'll then need to follow the instructions to set up the activation server backend and neuron viewer frontend.

Making changes

To validate changes:

  • Run pytest
  • Run mypy --config=mypy.ini .
  • Run activation server and neuron viewer and confirm that basic functionality like TDB and neuron viewer pages is still working

Links

How to cite

Please cite as:

Mossing, et al., “Transformer Debugger”, GitHub, 2024.

BibTex citation:

@misc{mossing2024tdb,
  title={Transformer Debugger},
  author={Mossing, Dan and Bills, Steven and Tillman, Henk and Dupré la Tour, Tom and Cammarata, Nick and Gao, Leo and Achiam, Joshua and Yeh, Catherine and Leike, Jan and Wu, Jeff and Saunders, William},
  year={2024},
  publisher={GitHub},
  howpublished={\url{https://github.com/openai/transformer-debugger}},
}