Efficient Automated Circuit Discovery in Transformers using Contextual Decomposition

๐Ÿ“… 2024-07-01
๐Ÿ“ˆ Citations: 2
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing circuit discovery methods for large language models (LLMs) suffer from high computational overhead, substantial approximation error, and reliance on non-vanishing gradients. To address these limitations, we propose Context Decomposition for Transformers (CD-T), a recursive contribution decomposition and pruning framework grounded in differentiable computation graphsโ€”requiring neither activation patching nor gradient constraints. CD-T enables fully automated, second-scale, fine-grained circuit discovery down to individual attention heads, supports modeling at arbitrary levels of abstraction, and yields circuits with perfect fidelity (1.0) and significantly reduced size. On three benchmark tasks, CD-T achieves an average ROC AUC of 97%, operates 3โ€“4 orders of magnitude faster than state-of-the-art methods, and reproduces circuit behavior with 80% higher accuracy than random baselines.

Technology Category

Application Category

๐Ÿ“ Abstract
Automated mechanistic interpretation research has attracted great interest due to its potential to scale explanations of neural network internals to large models. Existing automated circuit discovery work relies on activation patching or its approximations to identify subgraphs in models for specific tasks (circuits). They often suffer from slow runtime, approximation errors, and specific requirements of metrics, such as non-zero gradients. In this work, we introduce contextual decomposition for transformers (CD-T) to build interpretable circuits in large language models. CD-T can produce circuits of arbitrary level of abstraction, and is the first able to produce circuits as fine-grained as attention heads at specific sequence positions efficiently. CD-T consists of a set of mathematical equations to isolate contribution of model features. Through recursively computing contribution of all nodes in a computational graph of a model using CD-T followed by pruning, we are able to reduce circuit discovery runtime from hours to seconds compared to state-of-the-art baselines. On three standard circuit evaluation datasets (indirect object identification, greater-than comparisons, and docstring completion), we demonstrate that CD-T outperforms ACDC and EAP by better recovering the manual circuits with an average of 97% ROC AUC under low runtimes. In addition, we provide evidence that faithfulness of CD-T circuits is not due to random chance by showing our circuits are 80% more faithful than random circuits of up to 60% of the original model size. Finally, we show CD-T circuits are able to perfectly replicate original models' behavior (faithfulness $ = 1$) using fewer nodes than the baselines for all tasks. Our results underscore the great promise of CD-T for efficient automated mechanistic interpretability, paving the way for new insights into the workings of large language models.
Problem

Research questions and friction points this paper is trying to address.

Efficiently discover interpretable circuits in large language models.
Overcome slow runtime and approximation errors in circuit discovery.
Improve faithfulness and accuracy of automated mechanistic interpretation.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Contextual decomposition for transformers (CD-T)
Efficient circuit discovery via recursive computation
High-fidelity circuits with reduced runtime
๐Ÿ”Ž Similar Papers
No similar papers found.
A
Aliyah R. Hsu
Department of EECS, UC Berkeley, CA, USA; Center for Computational Biology, UC Berkeley, CA, USA
Yeshwanth Cherapanamjeri
Yeshwanth Cherapanamjeri
Postdoctoral Associate, MIT
High Dimensional StatisticsAlgorithms
A
A. Odisho
Department of Urology, UCSF, CA, USA; Department of Epidemiology and Biostatistics, UCSF, CA, USA
P
Peter R. Carroll
Department of Urology, UCSF, CA, USA
B
Bin Yu
Department of Statistics, UC Berkeley, CA, USA; Department of EECS, UC Berkeley, CA, USA