TIES-Merging: Resolving Interference When Merging Models

Link
Abstract

Transfer learning – i.e., further fine-tuning a pre-trained model on a downstream task – can confer significant advantages, including improved downstream performance, faster convergence, and better sample efficiency. These advantages have led to a proliferation of task-specific fine-tuned models, which typically can only perform a single task and do not benefit from one another. Recently, model merging techniques have emerged as a solution to combine multiple task-specific models into a single multitask model without performing additional training. However, existing merging methods often ignore the interference between parameters of different models, resulting in large performance drops when merging multiple models. In this paper, we demonstrate that prior merging techniques inadvertently lose valuable information due to two major sources of interference: (a) interference due to redundant parameter values and (b) disagreement on the sign of a given parameter’s values across models. To address this, we propose our method, TRIM, ELECT SIGN & MERGE (TIES-MERGING), which introduces three novel steps when merging models: (1) resetting parameters that only changed a small amount during fine-tuning, (2) resolving sign conflicts, and (3) merging only the parameters that are in alignment with the final agreed-upon sign. We find that TIES-MERGING outperforms several existing methods in diverse settings covering a range of modalities, domains, number of tasks, model sizes, architectures, and fine-tuning settings. We further analyze the impact of different types of interference on model parameters, and highlight the importance of resolving sign interference

Synth

Problem:: 모델 병합 시 파라미터 간 간섭으로 인한 성능 저하 / 중복 파라미터와 부호 불일치로 유용한 정보 손실 / 태스크 수 증가 시 간섭 문제 심화

Solution:: 중복 파라미터 제거로 영향력 있는 파라미터 보존 / 부호 충돌 해결을 위한 투표 기반 부호 선택 / 선택된 부호와 일치하는 파라미터만 병합

Novelty:: 파라미터의 중요도를 고려하는 방식을 Task Vector 방식에 적용함

Note:: 제안 방식의 효용성을 여러 방면에서 잘 보여준 듯

Summary

Motivation

Method

file-20250401201012019.png

Method 검증

Mean Magnitude가 큼 → 파라미터가 영향력이 있음
(a): 0, 1, >1: 파라미터가 영향을 주는 태스크 수, 0은 Redundant를 의미
(b): 숫자는 부호의 일치 정도, 0.5는 +가 반 -가 반을 의미, 1은 모든 부호가 동일함