Indexed by:
Abstract:
Tree of Thoughts (ToT) has emerged as a powerful framework for complex reasoning with Large Language Models (LLMs). However, existing ToT methods often generate highly similar paths, leading to inefficient resource utilization. We present Entropy-Guided Tree of Thoughts, a novel approach that enhances the diversity and efficiency of path generation in LLM reasoning. Our method introduces three key innovations: (1) a dynamic entropy evaluation mechanism that combines branch-specific and global probability distributions, (2) a path similarity penalty that prevents redundant exploration (3) an adaptive diversity constraint that dynamically adjusts search strategies. Preliminary experiments on the Game of 24 showed promising results in promoting solution diversity while maintaining reasoning quality. © 2025 IEEE.
Keyword:
Reprint 's Address:
Email:
Source :
Year: 2025
Page: 132-136
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: