AJIBOLA, Ifedayo Olabode and Covenant University, Theses Masters (2024) ENHANCED IN-CONTEXT LEARNING FOR CODE ANALYSIS WITH COMPACT LARGE LANGUAGE MODELS AND MONTE CARLO TREE SEARCH. Masters thesis, Covenant University.
PDF
Download (174kB) |
Abstract
Large Language Models (LLMs) demonstrate impressive reasoning abilities, yet their performance can falter when dealing with extensive context lengths. Techniques like Retrieval Augmented Generation (RAG) and Chain-of-Thought prompting seek to bridge this gap, but they face limitations when applied to large code-based contexts due to the complexity of representing inter-object relationships. Monte Carlo Tree Search (MCTS), a heuristic search algorithm, offers a potential solution by aiding LLMs in identifying crucial code repository aspects, thus facilitating downstream tasks. This research focuses on applying MCTS to enhance the performance of "Compact LLMs" - models small enough to run inference on consumer-grade GPUs. Our findings confirm that MCTS indeed boosts performance compared to the baseline Compact LLM. However, these compact models, even with MCTS, still lag behind larger models in performance.
Item Type: | Thesis (Masters) |
---|---|
Uncontrolled Keywords: | large language model, compact LLM, chain-of-thought, monte carlo tree search, in-context learning, code analysis |
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science Q Science > QA Mathematics > QA76 Computer software |
Divisions: | Faculty of Engineering, Science and Mathematics > School of Electronics and Computer Science |
Depositing User: | Patricia Nwokealisi |
Date Deposited: | 30 Oct 2024 11:54 |
Last Modified: | 30 Oct 2024 11:54 |
URI: | http://eprints.covenantuniversity.edu.ng/id/eprint/18544 |
Actions (login required)
View Item |