Created at 8pm, Feb 15
andthattoocs.NE
0
Hybrid Self-Attention NEAT: A novel evolutionary approach to improve the NEAT algorithm
xIPgE6hvTFoMFzr6E26Yrt1EOhv9DqHdPfYd0ZnpH2k
File Type
CUSTOM
Entry Count
876
Embed. Model
BAAI/bge-base-en-v1.5
Index Type
hnsw

This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are high dimensional, it cannot create a well-tuned network. Our study addresses this limitation by using self-attention as an indirect encoding method to select the most important parts of the input. In addition, we improve its overall performance with the help of a hybrid method to evolve the final network weights. The main conclusion is that Hybrid Self- Attention NEAT can eliminate the restriction of the original NEAT. The results indicate that in comparison with evolutionary algorithms, our model can get comparable scores in Atari games with raw pixels input with a much lower number of parameters.

How to Retrieve?

# Query

curl -X POST "https://search.dria.co/hnsw/query" \
-H "x-api-key: <YOUR_API_KEY>" \
-H "Content-Type: application/json" \
-d '{"vector": [0.123, 0.5236], "top_n": 10, "contract_id": "xIPgE6hvTFoMFzr6E26Yrt1EOhv9DqHdPfYd0ZnpH2k", "level": 2}'