Neural and Evolutionary Computing Papers
@TAL
Followers
946
Following
0
Media
0
Statuses
12K
New Neural and Evolutionary Computing submissions to https://t.co/PCh9ZUDtib (not affiliated with https://t.co/PCh9ZUDtib)
Joined November 2010
SNN-Based Online Learning of Concepts and Action Laws in an Open World.
arxiv.org
We present the architecture of a fully autonomous, bio-inspired cognitive agent built around a spiking neural network (SNN) implementing the agent's semantic memory. This agent explores its...
0
0
0
Brain-inspired Computational Intelligence via Predictive Coding.
arxiv.org
Artificial intelligence (AI) is rapidly becoming one of the key technologies of this century. The majority of results in AI thus far have been achieved using deep neural networks trained with the...
0
0
0
Explainable Framework for Swarm Intelligence Based on Fitness Landscape Features and Machine Learning.
arxiv.org
Swarm based optimization algorithms have demonstrated remarkable success in solving complex optimization problems. However, their widespread adoption remains sceptical due to limited transparency...
0
0
0
Bridging Expressivity and Scalability with Adaptive Unitary SSMs.
arxiv.org
Recent work has revealed that state space models (SSMs), while efficient for long-sequence processing, are fundamentally limited in their ability to represent formal languages particularly due to...
0
0
0
Position: Biology is the Challenge Physics-Informed ML Needs to Evolve.
arxiv.org
Physics-Informed Machine Learning (PIML) has successfully integrated mechanistic understanding into machine learning, particularly in domains governed by well-known physical laws. This success has...
0
0
0
Dynamically Weighted Momentum with Adaptive Step Sizes for Efficient Deep Network Training.
arxiv.org
Within the current sphere of deep learning research, despite the extensive application of optimization algorithms such as Stochastic Gradient Descent (SGD) and Adaptive Moment Estimation (Adam),...
0
0
0
A Benchmark Suite for Multi-Objective Optimization in Battery Thermal Management System Design.
arxiv.org
Synthetic Benchmark Problems (SBPs) are commonly used to evaluate the performance of metaheuristic algorithms. However, these SBPs often contain various unrealistic properties, potentially leading...
0
0
0
Socio-cognitive agent-oriented evolutionary algorithm with trust-based optimization.
arxiv.org
This paper introduces the Trust-Based Optimization (TBO), a novel extension of the island model in evolutionary computation that replaces conventional periodic migrations with a flexible,...
0
0
0
Exponential Dynamic Energy Network for High Capacity Sequence Memory.
arxiv.org
The energy paradigm, exemplified by Hopfield networks, offers a principled framework for memory in neural systems by interpreting dynamics as descent on an energy surface. While powerful for...
0
0
0
Do Language Models Use Their Depth Efficiently?.
arxiv.org
Modern LLMs are increasingly deep, and depth correlates with performance, albeit with diminishing returns. However, do these models use their depth efficiently? Do they compose more features to...
0
0
0
Discovering Heuristics with Large Language Models (LLMs) for Mixed-Integer Programs: Single-Machine Scheduling.
arxiv.org
Our study contributes to the scheduling and combinatorial optimization literature with new heuristics discovered by leveraging the power of Large Language Models (LLMs). We focus on the...
0
0
0
HyperGraphX: Graph Transductive Learning with Hyperdimensional Computing and Message Passing.
arxiv.org
We present a novel algorithm, \hdgc, that marries graph convolution with binding and bundling operations in hyperdimensional computing for transductive graph learning. For prediction accuracy...
0
0
0
All in one timestep: Enhancing Sparsity and Energy efficiency in Multi-level Spiking Neural Networks.
arxiv.org
Spiking Neural Networks (SNNs) are one of the most promising bio-inspired neural networks models and have drawn increasing attention in recent years. The event-driven communication mechanism of...
0
0
0
Connectome-Guided Automatic Learning Rates for Deep Networks.
arxiv.org
The human brain is highly adaptive: its functional connectivity reconfigures on multiple timescales during cognition and learning, enabling flexible information processing. By contrast, artificial...
0
0
0
A Neuroscience-Inspired Dual-Process Model of Compositional Generalization.
arxiv.org
Deep learning models struggle with systematic compositional generalization, a hallmark of human cognition. We propose \textsc{Mirage}, a neuro-inspired dual-process model that offers a processing...
0
0
0
Graph Neural Network Assisted Genetic Algorithm for Structural Dynamic Response and Parameter Optimization.
arxiv.org
The optimization of structural parameters, such as mass(m), stiffness(k), and damping coefficient(c), is critical for designing efficient, resilient, and stable structures. Conventional numerical...
0
0
1
Engram Memory Encoding and Retrieval: A Neurocomputational Perspective.
arxiv.org
Despite substantial research into the biological basis of memory, the precise mechanisms by which experiences are encoded, stored, and retrieved in the brain remain incompletely understood. A...
0
0
0
Sequential Multi-Agent Dynamic Algorithm Configuration.
arxiv.org
Dynamic algorithm configuration (DAC) is a recent trend in automated machine learning, which can dynamically adjust the algorithm's configuration during the execution process and relieve users...
0
0
0
Towards Scaling Deep Neural Networks with Predictive Coding: Theory and Practice.
arxiv.org
Backpropagation (BP) is the standard algorithm for training the deep neural networks that power modern artificial intelligence including large language models. However, BP is energy inefficient...
0
0
0