This repository contains the source code and results for the experiments presented in Evaluating Parameter-Based Training Performance of Neural Networks and Variational Quantum Circuits.
-
Updated
Feb 11, 2025 - Python
This repository contains the source code and results for the experiments presented in Evaluating Parameter-Based Training Performance of Neural Networks and Variational Quantum Circuits.
CRS-LM: Structure-aware context reduction for tiny language models under Parameter Golf constraints
Zero-hidden neural networks that solve non-linear problems through temporal depth, not spatial layers. 90.14% MNIST with 480 parameters. Intelligence is not depth — it's resonance. Time is the ultimate hidden layer. OdyssNet proves it.
This repository contains the source code and results for the experiments presented in The Impact of Parameter Count on Sarcasm Detection Using BERT-Based Models.
🧬 Neuro-Symbolic Activation Discovery: Using Genetic Programming to discover domain-specific activation functions and transfer them across scientific domains. Achieves 18-21% higher parameter efficiency with 5-6× fewer parameters.
Add a description, image, and links to the parameter-efficiency topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficiency topic, visit your repo's landing page and select "manage topics."