New Results in Neuroscience Based AI for Memory and Learning
In our own work on NeuroAI we aim to leverage knowledge about the brain — here the cortex — for structural learning, i.e. for learning that is fast, efficient and successful because it uses the pre-existing neuronal structures which have evolved over millions of years. In the terms of classical AI, what we have done […]
Read MoreCheck out our new video about AI and language (raw cut)
Check out the raw cut of our new video podcast about philosophy, AI and language And here is the full youtube Link.
Read MorePress Release: Ensemble Models of Memory
Here is a write-up for the general public of some of our work on neural plasticity and memory in ensembles of neurons. Please have a look, and respond with questions and support. We will continue with our scientific work, and attempt to finance part of it from a commercial spin-off. Any help is greatly appreciated. […]
Read MoreThe convergence
There used to be the concept of a “singularity”. The idea was that computers would become smarter than humans and start to replace them. Even the idea that humanity would be substituted by silicon-based computing machines (robots) was suggested. Against that two years ago we set the concept of “The convergence”. This assumes that biological […]
Read MoreInterview with Gabriele Scheler: Neuro AI. Will it be the future?
Here is an interview concerning the current AI and generative AI waves, and their relation to neuroscience. We propose solutions based on new technology from neuroAI – which includes humans ability for reasoning, thought, logic, mathematics, proof etc. – and are therefore poorly modeled by data analysis on its own. Some of our work – […]
Read MoreIn Vivo Analysis of Heterogeneous Extracellular Vesicles Using a Red-Shifted Bioluminescence Resonance Energy Transfer Reporter Protein
Check out a new paper from scientific director Michael Bachmann, MD.
Read MoreAnother blog entry on medium
Another blog entry on medium: “Engineering the brain”. There was no intelligent design, and as a result, body organs do not resemble machines. Once we start building machines like body organs — with utility functions, self-organization and cells as building blocks, we can mesh engineering and evolutionary principles to arrive at better organisms.
Read MoreSketch of a novel approach to a neural model
We present a novel model of neuroplasticity in the form of a horizontal-vertical integration model. The horizontal plane consists of a network of neurons connected by adaptive transmission links. This fits with standard computational neuroscience approaches. Each individual neuron also has a vertical dimension with internal parameters steering the external membrane-expressed parameters. These determine neural […]
Read MoreDetermining optimal combination regimens for patients with multiple myeloma
Here is an interesting paper from our board member Dr. Helen Moore.
Read MoreCortical Models
We started a collaboration on the topic of cortical microcircuits with Fred Narcross with the goal of investigating functional principles outside of biological implementation. This contributed to a new paper on neural models.
Read MoreModels of Neural Plasticity
Our work on basing models of neural plasticity on cellular principles continues to advance. With the help of Michael Wheeldon, B.Sc., as the current recipient of a research scholarship we anticipate two publications at the beginning of the new year. One publication will focus on outlining a new type of memory model using both horizontal […]
Read MorePharmacodynamics: Problems and Pitfalls
A systematic overview of qualitative and quantitative model evaluation methods with many detailed references. This is applied and substantiated with case studies, and most interestingly, with an analysis of what can go wrong. Dynamic models are highly sensitive to uncertainties and we need to be aware of the difficulties that can arise from that. S. […]
Read MorePersonalized Classifiers from Ensemble learning with Gaussian Process
Personalized Federated Learning with Gaussian Processes Idan Achituve, Aviv Shamsian, Aviv Navon, Gal Chechik, Ethan Fetaya This is a theoretical work for personalized learning with limited data. It was shown that the disadvantage of restricted exposure for each “person” or client can be remediated by learning a shared kernel function across all clients. This is […]
Read MoreHere is an important short entry.
Here is an important short entry. “An outline for Brain Plasticity: It is not just synapses”. This entry outlines the rational for a vertical internal-external integration theory of memory. Excerpt: We suggest to lead with a hypothesis according to which a neuronal cell operates with a central storage element, located in the nucleus, and a […]
Read MoreThere is a new blog on medium
There is a new blog on medium, which was started to inform about topics in computational neuroscience and our work, as well as general topics in AI and neuroscience. Check it out!
Read MoreResearch Stipend on Neural Plasticity
We are offering a research stipend to investigate theories of memorization in neural plasticity. The focus is a critical evaluation of the role of LTP/LTD and synaptic plasticity in memory. This position is virtual and could be done part-time, or full-time for three months. The ideal candidate should have solid knowledge of neurobiology, especially plasticity […]
Read MorePercolation on an autonomous network
Percolation on the gene regulatory network Giuseppe Torrisi, Reimer Kühn, and Alessia Annibale King’s College London, UK A theoretical analysis, adapting percolation theory to directed bipartite graphs (such as coupled dynamics of transcription factors and genes), investigating conditions under which genetic networks can support a multiplicity of stable gene expression patterns, as required in stable […]
Read MoreNeural ensembles – local information compression
A biorxiv preprint: Johann Schumann and Gabriele Scheler The issue of memory is difficult for standard neural network models. Ubiquitous synaptic plasticity introduces the problem of interference, which limits pattern recall and introduces conflation errors. We present a lognormal recurrent neural network, load patterns into it (MNIST), and test the resulting neural representation for information […]
Read MoreBoolean Neural Networks
Neural Networks, which are the foundation of every human brain, are very peculiar structures. Their functioning can produce amazing results. Learning about them and the way they work, however, is an uphill task.To respond to this challenge, several models that represent the functionality of the neural networks were created, for instance, [6] and [14] . […]
Read MoreDesign and Analysis of a novel Boolean neuron model
Sergey Nasonov’s master’s thesis on Boolean dendrites has been finished and will soon appear.
Read MoreTwo Step Adaptation as a Learning Principle
Our work with Carl Correns researcher Florian Dietz has resulted in the specification of a new, two-step adaptation algorithm for neural tissue models, which is highly compatible with biological observations. This is a major step forward beyond current synaptic plasticity models, where each processing step produces a learning event. We are now implementing and experimenting […]
Read MoreThesis on novel neuron model
The CCF sponsors a thesis to develop a new neuron model at the Technical University Munich. The neuron model to be developed has the goal to better capture dendritic properties. Many neuron models are point models, that is they fail to model dendritic and axonal branching. It is an open research question, how to model […]
Read MoreFirst CCF Grant Proposal on Mathematical Oncology
First CCF Grant Proposal on Mathematical Oncology: Tiling of RNAseq derived graphs. PI: Bachmann, Scheler.
Read MoreSponsored Lecture Series at the Technical University of Munich Starts
Sponsored Lecture Series at the Technical University of Munich starts with talks about Neurorobotics and the Sense of Gravity, Big Data and Ion Channels, and Network Theory in Deep Learning.
Read MorePaper on Logarithmic Distribution of Neuronal Gain published
Paper published: Logarithmic distributions prove that intrinsic learning is Hebbian. Scheler, G. https://www.ncbi.nlm.nih.gov/pubmed/29071065.2
Read MorePaper on Astrocyte-Neuron Interactions published
Paper published: From in silico astrocyte cell models to neuron-astrocyte network models: A review. Obermayer, K. https://www.ncbi.nlm.nih.gov/pubmed/28189516
Read MoreEvent: The self-organizing cell and problems in cancer biology
Santa Clara, 3350 Thomas Rd, Hacker Dojo, 7pm-9pm December 27th, 2016 Two presentations on cutting-edge computational biology Bioinformatics approaches the question: Is cancer inevitable? Or is it a preventable disease and maybe even reversible? The self-organizing cell: computational models of drug-resistance. The presentations are led by Dr. Michael Bachmann from Stanford University, and Dr. Gabriele Scheler […]
Read MoreScientific Adviser Dr. Moore elected to Council of SIAM.
Following SIAM’s fall elections, these are the new Board and Council members: Board of Trustees: Margot Gerritsen, Stanford University Tim Kelley, North Carolina State University Randy LeVeque, University of Washington Council: Liliana Borcea, University of Michigan Per Christian Hansen, Technical University of Denmark Helen Moore, Bristol-Myers Squibb Felix Otto, Max Planck Institute for Mathematics in […]
Read MoreNovember 18th, 2016
Starting to work on cellular intelligence and memory with Sayanti Banerjee.
Read MoreNovember 5th, 2016
Ashwin Rammohan and Louise Cabansay join as interns. They are working on MAPK signaling models and databases on carcinogens, respectively.
Read MoreECML-PKDD 2016
European Conference on Machine Learning and Principles and Practice of Knowledge Discovery
Read MoreInternational Conference on Systems Biology 2016
International Conference on Systems Biology 2016
Read More2nd International Conference on Mathematical Neuroscienc
2nd International Conference on Mathematical Neuroscience
Read MoreAugust 13th, 2016
An interesting article on innovation vs. incremental progress in science.
Read MoreJan 19th, 2016
U.S.Patent No. 9239903 “Determination of Output of Biochemical Reaction Networks” granted to Dr. Scheler
Read More