Given this information you can then select specific time windows where you average your firing rate over time for X neurons. Btw Bathellier was my PhD supervisor so if you have any question about the technique just tell me (I'm one of the first author of the paper).The goal of this technique is to tell you exactly where and when some specific population of neurons activate. The technique suppose that you have several trials for the same stimulus. What I did to discriminate two sounds based on the neural activity is that I transform my matrices into the Bathellier visualization describe here ( Temporal asymmetries in auditory coding and perception reflect multi-layered nonlinearities. I let you decide how you want to transform your spikes to get the FR, but to me the most elegant technique is the gaussian kernel method. I did my PhD on exactly these types of data (from the mouse auditory cortex with calcium imaging). So you'll have the firing rate over time of a bunch of neurons in response to different stimuli. Ok so if I understood well, you want to create simulation or to record neurons, but you don't have such data just yet. Can you tell me whether this approach makes sense? Or if this can be used in place of the method you suggested?Īlso I'm not quite familiar with machine learning, so I think I'll have to get acquainted with it if I was to do SVM.
I thought of this approach and I'm not quite sure how to program this. Once we convert the spike times to spike rates for each neuron, we can then create a matrix of firing rates over time, which will serve as the input to the to see if the firing rate matrices are discriminable.
I also thought that I could also convolve the time series with a gaussian or triangle shaped kernel. The easiest method is to simply count the number of spikes that occur in a narrow window (~10 ms) of time and divide by the window size. Usually, the neural activity is represented as a time series of firing rates and I came to know that there are several methods for converting spike times to firing rates. Then, build a classifier to see if it can discriminate between the neural activity patterns generated by the pair of stimuli. For example, I thought I could start with 2 stimuli having different frequency and/or amplitude parameters. To do this, I thought of varying the parameters of the stimulus (amplitude, frequency, duration) to generate simulated neural output for each set of stimuli. I'd like to use this model to test the limits of discrimination. So We know that the brain is able to detect small differences in the firing patterns of these sensory neurons.
However, even two smooth surfaces can feel very different when made of different materials.
One line of questions relates to how sensory neurons discriminate different textures - a rough surface feels very different from a smooth surface, obviously. I'll describe the problem once againīasically I want to examine how the population of sensory neurons encodes information about tactile stimuli. And I don't know whether I'm supposed to use machine learning for this. If you need more info, don't hesitate to ask your questions on the thread ) It will be efficient even with a low number of trials and you'll be able to extract quite some information out of your model (what time window or cell population are important for the discrimination. If you just want to classify your stimuli based on the activity, I would recommend a classic SVM. Another trick you can use are correlation analysis to better understand your activity pattern.
It all depends on the quality of your data (what technique are you using to record the firing rate?), the quality of the classification you want to obtain ? the explanation you want out of your model ?Ī first thing I would do is to display the neural activity over time with an PCA and if the two stimuli have very different activity pattern you should see it there. If you're not very familiar with machine learning you need to know that there are several types of models you can use to classify 2 stimuli based on neural activity.