Tag Archives: NeuroPhysiology

The Chain Program

Decoding Music Attention from “EEG Headphones”: A User-Friendly Auditory Brain-Computer Interface

Transcranial Evoked Potentials can be reliably recorded with active electrodes

Combining EEG and eye tracking: a workflow for your mobile experiment

A guide to peripheral physiology measurements using the BrainAmp ExG MR – Part 1: Let’s focus on EMG

Combining Brain Products and MindAffect for a fast, robust and reliable BCI speller

Thermal testing: what is the norm? A guide to using normative data

Intro to Thermal Quantitative Sensory Testing (QST)

Joint recording of EEG and audio signals in hyperscanning and pseudo-hyperscanning experiments

The modulation of neural insular activity by a brain computer interface differentially affects pain discrimination

Lifelines Trackit T4A

Smaller, lighter & more powerful than ever!

Lifelines Neuro trackit™ T4A is the latest ambulatory EEG amplifier, crafted with a robust design for rigorous home use standards. Capable of five days of continuous recording and three days of bluetooth recording.

 

All new CGX Quick headsets with live impedance measurement by means of LEDs now available

Since introducing the CGX Quick systems into our portfolio in 2020, several updates have been made to improve your overall experience with this dry electrode headset. Whether you are conducting research in neuromarketing, neuroergonomics, mobile applications or other fields where an easy to apply headset is needed, the updates recently made to the Quick Systems are sure to enhance the experience of both the researcher and the participant.

What’s new? 

The new Quick-32r and Quick-20r v2 have been updated to include on-board impedance checking by means of LEDs, a Brain Products’ patented technology which is implemented in our actiCAP slim electrodes. This handy feature eases your set-up as you can directly see the range of the electrode’s impedance at each site in real-time without having to check the recording software. BrainVision Recorder for CGX not only allows online impedance checking and has an LSL outlet, but is also compatible with RecView to perform online analysis.

CGX Quick-20r v2 and Quick-32r

CGX Quick-20r v2 (left) and Quick-32r (right)

Powered by AA batteries, you can get up to 8 hours of recording time with your new CGX Quick system. These attributes help reduce your set-up time and provide you with all the tools necessary to conduct your research studies.

Moreover, with the participants’ experience in mind, sensors were redesigned for faster set-up through the hair and increased comfort, for up to 60-minute recording sessions. This new and improved design has clear benefits for both the research technician, as well as the participant. As we’ve shown in previous webinars, the CGX headset can be self-donned, meaning that it is easy enough for the participant to apply all on their own without the assistance of a research technician.

For a closer look at getting started with the new Quick Systems, check out this video.

Are you thinking of upgrading?

Whether you recently purchased a Quick system from CGX or your local distributor, or were one of the early adopters of these headsets, we have attractive loyalty and trade-in offers to facilitate your upgrade to the newest product bundle. If your system (Quick-30 or Quick-20r) is less than 3 years old and you wish to upgrade, your newly purchased Quick-32r or Quick-20r v2 will be discounted. Similarly, you can trade in any Quick-30 or Quick-20r, regardless of condition, if you are ordering a new Quick-32r or Quick-20r v2. Trade-in pricing is determined based on the age of the Quick system.

If you’d like to know more about this product and these exceptional upgrade options, please contact us (via emailcontact form or chat) or your local distributor for more information or a product demo. Be sure to register for our upcoming webinar on introducing the new Quick systems and stay tuned for other upcoming online events.

Linking the neural basis of hierarchical prediction with statistical learning: The paradox of attention

How do you collect responses?

Very preterm birth and cognitive control: The mediating roles of motor skills and physical fitness

R-Nets with infants: a walkthrough

What happens in the brain of infants is especially interesting to developmental and neurocognitive psychologists. Up to now using EEG on infants, however, was a scientifically risky process with lots of dropout due to the long and – for the infant – uncomfortable preparation of the measurement. With the new R-Net system, the preparation time can be reduced to about five minutes – giving the scientists more time to collect good data.

In this guide, we provide a detailed walkthrough showing how to use the R-Net in studies with infants.

Overview

1. Before the participant arrives
a. Provide sufficient information to the caregivers 
b. Preparation of materials 

2. When the participant arrives
a. Inform the caregivers and make sure the infant is fine 
b. Fit the R-Net 
c. Adjust the cables 
d. Check impedances 

3. Starting the measurement
a. Instruct caregivers 
b. Record video of the experiment 

4. After the experiment
a. Show signal to caregiver 
b. Clean the equipment 


1. Before the participant arrives

a. Provide sufficient information to the caregivers

Often caregivers are not familiar with the EEG measurement and are afraid to participate in EEG experiments with their infants. When inviting the families, always make sure to provide an information sheet in which you explain the technique, state possible risks such as skin irritations, and list possible counter indications. You can also include pictures of an infant with the cap (make sure to have the consent of the family for this) so the caregivers know what to expect.

Some caregivers expect that bringing their infant to an EEG measurement will provide them with medical information about the infant. Thus, it is important to state that the measurement does not have a diagnostic purpose and that you cannot deliver any medical information.

In contrast to EEG measurements with adults, infants do not need to come with their hair washed. As they have little and thin hair, the measurement works well without it. With the R-Nets, the infants also do not need their hair to be washed after the experiment. It only gets a little wet, but you can easily rub it dry with a towel or use a hair dryer.

If you do not have the possibility to measure the head circumference of the infant yourself before the EEG measurement, you can use the pre-testing information to ask for it. Infants regularly visit the doctor and have their head circumference measured there – if the last measurement is not more than four weeks ago, you can use that number. Heads do not grow as fast as the rest of the body.

 back to top

b. Preparation of materials

Make sure to have the materials you need during the testing prepared. This should include:

  • an interesting toy for the infant in case you need distraction
  • a selection of infant-friendly videos you can play during the preparation
  • hook and loop fasteners or other material to fix the cap’s cables
  • a measurement band

You can already prepare the solution needed to prepare the R-Net. For this, fill the measurement cup labelled “electrolyte/water” with 1.5 liters of distilled water and add 1 teaspoon of potassium chloride (KCl) per liter (i.e. 1.5 spoons). Also, add a couple of drops of baby shampoo. Mix the solution.

 Tip: This video shows how to prepare the solution.

If you know the head circumference of the infant, you can put the R-Net of the correct size into the mix already. You know the size of the net from a small label close to the white plastic bar (this is called the “clamping block”). Take the size that is closest to the head circumference of the infant. If the circumference is in between two cap sizes, take the larger one so that the infant’s ears fit nicely into the cap. When you put the net in the solution, make sure to cover the splitter boxes with a towel, so they do not get wet. You can also store them on a shelf above the measurement cup to make sure no water drips on them. Make sure that the whole R-Net is within the solution. The net needs to soak for a minimum of 15 minutes but not more than 30 minutes. For example, you can put it into the solution 15 minutes before the scheduled appointment and set a timer for 30 minutes so that you know when to take it out of the solution if the family is running late. If you have other experimental tasks planned before the EEG measurement (something we would not recommend), make sure to plan the soaking time accordingly.

You can already prepare the disinfection solution. For this and for cleaning the R-Net, Brain Products officially recommends using distilled water; however, we’ve been using filtered water instead and so far, everything works fine in our lab with this alternative solution. (Be aware that this may increase corrosion of the material). Fill 1.5 liters of distilled/filtered water in the measurement cup labelled “disinfection”. Add the appropriate amount of your disinfectant and stir the solution.

 back to top

2. When the participant arrives

a. Inform the caregivers and make sure the infant is fine

When the families arrive, make sure to bring them to the experimental room as soon as possible and inform them there. This way, the infant can already get used to the experimental conditions that might be very unfamiliar (i.e., many distracting cables and equipment being around, different lighting conditions, etc.).

Put a nice, comfortable chair for the caregiver to sit on in front of the screen. Letting the infant sit on the caregiver’s lap might increase movement artefacts in comparison to letting them sit in an infant’s chair. However, the infant might feel more comfortable on the caregiver’s lap and be less fussy. Decide how to arrange infant and caregiver depending on the specific experiment. If you let the infant sit on the caregiver’s lap, make sure to have some pillows available to make it more comfortable for the caregiver if necessary. This way you can at least reduce movements from the caregiver.

If you do not have the head circumference of the infant yet, make sure to measure it and prepare the R-Net. As you best have two experimenters ready for the whole experiment, one can continue informing the caregivers, while the other is preparing the cap.

We usually bring a cap in a different size to show the caregivers the cap and explain again how it works. In the meantime, you can give the infant an interesting toy.

Ask the caregiver if they think the infant is fine. Offer the possibility to feed the infant. Often caregivers think that the infant will make it through the experiment, and they will feed the infant afterwards. However, it is better to have the infant as happy as possible before the experiment.

 back to top

b. Fit the R-Net

After 15 minutes of soaking, take the net out of the electrolyte solution and make sure to dry excess water. You do not need to be afraid that the cap will not work if it seems dry from the outside, if the sponges are soaked with the solution. On the contrary, having the cap too wet will lead to bridges easily. Wring out the chinstraps of the cap. They usually are also soaked with the solution, but them being wet is uncomfortable for the infant. So, make sure to get them as dry as possible.

During the preparation, play some infant-friendly videos so the infant is distracted and does not realize the cap immediately. This way you will also have the infant look straight, which will make it easier to fit the cap correctly.

At best, you should have two people fitting the cap. One will kneel in front of the infant and make sure to fit the cap close to the eyebrows of the infant. The other one will hold the back of the cap and make sure to place it over the head of the infant. It is particularly important to fit the cap as fast as possible, so the process does not bother the infant. It is worth training the team of experimenters on a Styrofoam head a couple of times before they start testing real infants, to make sure everyone exactly knows what to do. Changes in the team always worsen the results for some time as insecurities easily transfer to the mood of the infant.

Close the chinstraps of the cap and make sure that the white plastic bar (i.e. the clamping block) is approximately at the jaw of the infant. You might need to cut the foremost tube on the plastic bar as this might tickle the infant near the mouth, raising its attention towards the cap.

 Tip: This video shows how to exchange tubes of the R-Net to optimize its fit.

After you fit the R-Net, make sure that all the electrodes are straight and in contact with the head, meaning that it should be symmetrical on both sides. Cz should be centered between Nasion-Inion and the preauricular points. Especially the electrodes on top of the head are prone to being twisted; probably you need to adjust them manually.

 back to top

c. Adjust the cables

In your lab environment, make sure to be able to place the cables and boxes of the cap behind the infant. Anything that is in sight of the infant will provoke the infant to grab it. For example, you can lay the cable over the shoulder of the caregiver and fix it there. Fixing the cable will also limit the movement possibilities of the infant, leading to less movement artifacts.

R-Nets with Infants Setup

Image showing the typical setup in our lab at LMU Munich

 back to top

d. Check impedances

Plug in the amplifier cables into the boxes and start the impedance measurement. The R-Nets are capable to tolerate high impedances up to 150 kOhm. However, the BrainAmp amplifiers only can measure up to 100 kOhm. Therefore, you can work on the impedances until they are below 100 kOhm.

To work on the impedances, you can massage the cap with your hands. This way the infant does not attend to the cap as much as if you apply additional solution. In addition, as they have little hair, this often is enough.

 back to top

3. Starting the measurement

a. Instruct caregivers

Before finally starting the measurement, instruct the caregiver about their expected behavior during the measurement. You probably cannot stop the infant from moving but ask the caregiver to remain as still as possible. If the infant does not need their hands during the task, ask the caregiver to gently hold onto the infant’s hands. This way you reduce the possibility for the infant to grab and pull the R-Net.

 back to top

b. Record video of the experiment

R-Net with Infants Video Recording

Screenshot from Video Recording

Make sure to record a video of the infant during the experiment. In the best case, the video is already time-locked to your EEG recording.

For example, you can use BrainVision Recorder with a video recording add-on to simultaneously record a video and the EEG signal. This way you can easily code whether the infant was attentive to the screen in the respective trials. You can also monitor the infant’s attention live and trigger attention getters or breaks during the experiment.

The regular use of attention getters during the experiment is extremely helpful. Usually, right after the attention getters, the infant is attentive and still for a couple of seconds, giving you a much better signal for this period.

 back to top

4. After the experiment

a. Show signal to caregiver

If possible, you can show the EEG signal to the caregiver after the experiment. They will not see much in it of course, but knowing how it looks like is mostly interesting to the caregivers. You can also offer screenshots of the video you recorded so that they have a picture of the infant with the cap. This will increase the compliance to your study, lab and further experiments.

 back to top

b. Clean the equipment

After taking the cap off the infant, wrap the splitter boxes in a towel and put the cap into the measurement cup labelled “disinfection” that you already prepared before the experiment. Make sure all of the cap is completely under water and let it sit for about 10 minutes. In the meantime, you can already clean the measurement cup labelled “electrolyte/water” and fill it with one liter of distilled water. After the disinfection, place the cap into this measurement cup and wait for one more minute. Move the cap around a little bit to make sure that all disinfectant solution is washed out of the cap. This way you can reduce the risk of skin irritations with the next use. Put the cap for another minute into fresh water for two more times. Afterwards, hang the cap to dry. Make sure that the boxes are stored higher than the cap so that no water can drip into the boxes.

Extend your BrainVision Analyzer 2 to its full potential with Solutions

Are you looking for extensions for BrainVision Analyzer 2? They are called Solutions! Scientists from various fields of research use them to tweak Analyzer to their needs. Analysis of non-EEG sensor data, sleep data, single trials and time-frequency domain exports are only some examples where users can benefit from our solutions.

In the following we present our most popular solutions and show how they add valuable functionality to Analyzer 2. We will start with general remarks and installation instructions and continue with some selected use-cases dedicated to specific analysis needs. At the end we present generally useful solutions that many of our users can profit from.


General remarks and installation instructions

BrainVision Analyzer 2BrainVision Analyzer 2 is appreciated for its easy-to-use yet powerful signal processing ability. Analysis pipelines created from the rich collection of Transformations cover most analysis needs while being extremely memory efficient. As scientific methodology is rapidly evolving, occasionally researchers will miss a function or method that is not implemented as a transformation. At Brain Products Scientific Support, we offer a variety of extensions to Analyzer 2 that fill in this gap. We call them Solutions.  They are usually created in first response to a frequently needed functionality and over time, we have grown a significant library of them. Solutions are free of charge for any Analyzer 2 user. Once they are installed you can use them almost as any other transformation.

Most popular solutions can be directly downloaded from our website. You have the option to download all of them at once or only individual ones. Either way you only need to run the installer and open Analyzer 2 to have them in your Solutions ribbon menu (see below).

Solutions Ribbon Menu in BrainVision Analyzer

If Analyzer 2 was already open, click Solutions > Help > Refresh Solutions to see them. Under the Solutions Help menu you will find the documentation for each of them. You can read more about how to use the Solutions Help Explorer in the Support Tip “Have you located the Solutions help documentation for Analyzer 2?”. If you are struggling with a certain task in Analyzer 2, our Scientific Support is always happy to help. We might send you a solution that is not available on our website, in this case you receive a solutions file (*.vaso), that you need to add to a subfolder of the Solutions directory on your Analyzer 2 installation path. The default path is: C:\Vision\Analyzer2\Solutions.

One word on macros – yes, solutions are basically compiled macros. You can add your own functions to Analyzer 2 by writing a Sax Basic macro and running it through the Macro ribbon menu. This topic will not be covered in this article. You can find more information on it on our website.

Solutions to help with specific analyses

Sensor Data

If you are working with signals from non-EEG sensors, we offer a range of solutions that you might find useful. For instance, you can analyze acceleration data, ECG profiles, EMG or GSR Peaks, Pulse transit times and width with the help of solutions. We have recently described how you can do that in our Support Tip “Offline analysis of sensor data in BrainVision Analyzer”.

Sleep

If you are a sleep researcher, you might be interested to know how to score and use sleep stages in Analyzer 2. Our Sleep Scoring solution allows you to manually score your data or inspect and edit imported sleep scores. For this purpose, we introduced the SleepStage marker dedicated to sleep research. Its description indicates the type of physiological state i.e., sleep stage N1, N2, N3, N4, REM, or Wake or the absence of a score (None). The Sleep Scoring solution recognizes these markers and allows you to edit them.

You can navigate the sleep data in steps of the desired scoring interval (typically 30 seconds). The frequency-spectra or topography can be displayed simultaneously to support the scorer. A full night hypnogram displaying all scores can be opened in a Microsoft Excel® sheet. Once scoring is finished, a Sleep Report can be generated. It summarizes important sleep parameters such as sleep latency, sleep efficiency, duration of stages and composition of sleep cycles. The approved scores remain as markers in the dataset and can be used for Segmentation in a sleep-informed analysis in Analyzer 2. This solution is available on request.

If you are interested in a demonstration of the solution, feel free to watch the recording of our webinar about  “Sleep Research and Sleep Scoring Solution”.

Analyzer Solutions: Figure 1 Sleep Scoring in Analyzer 2 with the display of the frequency data of the current scoring interval.

Figure 1. Sleep Scoring in Analyzer 2 with the display of the frequency data of the current scoring interval.

Single trial analysis

Analyzer 2 was originally designed for ERP analysis. Many ERP studies need to extract features of ERP components only after performing an Average across trials. Analyzer’s transformations and exports are designed for this approach and offer feature extraction from averaged data. Solutions add the possibility to also perform single trial analysis.

Analyzer Solutions: Figure 2 Stacked Plot view: the solution utilizes the time-frequency view in Analyzer to show multiple stacked trials. For this reason, the label of the ordinate is showing Hz instead of trial number.

Figure 2. Stacked Plot view: the solution utilizes the time-frequency view in Analyzer 2 to show multiple stacked trials. For this reason, the label of the ordinate is showing Hz instead of trial number.

The Peak Detection transformation for example, detects peaks on averaged data and the MinMax Marker solution on a single trial level. It allows you to place Peak markers at the maximum and/or minimum in a certain interval of each segment.

Likewise, where Area Information and Peak Information Export modules work on averaged datasets, you can export data from single trials with the solutions described in the section Solutions for exports below. There are solutions for time, frequency, or time-frequency domain exports.

Before the detection and export of peaks, it often makes sense to assess the ERP on a single trial level. For example, to estimate the variability of components across segments or to visually inspect a set of trials at once. You can do this with the Stacked Plot solution. It will display all segments stacked on top of each other (see Figure 2). The solution is using the time-frequency view in Analyzer 2 to display trial number on the ordinate and time on the abscissa. Amplitude is shown on a color scale that can be configured through the view settings.

Solutions for exports

Our most popular solutions are exports. These provide exports of time, frequency or time-frequency domain data from averaged or single-trial history nodes. For a detailed overview about exports please read our Support Tip “Exports for all occasions – A selective overview of Analyzer 2’s most useful export options”.

Peak Export: Despite its name the solution exports not only various peak measures such as amplitude, latency, peak-to-peak distance and the area under the curve but also average amplitudes within a time interval. This is the go-to solution if you need to export single trial, time domain data.

Analyzer Solutions: Figure 3 Interface of the Wavelet Data Export solution.

Figure 3. Interface of the Wavelet Data Export solution.

FFT Band Export: Like the Peak Export this solution can be used to export single-trial data from a selected frequency range. It allows you to export individual values as well as many aggregation options such as area under curve measures, average or the raw sum.

Wavelet Data Export: If you are using Wavelets to decompose your data into the time-frequency domain this solution is essential. You can specify a time and frequency range (see Figure 3) and then export the sum or average of this area to a text file. It is applicable for both real (e.g. power) or complex values. It is now also possible to export into a file with comma separated values (*.csv), making the transfer to other software even easier.

Create MAT File: Complementary to the Matlab transformation, that interfaces directly with the application, this solution allows you to create a MATLAB® compatible file (*.mat) with the data of the current history node. For each channel a separate variable is created. Note that time-frequency domain data cannot be exported with this solution.

Solutions for parameter extraction

Analyzer Solutions: Figure 4 Example output of the Write Markers solution.

Figure 4. Example output of the Write Markers solution.

The Write Markers solution has found many useful applications despite its simplicity. It collects basic information of selected markers and writes them to an external text file. Marker information includes DescriptionTypePositionDuration and Amplitude at the marker position. An example file is shown in Figure 4. It is generated from a continuous dataset and only includes selected output information. If the dataset is segmented, each row in the file corresponds to a segment. If it is continuous, you can specify a marker that will trigger a new line or row. In this example the “S_20” marker was used. In the exported file, the position of this marker is reset to zero. The position of all other markers in the same row is exported as the distance to the previous “S_20” marker. This feature allows you to inspect marker placement and to plan a marker-based segmentation before actually implementing it.

This export is also quite useful to extract:

Reaction time from markers

Often the distance between a Stimulus marker and a following Response Marker indicates reaction time. In the example export in Figure 4 the Stimulus marker triggers a new row and the position of the following Response marker indicates reaction time. Note that the Position can be exported in milliseconds or in data points.

Peak frequency

Analyzer Solutions: Figure 5 Example of the MinMax Marker solution applied to FFT (frequency domain) data to identify peak frequencies.

Figure 5. Example of the MinMax Marker solution applied to FFT (frequency domain) data to identify peak frequencies.

For most spectral analysis, frequencies of interest must be defined. For example, when the individual alpha frequency (iAF) is of interest, the peak of the alpha band of each subject needs to be detected and exported. The peak of the alpha band or another frequency band can be exported when the solution is used in combination with the MinMax Marker solution (see Figure 5). The MinMax Marker solution finds the largest (or smallest) value in a dedicated frequency band and inserts a Peak marker. The Write Markers solution exports the Magnitude and Frequency of the Peak marker. Both solutions can be applied to segmented or averaged frequency domain data.

List rejected segments

A common preprocessing step for EEG or ERP analysis is the detection and rejection of data containing an artifact. In Analyzer 2, a Bad Interval marker is used to indicate them. You can detect artifacts with Raw Data Inspection or Artifact Rejection automatically. Segments that contain Bad Interval markers can be rejected directly within Segmentation or can be automatically ignored by other transformations such as the Average. Often it is important for researchers to get an overview of segments that contain artifacts and are rejected from the result. Such a list can be exported with the Write Markers solution by exporting the Bad Interval markers.

Other popular solutions

Recode Markers: If you are interested to explore the relationship between the behavioral response and the EEG, this solution might be worth noticing. It allows you to select a group of segments based on the temporal distance between markers. Typically, the distance between a Stimulus and Response marker is used to reflect the reaction time. Available groupings are Median/Mean Split, Mean ± SD, Upper/Lower percentages, Middle fraction and more. It is also possible to define your own fractions in percentage or time range. The inserted marker (Type “Comment”) can be used within Segmentation to create a group ERP. Additionally, statistics such as the Mean, Median and Standard Deviation (SD) of the marker distances (e.g. reaction time) are reported in the Operation Infos of the Marker Recode history node.

Set Markers: Have you ever been in need to add a marker to your dataset? Maybe because you realized only after recording that it is needed or because it was simply forgotten. Of course, you can add or edit markers with Edit Markers transformation, but if you need to place a marker in a fixed distance to another existing marker this solution will help. It allows you to insert markers with a fixed or randomized temporal distance to all markers of a selected Type and Description. This solution is available on request.

Read Coordinates: If your dataset is lacking electrode coordinates but they are available in an external file, you can load it with this solution. You can specify the type of coordinates used (cartesian or spherical) and instruct the solution where to find the information in the file. This makes it possible to read from any electrode coordinate file if it is in a compatible text form. The solution also converts coordinates. In Analyzer 2 only spherical coordinates are used and if yours are specified in cartesian it will convert them. This solution is available on request.

Moving Average: Some analyses require to estimate the envelope of the signal, for example in EMG analysis. The Moving Average solution can be used to smooth the data, similar to a low-pass filter. Each value is replaced with the average of a time-window centered on the current data point. It offers some extra options such as rectification before or subtraction instead of replacing with the average.

Concluding remarks

To grasp the full processing power and skill of Analyzer 2 it is good to know Solutions and the range of functionality that they add to it. This article provides a glimpse of the full spectrum that is available with the solutions that are useful for most researchers. If you are stuck with your analysis and need to advance your pipeline beyond what you can do with transformations our tip is: browse through our solutions and find out whether they can help you. If you don’t find anything that fits your pipeline, contact us, we have more! Please get in touch with us via support@brainproducts.com and we will do our best to find a solution for you.

The sound of silence: an EEG study of how musicians time pauses in individual and joint music performance

Combining EEG and eye tracking: a workflow for your lab experiment

Combining EEG and eye tracking can open new possibilities for your EEG analysis. If you would like to add eye tracking to your EEG setup but are unsure how to implement this, we have great news for you: Thanks to our new cooperation with Tobii Pro, Brain Products now offers complete out-of-the-box solutions for simultaneous EEG and eye tracking!

Abstract

This article intoduces how you can combine your EEG measurements with simultaneous eye tracking. We offer a full example workflow for a specific lab-based setup, while pointing to generally important aspects for a successful combination of EEG and eye tracking. For our setup, we are using the software Tobii Pro Lab for experimental control, the Tobii Pro Spectrum for recording eye tracking data, and the actiCHamp Plus to record EEG data in combination with our Photo Sensor. In the workflow, we describe how you can design your experiment while setting up shared event markers, how to perform the combined recordings, and how to merge both data streams in BrainVision Analyzer 2.

Boost your EEG research with simultaneous eye tracking!

In the last decade, the combination of eye tracking with measures of brain activity like EEG or fMRI has increased. But why would we want to take a closer look at the eyes when investigating brain activity?

Eye tracking offers two major sources of information:

The position of a person’s gaze gives us insight into the “open” focus their attention, and this information can be highly valuable for EEG research. With this gaze information, you will be able to tell if participants are focusing their attention on a target, when exactly their focus arrives and for how long it stays until shifting elsewhere. This will let you identify trials in which the participant was not paying attention and discard them from your analyses. Most importantly though, you gain precise timing information for your EEG analysis. Event-related potentials (ERPs) can, for example, be calculated with respect to the fixation on a stimulus (i.e., Fixation-Related Potentials), instead of the mere stimulus appearance on the screen. Research addressing topics like attentional processes, visual search, reading or social perception can highly benefit from gaze information.

Changes in pupil size can inform about cognitive and emotional experiences. The pupil reacts with short dilations (in the second range) to different stimuli. These “pupil responses” are a very sensitive physiological measure, and their magnitude reflects the intensity of the undergoing cognitive/emotional processes. Thereby, stimuli that are more emotionally arousing, or that demand higher cognitive effort cause larger pupil responses. By analyzing them, you may be able to check if your experimental manipulation was successful, or to even follow cognitive or emotional processes dynamically throughout your experiment. In combination with EEG, you could, for example, use the magnitude of pupil responses to weigh or categorize different trials in your experiment.

Adding eye tracking to your EEG setup will hence open a range of new possibilities for your research!

Figure 1. Combined EEG and eye tracking setup in a laboratory setting

Figure 1. Combined EEG and eye tracking setup in a laboratory setting

A workflow for your lab-based EEG & eye tracking experiment

Combining two different measures like EEG and eye tracking can be technically challenging, especially if they should be temporally aligned and analyzed together. The key here is setting up shared event markers/triggers that will appear in both the EEG and the eye tracking data at the same time. Note that not all trigger signals need to be shared. It is enough to have a few (at least two) shared event markers to align the data sets after recording. It is however crucial that an equal number of the shared events appear in both data sets, and that they mark common points in time. Therefore, we need to plan our setup and experiment with these shared trigger events in mind.

Event markers are usually generated by the software used for experimental control (like E-Prime®Presentation®Psychtoolbox or Tobii Pro Lab‘s Designer module). There are many ways to pass them on to your EEG and eye tracking recordings. The best setup for you will depend on your experimental software and the properties of the computer, EEG amplifier and eye tracker you are using.

Here, we want to show you one concrete example for setting up simultaneous EEG and eye tracking recordings. We are going to explain how you can record high-quality data in a lab-based setup using:

Figure 2. Example workflow for simultaneous EEG and eye tracking recordings

Figure 2. Example workflow for simultaneous EEG and eye tracking recordings

1. Design your experiment and set up shared event markers

Naturally, the workflow needs to start with designing and planning your experiment. If you use the Tobii Pro Lab software for the experimental design, it will allow you to set up the timeline of your experiment in a very intuitive way. Make sure the timeline always starts with a calibration and validation routine to accurately map and record gaze data. Next, you can add all sorts of stimuli to the timeline, e.g., pictures, text elements, videos, or groups of stimuli. You can find an introduction video on how to create a screen-based study with Tobii Pro Lab here, and further useful information here.

When designing your experiment, you need to set up shared event markers that will allow you to temporally align EEG and eye tracking data after recording. Note that you will need at least two markers of the same type appearing at the same time in both data streams. For example, you can send the first synchronization marker a few seconds before your task begins, and the last one a few seconds after the task finishes. This way your synchronization markers span the whole experiment, and you can align the EEG and eye tracking data sets completely.

(a) Marking events in the eye tracking data

The Pro Spectrum eye tracker can receive TTL trigger signals. However, in this specific example, we are using Tobii Pro Lab not only to present stimuli, but also to record eye tracking data. Therefore, all presented stimuli will be marked automatically as “Events” in the eye tracking data and you don’t need to worry about triggers.

(b) Marking events in the EEG data

To mark stimulus events in the EEG data, TTL hardware triggers are usually the preferred solution because they offer highly accurate timing. Tobii Pro Lab can send TTL pulses to mark stimulus events if your computer has a parallel port card available. However, for this scenario we will assume that you are working with a laptop that has no parallel port.

With a small workaround, you can still precisely record the stimulus timing in your EEG data by using a Photo Sensor. This small accessory detects changes in brightness that can be recorded alongside your EEG data. Simply attach the Photo Sensor to one corner of the presentation screen and modify your stimuli in a way that they differ in brightness in this very corner (see Figure 3). This way, the photo sensor will detect a change in brightness every time the next stimulus is presented. During later analysis, you can identify the stimulus onsets from the Photo Sensor signal. The timing of this solution is very precise because stimuli are detected by the Photo Sensor exactly when they appear on the screen.

 Tip: If you want to directly generate trigger events from your Photo Sensor, you can combine it with the Brain Products StimTrak! The StimTrak can convert the Photo Sensor signal into trigger pulses and pass them on to your EEG recording where they will appear as event markers. See this article for more information.

 Tip: If you want to identify different kinds of stimuli from your Photo Sensor signal, you can modify your stimuli with different shades of grey (this article offers a more detailed description).

Figure 3. Using a Photo Sensor to detect stimulus onsets. In this example, two checkerboard stimuli (A and B) are shown alternatingly on the presentation screen. Only Stimulus A displays a bright square in one corner. If the Photo Sensor is attached in this corner of the presentation screen, it will detect the change in brightness at every onset and offset of Stimulus A. During later data analysis, the photo sensor signal can be used to derive stimulus markers with very precise timing

Figure 3. Using a Photo Sensor to detect stimulus onsets. In this example, two checkerboard stimuli (A and B) are shown alternatingly on the presentation screen. Only Stimulus A displays a bright square in one corner. If the Photo Sensor is attached in this corner of the presentation screen, it will detect the change in brightness at every onset and offset of Stimulus A. During later data analysis, the Photo Sensor signal can be used to derive stimulus markers with very precise timing.

Here are a few additional options for alternative setups:

 Find a support article here about sending TTL trigger pulses, or read about our TriggerBox for sending trigger signals via USB port.

 If your EEG amplifier and your eye tracker have trigger ports and you can send TTL pulses, you can share the exact same triggers among both devices. Either split the trigger signal with a Y-cable, or use the practical trigger mirroring function of our actiCHamp Plus: this amplifier can receive 8-bit triggers and can immediately pass them on to your eye tracker!

 If you are using E-Prime® for presenting your experiment and have a screen-based Tobii eye tracker, you may be interested in the E-Prime extension for Tobii Pro Lab.

2. Prepare the eye tracking recordings

To set up your eye tracking recording, your Spectrum eye tracker needs to be connected and correctly set up in Tobii Pro Lab (find more information here). Once this is done, you will find everything you need in the “Record” tab of Tobii Pro Lab. Here, you should pay special attention to the sampling rate (or “sampling frequency”) with which you are recording the eye tracking data (click on the eye tracker symbol in the top left corner). Higher sampling rates allow you to assess not only fixations, saccades and even micro-saccades (see this article), but they also allow you to record the stimulus events with more temporal precision. Therefore, higher sampling rates are better for a more precise synchronization with the EEG data.

It is also important to set up the stimulus markers in Pro Lab with the highest temporal precision. You may encounter delays between the stimulus marker being registered in Pro Lab, and the stimulus actually appearing on the presentation screen. To reduce such delays, please make sure that the computer running Pro Lab matches the required specifications, and carefully follow these important tips to optimize your stimulus timing in Pro Lab. To find how you can determine this delay in your setup, and how you can account for it during recording, you can take a look at this Timing Guide.

Before recording data with an actual participant, you will need to run at least one test recording of your final task and make sure your current setup and the available stimulus events let you analyze everything of interest in Pro Lab’s “Analyze” tab. If all events are marked in Pro Lab and you are satisfied with their timing, you are all set for the eye tracking recordings.


3. Prepare the EEG Recordings

To prepare your EEG recordings, you will need to set up the actiCHamp Plus with the PowerUnit, and connect the Photo Sensor to one of the amplifier’s AUX channels. When preparing your workspace in BrainVision Recorder, make sure to also set up the respective AUX channel for recording the Photo Sensor signal. For the EEG data, we can use a higher sampling rate (for example 2000 Hz) to have a high temporal precision of the signal and a good synchronization with the eye tracking data.

When everything is set up, you will need to identify the correct position for the Photo Sensor on the presentation screen. For this, briefly start a test run of your experiment and attach the Photo Sensor to the monitor with an adhesive ring. Next, run a test EEG recording to make sure you can identify all necessary stimulus events in the recorded Photo Sensor signal. Present the full experiment while recording, then load the data in BrainVision Analyzer 2. If your setup contains the Photo Sensor in combination with a StimTrak, the stimulus events should already be marked in your EEG data. Otherwise, you can now use the “Level Trigger” transformation. Here, you can identify the optimal threshold value for your Photo Sensor data and extract the stimulus events from the Photo Sensor channel (see Figure 4).

Figure 4. Identify the stimulus onsets from the photo sensor channel with the Level Trigger transformation.

Figure 4. Identify the stimulus onsets from the photo sensor channel with the Level Trigger transformation.

Keep in mind that the shared synchronization events need to appear at the same time in both EEG and eye tracking data, and that there need to be an equal number of synchronization events present in both data sets. If necessary, you can use the “Edit Markers” transformation to rename or modify some events in your EEG data.

4. Record EEG and eye tracking data simultaneously

Now you are ready for the real data acquisition! Set up the EEG system and cap, use the prepared workspace and the Photo Sensor. To get ready for the eye tracking recordings, load the correct experiment in Tobii Pro Lab. Then have the participant sit in front of the eye tracker and presentation screen at the optimal distance. After double-checking that all settings are correct (see section “2. Prepare the eye tracking recordings” above), you can enter a name for your participant, and the “Record data” button will become available in Tobii Pro Lab.

When starting the recording in Tobii Pro Lab, follow the calibration and validation procedure until you are satisfied with accuracy and precision. Before you start the actual task, make sure to start your EEG recordings in time for the Photo Sensor to capture the first synchronization marker. Always keep an eye on the data streams in BrainVision Recorder and Tobii Pro Lab to make sure all data is recorded smoothly. When the task is finished, again make sure the Photo Sensor captured the last synchronization event before stopping the EEG recording.

 Tip: Make sure you do not stop or pause the EEG recordings before the task is fully finished, so you can later align the EEG and eye tracking data sets!

5. Analyze the eye tracking data

Now it’s time to analyze your eye tracking data in Tobii Pro Lab’s “Analyze” tab. It is good practice to start with some quality control (reviewing the recording and checking for data loss). Then you will be able to perform all kinds of analyses, export metrics or create graphics from your recorded gaze data. What may be most relevant for your combined EEG and eye tracking analysis is to identify times of interest or fixations in areas of interest in your eye tracking data.

When you are done with your eye tracking analysis, you can export the gaze and pupil data together with all identified event markers and import them into your EEG data. For this, use the “Data Export” option in Pro Lab and export the data in the Pro Lab Output File (PLOF) format.

6. Identify the event markers in your EEG data

After a brief quality control, you can extract all stimulus events from the Photo Sensor channel by using the “Level Trigger” transformation with the previously tested settings (see section “3. Prepare the EEG recordings” above). If necessary, modify the resulting markers so you can clearly identify the synchronization events that should be shared with the eye tracking recording.

 Tip: Be careful with segmenting your EEG data before importing the eye tracking data to make sure you don’t lose important synchronization markers!

7. Merge both data sets for combined EEG and eye tracking analysis

Finally, you can import the eye tracking data and the events you identified in your eye tracking analysis into your EEG recordings. At this time point, the sampling rates and the length of both data sets will likely be different, but BrainVision Analyzer 2 will now use the shared synchronization events to bring both data streams to the same timeline.

To merge the EEG and eye tracking data, open the EEG data containing the identified synchronization events. Next, use Analyzer’s Add Channels transform and select the previously exported eye tracking file under Import files. In the next window, you will need to select the shared synchronization markers which will be used to align both data sets. For the EEG data, they can be selected from the Markers in Active Node list, for the eye tracking data from the Markers in Import File list. If you click on the Details button, you will see if there is an equal number of synchronization markers in both data sets.

In the following dialogs, you will be able to select the specific channels and markers you would like to import. Finally, when you finish the Transformation, the eye tracking channels will appear underneath your EEG channels, and all selected event markers will be imported.

 Tip: You can find a full description of how to use the Add Channels transform in the BrainVision Analyzer 2 User Manual, and more information about its latest enhanced features here. However you can also always contact our Scientific Support team if you need help with BrainVision Analyzer 2.

Now that both data streams are temporally aligned, you can start analyzing them together! As mentioned in the introduction, you can discard data during which the subject was not focusing on areas of interest. Finally, you can also segment your EEG data based on fixations or other events you identified in your eye tracking analysis, and you can calculate Fixation-Related Potentials.

Conclusion

We hope this article provided you with helpful guidelines for your lab-based EEG and eye tracking setup, and that we could walk you through the most important steps for your recordings and analysis. Keep your eyes open for more articles as well as dedicated online events about our new eye tracking solutions!

Introducing Tobii hardware and software: the perfect complement to your EEG and eye tracking research

To provide solutions for neurophysiological researchers, we are always staying current and up to date on integrating EEG with complementary methods to equip scientists with the most comprehensive solution for understanding the relationship between brain and behaviour. These combined methods have been a focus for Brain Products, whether it be via extra physiological measuresEEG-fMRI solutions or EEG-fNIRS combinations. As an extension of our multimodal offering, we’ve partnered with Tobii Pro to offer you high-grade screen-based and wearable eye tracking systems for your combined EEG & eye tracking research.

Are you interested in adding eye tracking to your EEG experiments? Whether you are conducting research in cognitive psychology, vision sciences or real-world applications, we offer a range of devices to fit your research needs.

Screen-based eye trackers

For stationary experiments with the actiCHamp Plus or BrainAmp, we are pleased to offer a range of screen-based eye trackers. Capturing gaze data up to 1200Hz, the Tobii Pro Spectrum offers advanced triggering options with superior data quality. It’s designed for lab-based research in the vision sciences, as well as studying eye movements from fixation-based studies to micro-saccades. Another high precision eye tracker, which can track the pupil in both light and dark conditions, the Tobii Pro Fusion, is designed to collect data in a variety of environments (e.g. hospitals, libraries, schools etc). A much smaller and lightweight screen-based eye tracker, which is designed for fixation-based studies, the Tobii Pro Nano, is fully portable and provides the ideal setup for educational and teaching purposes.

Tobii Pro screen-based eye trackers

From left to right: Tobii Pro Spectrum, Tobii Pro Fusion and Tobii Pro Nano

Wearables

Perfectly paired with our mobile LiveAmp, the wearable Tobii Pro Glasses 3 allow you to conduct behavioural EEG and eye tracking research in a variety of real-world settings. Delivering accurate gaze data from naturally moving participants, these glasses come equipped with 4 cameras, 16 illuminators, and a full HD resolution scene camera with 106° field of view. This sleek setup, together with our low-profile, actiCAP slim, electrodes provide an outstanding solution for all of your mobile EEG and eye tracking research applications, whether it be MoBI, neuromarketing or sports psychophysiology.

Tobii Pro wearable eye tracker

Tobii Pro Glasses 3

Software

Together with Tobii Pro eye trackers, Tobii Pro Lab software provides the complete solution for researching human behaviour. A user interface and dedicated software features efficiently guide and support you through all the phases of an eye tracking experiment from test design to recording and subsequent analysis. Once your EEG and eye tracking data are recorded, use the new Add Channels transform in BrainVision Analyzer 2.2.1 to synchronize and align your data streams before further analysis.

Tobii Pro Lab Software

Tobii Pro Lab Software

Effects of transcranial static magnetic stimulation over the primary motor cortex on local and network spontaneous electroencephalogram oscillations

The neuronal associations of respiratory-volume variability in the resting state

BESA Research 7.1 March 2021 released

BESA Research 7.1 March 2021 is a maintenance release. All customers with a valid license for BESA Research version 7.1 are eligible for a free update to this version.

This release features a lot of improvements and bug fixes. Please make sure to update to this version as soon as possible on www.besa.de/downloads/besa-research/besa-research-7-1/.

Data review and pre-processing:

  • Batch commands – Many new or enhanced batch commands are available, including improvements for automated pattern search, visualizing results, drawing maps, saving screen shots, scaling of data, etc.
  • Data export improvements
  • New source montages including the new 25 source standard (cf. Scherg et al., Front. Neurol., 20 August 2019, https://doi.org/10.3389/fneur.2019.00855), and atlas-based source montages (see picture above for an example of atlas region sources).

Source Analysis:

  • The time-domain beamformer can now be used to compare two conditions. The target and control conditions can be selected in the dialog of the ERP module that initiates the beamformer calculation.
  • A single dipole fit can now be started directly using the Start Fit button, without having to place a dipole source first.
  • The Bayesian source imaging method SESAME is now available for all head models. Before, it was restricted to spherical models.
  • Beamformer and DICS can now be used with MEG finite element and boundary element models.
  • It is now possible to add noise sources to a solution, in order to generate source montages. They can be selected from several pre-defined source configurations, and only sources with a certain distance from existing sources will be added in order to describe brain activity that is unrelated to the activity of interest. The functionality is available from the Solution menu.

The full list of improvements and bug fixes can be seen on https://www.besa.de/downloads/besa-research/besa-research-7-1/ in the section on New Features and Bug Fixes.

Is it all in the knee?

Patellofemoral pain (PFP) is considered a mechanistic pain syndrome, originating from kinetic, anatomic or biomechanical dysfunction leading to nociceptive pain.

However, some data shows that not all pain expressions in patients with PFP can be causatively connected to a biomechanical impairment.

Researchers from the School of Health and Rehabilitation Sciences, The University of Queensland, Australia, have endeavored to clarify whether patients suffering from PFP have local or centrally altered sensory profiles.

Profiling patients vs. controls

One-hundred-and-fifty patients with PFP were recruited along with sixty one controls.
Quantitative sensory testing (QST) was performed on the most painful knee and on a remote site: the contralateral lateral epicondyle of the elbow. QST consisted of: mechanical and thermal sensory and pain thresholds, pressure pain thresholds (PPT), as well as mechanical temporal summation and conditioned pain modulation (CPM) with PPTs as test stimuli and cold pressor as the conditioning stimulus.

Medoc’s Pathway ATS, TSA2’s predecessor, was utilized for all thermal thresholds.

Questionnaires on kinesiophobia (TSK), self-efficacy (FESQ), catastrophizing (PCS), and anxiety and depression (HADS) were administered.

What was found

Interestingly, cold and heat pain thresholds were significantly lower for the patient group compared to the controls, both at the knee and the elbow, hinting at central sensitization. There were similar findings for the mechanical pain and pressure pain thresholds, but not for the sensory thermal/mechanical thresholds.
In pain modulation measures of temporal summation and CPM only temporal summation was significantly increased for the patient group.

Additional to this, higher prevalence of anxiety, depression and pain catastrophizing was found in the patient group as compared to the controls.

To conclude

The authors conclude that “Our discovery of thermal hyperalgesia offers new insight in terms of PFP mechanisms. Multi-modal hyperalgesia locally and at a remote site (elbow), reflected by greater sensitivity to heat, cold and pressure pain in our PFP group, could be construed as evidence of nociplastic pain.”
Physicians, physiotherapists and other clinicians treating patients with patellofemoral pain should take into account physiological, pain modulatory, and psychological changes, in order to holistically treat their patients.

Reference:

Maclachlan, L. R., Collins, N. J., Hodges, P. W., & Vicenzino, B. (2020). Psychological and pain profiles in persons with patellofemoral pain as the primary symptom. European Journal of Pain, 24(6), 1182-1196.

Touching to Feel: Brain Activity During In-Store Consumer Experience

Top-down control of visual cortex by the frontal eye fields through oscillatory realignment

Individual Differences in Working Memory and the N2pc

Artifact Reduction in Simultaneous EEG-fMRI: A Systematic Review of Methods and Contemporary Usage

Inverse effects of time‐on‐task in task‐related and task‐unrelated theta activity

Anodal tDCS modulates specific processing codes during conflict monitoring associated with superior and middle frontal cortices

Cerebral functional networks during sleep in young and older individuals

The challenge of learning a new language in adulthood: Evidence from a multi-methodological neuroscientific approach

Skip Nav Destination Decoding Neural Representations of Affective Scenes in Retinotopic Visual Cortex

The relationship between EEG and fMRI connectomes is reproducible across simultaneous EEG-fMRI studies from 1.5T to 7T

Disturbed temporal dynamics of episodic retrieval activity with preserved spatial activity pattern in amnestic mild cognitive impairment: A simultaneous EEG-fMRI study

Excitatory–inhibitory balance within EEG microstates and resting-state fMRI networks: assessed via simultaneous trimodal PET–MR–EEG imaging

Neurophysiological correlates of interference control and response inhibition processes in children and adolescents engaging in open- and closed-skill sports

Modulation of epileptic networks by transient interictal epileptic activity: A dynamic approach to simultaneous EEG-fMRI

BESA Research workshop scheduled

Tapentadol treatment results in long-term pain relief in patients with chronic low back pain and associates with reduced segmental sensitization

Nociception testing during fixed-wing ambulance flights.

thermal taster

Do you know what your thermal taster status is?

Thermal taster status (TTS) is a phenomenon in which thermal stimulation of specific areas of the tongue, causes a sensation of a distinct taste in the absence of a gustatory stimulus. Reports vary on what percentage of the general population is a thermal taster, occurrence of thermal tasters in research cohorts of between 20% and 50% has been reported.

thermal taster

Not all Thermal Tasters taste alike

Even within the group of thermal tasters, there are subgroups. These groups differ from one another in responsiveness to thermal stimuli in different areas of the tongue and the phantom taste that each type of stimulation arouses. Green and George report that “thermal sweetness” is a common taste occurring in half the thermal tasters in response to warming after the tongue was cooled, while Skinner et al. reported 25% of tasters tasting “bitter” while another 25% tasting “sour” in cooling trials.

How to assess Thermal Taste

In general, TTS is assessed by applying a thermode with a warming and a cooling stimulus, as each temperature change direction and specific temperatures elicits a different taste sensation in thermal tasters. Thermal taste is classically tested on the tip of the tongue, and some studies report findings from areas lateral of the tip or the back of the tongue. Several studies on thermal tasters have used Medoc’s Pathway 16*16 mm thermode or the Intra-oral thermode. An example of a testing protocol for TTS could be found in Eldeghaidy et al.’s study in which both warming trials and cooling trials were applied. A warming trial would start at 35°C, cooled down to 15°C and go up to 40°C, and held there for 10 sec., with a ramp of 1 °C/sec.

Tongue taste innervation

In Thermal tasters, the anterior part of the tongue, innervated by the chorda tympani nerve, shows a typical reaction to heating and to cooling, while the posterior part of the tongue, innervated by the glossopharyngeal nerve, reacts less typically, Cruz and Green found[7]. Thermal taster status, along with another measure, 6-npropylthiouracil (PROP) taster status, form the taste phenotype.

Do fungiform papillae matter?

A hypothesis existed that the fungiform papillae of the tongue would be responsible for thermal taste because of their high density at the tip of the tongue, and their dual role: as they contain both taste buds and mechanoreceptors that are innervated by gustatory and trigeminal nerve fibers. Eldeghaidy et al. found that TTS did not seem to be correlated to fungiform papillae density in contrast to PROP taster status, and thus must have a different mechanism. The taste phenotype as a whole, and the thermal taster status specifically, increasingly allure both neurology researchers and the food and beverage industry alike. Temperature may be actively integrated as a contributor in the totality of the gustatory experience when new taste product are planned to be released to market.

The Association Between Preoperative Pain Catastrophizing and Chronic Pain After Hysterectomy – Secondary Analysis of a Prospective Cohort Study

    Hon Sen Tan,1 Rehena Sultana,2 Nian-Lin Reena Han,3 Chin Wen Tan,1,4 Alex Tiong Heng Sia,1,4 Ban Leong Sng1,4
    1Department of Women’s Anaesthesia, KK Women’s and Children’s Hospital, Singapore; 2Centre for Quantitative Medicine, Duke-NUS Medical School, Singapore; 3Division of Clinical Support Services, KK Women’s and Children’s Hospital, Singapore; 4Anesthesiology and Perioperative Sciences Academic Clinical Program, SingHealth-Duke-NUS Medical School, Singapore
    Correspondence: Ban Leong Sng
    Department of Women’s Anaesthesia, KK Women’s and Children’s Hospital, 100 Bukit Timah Road 229899, Singapore
    Tel +65 6394 1077
    Email sng.ban.leong@singhealth.com.sg
    Purpose: Hysterectomy is associated with a high incidence of chronic post-hysterectomy pain (CPHP). Pain catastrophizing, a negative cognitive-affective response to pain, is associated with various pain disorders but its role in CPHP is unclear. We aimed to determine the association of high preoperative pain catastrophizing with CPHP development and functional impairment 4 months after surgery.
    Patients and Methods: Secondary analysis of a prospective cohort study of women undergoing abdominal/laparoscopic hysterectomy to investigate the association between high pain catastrophizing (pain catastrophizing scale, PCS≥ 20) with CPHP and associated functional impairment (defined as impairment with standing for ≥ 30 minutes, sitting for ≥ 30 minutes, or walking up or down stairs). CPHP and functional impairment were assessed via 4- and 6-month phone surveys.
    Results: Of 216 patients, 72 (33.3%) had high PCS, with mean (SD) of 30.0 (7.9). In contrast, 144 (66.7%) patients had low PCS, with mean (SD) of 9.0 (4.7). At 4 months, 26/63 (41.3%) patients in the high PCS group developed CPHP, compared to 24/109 (22.0%) in the low PCS group. At 6 months, 14/53 (26.4%) high PCS patients developed CPHP, compared to 10/97 (10.3%) patients with low PCS. High PCS was independently associated with CPHP at 4 months (OR 2.49 [95% CI 1.27 to 4.89], p=0.0082) and 6 months (OR 3.12 [95% CI 1.28 to 7.64], p=0.0126) but was not associated with functional impairment. High PCS≥ 20, presence of evoked mechanical temporal summation (MTS), and history of abdominal/pelvic surgery predict CPHP at 4 months with area under the curve (AUC) of 0.69. Similarly, PCS≥ 20 and increasing MTS magnitude predicted CPHP at 6 months with AUC of 0.76.
    Conclusion: High PCS was independently associated with CPHP. Future studies should identify other CPHP associated factors to formulate a risk-prediction model and investigate the effectiveness of early intervention for pain catastrophizers in improving pain-related outcomes.

Neuronavigation based 10 sessions of repetitive transcranial magnetic stimulation therapy in chronic migraine: an exploratory study

    Abstract

    Introduction: Chronic migraine is a disease of altered cortical excitability. Repetitive transcranial magnetic stimulation provides a novel non-invasive method to target the nociceptive circuits in the cortex. Motor cortex is one such potential target. In this study, we targeted the left motor cortex using fMRI-guided neuronavigation.

    Materials and methods: Twenty right-handed patients were randomized into real and sham rTMS group. Baseline subjective pain assessments were done using visual analog scale (VAS) and questionnaires: State-Trait Anxiety Inventory, Becks Depression Inventory, and Migraine Disability Assessment (MIDAS) questionnaire. Objectively, pain was assessed by means of thermal pain thresholds using quantitative sensory testing. For corticomotor excitability parameters, resting motor thresholds and motor-evoked potentials were mapped. For rTMS total, 600 pulses in 10 trains at 10 Hz with an intertrain interval of 60 s were delivered in each session. Ten such sessions were given 5 days per week over 2 consecutive weeks. The duration of each session was 10 min. Real rTMS was administered at 70% of Resting MT. All the tests were repeated post-intervention and after 1 month of follow-up. There are no studies reporting the use of fMRI-based TMS for targeting the motor cortex in CM patients.
    Results: We observed a significant reduction in the mean VAS rating, headache frequency, and MIDAS questionnaire in real rTMS group which was maintained after 1 month of follow-up.
    Conclusion: Ten sessions of fMRI-based rTMS over the left motor cortex may provide long-term pain relief in CM, but further studies are warranted to confirm our preliminary findings.
    Keywords: Chronic pain; Cortical excitability; Headache; Motor cortex stimulation; Neuromodulation; Quantitative Sensory test.

Stepwise increasing sequential offsets cannot be used to deliver high thermal intensities with little or no perception of pain

Abstract

Offset analgesia (OA) is the disproportionate decrease in pain experience following a slight decrease in noxious heat stimulus intensity. We tested whether sequential offsets would allow noxious temperatures to be reached with little or no perception of pain. Forty-eight participants continuously rated their pain experience during trials containing trains of heat stimuli delivered by Peltier thermode. Stimuli were adjusted through either stepwise sequential increases of 2°C and decreases of 1°C or direct step increases of 1°C up to a maximum of 46°C. Step durations (1, 2, 3, or 6 s) varied by trial. Pain ratings generally followed presented temperature, regardless of step condition or duration. For 6-s steps, OA was observed after each decrease, but the overall pain trajectory was unchanged. We found no evidence that sequential offsets could allow for little pain perception during noxious temperature presentation.

NEW & NOTEWORTHY Offset analgesia is the disproportionate decrease in pain experience following a slight decrease in noxious heat stimulus intensity. We tested whether sequential offsets would allow noxious temperatures to be reached with little or no perception of pain. We found little evidence of such overall analgesia. In contrast, we observed analgesic effects after each offset with long-duration stimuli, even with relatively low-temperature noxious stimuli.

INTRODUCTION

Offset analgesia (OA) was first described by Grill and Coghill (2002) and was defined as a disproportionate decrease in pain experience following a slight decrease in heat stimulus intensity. In a typical OA experiment, three successive periods (T1, T2, T3) each contain a continuous noxious stimulus. The first and last stimuli are of equal intensity, but the middle stimulus is slightly more intense (e.g., 45°C, 46°C, 45°C). The OA effect is revealed by a greater fall in reported pain intensity following a step back to the original noxious stimulus temperature compared with delivery of a continuous noxious stimulus temperature (e.g., 45°C, 45°C, 45°C).

Pain relief for patients with chronic low back pain

CHEPS for TSA2

CHEPS FOR TSA 2

Medoc Thermodes

Fit to a T(hermode)

Medoc Thermodes

We are often asked by our customers: “what thermode should I use?” Our answer is usually: “it depends”.

This is one of the most common questions we are asked when a customer approaches us, intending to buy a thermal quantitative sensory testing (QST) device.

The thermode is the probe that is attached to the participants’ skin, that on command of the computer program changes its temperature to hot or cold.

There are several types of thermodes; which one fits you best, depends mostly on your intended use.

Let’s start with the basics:

Comparing and contrasting

The classic thermode size is the 30mm by 30mm contact surface thermode, or for short: the 30*30. This thermode size has been around for decades and has therefor gathered quite the following.

Most of the normative data that has been gathered with Medoc devices around the world, and specifically by the German Research Network on Neuropathic Pain, the DFNS, has been gathered with this 30*30 thermode[1],[2],[3]. If you intend to compare your QST results to normative values that have been collected from healthy participants, you may want to consider using the 30*30.

Another quite common thermode size is the 16*16. This thermode has been in use with researchers and clinicians who wish to stimulate smaller areas, like the face[4] or the tongue[5], or perform QST on children[6].

Need for speed

One of the most asked-about thermodes is the CHEPS thermode. This thermode is special, because its technology allows working at very high speeds, for both heat and cold stimulation.

These high speeds are especially important for researchers who want to use a fast thermal stimulation in order to record Contact Heat Evoked Potentials (CHEPs)[7],[8],[9] or Cold Evoked Potentials (CEPs)[10]. Others may be interested in an application called: phasic heat temporal summation, in which very fast noxious heat pulses are applied in order to test for the wind-up phenomenon[11],[12].

Visualizing pain

The above thermode types (30*30, 16*16, CHEPS) are also available in fMRI versions. fMRI thermodes are different from normal thermodes for having additional 10 meters cable length, allowing the device to be placed outside the magnetic chamber and only the thermode to pass through the waveguide, reducing noise artifacts and insuring safety. These thermodes have undergone thorough testing and validation in different MRI environments.

Thermal stimulation is used in many trials that examined psychology (including reward processing, mindfulness, and more)[13],[14] and pain neurophysiology[15],[16].

Not your run of the mill thermode..

Then there are the specialized thermodes. Some quantitative sensory testing has been conducted on the most uncommon places in the body, to elucidate specific issues.

Intra-oral testing is conducted with a small diameter Intraoral thermode for varying purposes like; tooth sensitivity[17],[18], pain disorders involving the mouth or the face[19]and thermal taster status.

Medoc’s Intravaginal thermode, formerly known as the Genito-sensory-analyzer (GSA) is utilized in studies which seek to assess somatosensory function and pain of the genital area in women[20],[21],[22] and men[23].

 

References: [1]Hafner, J., Lee, G., Joester, J., Lynch, M., Barnes, E. H., Wrigley, P. J., & Ng, K. (2015). Thermal quantitative sensory testing: a study of 101 control subjects. Journal of Clinical Neuroscience, 22(3), 588-591. [2] Blankenburg, M., Boekens, H., Hechler, T., Maier, C., Krumova, E., Scherens, A., … & Zernikow, B. (2010). Reference values for quantitative sensory testing in children and adolescents: developmental and gender differences of somatosensory perception. PAIN®, 149(1), 76-88. [3]Yarnitsky, D., & Sprecher, E. (1994). Thermal testing: normative data and repeatability for various test algorithms. Journal of the neurological sciences, 125(1), 39-45. [4] Sampaio, F. A., Sampaio, C. R., Cunha, C. O., Costa, Y. M., Alencar, P. N., Bonjardim, L. R., … & Conti, P. C. (2019). The effect of orthodontic separator and short‐term fixed orthodontic appliance on inflammatory mediators and somatosensory function. Journal of oral rehabilitation, 46(3), 257-267. [5] Yang, Q., Dorado, R., Chaya, C., & Hort, J. (2018). The impact of PROP and thermal taster status on the emotional response to beer. Food Quality and Preference, 68, 420-430. [6] Hainsworth, K. R., Simpson, P. M., Ali, O., Varadarajan, J., Rusy, L., & Weisman, S. J. (2020). Quantitative Sensory Testing in Adolescents with Co-occurring Chronic Pain and Obesity: A Pilot Study. Children, 7(6), 55. [7] Rosner, J., Hostettler, P., Scheuren, P. S., Sirucek, L., Rinert, J., Curt, A., … & Hubli, M. (2018). Normative data of contact heat evoked potentials from the lower extremities. Scientific reports, 8(1), 1-9. [8] Jutzeler, C. R., Rosner, J., Rinert, J., Kramer, J. L., & Curt, A. (2016). Normative data for the segmental acquisition of contact heat evoked potentials in cervical dermatomes. Scientific reports, 6, 34660. [9] Granovsky, Y., Anand, P., Nakae, A., Nascimento, O., Smith, B., Sprecher, E., & Valls-Solé, J. (2016). Normative data for Aδ contact heat evoked potentials in adult population: a multicenter study. Pain, 157(5), 1156-1163. [10]Hüllemann, P., Nerdal, A., Binder, A., Helfert, S., Reimer, M., & Baron, R. (2016). Cold‐evoked potentials–Ready for clinical use?. European Journal of Pain, 20(10), 1730-1740. [11]Staud, R., Weyl, E. E., Riley III, J. L., & Fillingim, R. B. (2014). Slow temporal summation of pain for assessment of central pain sensitivity and clinical pain of fibromyalgia patients. PloS one, 9(2), e89086. [12]Bar-Shalita, T., Vatine, J. J., Yarnitsky, D., Parush, S., & Weissman-Fogel, I. (2014). Atypical central pain processing in sensory modulation disorder: absence of temporal summation and higher after-sensation. Experimental brain research, 232(2), 587-595. [13] Elman, I., Upadhyay, J., Langleben, D. D., Albanese, M., Becerra, L., & Borsook, D. (2018). Reward and aversion processing in patients with post-traumatic stress disorder: functional neuroimaging with visual and thermal stimuli. Translational psychiatry, 8(1), 1-15. [14] Harrison, R., Zeidan, F., Kitsaras, G., Ozcelik, D., & Salomons, T. V. (2019). Trait mindfulness is associated with lower pain reactivity and connectivity of the default mode network. The Journal of Pain, 20(6), 645-654. [15]Russo, A., Tessitore, A., Esposito, F., Di Nardo, F., Silvestro, M., Trojsi, F., … & Tedeschi, G. (2017). Functional changes of the perigenual part of the anterior cingulate cortex after external trigeminal neurostimulation in migraine patients. Frontiers in neurology, 8, 282. [16] Grahl, A., Onat, S., & Büchel, C. (2018). The periaqueductal gray and Bayesian integration in placebo analgesia. Elife, 7, e32930 [17] Baad-Hansen, L., Lu, S., Kemppainen, P., List, T., Zhang, Z., & Svensson, P. (2015). Differential changes in gingival somatosensory sensitivity after painful electrical tooth stimulation. Experimental Brain Research, 233(4), 1109-1118 [18] Rahal, V., Gallinari, M. D. O., Barbosa, J. S., Martins-Junior, R. L., Santos, P. H. D., Cintra, L. T. A., & Briso, A. L. F. (2018). Influence of skin cold sensation threshold in the occurrence of dental sensitivity during dental bleaching: a placebo controlled clinical trial. Journal of Applied Oral Science, 26. [19] Mo, X., Zhang, J., Fan, Y., Svensson, P., & Wang, K. (2015). Thermal and mechanical quantitative sensory testing in chinese patients with burning mouth syndrome–a probable neuropathic pain condition?. The journal of headache and pain, 16(1), 84. [20] Gruenwald, I., Mustafa, S., Gartman, I., & Lowenstein, L. (2015). Genital sensation in women with pelvic organ prolapse. International urogynecology journal, 26(7), 981-984. [21]Reed, B. D., Sen, A., Harlow, S. D., Haefner, H. K., & Gracely, R. H. (2017). Multimodal vulvar and peripheral sensitivity among women with vulvodynia: a case-control study. Journal of lower genital tract disease, 21(1), 78. [22] Lesma, A., Bocciardi, A., Corti, S., Chiumello, G., Rigatti, P., & Montorsi, F. (2014). Sexual function in adult life following Passerini-Glazel feminizing genitoplasty in patients with congenital adrenal hyperplasia. The Journal of urology, 191(1), 206-211. [23] Chen, X., Wang, F. X., Hu, C., Yang, N. Q., & Dai, J. C. (2018). Penile sensory thresholds in subtypes of premature ejaculation: implications of comorbid erectile dysfunction. Asian journal of andrology, 20(4), 330.

BESA statistics

BESA Statistics 2.1 released!

The successor to the ground-breaking BESA Statistics program is there! BESA Statistics 2.1 greatly enhances the options of the previous version 2.0. As before, dedicated workflows allow you to perform t-test, one-way ANOVA, and correlation analyses of your data using the parameter-free cluster permutation statistics which so elegantly solve the multiple-test problem. We have added several input data types to this pipeline, in order to ensure that time-frequency analyses and connectivity analyses are now fully supported.

The main highlights of the new release are:

  • In all workflows, the data type Connectivity can now be used. This enables direct import of results obtained by BESA Connectivity for group statistics on connectivity results in sensor space or source space.
  • For Image data, a configurable slice view is available that displays sequences in one of three available orthogonal orientation.
  • The color theme can be adjusted between BESA White and the previous BESA Standard.
  • Several new color maps are available.
  • The data values are displayed on mouse-over in the detail windows.
  • Time-frequency data stored by BESA Connectivity with wavelet analysis can now be read with the correct (logarithmic) frequency spacing.
  • Single-trial time-frequency data can now be read in the t-test workflow (.tfcs data format).
  • There is no upper limit on the number of data files imported into the workflow.
  • A new image export format is available (.svg).
  • Screenshots and cluster summary results can now be copied to the clipboard using the right mouse popup menu.
Medoc TSA2 QST

New CHEPS for TSA 2 – Fast. Precise. Easy.

Advanced Thermosensory Stimulator  (TSA)

  • Precise stimulation temperature control.
  • External Control programming capability.
  • Rapid thermal stimulation rates- up to 13°C/sec.
  • Single and Dual Thermode configurations.
  • Upgradeable for the fMRI imaging environment.
  • Add the new CHEPS for TSA 2 thermode for rapid and precise heat and cold stimulation.
Carbon Wire Loops for MR EEG

State of the art MR artifact handling with Carbon Wire Loops – a true market innovation!

HIgh Density RNet

Update to the R-Net – high-density montage including face and neck electrodes

BrainVision Analyser

BrainVision Analyzer 2.2.1 – Integration of Tobii Pro Lab data in Add Channels & more

Simultaneous TMS & EEG

Methodology for characterizing network activations with neuro-navigated TMS and EEG

wireless stimtracker

Wireless Triggering Now Available

Melbourne Visit 2020

Feasibility of an Ambulatory HD EEG system for Home Monitoring in Epilepsy Patients

Motor neuroprosthesis implanted with neurointerventional surgery improves capacity for activities of daily living tasks in severe paralysis

Abstract

Background Implantable brain–computer interfaces (BCIs), functioning as motor neuroprostheses, have the potential to restore voluntary motor impulses to control digital devices and improve functional independence in patients with severe paralysis due to brain, spinal cord, peripheral nerve or muscle dysfunction. However, reports to date have had limited clinical translation.

Methods Two participants with amyotrophic lateral sclerosis (ALS) underwent implant in a single-arm, open-label, prospective, early feasibility study. Using a minimally invasive neurointervention procedure, a novel endovascular Stentrode BCI was implanted in the superior sagittal sinus adjacent to primary motor cortex. The participants undertook machine-learning-assisted training to use wirelessly transmitted electrocorticography signal associated with attempted movements to control multiple mouse-click actions, including zoom and left-click. Used in combination with an eye-tracker for cursor navigation, participants achieved Windows 10 operating system control to conduct instrumental activities of daily living (IADL) tasks.

Results Unsupervised home use commenced from day 86 onwards for participant 1, and day 71 for participant 2. Participant 1 achieved a typing task average click selection accuracy of 92.63% (100.00%, 87.50%–100.00%) (trial mean (median, Q1–Q3)) at a rate of 13.81 (13.44, 10.96–16.09) correct characters per minute (CCPM) with predictive text disabled. Participant 2 achieved an average click selection accuracy of 93.18% (100.00%, 88.19%–100.00%) at 20.10 (17.73, 12.27–26.50) CCPM. Completion of IADL tasks including text messaging, online shopping and managing finances independently was demonstrated in both participants.

Conclusion We describe the first-in-human experience of a minimally invasive, fully implanted, wireless, ambulatory motor neuroprosthesis using an endovascular stent-electrode array to transmit electrocorticography signals from the motor cortex for multiple command control of digital devices in two participants with flaccid upper limb paralysis.

Spike2

The latest Spike2 updates for V10, V9 and V8, for Windows is available now

Features of version 10.07 include:

  • Video recording has a new option to fix timing problems with some cameras. It now compensates for time delays when starting to record video. It also can be used across a remote desktop. Video review has frame accurate video stepping for both MP4 and AVI files.
  • You can display axes in the data area of Time, Result and XY views. This is expected to be useful when generating figures for publication
  • In a time view you can add channels without a y axis to a group (as long as the group head has an axis). This allows you to colour the background of areas of a waveform with states and to superimpose TextMark data.
  • Many useful small improvements and fixes

“It’s so Cute I Could Crush It!”: Understanding Neural Mechanisms of Cute Aggression

  • Graduate School of Education, University of California, Riverside, Riverside, CA, United States

The urge people get to squeeze or bite cute things, albeit without desire to cause harm, is known as “cute aggression.” Using electrophysiology (ERP), we measured components related to emotional salience and reward processing. Participants aged 18–40 years (n = 54) saw four sets of images: cute babies, less cute babies, cute (baby) animals, and less cute (adult) animals. On measures of cute aggression, feeling overwhelmed by positive emotions, approachability, appraisal of cuteness, and feelings of caretaking, participants rated more cute animals significantly higher than less cute animals.

There were significant correlations between participants’ self-report of behaviors related to cute aggression and ratings of cute aggression in the current study.

N200: A significant effect of “cuteness” was observed for animals such that a larger N200 was elicited after more versus less cute animals. A significant correlation between N200 amplitude and the tendency to express positive emotions in a dimorphous manner (e.g., crying when happy) was observed.

RewP: For animals and babies separately, we subtracted the less cute condition from the more cute condition. A significant correlation was observed between RewP amplitude to cute animals and ratings of cute aggression toward cute animals. RewP amplitude was used in mediation models.

Mediation Models: Using PROCESS (Hayes, 2018), mediation models were run. For both animals and babies, the relationship between appraisal and cute aggression was significantly mediated by feeling overwhelmed. For cute animals, the relationship between N200 amplitude and cute aggression was significantly mediated by feeling overwhelmed. For cute animals, there was significant serial mediation for RewP amplitude through caretaking, to feeling overwhelmed, to cute aggression, and RewP amplitude through appraisal, to feeling overwhelmed, to cute aggression. Our results indicate that feelings of cute aggression relate to feeling overwhelmed and feelings of caretaking. In terms of neural mechanisms, cute aggression is related to both reward processing and emotional salience.

Introduction

Cute aggression is defined as the urge some people get to squeeze, crush, or bite cute things, albeit without any desire to cause harm. Aragón et al. (2015) initially operationalized the phenomenon of “cute aggression” through individual self-reports while viewing cute stimuli. The authors investigated cute aggression using pictures of baby humans and animals via an online survey. Findings indicated that for infantile babies (e.g., images that had been altered to have large eyes and chubby cheeks; Sherman et al., 2013) and baby animals, there was a relationship between being overwhelmed by positive feelings and the expression of cute aggression (Aragón et al., 2015).

Acute Exercise as an Intervention to Trigger Motor Performance and EEG Beta Activity in Older Adults

Anodal transcranial patterned stimulation of the motor cortex during gait can induce activity-dependent corticospinal plasticity to alter human gait

State Anxiety Down-Regulates Empathic Responses: Electrophysiological Evidence

Age-Related Alterations in Electroencephalography Connectivity and Network Topology During n-Back Working Memory Task