Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
faq:how_can_i_monitor_a_subject_s_head_position_during_a_meg_session [2017/08/17 11:21]
127.0.0.1 external edit
faq:how_can_i_monitor_a_subject_s_head_position_during_a_meg_session [2017/12/15 15:09] (current)
simon Tutorial includes preparation for realistic head shapes for realtime headlocaliser,
Line 11: Line 11:
 Neuroimage. 2013 Mar;​68:​39-48. doi: 10.1016/​j.neuroimage.2012.11.047. Neuroimage. 2013 Mar;​68:​39-48. doi: 10.1016/​j.neuroimage.2012.11.047.
 </​note>​ </​note>​
 +
 +==== Acquiring head shape for visualisation ====
 +Monitoring the head position can be done by visualizing the head shape in 3 different ways: As a sphere, Polhemus acquired points representing the head shape or with a realistic head shape acquired with a 3D-Scanner. While the sphere needs no further action, the latter two need to be recorded in advance to the MEG measurement.
 +
 +=== Polhemus===
 +During the preparation for the MEG measurement the fiducials and additional points of the head surface are measured with the Polhemus. We suggest to acquire the additional points on the brow ridge, cheekbone and along the nose, this will help in visualizing the head shape more realistic. In a case of EEG/MEG also the locations of the electrode locations can also be measured and used for the visualization. All these points together can be used for visualizing the head shape during the on- and offline visualization of the head movements.
 +
 +=== 3D-Scanner ===
 +The head shape can also be measured with a 3D-Scanner (i.e. structure.io) to acquire a realistic representation of the subject. But before we can use the measured head shape we have to preprocess the data. The structure.io stores the head shape in its own device coordinate system and therefore needs to realigned to the respective coordinate system. So in the first step we localize the fiducials on the head shape:
 +
 +<​code>​
 +headshape = ft_read_headshape('​Model.obj'​);​
 +cfg = [];
 +cfg.method = '​headshape';​
 +fid = ft_electrodeplacement(headshape);​
 +</​code>​
 +
 +After the localization of the fiducials we realign the head shape to the respective coordinate system:
 +<​code>​
 +cfg = [];
 +cfg.coordsys ​       = '​ctf';​ % or '​neuromag'​
 +cfg.fiducial.nas ​   = fid.elec(1,:​);​ % position of nasion
 +cfg.fiducial.lpa ​   = fid.elec(2,:​);​ % position of LPA
 +cfg.fiducial.rpa ​   = fid.elec(3,:​);​ % position of RPA
 +headshape_ctf = ft_meshrealign(headshape)
 +</​code>​
 +Now we have the head shape in the correct coordinate system and can use it for on- and offline head localization.
 +
 +==== Monitor a subject'​s head position during a MEG session ====
  
 After initializing the MEG system, one starts the **acq2ftx/​neuromag2ft application**. When subsequently starting Acquisition,​ the data is transferred in realtime to the FieldTrip buffer which can be read from any computer connected through a network. Point to the location of the buffer by correctly specifying cfg.dataset:​ After initializing the MEG system, one starts the **acq2ftx/​neuromag2ft application**. When subsequently starting Acquisition,​ the data is transferred in realtime to the FieldTrip buffer which can be read from any computer connected through a network. Point to the location of the buffer by correctly specifying cfg.dataset:​
Line 18: Line 47:
   cfg.dataset = '​buffer://​hostname:​1972'; ​    % get data from buffer   cfg.dataset = '​buffer://​hostname:​1972'; ​    % get data from buffer
   ft_realtime_headlocalizer(cfg)   ft_realtime_headlocalizer(cfg)
 +</​code>​
 +
 +To improve the real time head movement compensation,​ we can also specify a realistic head shape and a realistic model of the dewar:
 +<​code>​
 +cfg = [];
 +cfg.dataset = '​buffer://​hostname:​1972'; ​    % get data from buffer
 +cfg.dewar ​      = ctf_dewar;
 +cfg.head ​       = headshape_ctf;​
 +ft_realtime_headlocalizer(cfg)
 </​code>​ </​code>​
  
 **Repositioning within a recording session** can be achieved by marking the head position indicator (HPI) coil positions at an arbitrary point in time, operationalized through clicking the '​Update'​ button. Black unfilled markers should appear which indicate the positions of the coils at the moment of buttonpress. Distance to these marked positions then become colorcoded. **Repositioning within a recording session** can be achieved by marking the head position indicator (HPI) coil positions at an arbitrary point in time, operationalized through clicking the '​Update'​ button. Black unfilled markers should appear which indicate the positions of the coils at the moment of buttonpress. Distance to these marked positions then become colorcoded.
  
-**Repositioning between a recording session**, i.e. to a previous recording session, can be achieved by specifying cfg.template. Either by pointing to another dataset; e.g. cfg.template = '​subject01xxx.ds'​ (CTF275 systems only), or by pointing to a textfile created by clicking the Update button during a previous recording session; e.g. cfg.template = '​29-Apr-2013-xxx.txt'​ (CTF275 and Neuromag systems). ​+**Repositioning between a recording session** , i.e. to a previous recording session, can be achieved by specifying cfg.template. Either by pointing to another dataset; e.g. cfg.template = '​subject01xxx.ds'​ (CTF275 systems only), or by pointing to a textfile created by clicking the Update button during a previous recording session; e.g. cfg.template = '​29-Apr-2013-xxx.txt'​ (CTF275 and Neuromag systems). ​
  
 {{:​faq:​anims1.gif?​direct&​600|}} {{:​faq:​anims1.gif?​direct&​600|}}
  
 //Figure 1; Top (left plot) and back view (right plot) of the subject'​s head. Nasion is represented by a triangular marker and both aurical points by circular markers. To aid the subject with repositioning,​ the real-time fiducial positions are color coded to indicate the distances to the targets (green < 1.5 mm, orange < 3 mm, and red > 3 mm). If all three markers are within limits, the head turns lightblue (CTF only). Click on the image for the animation. // //Figure 1; Top (left plot) and back view (right plot) of the subject'​s head. Nasion is represented by a triangular marker and both aurical points by circular markers. To aid the subject with repositioning,​ the real-time fiducial positions are color coded to indicate the distances to the targets (green < 1.5 mm, orange < 3 mm, and red > 3 mm). If all three markers are within limits, the head turns lightblue (CTF only). Click on the image for the animation. //
- 
-==== Offline head movement compensation ==== 
- 
-The above online head localization procedure can substantially reduce the influence of head movement within a session, e.g. using short repositioning instructions between experimental blocks, and also allows for accurate repositioning between sessions. However, residual head movement is likely to negatively impact statistical sensitivity and one may want to consider to incorporate information about these head movements into the [[:​example:​how_to_incorporate_head_movements_in_meg_analysis?​|offline analysis]]. For instance, incorporation of head position time-series into the general linear model, using ft_regressconfound,​ has been found to improve statistical sensitivity up to 30%. 
  
 ==== Replaying a subject'​s recorded head position ==== ==== Replaying a subject'​s recorded head position ====
Line 42: Line 76:
   ft_realtime_headlocalizer(cfg)   ft_realtime_headlocalizer(cfg)
 </​code>​ </​code>​
-==== Donders' ​specific protocol ​(using a CTF system) ====+Before we can replay the data acquired with the Elekta Neuromag, the data has to be preprocessed with maxfilter. The first possibility is to add the relevant information to .fif file with MaxMove (see also under further reading). 
 +The other option is to use maxfilter to create an ascii file containing the relevant information about head movement. Under ‘Head position estimation’ the button ‘Save head postions in an ascii file’ just need to be pressed (see also under further reading). 
 + 
 +<​code>​ 
 +  cfg.bufferdata ​  '​first'; ​                % read data from first until last segment  
 +  cfg.template ​    '​previousdataset'; ​   
 +  cfg.dataset ​     ​'​previousdataset'; ​  
 +  cfg.headmovement ​= 'maxfilter.pos';​  
 +  ft_realtime_headlocalizer(cfg) 
 +</​code>​ 
 + 
 + 
 + 
 +==== CTF specific protocol ​====
  
 1) '​Initialize the MEG system'​. 1) '​Initialize the MEG system'​.
Line 72: Line 119:
 Keep in mind that Odin's data directory is automatically cleaned every now and then. If your template dataset has been removed, you could still read it from your own M disk in case you have backed it up there. Logout the meg user on the headlocalizer dedicated computer and login as yourself. Now run the headlocalizer with specifying the file location on your M disk (e.g. cfg.template = '/​home/​action/​arjsto/​MEG/​ArjSto_1200hz_20100812_01.ds'​). ​ Keep in mind that Odin's data directory is automatically cleaned every now and then. If your template dataset has been removed, you could still read it from your own M disk in case you have backed it up there. Logout the meg user on the headlocalizer dedicated computer and login as yourself. Now run the headlocalizer with specifying the file location on your M disk (e.g. cfg.template = '/​home/​action/​arjsto/​MEG/​ArjSto_1200hz_20100812_01.ds'​). ​
  
 +==== Elekta specific protocol ====
 +
 +Currently the option for online monitoring is only available for the CTF system. The Elekta real-time data stream can already be processed in FieldTrip, however, the relevant information in real time data stream is currently missing. However, in principle it would look similar to CTF specific protocol
 +
 +1) '​Initialize the MEG system'​.
 +
 +2) 'Start neuromag2ft for real-time head localization'​.
 +
 +3) 'Start Acq'. You should see activity in the terminal in which neuromag2ft is running.
 +
 +4) Start Matlab on the '​real-time computer'​
 +
 +5) Visualize the subject'​s head in real-time.
 +   
 +
 +<​code>​
 +  cfg = [];
 +  cfg.dataset = '​buffer://​server:​port'​
 +  ft_realtime_headlocalizer(cfg)
 +</​code>​
 +
 +==== Further reading ====
 +For further reading of real time head localizer please read {{:​faq:​stolkneuroimage2013.pdf|this paper}} from Stolk A, et al.
 +
 +The above online head localization procedure can substantially reduce the influence of head movement within a session, e.g. using short repositioning instructions between experimental blocks, and also allows for accurate repositioning between sessions. However, residual head movement is likely to negatively impact statistical sensitivity and one may want to consider to incorporate information about these head movements into the offline analysis. For instance, incorporation of head position time-series into the general linear model, using [[reference:​ ft_regressconfound | ft_regressconfound]],​ has been found to improve statistical sensitivity up to 30%.
 +
 +Furthermore,​ despite using the Polhemus to localize electrode locations we can use the structure.io to localize them. You can find the tutorial [[tutorial/​electrode|here]]. This means we do not need the Polhemus for our experimental procedure and therefore reduce the preparation time by having less to measure.
 +
 +For the Elekta Neuromag system the maxfilter [[https://​www.google.nl/​search?​hl=nl&​dcr=0&​source=hp&​ei=HtczWtaeGMbawAKP0JiYBg&​q=maxfilter+user%E2%80%99s+guide&​oq=maxfilter+user%E2%80%99s+guide&​gs_l=psy-ab.3...708.708.0.1007.1.1.0.0.0.0.81.81.1.1.0....0...1c.2.64.psy-ab..0.0.0....0.PPP2C6Blbso|User’s guide Chapter4 MaxMove]] ​ provides further information on offline head movement visualization and compensation. ​
 +
 +
 +For more information about the CTF head localization we recommend ​ [[https://​www.google.nl/​search?​ei=htczWqiUCs2VsAefoZP4BA&​q=Head+Localization+Guide+CTF+MEGTM+Software&​oq=Head+Localization+Guide+CTF+MEGTM+Software&​gs_l=psy-ab.3...665.2032.0.2495.2.2.0.0.0.0.127.197.1j1.2.0....0...1c.1.64.psy-ab..0.0.0....0.S5__Ll6gens|Head Localization Guide CTF MEGTM Software]].