[RESOLVED] AP-PA and gradient distortion correction

Hi,
I have some questions regarding how to set parameters for an APPA correction and for gradient correction:

  1. I’ve seen your example using runturnkey and in the example mapping.txt file only rfMRI_REST_AP and rfMRI_Task_AP are mentioned, while in data there are both PA and AP dicom files. Is the correction done automatically or a different mapping file is needed?

  2. About this correction is it possible to do it if I have a lesser number of volumes for PA? And if I don’t have PA acquisition but only AP is it a problem for the pipeline?

  3. Regarding the gradient distortion correction, where i can set the gradient coefficient file? Inside the batch file?

Thanks for the help,

Chiara

Dear Chiara,

Excellent question! In fact, the mapping file HCPA001_mapping.txt contains the following info:

#  HCP mapping file
#  ----------------
191                  => T1w
231                  => T2w
SpinEchoFieldMap_AP  => SE-FM-AP
SpinEchoFieldMap_PA  => SE-FM-PA
rfMRI_REST_AP_SBRef  => boldref:rest
rfMRI_REST_AP        => bold:rest
rfMRI_Task_AP_SBRef  => boldref:task
rfMRI_Task_AP        => bold:task
dMRI_dir98_AP        => DWI:dir98_AP
dMRI_dir98_PA        => DWI:dir98_PA
dMRI_dir99_AP        => DWI:dir99_AP
dMRI_dir99_PA        => DWI:dir99_PA

And, as you noted, the Quick Start Data contains the following DICOMs:

1-Localizer
2-AAHScout
3-AAHScout_MPR_sag
4-AAHScout_MPR_cor
5-AAHScout_MPR_tra
6-Localizer_aligned
7-SpinEchoFieldMap_AP
8-SpinEchoFieldMap_PA
9-rfMRI_REST_AP_SBRef
10-rfMRI_REST_AP
11-rfMRI_REST_AP_PhysioLog
12-rfMRI_REST_PA_SBRef
13-rfMRI_REST_PA
14-rfMRI_REST_PA_PhysioLog
15-T1w_setter
16-T1w_setter
17-T1w_MPR_vNav_4e
18-T1w_MPR_vNav_4e
19-T1w_MPR_vNav_4e_RMS
20-T1w_MPR_vNav_4e_RMS
21-T2w_setter
22-T2w_setter
23-T2w_SPC_vNav
24-T2w_SPC_vNav
25-TSE_HiResHp
26-TSE_HiResHp
27-SpinEchoFieldMap_AP
28-SpinEchoFieldMap_PA
29-dMRI_dir98_AP_SBRef
30-dMRI_dir98_AP
31-dMRI_dir98_AP_PhysioLog
32-dMRI_dir98_PA_SBRef
33-dMRI_dir98_PA
34-dMRI_dir98_PA_PhysioLog
35-dMRI_dir99_AP_SBRef
36-dMRI_dir99_AP
37-dMRI_dir99_AP_PhysioLog
38-dMRI_dir99_PA_SBRef
39-dMRI_dir99_PA
40-dMRI_dir99_PA_PhysioLog
99-PhoenixZIPReport

So to address your questions:

  1. The mapping file actually needed adjusting on our end to include both AP and PA directions. The provided version will work, but we are adjusting the mapping file to showcase the example you asked for. Also, we noticed a discrepancy thanks to your question - namely there is no rfMRI_Task_AP_SBRef so this would have been ignored.

  2. Not it is not a problem and the pipeline can handle any combination.

  3. The gradient distortion correction file is set in the batch in the following way: _hcp_bold_gdcoeffs : NONE. In this situation the gradient file will not be used. However, if you have your vendor-specific gradient file you can set it in this variable like so:

_hcp_bold_gdcoeffs : <path_to_your_file>/<gradient_correction_file>.grad

A specific example from the HCP dataset would look like so for our acceptance test scenario:

_hcp_bold_gdcoeffs : /gpfs/loomis/pi/n3/Studies/QuNexAcceptTest/SpecFiles/hcp_gradient_coefficient_files/Prisma_3T_coeff_AS82.grad

We will update the mapping and the batch files shortly, run a few tests and notify you so you can re-run the Quick Start.

Thank you for using QuNex!

1 Like

Dear Alan,

thank you for your answers, very useful to understand things better, hope to hear from you soon about the update!

I have another question if I can ask. Recently I received data to preprocess with hcp but they were all in nifti format. Is there a way to use them as input? I’ve tried creating a study and copying them inside nii folder in the study folder (simulating the quickstart organization example) and writing a session.txt but I had no luck with this.

Chiara

Dear Chiara,

we fixed the quick start so it now properly uses both PA and AP dicom files. All you have to do is update your batching and mapping files. You can get the updated ones by running:

# -- Download the batch file
wget --no-check-certificate -r 'https://drive.google.com/uc?id=16FePg7JoQo2jqWTYoI8-sZPEmPaCzZNd&export=download' -O HCPA001_batch.txt

# -- Download the mapping specification file
wget --no-check-certificate -r 'https://drive.google.com/uc?id=1HtIm0IR7aQc8iJxf29JKW846VO_CnGUC&export=download' -O HCPA001_mapping.txt

Cheers, Jure

1 Like

Dear Jure,

thank you so much!

Have a nice day,

Chiara

Dear Chiara,

Let me get back to your second question about the case in which you already have the data in the NIfTI format. In this case there are two ways that you can go about it, depending on whether the files are organized in a BIDS compliant manner or not.

Importing of BIDS compliant NIfTI dataset
In the case of BIDS compliant NIfTI dataset, you can use the import_bids command. A detailed description of the onboarding of BIDS compiant datasets is available on the wiki: https://bitbucket.org/oriadev/qunex/wiki/UsageDocs/OnboardingBIDSData.md. You can also refer to inline help by running qunex import_bids.

Staging existing NIfTI files
If the dataset is not BIDS compliant, you can stage the files yourself, as you mentioned you tried to do. Let’s say that you have data for three sessions, each session has a T1w and a T2w structural image, a pair of AP/PA acquired spin echo field map images that should be used both for structural and bold distorsion correction and two resting state bold images. In this case you would follow the steps below. I am writing the steps as a bash script, but you can do similar using a GUI, an FTP client or any other way.

First, you would create a study folder by running:

MYSTUDY="<path to the relevant folder>/my_study"
container create_study $MYSTUDY

Next, you would set up folders for sessions 1-3 that also include the nii subfolder and the subject.txt and subject_hcp.txt file.

for session in 01 02 03
do
  mkdir -p $MYSTUDY/sessions/s$session/nii
  echo "id: s$session
subject: s$session
raw_data: $MYSTUDY/sessions/s$session/nii
hcp: $MYSTUDY/sessions/s$session/hcp

10: T1w image
20: T2w image
30: AP Spin echo FM image
40: PA Spin echo FM image
50: BOLD resting state image
60: BOLD resting state image
70: BOLD resting state image" > $MYSTUDY/sessions/s$session/session.txt
  echo "id: s$session
subject: s$session
raw_data: $MYSTUDY/sessions/s$session/nii
hcp: $MYSTUDY/sessions/s$session/hcp

hcpready: true
10: T1w         : T1w image : se(1)
20: T2w         : T2w image : se(1)
30: SE-FM-AP    : AP Spin echo FM image : se(1)
40: SE-FM-PA    : PA Spin echo FM image : se(1)
50: bold1:rest  : BOLD resting state image : se(1)
60: bold2:rest  : BOLD resting state image : se(1)
70: bold3:rest  : BOLD resting state image : se(1)" > $MYSTUDY/sessions/s$session/session_hcp.txt
done

You should now have the following folder structure (currently irrelevant subfolders are excluded):

my_study/
├── analysis
├── info
├── processing
└── sessions
    ├── archive
    ├── inbox
    ├── QC
    ├── s01
    │   ├── nii
    │   ├── session.txt
    │   └── session_hcp.txt
    ├── s02
    │   ├── nii
    │   ├── session.txt    
    │   └── session_hcp.txt
    ├── s03
    │   ├── nii
    │   ├── session.txt    
    │   └── session_hcp.txt
    └── specs
        └── batch_example.txt

You should notice that we already prepared both the session.txt and session_hcp.txt files. You could also skip the session_hcp.txt file and rather prepare a related hcp_mapping.txt file. This should be:

T1w image                   => T1w
T2w image                   => T2w
AP Spin echo FM image       => SE-FM-AP
PA Spin echo FM image       => SE-FM-PA
BOLD resting state image    => bold:rest

The next thing to do now is to copy the NIfTI images into the relevant $MYSTUDY/sessions/s$session/nii folders. The images should have the following file names:

10.nii.gz   # T1w image
20.nii.gz   # T2w image
30.nii.gz   # AP Spin echo FM image
40.nii.gz   # PA Spin echo FM image
50.nii.gz   # BOLD resting state image
60.nii.gz   # BOLD resting state image
70.nii.gz   # BOLD resting state image"

You should be ready to continue from here with the setup_hcp RunTurnkey step (or rather create_session_info step is you skipped generation of session_hcp.txt files above). Do note that you still need to prepare your correct batch parameter file!

Do let me know if you get stuck at any point.

With kind regards,

Grega

1 Like

Dear Grega,

thank you for the detailed answer. I’ve tried successfully with the first approach (the bids one), but I think I’ll also try the second one, just to be ready in case future data needed this second method. I’ll let you know if everything goes ok with this one too.

Kind regards,

Chiara