HCP TransmitBias pipeline

Hi everyone,

I’ve been trying to use QuNex to reconstruct T1/T2 myelin maps within the HCP pipeline, and I’m wondering if anyone already has a complete batch script for running a standard structural HCP pipeline + HCP TransmitBias pipeline (hcp_transmit_bias_individual).

I’ve gathered some helpful information from the HCP Users Group discussion here, including:

  • The RF_MAP sequence (the default B1 map on Siemens scanners) can be used as the transmit field map.
  • If the RF map is provided, it’s not necessary to process the structural scans alongside fieldmaps or SBRefs from functional data.
  • The required group template is the transmit-corrected group myelin file:
    MNINonLinear/fsaverage_LR32k/Partial.MyelinMap_GroupCorr_MSMAll.32k_fs_LR.dscalar.nii
    (following Tim Coalson’s post)

If anyone has a working batch or any advice, I’d really appreciate your input!

Thanks!

Hi!

Welcome to the QuNex forum. Can you tell a bit more about the data you have? What kind of format is it currently in (raw DICOMs, BIDS …)? How many sessions are we talking about?

Based on this I can see what steps are needed and then we can take it from there. Once we onboard the data into a QuNex study, it will be clear what kind of imaging data you have, and then we can spec out commands that will get you through strcutral processing and then the TransmitBias pipeline.

A good starting point if you have DICOM data is the QuNex quickstart (QuNex quick start using a container — QuNex documentation).

Best, Jure

Dear Jure,

Thank you for your quick response.

Indeed, I’ve just started using QUNEX and haven’t yet had the chance to explore all its functionalities. For now, our goal is to preprocess T1w and T2w images and generate myelin maps with transmit field bias correction, following the recent implementation.

We are conducting a longitudinal study involving children aged 5–8, with a planned total of over 300 MRI sessions (currently ~150 completed). Our data is organized according to the BIDS specification, as in the following example:

KJ23a/sub-KJ007/
└── ses-01/
    ├── anat/
    │   ├── sub-KJ007_ses-01_T2w.json
    │   ├── sub-KJ007_ses-01_T2w.nii.gz
    │   ├── sub-KJ007_ses-01_run-01_T1w.json
    │   ├── sub-KJ007_ses-01_run-01_T1w.nii.gz
    │   ├── sub-KJ007_ses-01_run-02_T1w.json
    │   ├── sub-KJ007_ses-01_run-02_T1w.nii.gz
    │   ├── sub-KJ007_ses-01_run-03_T1w.json
    │   └── sub-KJ007_ses-01_run-03_T1w.nii.gz
    ├── fmap/
    │   ├── sub-KJ007_ses-01_acq-hcp_dir-AP_epi.json
    │   ├── sub-KJ007_ses-01_acq-hcp_dir-AP_epi.nii.gz
    │   ├── sub-KJ007_ses-01_acq-hcp_dir-PA_epi.json
    │   ├── sub-KJ007_ses-01_acq-hcp_dir-PA_epi.nii.gz
    │   ├── sub-KJ007_ses-01_acq-std_dir-AP_epi.json
    │   ├── sub-KJ007_ses-01_acq-std_dir-AP_epi.nii.gz
    │   ├── sub-KJ007_ses-01_acq-std_dir-PA_epi.json
    │   └── sub-KJ007_ses-01_acq-std_dir-PA_epi.nii.gz
    └── func/
        ├── sub-KJ007_ses-01_task-alicja1_run-01_bold.json
        ├── sub-KJ007_ses-01_task-alicja1_run-01_bold.nii.gz
        ├── sub-KJ007_ses-01_task-alicja1_run-02_bold.json
        ├── sub-KJ007_ses-01_task-alicja1_run-02_bold.nii.gz
        ├── sub-KJ007_ses-01_task-alicja2_bold.json
        ├── sub-KJ007_ses-01_task-alicja2_bold.nii.gz
        ├── sub-KJ007_ses-01_task-rest_bold.json
        ├── sub-KJ007_ses-01_task-rest_bold.nii.gz
        ├── sub-KJ007_ses-01_task-rest_sbref.json
        └── sub-KJ007_ses-01_task-rest_sbref.nii.gz

As you may notice, I haven’t yet added the RF MAP scans to the repository — these are two files: the magnitude and phase maps.

So far, I’ve been running QUNEX commands manually for individual subjects. Here’s what I’ve done:

qunex_container import_bids \
  --sessionsfolder="${STUDY_FOLDER}/sessions" \
  --inbox="${RAW_DATA}" \
  --action='copy' \
  --archive='leave' \
  --overwrite=no \
  --fileinfo=full \
  --container="${QUNEX_CONTAINER}"

At this stage, I had to manually edit /sessions/session_hcp.txt to add :T1w, as it wasn’t automatically labeled.

Then I proceeded with the preprocessing steps:

qunex_container hcp_pre_freesurfer \
  --sessionsfolder="${STUDY_FOLDER}/sessions" \
  --batchfile="${STUDY_FOLDER}/processing/batch.txt" \
  --container="${QUNEX_CONTAINER}" \
  --hcp_avgrdcmethod=NONE

This was followed by the FreeSurfer and post-FreeSurfer steps — all of which ran very smoothly.

I then attempted the transmit field bias correction:

qunex_container hcp_transmit_bias_individual \
  --sessionsfolder="${STUDY_FOLDER}/sessions" \
  --batchfile="${STUDY_FOLDER}/processing/batch.txt" \
  --container="${QUNEX_CONTAINER}" \
  --hcp_b1tx_magnitude=/home/bkossows/qunex/b1tx/mag \
  --hcp_b1tx_phase=/home/bkossows/qunex/b1tx/pha \
  --hcp_group_corrected_myelin=/home/bkossows/qunex/Partial.MyelinMap_GroupCorr_MSMAll.32k_fs_LR.dscalar.nii \
  --hcp_transmit_mode=B1Tx

This last step was just a test and did not succeed. I hope this message clarifies what I am trying to accomplish, and I would greatly appreciate any suggestions you have for improving the pipeline.

Best regards,
Bartosz Kossowski

Could you upload the batch.txt file so I can check if B1TX data was onboarded and labeled properly.

T1w was probably not marked as there are three of them (run01, run02, run02) and there is no way of knowing which one we should use. Let us discuss this internally what would be the best way to proceed in such cases.

Also, can you upload the error log from your hcp_transmit_bias_individual run. It should be in /processing/logs/comlogs or maybe runlogs if it exited before even triggering the HCP Pipelines command.

Best, Jure

Hi again,

Here are the files you asked for. I’m certain that b1tx wasn’t addressed at all in batch.txt — I’ve just added command-line parameters for it.

As with T1w, we want to keep two runs per subject. In this case, I included even more, just for testing purposes. The HCP pre_freesurfer pipeline automatically registers and averages multiple files.

error_import_bids_2025-05-21_12.08.11.516674.log (1.1 KB)
batch.txt (5.1 KB)

Hi,

So, I think we need to resolve 2 things here.

  1. The first one is the T1w labelling. You said that you needed to label something manually to get:
1   :T1w                :run-01_T1w: DwellTime(9.8e-06)
2   :T1w                :run-02_T1w: DwellTime(9.8e-06)
3   :T1w                :run-03_T1w: DwellTime(9.8e-06)
4   :T1w                :run-04_T1w: DwellTime(9.8e-06)
5   :T2w             :T2w: UnwarpDir(x): DwellTime(2.7e-06): phenc(RL)

Did you have to manually add T1w to all T1w scans or was it added to a single one and you needed to add it toother to achieve averaging? It might be that our import_bids does not support scans with run-xy_T1w, but only with a single, exact T1w. If that is the case, this is easy to fix on our end.

  1. For BIDS import, I think your B1 data is not named per BIDS standard (Quantitative MRI - Brain Imaging Data Structure 1.10.0). I think in your case you would need TB1TFL? This is the tag that the data we were using when developing the TransmitBias field had in a similar case as yours. Maybe there is also a use case of TransmitBias that uses “regular” fieldmaps, I would need to consult HCP folks on this. I am also not sure why the fieldmaps or bolds are not onboarded. I am not a bids expert, so I will ask someone who is more familiar with this to also take a look.

Best, Jure

@bkossows Can you please send me .json files for subject KJ007 so that I can fully reproduce the import?
How did you prepare session_hcp.txt files? Did you use create_session_info? Can you paste your hcp_mapping.txt file here?

Thank you again for your responses and helpful suggestions!

1. T1w Labeling and session_hcp.txt

Onboarding multiple T1w scans seems to work correctly. I used import_bids followed by create_session_info to generate the session_hcp.txt file as follows:

1   :T1w             :run-01_T1w: DwellTime(9.8e-06)
2   :T1w             :run-02_T1w: DwellTime(9.8e-06)
3   :T1w             :run-03_T1w: DwellTime(9.8e-06)
4   :                :run-04_T1w: DwellTime(9.8e-06)
5   :T2w             :T2w: UnwarpDir(x): DwellTime(2.7e-06): phenc(RL)

Here is the hcp_mapping.txt I used:

run-01_T1w      => T1w
run-02_T1w      => T1w
run-03_T1w      => T1w
T2w             => T2w

So, at least in this setup, having multiple run-xy_T1w files is handled correctly. Only run-04 wasn’t mapped, which I left out intentionally.

2. B1 Mapping and RF Map Import

Regarding B1 mapping—this part is indeed a bit tricky. I’ve noticed that dcm2niix doesn’t properly convert my RF maps. Specifically, it fails to sort slices and frames, resulting in a single image that contains what looks like “four heads” (i.e., incorrectly stacked frames). So I can’t use my BIDS repository to load B1 maps. However, converting the same DICOM file using SPM12 (MATLAB) yields a proper two-frame NIfTI, with both the magnitude and flip-angle images correctly separated.

I’ve attached the anonymized DICOM file. Could you check whether this file can be correctly imported directly into QUNEX, without preprocessing it in SPM?

Also, does it make sense to mix BIDS and DICOM imports in the same QUNEX pipeline, or should everything go through one consistent format?

I’m hopeful that once the RF map is imported correctly, it will integrate smoothly with the rest of the pipeline.

Kind regards,
Bartosz Kossowski

1.3.12.2.1107.5.2.43.167119.30000025060209575526500000011-20-1-dpveah.dcm.log (487.1 KB)

AD. 2 It looks like dicm2nii does the same shuffling as dcm2niix :confused:

Hi,
sorry for the late reply. While mixing BIDS and DICOM imports it is usually not suggested, I think you can mix those. As long as the nii folder and session.txt file are consistent, later functions should work fine. For now I suggest you to use the SPM import and then just replace the improperly imported .nii files with the SPM version.

We will look into the issue with B1 import! However, Jure is currently on vacation and I don’t have experience with TransmitBias pipeline to be able to offer appropriate help.

Like I said above, I think that in your case the images in question should be imported as TB1TFL-Magnitude and TB1TFL-Phase. That is how similar data that we received from HCP for coding QuNex support for the TransmitBias HCP pipellines was onboarded and then used for processing. Maybe the easiest way that does not conflate everything would be to:

  • Properly convert whatever DCMs you have to nii.
  • Sort everything out into a proper BIDS structure.
  • Onboard into QuNex with import_bids.

The issues with dcm2niix could be out of our hands as they seem to arise in a downstream tool. If this is simply a parametrization issue, then we can fix it, but it does not look that way. We support multiple dcm to nii conversion tools exactly because of this reason, none of them seem to be working well in all scenarios.

Hi,

Any updates here or was this resolved?

Once the data is onboarded properly, then all you need is to prepare the correct mapping file and you should be good to go.

Best, Jure