QMRITools`
QMRITools`
TrainSegmentationNetwork
TrainSegmentationNetwork[{inFol,outFol}]
trains a segmentation network. The correctly prepared training data should be stored in inFol. The progress each round will be saved in outFol.
TrainSegmentationNetwork[{inFol,outFol},netCont]
does the same but defines how to continue with netCont. If netCont is "Start" training will be restarted. If netCont is a initialized network or network file (wlnet) this will be used. If netCont is a a outFol the last saved network will be used. Possible loss functions are {"SoftDice", "SquaredDiff", "Tversky" , "CrossEntropy", "Jaccard"}.
Details
- The following options can be given:
-
LoadTrainingData True LoadTrainingData is an option for TrainSegmentationNetwork. If set to True the training data is loaded from the disk. MonitorInterval 1 MonitorInterval is an option for TrainSegmentationNetwork. It defines how often the training is monitored. PatchSize {32, 112, 112} PatchSize is an option for TrainSegmentationNetwork. Defines the patch size used in the network training. PatchesPerSet 1 PatchesPerSet is an option for GetTrainData. Defines how many random patches per dataset are created within the batch. BatchSize 4 BatchSize is an option for NetTrain and related functions that specifies the size of a batch of examples to process together. RoundLength 512 RoundLength is an option for TrainSegmentationNetwork. Defines how many batches will be seen during eacht training round. MaxTrainingRounds 150 MaxTrainingRounds is an option for NetTrain and related functions that specifies the maximum number of rounds of training to do. BlockType "ResNet" BlockType is an option for MakeUnet. It specifies the type of block used in the network. It can be "Conv", "UNet", "ResNet", "DenseNet", "Inception", or "U2Net". NetworkArchitecture "UNet" NeworkArchitecture is an option for MakeUnet. It defines the architecture of the network. It can be "UNet", "UNet+", or "UNet++". For "UNet+" or "UNet++" it can also be {arch, i} where i specifies how many of the top layers are connected to the mapping layer. ActivationType "GELU" ActivationType is an option for MakeUnet. It sepecifies which activation layer is used in the network. It can be "LeakyRELU" or any type allowed by a "name" definition in ElementwiseLayer. DownsampleSchedule 2 DownsampleSchedule is an option for MakeUnet. It defines how the data is downsampled for each of the deeper layers of the Unet. By default is is a factor two for each layer. A custum schedual for a 5 layer 3D Unet could be {{2,2,2},{1,2,2},{2,2,2},{1,2,2}, 1}. The deepest layer is always downsampled by 1 and therefore not needed to be specified. SettingSchedule Automatic SettingSchedule is an option for MakeUnet. It defines the settings for the Unet blocks. If one setting is given it applied to all layers. If a list of settings is given the settings can be different per layer. The following settings are the default settings. "Unet": convblock repetitions, 2, "ResNet" -> convblock repetitions, 2, "DenseNet" -> {dense depth, block repetitions}, {4,2}, "Inception" -> {inception width, block repetitions}, {4,2}, "U2Net"-> {Unet depth, downscale}, {5, True}. FeatureSchedule 32 FeatureSchedule is an option for MakeUnet. It defines how the number of features is upsampled for each of the deeper layers of the Unet. By default it increases the number of features by a factor 2 each layer, i.e. {1, 2, 4, 8, 16}. NetworkDepth 5 NetworkDepth is an option for MakeUnet. It specifief how deep the UNET will be. AugmentData True AugmentData is an option for GetTrainData and TrainSegmentationNetwork. If set True the trainingdata is augmented. LossFunction All LossFunction is an option for NetTrain that specifies how to compare actual and requested outputs from a neural net. DropoutRate 0.2 DropoutRate is an option for MakeUnet. It specifies how musch dropout is used after each block. It is a value between 0 and 1, default is .2. LearningRate 0.001 LearningRate is an option for NetTrain that specifies the rate at which to adjust neural net weights in order to minimize the training loss. L2Regularization 0.0001 L2Regularization is an option for TrainSegmentationNetwork. It defines the L2 regularization factor. MonitorCalc False MonitorCalc is an option for many processing functions. When true the proceses of the calculation is shown.