Classification parameter optimisation for the KNN implementation of Wu and Dietterich's transfer learning schema
knntlOptimisation(
primary,
auxiliary,
fcol = "markers",
k,
times = 50,
test.size = 0.2,
xval = 5,
by = 0.5,
length.out,
th,
xfolds,
BPPARAM = BiocParallel::bpparam(),
method = "Breckels",
log = FALSE,
seed
)
An instance of class "MSnSet"
.
An instance of class
"MSnSet"
.
The feature meta-data containing marker definitions.
Default is markers
.
Numeric vector of length 2, containing the best k
parameters to use for the primary (k[1]
) and auxiliary
(k[2]
) datasets. See knnOptimisation
for
generating best k
.
The number of times cross-validation is performed. Default is 50.
The size of test (validation) data. Default is 0.2 (20 percent).
The number of rounds of cross-validation to perform.
The increment for theta, must be one of c(1, 0.5,
0.25, 0.2, 0.15, 0.1, 0.05)
Alternative to using by
parameter. Specifies the desired length of the sequence of
theta to test.
A matrix of theta values to test for each class as
generated from the function thetas
, the number
of columns should be equal to the number of classes contained
in fcol
. Note: columns will be ordered according to
getMarkerClasses(primary, fcol)
. This argument is only
valid if the default method 'Breckels' is used.
Option to pass specific folds for the cross validation.
Required for parallelisation. If not specified
selects a default BiocParallelParam
, from global
options or, if that fails, the most recently registered()
back-end.
The k-NN transfer learning method to use. The default is 'Breckels' as described in the Breckels et al (2016). If 'Wu' is specificed then the original method implemented Wu and Dietterich (2004) is implemented.
A logical
defining whether logging should be
enabled. Default is FALSE
. Note that logging produes
considerably bigger objects.
The optional random number generator seed.
A list of containing the theta combinations tested, associated macro F1 score and accuracy for each combination over each round (specified by times).
knntlOptimisation
implements a variation of Wu and
Dietterich's transfer learning schema: P. Wu and
T. G. Dietterich. Improving SVM accuracy by training on auxiliary
data sources. In Proceedings of the Twenty-First International
Conference on Machine Learning, pages 871 - 878. Morgan Kaufmann,
2004. A grid search for the best theta is performed.
Breckels LM, Holden S, Wonjar D, Mulvey CM, Christoforou A, Groen AJ, Kohlbacher O, Lilley KS, Gatto L. Learning from heterogeneous data sources: an application in spatial proteomics. bioRxiv. doi: http://dx.doi.org/10.1101/022152
Wu P, Dietterich TG. Improving SVM Accuracy by Training on Auxiliary Data Sources. Proceedings of the 21st International Conference on Machine Learning (ICML); 2004.
knntlClassification
and example therein.