package linwrap
Install
Dune Dependency
Authors
Maintainers
Sources
sha256=d662173151e3ffed59ba4ac89f21a8d28507ba81ed8a96cf44a0334afcc1de96
md5=fa236a7e11393a35a358fd3c89bb2d40
Description
Linwrap can be used to train a L2-regularized logistic regression classifier or a linear Support Vector Regressor. You can optimize C (the L2 regularization parameter), w (the class weight) or k (the number of bags, i.e. use bagging). You can also find the optimal classification threshold using MCC maximization, use k-folds cross validation, parallelization, etc. In the regression case, you can only optimize C and epsilon.
When using bagging, each model is trained on balanced bootstraps from the training set (one bootstrap for the positive class, one for the negative class). The size of the bootstrap is the size of the smallest (under-represented) class.
usage: linwrap -i : training set or DB to screen [-o ]: predictions output file [-np ]: ncores [-c ]: fix C [-e ]: fix epsilon (for SVR); (0 <= epsilon <= max_i(|y_i|)) [-w ]: fix w1 [--no-plot]: no gnuplot [-k ]: number of bags for bagging (default=off) [{-n|--NxCV} ]: folds of cross validation [--mcc-scan]: MCC scan for a trained model (requires n>1) also requires (c, w, k) to be known [--seed ]: fix random seed [-p ]: training set portion (in [0.0:1.0]) [--pairs]: read from .AP files (atom pairs; will offset feat. indexes by 1) [--train <train.liblin>]: training set (overrides -p) [--valid <valid.liblin>]: validation set (overrides -p) [--test <test.liblin>]: test set (overrides -p) [{-l|--load} ]: prod. mode; use trained models [{-s|--save} ]: train. mode; save trained models [-f]: force overwriting existing model file [--scan-c]: scan for best C [--scan-e ]: epsilon scan #steps for SVR [--regr]: regression (SVR); also, implied by -e and --scan-e [--scan-w]: scan weight to counter class imbalance [--w-range ::]: specific range for w (semantic=start:nsteps:stop) [--e-range ::]: specific range for e (semantic=start:nsteps:stop) [--c-range <float,float,...>] explicit scan range for C (example='0.01,0.02,0.03') [--k-range <int,int,...>] explicit scan range for k (example='1,2,3,5,10') [--scan-k]: scan number of bags (advice: optim. k rather than w)
Published: 06 Apr 2021
README
linwrap
Wrapper on top of liblinear-tools.
Linwrap can be used to train a L2-regularized logistic regression classifier or a linear Support Vector Regressor. You can optimize C (the L2 regularization parameter), w (the class weight) or k (the number of bags, i.e. use bagging). You can also find the optimal classification threshold using MCC maximization, use k-folds cross validation, parallelization, etc. In the regression case, you can only optimize C and epsilon.
Bibliography
[1] Fan, R. E., Chang, K. W., Hsieh, C. J., Wang, X. R., & Lin, C. J. (2008). LIBLINEAR: A library for large linear classification. Journal of machine learning research, 9(Aug), 1871-1874.
[2] Hsu, C. W., Chang, C. C., & Lin, C. J. (2003). A practical guide to support vector classification.
[3] Hsia, J. Y., & Lin, C. J. (2020). Parameter selection for linear support vector regression. IEEE Transactions on Neural Networks and Learning Systems.
[4] Breiman, L. (1996). Bagging predictors. Machine learning, 24(2), 123-140.