[导读]Learning from Imbalanced Classes

原文:Learning from Imbalanced Classesorm

数据不平衡是一个很是经典的问题,数据挖掘、计算广告、NLP等工做常常遇到。该文总结了可能有效的方法,值得参考:blog

  • Do nothing. Sometimes you get lucky and nothing needs to be done. You can train on the so-called natural (or stratified) distribution and sometimes it works without need for modification.
  • Balance the training set in some way:
    • Oversample the minority class.
    • Undersample the majority class.
    • Synthesize new minority classes.
  • Throw away minority examples and switch to an anomaly detection framework.
  • At the algorithm level, or after it:
    • Adjust the class weight (misclassification costs).
    • Adjust the decision threshold.
    • Modify an existing algorithm to be more sensitive to rare classes.
  • Construct an entirely new algorithm to perform well on imbalanced data.
相关文章
相关标签/搜索