Acronym: | TSF | Type: | Intervals | Year: | 2013 | Publication: | InfoScience |

Description: | Deng $et\:al.$ overcome the problem of the huge interval feature space by employing a random forest approach, using summary statistics of each interval as features. Training a single tree involves selecting $\sqrt{m}$ random intervals, generating the mean, standard deviation and slope of the random intervals for every series then creating and training a tree on the resulting $3\sqrt{m}$ features. Classification is by a majority vote of all the trees in the ensemble. The classification tree has two bespoke characteristics. Firstly, rather than evaluate all possible split points to find the best information gain, a fixed number of evaluation points is pre-defined. We assume this is an expedient to make the classifier faster, as it removes the need to sort the cases by each attribute value. Secondly, a refined splitting criteria to choose between features with equal information gain is introduced. This is defined as the distance between the splitting margin and the closest case. The intuition behind the idea is that if two splits have equal entropy gain, then the split that is furthest from the nearest case should be preferred. This measure would have no value if all possible intervals were evaluated because by definition the split points are taken as equi-distant between cases. We experimented with including these two features, but found the effect on accuracy was, if anything, negative. We found the computational overhead of evaluating all split points acceptable, hence we had no need to include the margin based tie breaker. We used the built in Weka RandomTree classifier (which is the basis for the Weka RandomForest classifier) with default parameters. This means there is no limit to the depth of the tree nor a minimum number of cases per leaf node. A more formal description is given in Algorithm 6. |

Source Code: | Time Series Forest Code |

Published Results: | Recreated Results: |

Published | |

Dataset: | Result: |

Adiac | 0.261 |

Beef | 0.3 |

CBF | 0.039 |

ChlorineConcentration | 0.26 |

CinCECGtorso | 0.069 |

Coffee | 0.071 |

CricketX | 0.287 |

CricketY | 0.2 |

CricketZ | 0.239 |

DiatomSizeReduction | 0.101 |

ECGFiveDays | 0.07 |

FaceAll | 0.231 |

FaceFour | 0.034 |

FacesUCR | 0.109 |

FiftyWords | 0.277 |

Fish | 0.154 |

GunPoint | 0.047 |

Haptics | 0.565 |

InlineSkate | 0.675 |

ItalyPowerDemand | 0.033 |

Lightning2 | 0.18 |

Lightning7 | 0.263 |

Mallat | 0.072 |

MedicalImages | 0.232 |

MoteStrain | 0.118 |

NonInvasiveFatalECGThorax1 | 0.103 |

NonInvasiveFatalECGThorax2 | 0.094 |

OliveOil | 0.1 |

OSULeaf | 0.426 |

SonyAIBORobotSurface1 | 0.235 |

SonyAIBORobotSurface2 | 0.177 |

StarlightCurves | 0.036 |

SwedishLeaf | 0.109 |

Symbols | 0.121 |

SyntheticControl | 0.023 |

TwoLeadECG | 0.112 |

TwoPatterns | 0.053 |

UWaveGestureLibraryX | 0.213 |

UWaveGestureLibraryY | 0.288 |

UWaveGestureLibraryZ | 0.267 |

Wafer | 0.047 |

WordSynonyms | 0.381 |

Yoga | 0.157 |

Algorithm: |