Detecting Rare Actions and Events from Surveillance Big Data with Bag of Dynamic Trajectories Surveillance video is increasingly becoming the “biggest big data”. This presents an unprecedented challenge for analyzing and mining the meaningful information (e.g., Rare actions or events) in such a huge amount of videos. Recent studies have shown that feature-trajectories-based methods are effective to encode motion information in video, consequently demonstrating superior performance in action and event detection. However, in existing methods, distance between two trajectories is often measured by linear models, which may be not robust enough when the lengths of trajectories are variable. Moreover, due to the rare distribution of target actions or events, the traditional classifier often tends to identify all samples as negative, consequently producing heavy performance bias. To address both two issues, this paper proposes a trajectory descriptor, BoDT (Bag of Dynamic Trajectories), and a multi-channel uneven SVM. By utilizing the DTW (dynamic time warping) algorithm to measure the similarity between two trajectories, BoDT is robust for variable-length trajectory representation. Meanwhie, as an extension of SVM with uneven margins, the proposed multi-channel uneven SVM can successfully identify rare events by adjusting a margin parameter to make the classification boundary properly moved away from the positive training examples. Extensive experiments on several benchmark datasets including KTH, YouTube, Olympic, MIT, QMUL and TRECVid demonstrate that our approach is feasible and effective.