PP: Learning representations for time series clustering

Problem: time series clustering TSC - unsupervised learning/ category information is not available.

time-series clustering for anomaly detection/ pattern detection.

Feature-based time series clustering methods typically rely on domain knowledge to manually construct high-quality features.

Deep temporal clustering representation DTCR: add temporal reconstruction and k-means into the seq2seq model.

Introduction:

time-series clustering ----- data mining technology: from data to knowledge/ extract valuable information;

Feature-based methods: extracts features and then clusters. This kind of methods is robust to noise and outliers. It can conduct dimension reduction to improve efficiency.

However, most existing methods require domain knowledge to construct high-quality features manually.

discriminative features.

Seq2seq Model: it can learn general representations from sequential data.

We aim to learn a non-linear temporal representation for TSC using seq2seq model.

当使用seq2seq模型时,由于缺少labels,无法进行学习,guide the learning process to generate cluster-specific representations. 所以该论文如何解决这个问题?

generate cluster-specific temporal representations.

DTCR = temporal reconstruction + k-means + seq2seq model

个人观点: 这不会是主流方法,而且结果图有分类比较和聚类比较,结果感觉不真实。

对于TSC,本不应该应用方法的拼拼凑凑,而且他说有辅助分类,在真实世界中,不可能有辅助分类帮助你进行聚类。

Supplementary knowledge:

1. other TSC methods:

  • encode time series into images, and then use CNNs etc, like recurrence plots/  Gramian angular summation/ Gramian angular difference fields/ Markov transition fields.

2. PP: Motif difference field (MDF): A simple and effective image representation of time series for classification.

encode time series into MDF images.

This paper tries to include temporal information while encoding.

原文地址:https://www.cnblogs.com/dulun/p/12271672.html

时间: 2024-10-13 02:09:26

PP: Learning representations for time series clustering的相关文章

PP: Time series clustering via community detection in Networks

tasks:1. review the community detection paper2. formulate your problem and software functions3. Suppose: similar time series tend to connect to each other and form communities. / high correlated time series tend to connect to each other and form comm

ICLR 2016 - Workshop Track International Conference on Learning Representations 论文papers

ICLR 2016 - Workshop Track International Conference on Learning Representations May 2 - 4, 2016, Caribe Hilton, San Juan, Puerto Rico Please see the venue website (http://www.iclr.cc/doku.php?id=iclr2016:main) for more information. Submission deadlin

PP: Triple-shapelet networks for time series classification

Problem: time series classification shapelet-based method: two issues 1. for multi-class imbalanced classification tasks, these methods will ignore the shapelets that can distinguish minority class from other classes. 2. the shapelets are fixed after

Deep Learning in a Nutshell: History and Training

Deep Learning in a Nutshell: History and Training This series of blog posts aims to provide an intuitive and gentle introduction to deep learning that does not rely heavily on math or theoretical constructs. The first part in this series provided an

(转) Deep Learning in a Nutshell: Core Concepts

Deep Learning in a Nutshell: Core Concepts Share: Posted on November 3, 2015by Tim Dettmers 7 CommentsTagged cuDNN, Deep Learning, Deep Neural Networks, Machine Learning,Neural Networks This post is the first in a series I’ll be writing for Parallel

阅读笔记 CCL: Cross-modal Correlation Learning with Multi-grained Fusion by Hierarchical Network

总结 CCL: Cross-modal Correlation Learning with Multi-grained Fusion by Hierarchical Network Yuxin Peng, Jinwei Qi, Xin Huang and Yuxin Yuan 常见方法 使用深度神经网络(DNN)的跨模态检索大体分为两个步骤: 1 The first learning stage is to generate separate representation for each mo

#Deep Learning回顾#之LeNet、AlexNet、GoogLeNet、VGG、ResNet

CNN的发展史 上一篇回顾讲的是2006年Hinton他们的Science Paper,当时提到,2006年虽然Deep Learning的概念被提出来了,但是学术界的大家还是表示不服.当时有流传的段子是Hinton的学生在台上讲paper时,台下的机器学习大牛们不屑一顾,质问你们的东西有理论推导吗?有数学基础吗?搞得过SVM之类吗?回头来看,就算是真的,大牛们也确实不算无理取闹,是骡子是马拉出来遛遛,不要光提个概念. 时间终于到了2012年,Hinton的学生Alex Krizhevsky在寝

2016.4.15 nature deep learning review[1]

今天,我本来想膜一下,所以找到了上古时期发表再nature上的反向传播的论文,但是没看下去...所以,翻出来了15年发表在nature上的deep learning,相当于一个review,来阅读一下,而且感觉引文会比较重要,所以这篇中枢值较高的文献拿来学一学. 相关资料: 英文原文: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.436.894&rep=rep1&type=pdf 中文翻译: http://www.csd

Deep Learning and Shallow Learning

Deep Learning and Shallow Learning 由于 Deep Learning 现在如火如荼的势头,在各种领域逐渐占据 state-of-the-art 的地位,上个学期在一门课的 project 中见识过了 deep learning 的效果,最近在做一个东西的时候模型上遇到一点瓶颈于是终于决定也来了解一下这个魔幻的领域. 据说 Deep Learning 的 break through 大概可以从 Hinton 在 2006 年提出的用于训练 Deep Belief