Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A more detailed Readme. #19

Closed
tcapelle opened this issue Aug 15, 2020 · 1 comment
Closed

A more detailed Readme. #19

tcapelle opened this issue Aug 15, 2020 · 1 comment

Comments

@tcapelle
Copy link

tcapelle commented Aug 15, 2020

Hello,
I am also interested on Transformers applied to TimeSeries, but I am having a hard time understanding the repo.
I would suggest adding more info to the Readme to show the capabilities of the library.

  • Is it a forecasting model? multistep ahead, single step?
  • Is it a classification model (as InceptionTime), can be adapted to do this?
  • Is it a regression model?

I would also be nice to have a small graph like the ones on the training notebooks on the Readme, to show the potential.
Great work btw, and I am very interested to collaborate.
We run a Time Series Study group on the fast.ai forums:

  • Here for V1 of the library (soon outdated)
  • Here for the updated V2 fastai library.
@maxjcohen
Copy link
Owner

Hi, the Transformer implemented in this repo is very similar to the original model described in Attention Is All You Need, I would suggest heading there for more information. To answer your questions,

  • The Transformer is a coherent many to many model, i.e. we predict one single output for each input. Using the current architecture, it is not suited for forecasting
  • We implemented this Transformer with regression in mind, but you should be able to apply it to classification, see for instance the transformer to be applied to classification #18 (pinned for more visibility)

If you want to add some modules, or modify the model itself (for forcasting or classification for example), don't hesitate to fork and PR. And thanks for the links, I'll be sure to check them out !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants