gluonts.mx.model.transformer.trans_decoder module#

class gluonts.mx.model.transformer.trans_decoder.TransformerDecoder(decoder_length: int, config: Dict, **kwargs)[source]#

Bases: HybridBlock

cache_reset()[source]#
hybrid_forward(F, data: Union[NDArray, Symbol], enc_out: Union[NDArray, Symbol], mask: Optional[Union[NDArray, Symbol]] = None, is_train: bool = True) Union[NDArray, Symbol][source]#

A transformer encoder block consists of a self-attention and a feed- forward layer with pre/post process blocks in between.