This is the official source code of "BiCAT: Bi-Chronological Augmentation of Transformer for Sequential Recommendation".

Overview

BiCAT

This is our TensorFlow implementation for the paper: "BiCAT: Sequential Recommendation with Bidirectional Chronological Augmentation of Transformer". Our code is implemented based on Tensorflow version of SASRec and ASReP.

Environment

  • TensorFlow 1.12
  • Python 3.6.*

Datasets Prepare

Benchmarks: Amazon Review datasets Beauty, Movie Lens and Cell_Phones_and_Accessories. The data split is done in the leave-one-out setting. Make sure you download the datasets from the link. Please, use the DataProcessing.py under the data/, and make sure you change the DATASET variable value to your dataset name, then you run:

python DataProcessing.py

You will find the processed dataset in the directory with the name of your input dataset.

Beauty

1. Reversely Pre-training and Short Sequence Augmentation

Pre-train the model and output 20 items for sequences with length <= 20.

python main.py \
       --dataset=Beauty \
       --train_dir=default \
       --lr=0.001 \
       --hidden_units=128 \
       --maxlen=100 \
       --dropout_rate=0.7 \
       --num_blocks=2 \
       --l2_emb=0.0 \
       --num_heads=4 \
       --evalnegsample 100 \
       --reversed 1 \
       --reversed_gen_num 20 \
       --M 20

2. Next-Item Prediction with Reversed-Pre-Trained Model and Augmented dataset

python main.py \
       --dataset=Beauty \
       --train_dir=default \
       --lr=0.001 \
       --hidden_units=128 \
       --maxlen=100 \
       --dropout_rate=0.7 \
       --num_blocks=2 \
       --l2_emb=0.0 \
       --num_heads=4 \
       --evalnegsample 100 \
       --reversed_pretrain 1 \
       --aug_traindata 15 \
       --M 18

Cell_Phones_and_Accessories

1. Reversely Pre-training and Short Sequence Augmentation

Pre-train the model and output 20 items for sequences with length <= 20.

python main.py \
       --dataset=Cell_Phones_and_Accessories \
       --train_dir=default \
       --lr=0.001 \
       --hidden_units=32 \
       --maxlen=100 \
       --dropout_rate=0.5 \
       --num_blocks=2 \
       --l2_emb=0.0 \
       --num_heads=2 \
       --evalnegsample 100 \
       --reversed 1 \
       --reversed_gen_num 20 \
       --M 20

2. Next-Item Prediction with Reversed-Pre-Trained Model and Augmented dataset

python main.py \
       --dataset=Cell_Phones_and_Accessories \
       --train_dir=default \
       --lr=0.001 \
       --hidden_units=32 \
       --maxlen=100 \
       --dropout_rate=0.5 \
       --num_blocks=2 \
       --l2_emb=0.0 \
       --num_heads=2 \
       --evalnegsample 100 \
       --reversed_pretrain 1 \ 
       --aug_traindata 17 \
       --M 18

Citation

@misc{jiang2021sequential,
      title={Sequential Recommendation with Bidirectional Chronological Augmentation of Transformer}, 
      author={Juyong Jiang and Yingtao Luo and Jae Boum Kim and Kai Zhang and Sunghun Kim},
      year={2021},
      eprint={2112.06460},
      archivePrefix={arXiv},
      primaryClass={cs.IR}
}
Owner
John
My research interests are machine learning and recommender systems.
John
Unified Pre-training for Self-Supervised Learning and Supervised Learning for ASR

UniSpeech The family of UniSpeech: UniSpeech (ICML 2021): Unified Pre-training for Self-Supervised Learning and Supervised Learning for ASR UniSpeech-

Microsoft 282 Jan 09, 2023
An educational resource to help anyone learn deep reinforcement learning.

Status: Maintenance (expect bug fixes and minor updates) Welcome to Spinning Up in Deep RL! This is an educational resource produced by OpenAI that ma

OpenAI 7.6k Jan 09, 2023
Source code of the paper Meta-learning with an Adaptive Task Scheduler.

ATS About Source code of the paper Meta-learning with an Adaptive Task Scheduler. If you find this repository useful in your research, please cite the

Huaxiu Yao 16 Dec 26, 2022
Animate molecular orbital transitions using Psi4 and Blender

Molecular Orbital Transitions (MOT) Animate molecular orbital transitions using Psi4 and Blender Author: Maximilian Paradiz Dominguez, University of A

3 Feb 01, 2022
SOLO and SOLOv2 for instance segmentation, ECCV 2020 & NeurIPS 2020.

SOLO: Segmenting Objects by Locations This project hosts the code for implementing the SOLO algorithms for instance segmentation. SOLO: Segmenting Obj

Xinlong Wang 1.5k Dec 31, 2022
This repository contains the code used for the implementation of the paper "Probabilistic Regression with HuberDistributions"

Public_prob_regression_with_huber_distributions This repository contains the code used for the implementation of the paper "Probabilistic Regression w

David Mohlin 1 Dec 04, 2021
code for TCL: Vision-Language Pre-Training with Triple Contrastive Learning, CVPR 2022

Vision-Language Pre-Training with Triple Contrastive Learning, CVPR 2022 News (03/16/2022) upload retrieval checkpoints finetuned on COCO and Flickr T

187 Jan 02, 2023
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.

DeepConsensus DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS)

Google 149 Dec 19, 2022
This is a repository for a Semantic Segmentation inference API using the Gluoncv CV toolkit

BMW Semantic Segmentation GPU/CPU Inference API This is a repository for a Semantic Segmentation inference API using the Gluoncv CV toolkit. The train

BMW TechOffice MUNICH 56 Nov 24, 2022
AI-generated-characters for Learning and Wellbeing

AI-generated-characters for Learning and Wellbeing Click here for the full project page. This repository contains the source code for the paper AI-gen

MIT Media Lab 214 Jan 01, 2023
custom pytorch implementation of MoCo v3

MoCov3-pytorch custom implementation of MoCov3 [arxiv]. I made minor modifications based on the official MoCo repository [github]. No ViT part code an

39 Nov 14, 2022
Codebase for Attentive Neural Hawkes Process (A-NHP) and Attentive Neural Datalog Through Time (A-NDTT)

Introduction Codebase for the paper Transformer Embeddings of Irregularly Spaced Events and Their Participants. This codebase contains two packages: a

Alan Yang 28 Dec 12, 2022
Highly comparative time-series analysis

〰️ hctsa 〰️ : highly comparative time-series analysis hctsa is a software package for running highly comparative time-series analysis using Matlab (fu

Ben Fulcher 569 Dec 21, 2022
A pure PyTorch implementation of the loss described in "Online Segment to Segment Neural Transduction"

ssnt-loss ℹ️ This is a WIP project. the implementation is still being tested. A pure PyTorch implementation of the loss described in "Online Segment t

張致強 1 Feb 09, 2022
MILK: Machine Learning Toolkit

MILK: MACHINE LEARNING TOOLKIT Machine Learning in Python Milk is a machine learning toolkit in Python. Its focus is on supervised classification with

Luis Pedro Coelho 610 Dec 14, 2022
Streaming Anomaly Detection Framework in Python (Outlier Detection for Streaming Data)

Python Streaming Anomaly Detection (PySAD) PySAD is an open-source python framework for anomaly detection on streaming multivariate data. Documentatio

Selim Firat Yilmaz 181 Dec 18, 2022
Edge Restoration Quality Assessment

ERQA - Edge Restoration Quality Assessment ERQA - a full-reference quality metric designed to analyze how good image and video restoration methods (SR

MSU Video Group 27 Dec 17, 2022
We are More than Our JOints: Predicting How 3D Bodies Move

We are More than Our JOints: Predicting How 3D Bodies Move Citation This repo contains the official implementation of our paper MOJO: @inproceedings{Z

72 Oct 20, 2022
Convert scikit-learn models to PyTorch modules

sk2torch sk2torch converts scikit-learn models into PyTorch modules that can be tuned with backpropagation and even compiled as TorchScript. Problems

Alex Nichol 101 Dec 16, 2022