Download PDFOpen PDF in browserMulti-Granulariy Time-Based Transformer for Knowledge TracingEasyChair Preprint 9950, version 35 pages•Date: September 16, 2023AbstractIn this paper, we present a transformer architecture for predicting student performance on standardized tests. Specifically, we leverage students’ historical data, including their past test scores, study habits, and other relevant information, to create a personalized model for each student. We then use these models to predict their future performance on a given test. Applying this model to the RIIID dataset, we demonstrate that using multiple granularities for temporal features as the decoder input significantly improve model performance. Our results also show the effectiveness of our approach, with substantial improvements over the LightGBM method. Our work contributes to the growing field of AI in education, providing a scalable and accurate tool for predicting student outcomes. Keyphrases: Education, RIIID, deep learning, multi-granularity, transformer
|