Download PDFOpen PDF in browserImproving Performance Through Novel Enhanced Hierarchial Attention Neural NetworkEasyChair Preprint 37227 pages•Date: July 3, 2020AbstractBig data and its classification have been the recent challenge in the evolving world. Data evolving needs to be classified in an effective way. For the classifying process, deep learning and machine learning models are evolved. Hierarchical Attention Network (HAN) is one of the most dominant neural network structures in classification. The major demerits which the HAN is facing are, high computation time and numerous layers. The drawback of HAN is vanquished by a new idea arrived from the mining methods that yield mixed attention network for android data classification. By this flow it could handle more complex request apart from the concept identified. The EHAN (Enhanced Hierarchical Attention Network) has two prototypes. The first one is the attention model to distinguish the features and the second one is the self-attention model to identify worldwide facts. By the demonstrated outcome we infer that the partitioning of task is constructive and therefore the EHAN features shows a significant growth on the news database. In addition to this paper we could add other subnetworks subsequently to assist this ability further. Keyphrases: Artificial Intelligence, Enhanced Hierarchical Attention Network, deep learning, machine learning
|