• Home
  • About Rimag
  • Contact Us
  • Register
  • Log in
  • Order
Advanced
  • Home
  • Laxmi Ahuja
  • Current Issue

    42
    Issue 42   Vol 11 Spring 2023
    Submit Your Paper List of Journal Reviewers

    Published Issues

    • Vol. 11
      • ✓ Issue 42 - Spring 2023
      • ✓ Issue 41 - Winter 2023
    • Vol. 10
      • ✓ Issue 40 - Autumn 2022
      • ✓ Issue 39 - Summer 2022
      • ✓ Issue 38 - Spring 2022
      • ✓ Issue 37 - Winter 2022
    • Vol. 9
      • ✓ Issue 36 - Autumn 2021
      • ✓ Special Issue
      • ✓ Issue 35 - Summer 2021
      • ✓ Issue 34 - Spring 2021
      • ✓ Issue 33 - Winter 2021
    • Vol. 8
      • ✓ Issue 32 - Autumn 2020
      • ✓ Issue 31 - Summer 2020
      • ✓ Issue 30 - Spring 2020
      • ✓ Issue 29 - Winter 2020
    • Vol. 7
      • ✓ Issue 28 - Autumn 2019
      • ✓ Issue 27 - Summer 2019
      • ✓ Issue 26 - Spring 2019
      • ✓ Issue 25 - Winter 2019
    • Vol. 6
      • ✓ Issue 24 - Autumn 2018
      • ✓ Issue 23 - Summer 2018
      • ✓ Issue 22 - Spring 2018
      • ✓ Issue 21 - Winter 2018
    • Vol. 5
      • ✓ Issue 20 - Autumn 2017
      • ✓ Issue 19 - Summer 2017
      • ✓ Issue 18 - Spring 2017
      • ✓ Issue 17 - Winter 2017
    • Vol. 4
      • ✓ Issue 16 - Autumn 2016
      • ✓ Issue 15 - Summer 2016
      • ✓ Issue 14 - Spring 2016
      • ✓ Issue 13 - Winter 2016
    • Vol. 3
      • ✓ Issue 12 - Autumn 2015
      • ✓ Issue 11 - Summer 2015
      • ✓ Issue 10 - Spring 2015
      • ✓ Issue 9 - Winter 2015
    • Vol. 2
      • ✓ Issue 8 - Autumn 2014
      • ✓ Issue 7 - Summer 2014
      • ✓ Issue 6 - Spring 2014
      • ✓ Issue 5 - Winter 2014
    • Vol. 1
      • ✓ Issue 4 - Autumn 2013
      • ✓ Issue 3 - Summer 2013
      • ✓ Issue 2 - Spring 2013
      • ✓ Issue 1 - Winter 2013

    Browse

    • •  Current Issue
    • •  By Issue
    • • Author Index
    • •  By Subject
    • •  By Author

    Menu

    • •  Editorial Board
    • •  Journal Policy
    • •  About Journal
    • •  Special Issues
    • •  Author Guide
    • •  Article Processing Charges (APC)
    • •  Evaluation Process
    • Contact Journal
    OpenAccess
    • List of Articles Laxmi Ahuja

      • Open Access Article
        • Abstract Page
        • Full-Text

        1 - Utilizing Gated Recurrent Units to Retain Long Term Dependencies with Recurrent Neural Network in Text Classification
        Nidhi Chandra Laxmi  Ahuja Sunil Kumar Khatri Himanshu Monga
        10.52547/jist.9.34.89
        20.1001.1.23221437.2021.9.34.2.9
        The classification of text is one of the key areas of research for natural language processing. Most of the organizations get customer reviews and feedbacks for their products for which they want quick reviews to action on them. Manual reviews would take a lot of time a More
        The classification of text is one of the key areas of research for natural language processing. Most of the organizations get customer reviews and feedbacks for their products for which they want quick reviews to action on them. Manual reviews would take a lot of time and effort and may impact their product sales, so to make it quick these organizations have asked their IT to leverage machine learning algorithms to process such text on a real-time basis. Gated recurrent units (GRUs) algorithms which is an extension of the Recurrent Neural Network and referred to as gating mechanism in the network helps provides such mechanism. Recurrent Neural Networks (RNN) has demonstrated to be the main alternative to deal with sequence classification and have demonstrated satisfactory to keep up the information from past outcomes and influence those outcomes for performance adjustment. The GRU model helps in rectifying gradient problems which can help benefit multiple use cases by making this model learn long-term dependencies in text data structures. A few of the use cases that follow are – sentiment analysis for NLP. GRU with RNN is being used as it would need to retain long-term dependencies. This paper presents a text classification technique using a sequential word embedding processed using gated recurrent unit sigmoid function in a Recurrent neural network. This paper focuses on classifying text using the Gated Recurrent Units method that makes use of the framework for embedding fixed size, matrix text. It helps specifically inform the network of long-term dependencies. We leveraged the GRU model on the movie review dataset with a classification accuracy of 87%. Manuscript profile
  • Home Page
  • Site Map
  • Contact Us
  • Home
  • Site Map
  • Regional Science and Technology Information Center
  • Contact Us

The rights to this website are owned by the Raimag Press Management System.
Copyright © 2017-2023

Home| Login| About Rimag| Contact Us|
[فارسی] [العربية] [fa] [ar]
  • Ricest
  • Login
  • email