
International Journal on Science and Technology
E-ISSN: 2229-7677
•
Impact Factor: 9.88
A Widely Indexed Open Access Peer Reviewed Multidisciplinary Bi-monthly Scholarly International Journal
Plagiarism is checked by the leading plagiarism checker
Call for Paper
Volume 16 Issue 3
July-September 2025
Indexing Partners



















Pre-Trained Language Models development using Contrastive Framework for Semi-Supervised Fine-Tuning
Author(s) | Mr. Rohit Singh |
---|---|
Country | India |
Abstract | The rapid advancements in pre-trained language models (PLMs) have revolutionized natural language processing (NLP), giving performance across diverse tasks. However, their efficacy diminishes in low-resource domains with limited labeled data, where extracting task-specific semantics becomes challenging. This limitation is particularly pronounced in mission-critical applications such as military operations, where the availability of labeled datasets is constrained by security and operational restrictions. To address these challenges, this paper proposes a novel Contrastive Framework for Semi-Supervised Fine-Tuning of PLMs. By integrating contrastive learning with semi-supervised techniques, the framework enables PLMs to effectively leverage both labeled and unlabeled data, enhancing their ability to generalize in low-resource settings. The study focuses on creating a customized, domain-specific language model tailored to the unique linguistic and operational requirements of the Indian Army, addressing critical tasks such as secure communication, multilingual processing, and intelligence analysis. |
Keywords | NLP, LLM, machine learning, fine tuning |
Field | Engineering |
Published In | Volume 16, Issue 3, July-September 2025 |
Published On | 2025-07-15 |
DOI | https://doi.org/10.71097/IJSAT.v16.i3.6917 |
Short DOI | https://doi.org/g9s9v6 |
Share this


CrossRef DOI is assigned to each research paper published in our journal.
IJSAT DOI prefix is
10.71097/IJSAT
Downloads
All research papers published on this website are licensed under Creative Commons Attribution-ShareAlike 4.0 International License, and all rights belong to their respective authors/researchers.
