Self-Supervised Contrastive Embeddings for Semantic Enrichment: Contextual Alignment and Entity Linking

Authors

  • Simona Vasilica Department of Economic Informatics, Bucharest University of Economic Studies, Bucharest, Romania Author
  • Adela Bara Department of Economic Informatics, Bucharest University of Economic Studies, Bucharest, Romania Author

Keywords:

contrastive learning, Semantic enrichment, entity linking, knowledge graphs, contextual alignment, retrieval, candidate generation

Abstract

Semantic enrichment benefits from representations that respect both textual context and knowledge-graph structure. Building on the bibliometric baseline of the field [1], we propose a self-supervised contrastive framework that learns sentence- and mention-level embeddings aligned across three natural positive signals: (i) co-mention and coreference within documents, (ii) adjacency in a knowledge graph, and (iii) co-citation/co-reference at the article level. Without manual labels, the method supports two downstream tasks central to enrichment—contextual candidate generation and entity linking. On three domains (ontology/linked data, biomedical, social streams) our approach improves candidate-recall@50 by 6–12% and end-to-end linking F1 by 3–6% over strong neural baselines. Ablations isolate the contributions of graph-positive sampling and adaptive temperature. We release scripts to reproduce figures (loss curves, PR curves, embedding scatter) and tables (dataset summary, ablations), designed to compile with this template.

Downloads

Published

2024-06-15

Issue

Section

Articles

How to Cite

Self-Supervised Contrastive Embeddings for Semantic Enrichment: Contextual Alignment and Entity Linking. (2024). International Journal of Industrial Engineering and Construction Management (IJIECM), 1(1), 14-19. https://www.ijiecm.com/index.php/ijiecm/article/view/58

Similar Articles

1-10 of 21

You may also start an advanced similarity search for this article.