DFKI NLP
DFKI NLP
Home
News
People
Publications
Projects
Datasets
Contact
1
Evaluating German Transformer Language Models with Syntactic Agreement Tests
Pre-trained transformer language models (TLMs) have recently refashioned natural language processing (NLP): Most stateof-the-art NLP …
Karolina Zaczynska
,
Nils Feldhus
,
Robert Schwarzenberg
,
Aleksandra Gabryszak
,
Sebastian Möller
PDF
Cite
Project
Abstractive Text Summarization based on Language Model Conditioning and Locality Modeling
We explore to what extent knowledge about the pre-trained language model that is used is beneficial for the task of abstractive …
Dmitrii Aksenov
,
Julian Moreno Schneider
,
Peter Bourgonje
,
Robert Schwarzenberg
,
Leonhard Hennig
,
Georg Rehm
PDF
Cite
Layerwise Relevance Visualization in Convolutional Text Graph Classifiers
Representations in the hidden layers of Deep Neural Networks (DNN) are often hard to interpret since it is difficult to project them …
Robert Schwarzenberg
,
Marc Hübner
,
David Harbecke
,
Christoph Alt
,
Leonhard Hennig
PDF
Cite
Code
Cross-lingual Neural Vector Conceptualization
Recently, Neural Vector Conceptualization (NVC) was proposed as a means to interpret samples from a word vector space. For NVC, a …
Lisa Raithel
,
Robert Schwarzenberg
PDF
Cite
Code
Fine-Tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
We show that generative language model pre-training combined with selective attention improves recall for long-tail relations in distantly supervised neural relation extraction.
Christoph Alt
,
Marc Hübner
,
Leonhard Hennig
PDF
Cite
Code
Poster
A Crowdsourcing Approach to Evaluate the Quality of Query-based Extractive Text Summaries
We analyze the feasibility and appropriateness of micro-task crowdsourcing for evaluation of different summary quality characteristics and report an ongoing work on the crowdsourced evaluation of query-based extractive text summaries
Neslihan Iskender
,
Aleksandra Gabryszak
,
Tim Polzehl
,
Leonhard Hennig
,
Sebastian Möller
PDF
Cite
Enriching BERT with Knowledge Graph Embedding for Document Classification
In this paper, we focus on the classification of books using short descriptive texts (cover blurbs) and additional metadata. Building …
Malte Ostendorff
,
Peter Bourgonje
,
Maria Berger
,
Julian Moreno Schneider
,
Georg Rehm
,
Bela Gipp
PDF
Code
Improving Relation Extraction by Pre-Trained Language Representations
We show that transfer learning through generative language model pre-training improves supervised neural relation extraction, achieving new state-of-the-art performance on TACRED and SemEval 2010 Task 8.
Christoph Alt
,
Marc Hübner
,
Leonhard Hennig
PDF
Cite
Code
Poster
Slides
Neural Vector Conceptualization for Word Vector Space Interpretation
Distributed word vector spaces are considered hard to interpret which hinders the understanding of natural language processing (NLP) …
Robert Schwarzenberg
,
Lisa Raithel
,
David Harbecke
PDF
Cite
Code
Train, Sort, Explain: Learning to Diagnose Translation Models
Evaluating translation models is a trade-off between effort and detail. On the one end of the spectrum there are automatic count-based …
Robert Schwarzenberg
,
David Harbecke
,
Vivien Macketanz
,
Eleftherios Avramidis
,
Sebastian Möller
PDF
Cite
Code
«
»
Cite
×