10th Workshop on Automated Knowledge Base Construction
While Large Language Models (LLMs) have revolutionized NLP, they remain prone to hallucinations, reasoning “mode-collapse” in open-ended generation, and a lack of factual provenance. The Automated Knowledge Base Construction (AKBC) workshop addresses a key missing piece of the generative era: structured knowledge. Knowledge Bases (KBs) serve as ground truth for fact verification, the semantic backbone for constrained decoding in generation, and as a resource behind Retrieval-Augmented Generation (RAG).The workshop contributes to the growing momentum around integrating structured knowledge into generative models, both at training time and at inference time.
The workshop is co-located with EMNLP 2026 in Budapest, Hungary. It follows a successful series of previous editions: as an independent conference in 2022, 2021, and 2020, as workshop in 2017 at NIPS, in 2016 at NAACL, in 2014 at NIPS, in 2013 at CIKM, in 2012 at NAACL, and in 2010 as a stand-alone event in Grenoble, France.
News
- April 13th, 2026: The Web page of AKBC goes live
Important Dates
- Direct submission (research + vision)
- July 27, 2026 (Monday)
- Direct submission (shared task)
- August 15 (Saturday)
- ARR committment (research)
- August 25 (Tuesday)
- Notification
- September 1 (Tuesday)
- Camera ready due
- September 15 (Tuesday)
- Workshop date
- 1 day between October 24–29, 2026
Call for Papers
The 10th Workshop on Automated Knowledge Base Construction (AKBC) returns in 2026, bringing together researchers and practitioners working on the construction, integration, and use of structured knowledge in the era of large language models (LLMs). As LLMs continue to transform NLP, challenges such as hallucinations, lack of provenance, and limited reasoning reliability highlight the need for robust, explicit, and usable knowledge representations.
AKBC provides a venue at the intersection of natural language processing, knowledge representation, databases, and machine learning, with a particular focus on how symbolic structure can ground, constrain, and enhance generative models.
We invite submissions on topics including, but not limited to:
Knowledge for Generative Models
- Knowledge-aware pretraining and fine-tuning
- Factuality, attribution, and verification in generation
- Neuro-symbolic methods and hybrid models
- Injecting and editing knowledge in LLMs
Building and Maintaining Knowledge
- Knowledge extraction and consolidation from text and multimodal data
- Knowledge graphs, ontologies, and schema alignment
- Knowledge base construction, completion, and continual updates
Retrieval and Reasoning
- Retrieval-augmented generation (RAG) with structured sources
- Graph-based and knowledge-intensive question answering
- Multi-hop reasoning and interaction with KBs
Vision Papers
- New roles for structured knowledge in generative models
- Knowledge-aware training objectives, pretraining, and fine-tuning
- New architectures for combining symbolic knowledge and generative models
- Benchmarks, evaluations, and research agendas for knowledge-grounded generation
- Position papers on the future of knowledge bases, reasoning, and trustworthy AI
Submission Types
We welcome three types of submissions, reflecting both mature research results and forward-looking ideas at the intersection of structured knowledge and generative AI. all submissions have to follow the EMNLP formatting instructions.
1. Regular Research Papers (via ARR or direct submission)
For original research contributions. We accept papers reviewed through the ACL Rolling Review (ARR); authors may submit to ARR and commit their papers to AKBC. Accepted papers will follow standard ACL/EMNLP reviewing policies. The page limit is 8 pages + references.
2. Vision Papers (direct submission)
For bold ideas, emerging directions, and unifying perspectives. We particularly encourage papers that articulate new opportunities and challenges for knowledge base construction in the age of LLMs, and that help define promising research agendas for the field. The page limit is 4 pages + references.
3. Shared Task Papers (direct submission)
For system descriptions and analyses related to the AKBC shared task, co-located with the workshop. These submissions should describe participating systems, methodologies, and lessons learned from the challenge. The page limit is 4 pages + references.
Presentation Formats
Accepted papers will be presented as posters, with a selection invited for lightning talks. The workshop will also feature invited keynotes from leading researchers in academia and industry.
Invited Speakers
Invited speakers will be announced once confirmed. Speakers at previous AKBC workshops and conferences were from Google AI, HuggingFace, Deepmind, Apple, Stanford, the University of Washington, UCSC, CMU, and others.Organization
Jan-Christoph Kalo, University of Amsterdam, NetherlandsNdapa Nakashole, University of California, San Diego, USA
Simon Razniewski, ScaDS.AI Dresden/Leipzig & TU Dresden, Germany
Fabian M. Suchanek, Télécom Paris, Institut Polytechnique de Paris, France
Andreas Vlachos, University of Cambridge, United Kingdom
Andrew McCallum, University of Massachusetts Amherst, USA
Contact us at akbc2026@gmail.com for workshop inquiries, or at akbc2026-shared-task@googlegroups.com for shared task inquiries!