Exploiting BERT for End-to-End Aspect-based Sentiment Analysis

Xin Li1, Lidong Bing2, Wenxuan Zhang3, Wai Lam3
1Department of Systems Engineering and Engineering Management, The Chinese University of Hong Kong, 2Alibaba DAMO Academy, 3The Chinese University of Hong Kong


In this paper, we investigate the modeling power of contextualized embeddings from pre-trained language models, e.g. BERT, on the E2E-ABSA task. Specifically, we build a series of simple yet insightful neural baselines to deal with E2E-ABSA. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works. Besides, we also standardize the comparative study by consistently utilizing a hold-out validation dataset for model selection, which is largely ignored by previous works. Therefore, our work can serve as a BERT-based benchmark for E2E-ABSA.