Amazing Work from GEC Students

Develop. Grow. Succeed.

Title: Disaster Detector on Twitter Using Bidirectional Encoder Representation from Transformers with Keyword Position Information

Abstract: Deep learning, as one of the most currently remarkable machine learning techniques, has achieved great success in many applications such as image analysis, speech recognition, and text understanding. This work aims to make use of the bidirectional encoder representation from Transformers (BERT) model to a particular problem in our real life, detecting disaster on Twitter. Specifically, we first validate the practicability of BERT in disaster detection, and then improve the model practicability by extracting keyword information to classify the text. The experimental results show that our method achieves a more accurate prediction of texts announcing a disaster compared to the models without keyword position information. The result of this research will be helpful in monitoring and tracking social media content and in discovering how people's descriptions of disasters are like.
 

Z. Wang, T. Zhu and S. Mai, "Disaster Detector on Twitter Using Bidirectional Encoder Representation from Transformers with Keyword Position Information," 2020 IEEE 2nd International Conference on Civil Aviation Safety and Information Technology (ICCASIT, 2020, pp. 474-477, DOI: 10.1109/ICCASIT50869.2020.9368610.