2019年4月22日-26日：Natural Language Processing短期课程——何静
来源： 计算机学院 | 发表时间： 2019-04-18 | 浏览次数： 1119
Dr. Jing He is a professor in school of software and electrical engineering, Swinburne University of Technology. She was awarded a PhD degree from the Academy of Mathematics and System Science, Chinese Academy of Sciences in 2006. Prior to joining Victoria University, she worked in the University of Chinese Academy of Sciences, China during 2006-2008. She has been active in areas of Algorithm and Chips, Artificial Intelligence, Data Mining, Web service/Web search, Spatial and Temporal Database, Multiple Criteria Decision Making, Intelligent Systems, Scientific Workflow and some industry fields such as E-Health, Petroleum Exploration and Development, Water recourse Management and e-Research. She has published over 160 research papers in refereed international journals and conference proceedings, including ACM Transaction on Internet Technology (TOIT), IEEE Transaction on Knowledge and Data Engineering (TKDE), Information Systems, the Computer journal, Computers and Mathematics with Applications, Concurrency and Computation: Practice and Experience, International Journal of Information Technology & Decision Making, Applied Soft Computing, and Water Resource Management. She has received over 2.5 million Australian dollar research funding from the Australian Research Council (ARC) with ARC Early Career Researcher Award (DECRA), ARC Discovery Project, ARC Linkage Project and National Natural Science Foundation of China (NSFC) since 2008.
Natural Language Processing
This class will give a easy introduction of Natural Language Processing (NLP), which is a sub-field of Artificial Intelligence. It focused on teach computers to understand and process human languages, to make computers closer to a human-level understanding of language.
This course covers a wide range of tasks in Natural Language Processing from basic to advanced: sentiment analysis, summarization, dialogue state tracking, to name a few. Upon completing, you will be able to recognize NLP tasks in your day-to-day work, propose approaches, and judge what techniques are likely to work well. The final project is devoted to one of the hottest topics in naturel language processing.
The project will be based on practical assignments of the course, that will give you hands-on experience with such tasks as text classification, named entities recognition, and duplicates detection. Throughout the lectures, we will aim at finding a balance between traditional and deep learning techniques in NLP and cover them in parallel. For example, we will discuss word alignment models in machine translation and see how similar it is to attention mechanism in encoder-decoder neural networks. Core techniques are not treated as black boxes. On the contrary, you will get in-depth understanding of what’s happening inside. To succeed in that, we expect your familiarity with the basics of linear algebra and probability theory, machine learning setup, and deep neural networks. Some materials are based on one-month-old papers and introduce you to the very state-of-the-art in NLP research.