- Michal Farkaš: NL-FIIT at SemEval-2018 Task 3: Emotion Detection From Conversational Triplets Using Hierarchical Encoders
- Peter Ocelik: Poetry Generation via Deep Neural Networks
- Samuel Pecár: Automatic Text Summarization of Customer Reviews
- Lenka Pejchalová: Joke generation using Deep Neural Networks
- Matúš Pikuliak: Cross-Lingual Learning
NL-FIIT at SemEval-2018 Task 3: Emotion Detection From Conversational Triplets Using Hierarchical Encoders
Michal Farkaš
Abstract: In this paper, we present our system submission for the EmoContext, the third task of SemEval 2019 workshop. Our solution is a hierarchical recurrent neural network with the ELMo embeddings and regularization through dropout and Gaussian noise. We have decided to focus more on the architecture of the model and various methods rather than on the data preprocessing. We have mainly experimented with two main model architectures: simple and hierarchical LSTM network. We have also examined ensembling of the models and various variants of an ensemble. We have achieved microF1 score of 74.81\%, which is significantly higher than baseline and currently the 19th best submission.
Poetry Generation via Deep Neural Networks
Peter Ocelik
Abstract: Natural language generation is one of the many problems that computer linguistics addresses. Different types of generation include generating CVs, weather forecasts, dialogues, or creative texts such as stories, jokes, and poetry. In recent years, the generation of poetry has become an interesting problem in the research field. Poetry uses various creative features compared to other texts, taking into account phonetics, lexicology, syntax, and requires a lot of input information. Although there are several evaluation metrics, they are ambiguous and there is a problem in correlation with human evaluation. Especially Chinese poems are widespread among researchers, and the English language lags far behind.
We will analyze the existing approaches, methods, and architectures that are used to generate poetry today. We will suggest the new one or improve an already existing method for generating poetry using deep neural networks for English language. Then we will implement the solution and compare the results with existing approaches.
Joke generation using Deep Neural Networks
Lenka Pejchalová
Abstract: In the last few decades, the development of informatics and information technology has brought new exploration opportunities. In particular, the development of artificial intelligence and achievements in the use of neural network models have been a major step. One area of artificial intelligence is the generation of text that includes a wide range of possible tasks, from generating reviews, stories, poems, song lyrics to generating article titles, or generating jokes. It is the computational humor that deals with generation of jokes. Although direct use may not be obvious, the main goal of this area is to make communication between devices and people more enjoyable.
In recent years, a large number of devices, such as smart mobile phones, laptops, or other devices containing intelligent assistants have brought into people’s life. The interaction with such non-human assistants is still very unnatural for ordinary people. With humor, which is considered as an ice-breaker in normal interpersonal communication, the use of humor in intelligent assistants could give the impression of more-human communication.
The goal of my paper is analyzing the current state of computational humor, including datasets and applied approaches or methods and also to design or extend an existing approach. In later stages, the implementation itself and the verification of the success of generating meaningful jokes.
Automatic Text Summarization of Customer Reviews
Samuel Pecár
Abstract: In recent years, the number of texts has grown rapidly. For example, most review-based portals, like Yelp or Amazon, contain thousands of user-generated reviews. It is impossible for any human reader to process even the most relevant of these documents. The most promising tool to solve this task is a text summarization. Most existing approaches, however, work on small, homogeneous, English datasets, and do not account to multi-linguality, opinion shift, and domain effects. In this paper, we introduce our research plan to use neural networks on user-generated travel reviews to generate summaries that take into account shifting opinions over time. We outline future directions in summarization to address all of these issues. By resolving the existing problems, we will make it easier for users of review-sites to make more informed decisions.
Cross-Lingual Learning
Matúš Pikuliak
Abstract: In recent years, the number of texts has grown rapidly. For example, most review-based portals, like Yelp or Amazon, contain thousands of user-generated reviews. It is impossible for any human reader to process even the most relevant of these documents. The most promising tool to solve this task is a text summarization. Most existing approaches, however, work on small, homogeneous, English datasets, and do not account to multi-linguality, opinion shift, and domain effects. In this paper, we introduce our research plan to use neural networks on user-generated travel reviews to generate summaries that take into account shifting opinions over time. We outline future directions in summarization to address all of these issues. By resolving the existing problems, we will make it easier for users of review-sites to make more informed decisions.