Overall comparison of Concepts from Literature Survey.
|Model||Advantages||Suited problem statements||Chatbot applications using the model|
|ANN [17–19]||– Self-organizing to changes in information
– Fault-tolerant to the corruption of cells and missing input values
|ANN is well suited for classification and regression dealing with a large number of variables. Character recognition, Image processing are some of the applications for ANN||eHealth Chatbot, Tourism Chatbot, Quiz Generation and Answering Chatbot|
|RNN [20–22]||– Adapts wells to quick changes in the input nodes
– Variable size of input and output vectors
– Works well with contextual input sequences
– Excel at modelling temporal structure
|Appropriate for sequence prediction, classification prediction, and Natural language processing and generative model. Hence they can be used in text generation, prediction of the values of an attribute in a problem statement.||Combating Depression, Emotion Recognition, Technical Support Automation|
|Ensemble Learning [23–25]||– Better generalization ability
– Weak models can be boosted to efficient learners
– Because of growing computations power, the Ensemble model can be well utilized
|It can be used to enhance the performance of existing models like RNN, LSTM and GRU.||Diabetes Personalised Treatment Chatbot, Fashion Recommender Chatbot|
|LSTM [26–28]||– Extended memory capabilities than RNN
– Handles Long-term dependencies well.
– More robust to vanishing gradients that RNN
|The right choice for problem statements like Time series forecasting. LSTM can be applied in Conversation agent, handwriting generation, Language translation, Image captioning.||Twitterbot, AgriBot, Midoriko Anime Chatbot|
|GRU [29,30]||– Handles Long-term dependencies effectively
– Robust to Vanishing gradient problem
– Computationally effective than LSTM
|GRU can be used in applications related to time series prediction like text generations, classification, etc||Cultural Heritage website chatbot, Papaya English dialogue chatbot|
|NTM [31,32]||– Generalize well to longer inputs as compared to LSTM
– Presence of external memory complements the RNNs existing memory
|NTM is well suited for models with massive and more extended sequences of data. NTM has demonstrated the solutions to be generalizing well for basic algorithms like copying and sorting.||Agnostic Chatbot, Arabic Chatbot|
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.