Transformers are a family of deep learning models based on attention mechanisms. First proposed by Vaswani et al. in 2017, these models have achieved state-of-the-art results on many natural language processing tasks. Transformers have outperformed recurrent networks by harnessing the potential of transfer learning whereby models are pretrained on data-rich tasks, like language modelling, and then fine-tuned on downstream tasks of interest, like summarization. Transfer learning has also allowed multilingual transformer models to generalize across languages and achieve better performance on low-resource languages.
The Huggingface Transformers library provides hundreds of pretrained transformer models for natural language processing. …
Literature review is the survey of published literature on a specific topic. It provides an overview of the current scholarship in the domain and, as such, is the starting point of any research project. The goal of literature review is to answer the following questions:
Field mapping is a method that allows for systematic survey of a research area in order to arrange the information for easy retrieval in future. It involves the identification of:
Android is an open-source mobile operating system built on top of a modified version of the Linux kernel. It was developed by the Open Handset Alliance, a consortium of technology industry giants led by Google. It is licensed under the Apache License and its source code is released as part of the Android Open Source Project (AOSP). Launched in 2007, Android soon become the best-selling operating system in history because of its open development model and user-friendly interface. Its latest version, Android 11, was released in 2020.
The Android project was founded in 2003 with the goal of developing smart…