$GTCH GBT will be evaluating the T5, pre-trained model with the goal of using it in its Hippocrates healthcare advisory system, handling Q/A, text, summarization and compositional commonsense knowledge. The model allows more parallel processing than methods like Recurrent Neural Network (RNN) and Convolutional Neural Network (CNN) which significantly increases data understanding and reasoning capabilities. For example, T5 model is capable of processing words together rather than on a word-by-word of a given text. As global data realm is estimated to reach zettabytes range in the near future, our deep learning computing will need powerful processing capabilities, comprehending and scrutinizing data, particularly in the huge, unstructured NLP domain.
(0)
(0)