Natural Language Processing (NLP):

Downstream NLP tasks fine-tunned with LLMs/Transformers:

    Sequence generation tasks: seq2seq models
  • Question Answering: Demo

  • Fine-tuned a BART+RAG model
  • Text Summarization: Demo

  • Fine-tuned mT5 model
  • Machine Translation: Demo

  • Fine-tuned a T5 seq2seq model.

    Language modeling:
  • Masked Language Model: Demo

  • Token generation task.
  • Causal Language Model: Demo

  • Next sequence generation task (Python code generation).

    Conversational AI:
  • Dialog systems: Demo

  • DialogGPT.

Master’s Thesis (USC, August 2022):

  • Lexical complexity-driven representation learning: View

  • NLP Task: Complex English phrase and word identification; Token classification (akin to NER or POS tasks). Showed State-of-the-art f1 score.