Machine Learning Study Group
![]() |
Welcome! We meet from 4:00-4:45 p.m. CT. Anyone can join. Feel free to attend any or all sessions, or ask to be removed from the invite list as we have no wish to send unneeded emails of which we all certainly get too many. | ![]() |
Contacts: jdberleant@ualr.edu and mgmilanova@ualr.edu
Agenda & Minutes
100th(!) Meeting, Feb. 9, 2024
🎉🎉
- Updates, announcements, questions, etc.?
MM writes: "Some students asked me to share the link and the [access] code. Using the code, any course will be $0.00. I think I need to show next time how we can use the code. [...] There are many resources (https://www.nvidia.com/en-us/
training/online/ )."- Today MM will explain more about this.
- Suggested viewings/readings on LAMs (Large Action Models) we could evaluate:
- https://www.youtube.com/watch?v=22wlLy7hKP4&t=91s&ab_channel=rabbit
- https://www.youtube.com/watch?v=UOZqFMxRpWE&ab_channel=ExitsMedia
- https://www.youtube.com/watch?v=Rqh6fhcAqpw&ab_channel=ColdFusion
- https://www.youtube.com/watch?v=uJnhh7YSr5Q&ab_channel=TheAIGRID
- https://medium.com/version-1/the-rise-of-large-action-models-lams-how-ai-can-understand-and-execute-hum
- Readings: We are reading
- Sparks of Artificial General Intelligence: Early experiments with GPT-4 (https://arxiv.org/abs/2303.12712).
- We can continue where it says "However, impressive outputs are not enough to convince us that"
- We earlier read chapter 1 part 4:
https://huggingface.co/learn/nlp-course/chapter1/4 up to "Note that the first attention layer in a decoder block pays attention to" and we can start from there.
- Last time
- DA - brainstorming about topics. IK suggests checking https://github.com/meagmohit/EEG-Datasets. Maybe it is time to do a literature search and find similar works and what they are achieving and what the open problems are.
No comments:
Post a Comment