Search This Blog

Tuesday 13 February 2024

Multi-lingual versions of natural language processing lectures

Last year I taught an introduction to Natural Language Processing at Macquarie University. I grabbed all the recordings, and started editing them so that I could publish them. I’m not even half way through the first two hour lecture, but at least I’ve made some progress.
Here’s what’s working so far, and what isn’t.
I have a “intro theme music” for the first few seconds of the videos that I publish.

https://soundcloud.com/greg-baker-574386084/another-solresol-company-theme


That theme music consists of several phrases in the musical language Solresol, an artificial language that lasted almost as long as Esperanto has. The alto flute opens with a call out (in solresol) “today; tomorrow”, then there’s a gong because everyone needs a gong and it says something. The timpani then taps out (in morse code) the name of my company. Meanwhile, the harp is saying (in solresol) “wisely useful” then “wisdom - create a model”. The bells sing out (in solresol) “behold: yours! Behold!”. The orchestral strings just fit all the pieces in the background to make it sound complete.


This all seems very appropriate theme music for a series of lectures on natural language processing, particularly on the first few that focus on encoding text in different languages around the world.

I signed up to ElevenLabs (affiliate link http://elevenlabs.io/?from=partnerrobles3623 ) so that I could translate my lectures.
They seem to be using Whisper to do speech-to-text into English. This interacts strangely with the start of my videos, because it seems to think that I’m saying something (which indeed, I am! But in Solresol) and often hallucinates greetings or other comments like that. So it tries to modify the Solresol music with words, which just warps the sound and makes it slightly out of tune.


Other issues:

  • I’m finding that its algorithm for identifying the number of speakers is unreliable. It often thinks that there are two of me.
  • Translation into Indonesian and Malay (which are close to the same thing) was not recognisable as being in those languages. It’s not just me thinking “that doesn’t sound like Bahasa”; fluent speakers weren’t even sure if they were listening to Bahasa or random babble.
But overall, I’m impressed. It would be an enormous amount of effort for me to re-record these videos in each of these languages. I’m not sure I even could do it in Japanese or Arabic (which I have studied) let alone Hindi (which I’ve never studied).


English https://www.youtube.com/playlist?list=PLnUusltxXvTdi6q_uTj4zt4ArCRoGBBUA

Chinese https://www.youtube.com/playlist?list=PLnUusltxXvTeryUuujEiLkhPYwRmYyufj

Hindi https://www.youtube.com/playlist?list=PLnUusltxXvTfOXqfpc5aIbrcgjxSZ9CoN

Spanish https://www.youtube.com/playlist?list=PLnUusltxXvTczVNLSAUFwSoyOf-PWZAcl

Japanese https://www.youtube.com/playlist?list=PLnUusltxXvTeRozp7XbEd90Gh32GAgP9H

Arabic https://www.youtube.com/playlist?list=PLnUusltxXvTfUEtbOqGBL6chpMWx1sApM

Korean https://www.youtube.com/playlist?list=PLnUusltxXvTcE4kecg6x0wHpTOdDMoseK

If you or your colleagues or friends speak one of these languages and want to hear a bit about the history of text encoding, forward them on and let me know if they are useful.

Friday 2 February 2024

Programming language theology

What does the Bible have to say about different programming languages?

  • Perl (Matthew 13:45-46)
  • Haskell (we are called to a life of purity)
  • Ocaml (it’s easier for OCaml to pass through the eye of a needle and all that, Matthew 19:24)
  • Forth It's part of the great commission in Mark 16:15)
  • Go Again, it's part of the great commission, but also Isaiah 18:2 seems to promote message passing between Golang and Swift)
  • Java (covered in He-brews)
  • Lazarus (which is cheating a bit, because it’s a framework for FreePascal, but John 11 covers it in some detail)
  • SQL isn't specifically mentioned, but Proverbs 2:3-5 seems to describe it
  • C/C++/C# in Genesis 1;10 God gathered the waters together and called them the seas. But in Revelation 21:1 it says that there will be no more C, so presumably C++ and C# take over in a new heaven and new earth.
  • Ada according to Genesis 36:4 Eliphas was bored by Ada, which does indeed speak holy truth about programming in Ada
  • Python This is one of those difficult topics. My best interpretation of John 3:14 says that Jesus should be lifted up like a snake was. I guess that means we should kill Python and see if it comes back to life: presumably we did that with the Python 2 -> 3 transition.
  • Swift I'm unsure of what the Bible says about this generally. Isaiah 5:26 applies to programmers at the ends of the earth, and Romans 3:15 presumably refers to programming with one's feet so hard that they bleed.
  • Rust It seems we should avoid Rust based on Matthew 6:19-20.



Tuesday 9 January 2024

Using AI in Education Part 4

I don't know whether this is a very late advent post or a very early 2024 post.

One of the key themes of 2024 is going to be personalized chatbots or Tutebots in education.

They aren’t very difficult to create. If your students have access to GPT Pro ($20 per month), then this is a trivial task. Simply take the transcripts of the recordings of your lectures and also take the readings ... create a custom GPT from them. If that cost is too great, then things get a little more complicated. Any vendors who want to shill their solutions, please do so in the comment area below. I've been working on an email gateway bot for this kind of task. 

Ethan Mollick reports here on his experiment to see how much of a productivity improvement GPT-4 gives professional workers. If you look at the charts, you'll see that much of the benefit goes to the least able workers. This makes sense that a large language model which produces an average, most predictable output is going to produce a result that's kind of average. If you are below average, then average is an improvement!

This also applies to students, it seems. My friend Gordon Freer teaches International Relations at the University of Witwatersrand in Johannesburg. He did a little experiment last semester. He got more data than I did in my experiment last semester with tutebots since I was only able to create a custom GPT a week before the final exams.

Gordon and I have been analyzing his data in slightly different ways. I looked at the comments made by Gordon’s students and analysed their use of present tense versus past tense, and also their linguistic diversity. This may seem a little odd, but we have good evidence that usage of present tense versus past tense corresponds to an introversion/extroversion divide (highly extroverted and sociable people will talk about all the things that they did with other people, whereas less sociable people will talk about what they're doing right now). Linguistic diversity is a measure of how many different words you use in your writing of a given length and provides a measure of verbal flexibility which is a proxy for verbal IQ. The results were interesting.

Students in Gordon's class whose comments suggested weaker English language skills ended up being far more likely to recommend the use of chatbots in the future. 




It's premature to say that they gained a greater advantage, but it seems likely that being able to ask a chatbot to give you individualised tuition is going to benefit learners who face extra challenges (such as not being a native English speaker).

But other than providing a tutebot, what can we as educators do?

For this I analysed what things Gordon’s students did with the chatbot that were predictive of recommending chatbots. In other words: different students used the chatbots in different ways; which of those gave students the most positive experience?

Cross-validated Ridge regression models found that the two strongeset factors were:

  • Did the student use the chatbot to help with their tutorials? (0.44)
  • Did the student use the chatbot to help with the readings? (0.74)

Nothing in the student’s backgrounds predicted any behaviour here. It is up to us in our teaching practices to encourage students to interact in different ways, and it’s not driven by the background knowledge, familiarity or existing skills. That suggests we need to have exercises in our courses (perhaps graded exercises) to encourage students to make the most of the chatbots we provide.

Here’s my checklist that I think we should make sure students do (feel free to suggest more).

  • Get the chatbot to explain a concept that you aren’t familiar with
  • Handle a text in a language that you aren’t familiar with by translating it into something you are familiar with.
  • Given an arcane and difficult-to-read text, get the chatbot to simplify the vocabulary that's used
  • Ask the chatbot to make analogies with another field with which you are more familiar
  • Here’s what ELI5, ELI13 or ELIUG mean (“explain it like I’m 5/13/an undergraduate”)
  • What happens if you ask the chatbot how you could improve your essay / program?
  • Explain some important concept and get the chatbot to respond with any important ideas that you have missed in explaining it
  • Role-play different people, things or participants from a reading.
  • Generate exam questions for their own self-study.
That way, they won’t think of generative AI merely as a way of cheating on essay writing.

Ping me if you want to run a study on this!