Search This Blog

Monday 4 March 2024

Why isn't there a degree called a Bachelor of Contact Centre Adminstration?

 It would cover IP networking, VoIP and telephony. It would teach students about AI and its applications in all aspects of compliance, optimisation, training, CSAT improvement, translation; and enough maths to understand queue behaviour under pressure. Contact centres often have high turnover / transient staff, unique training needs and large numbers of staff, so there should be a strong HR component. Perhaps there should be units on contract negotiation, sales techniques and other parts from a business degree.


I was thinking about this yesterday when I was creating an assignment for my students where they have to write a telephony speech recognition and intent detection system for a contact centre. I realised I would need to explain a few background concepts (e.g. like call routing and queueing)... and it quickly snowballed into "there's actually quite a lot of stuff to know in order to understand a modern contact centre".
Everyone I know had to learn everything about contact centre management on-the-job. Every call centre I know of has people in it studying part-time to get a degree in something other than their current occupation.

Why is a Bachelor of Call Centre Administration not a thing?

Tuesday 13 February 2024

Multi-lingual versions of natural language processing lectures

Last year I taught an introduction to Natural Language Processing at Macquarie University. I grabbed all the recordings, and started editing them so that I could publish them. I’m not even half way through the first two hour lecture, but at least I’ve made some progress.
Here’s what’s working so far, and what isn’t.
I have a “intro theme music” for the first few seconds of the videos that I publish.

https://soundcloud.com/greg-baker-574386084/another-solresol-company-theme


That theme music consists of several phrases in the musical language Solresol, an artificial language that lasted almost as long as Esperanto has. The alto flute opens with a call out (in solresol) “today; tomorrow”, then there’s a gong because everyone needs a gong and it says something. The timpani then taps out (in morse code) the name of my company. Meanwhile, the harp is saying (in solresol) “wisely useful” then “wisdom - create a model”. The bells sing out (in solresol) “behold: yours! Behold!”. The orchestral strings just fit all the pieces in the background to make it sound complete.


This all seems very appropriate theme music for a series of lectures on natural language processing, particularly on the first few that focus on encoding text in different languages around the world.

I signed up to ElevenLabs (affiliate link http://elevenlabs.io/?from=partnerrobles3623 ) so that I could translate my lectures.
They seem to be using Whisper to do speech-to-text into English. This interacts strangely with the start of my videos, because it seems to think that I’m saying something (which indeed, I am! But in Solresol) and often hallucinates greetings or other comments like that. So it tries to modify the Solresol music with words, which just warps the sound and makes it slightly out of tune.


Other issues:

  • I’m finding that its algorithm for identifying the number of speakers is unreliable. It often thinks that there are two of me.
  • Translation into Indonesian and Malay (which are close to the same thing) was not recognisable as being in those languages. It’s not just me thinking “that doesn’t sound like Bahasa”; fluent speakers weren’t even sure if they were listening to Bahasa or random babble.
But overall, I’m impressed. It would be an enormous amount of effort for me to re-record these videos in each of these languages. I’m not sure I even could do it in Japanese or Arabic (which I have studied) let alone Hindi (which I’ve never studied).


English https://www.youtube.com/playlist?list=PLnUusltxXvTdi6q_uTj4zt4ArCRoGBBUA

Chinese https://www.youtube.com/playlist?list=PLnUusltxXvTeryUuujEiLkhPYwRmYyufj

Hindi https://www.youtube.com/playlist?list=PLnUusltxXvTfOXqfpc5aIbrcgjxSZ9CoN

Spanish https://www.youtube.com/playlist?list=PLnUusltxXvTczVNLSAUFwSoyOf-PWZAcl

Japanese https://www.youtube.com/playlist?list=PLnUusltxXvTeRozp7XbEd90Gh32GAgP9H

Arabic https://www.youtube.com/playlist?list=PLnUusltxXvTfUEtbOqGBL6chpMWx1sApM

Korean https://www.youtube.com/playlist?list=PLnUusltxXvTcE4kecg6x0wHpTOdDMoseK

If you or your colleagues or friends speak one of these languages and want to hear a bit about the history of text encoding, forward them on and let me know if they are useful.

Friday 2 February 2024

Programming language theology

What does the Bible have to say about different programming languages?

  • Perl (Matthew 13:45-46)
  • Haskell (we are called to a life of purity)
  • Ocaml (it’s easier for OCaml to pass through the eye of a needle and all that, Matthew 19:24)
  • Forth It's part of the great commission in Mark 16:15)
  • Go Again, it's part of the great commission, but also Isaiah 18:2 seems to promote message passing between Golang and Swift)
  • Java (covered in He-brews)
  • Lazarus (which is cheating a bit, because it’s a framework for FreePascal, but John 11 covers it in some detail)
  • SQL isn't specifically mentioned, but Proverbs 2:3-5 seems to describe it
  • C/C++/C# in Genesis 1;10 God gathered the waters together and called them the seas. But in Revelation 21:1 it says that there will be no more C, so presumably C++ and C# take over in a new heaven and new earth.
  • Ada according to Genesis 36:4 Eliphas was bored by Ada, which does indeed speak holy truth about programming in Ada
  • Python This is one of those difficult topics. My best interpretation of John 3:14 says that Jesus should be lifted up like a snake was. I guess that means we should kill Python and see if it comes back to life: presumably we did that with the Python 2 -> 3 transition.
  • Swift I'm unsure of what the Bible says about this generally. Isaiah 5:26 applies to programmers at the ends of the earth, and Romans 3:15 presumably refers to programming with one's feet so hard that they bleed.
  • Rust It seems we should avoid Rust based on Matthew 6:19-20.



Tuesday 9 January 2024

Using AI in Education Part 4

I don't know whether this is a very late advent post or a very early 2024 post.

One of the key themes of 2024 is going to be personalized chatbots or Tutebots in education.

They aren’t very difficult to create. If your students have access to GPT Pro ($20 per month), then this is a trivial task. Simply take the transcripts of the recordings of your lectures and also take the readings ... create a custom GPT from them. If that cost is too great, then things get a little more complicated. Any vendors who want to shill their solutions, please do so in the comment area below. I've been working on an email gateway bot for this kind of task. 

Ethan Mollick reports here on his experiment to see how much of a productivity improvement GPT-4 gives professional workers. If you look at the charts, you'll see that much of the benefit goes to the least able workers. This makes sense that a large language model which produces an average, most predictable output is going to produce a result that's kind of average. If you are below average, then average is an improvement!

This also applies to students, it seems. My friend Gordon Freer teaches International Relations at the University of Witwatersrand in Johannesburg. He did a little experiment last semester. He got more data than I did in my experiment last semester with tutebots since I was only able to create a custom GPT a week before the final exams.

Gordon and I have been analyzing his data in slightly different ways. I looked at the comments made by Gordon’s students and analysed their use of present tense versus past tense, and also their linguistic diversity. This may seem a little odd, but we have good evidence that usage of present tense versus past tense corresponds to an introversion/extroversion divide (highly extroverted and sociable people will talk about all the things that they did with other people, whereas less sociable people will talk about what they're doing right now). Linguistic diversity is a measure of how many different words you use in your writing of a given length and provides a measure of verbal flexibility which is a proxy for verbal IQ. The results were interesting.

Students in Gordon's class whose comments suggested weaker English language skills ended up being far more likely to recommend the use of chatbots in the future. 




It's premature to say that they gained a greater advantage, but it seems likely that being able to ask a chatbot to give you individualised tuition is going to benefit learners who face extra challenges (such as not being a native English speaker).

But other than providing a tutebot, what can we as educators do?

For this I analysed what things Gordon’s students did with the chatbot that were predictive of recommending chatbots. In other words: different students used the chatbots in different ways; which of those gave students the most positive experience?

Cross-validated Ridge regression models found that the two strongeset factors were:

  • Did the student use the chatbot to help with their tutorials? (0.44)
  • Did the student use the chatbot to help with the readings? (0.74)

Nothing in the student’s backgrounds predicted any behaviour here. It is up to us in our teaching practices to encourage students to interact in different ways, and it’s not driven by the background knowledge, familiarity or existing skills. That suggests we need to have exercises in our courses (perhaps graded exercises) to encourage students to make the most of the chatbots we provide.

Here’s my checklist that I think we should make sure students do (feel free to suggest more).

  • Get the chatbot to explain a concept that you aren’t familiar with
  • Handle a text in a language that you aren’t familiar with by translating it into something you are familiar with.
  • Given an arcane and difficult-to-read text, get the chatbot to simplify the vocabulary that's used
  • Ask the chatbot to make analogies with another field with which you are more familiar
  • Here’s what ELI5, ELI13 or ELIUG mean (“explain it like I’m 5/13/an undergraduate”)
  • What happens if you ask the chatbot how you could improve your essay / program?
  • Explain some important concept and get the chatbot to respond with any important ideas that you have missed in explaining it
  • Role-play different people, things or participants from a reading.
  • Generate exam questions for their own self-study.
That way, they won’t think of generative AI merely as a way of cheating on essay writing.

Ping me if you want to run a study on this!

Tuesday 12 December 2023

Using AI in education part 3

Continuing on my AI in education advent calendar...

I stumbled upon a solution to a really annoying problem. You've just gone through a whole pile of marking... (the picture was about a third of the pile from one course)... and then you wonder whether you've gone cross-eyed and entered the wrong numbers into the spreadsheet / learning management system / course management system.

A pile of exam papers needing to be marked

We have flawless speech-to-text now. We have access to language models that can do tedious tasks.

So at the end of the marking, I open up a sound recorder on my phone or my laptop, flick through the exam papers and talk narrate the student marks as I go. "Xu Anh student number 23424 scored 41.5, Tom Bowman student 28559 scored 34, Amy Chin student 29912 scored 26, Daniel Davids student id 28588 scored 37...". Run that through Whisper ( https://openai.com/research/whisper or any of the desktop apps that use it, e.g. MacWhisper ) and you'll get what you said in text. It gets even the most complicated names right if you pronounce them correctly.

Now take a download from your marks management system (or the spreadsheet that you were typing them in to). If you have access to GPT4, you can upload the spreadsheet. If you are using GPT-3.5 or a free LLM like llama, copy and paste from the spreadsheet.

Then the magic prompt: "identify if there are any discrepancies between this spreadsheet and the transcript of me reading out the marks".

Works like a charm, every time.

Saturday 9 December 2023

Cat consulting

Deb Zahn interviewed me on her Craft of Consulting blog: https://www.craftofconsulting.com/podcasts/episode-238

This then inspired https://www.linkedin.com/in/nataliekjacobs/ to write a poem from the perspective of her cat.

To get a sense of how good voice modding is nowadays, here's my deep Australian basso-profundo voice turned into a feminine cat-like voice:

https://soundcloud.com/greg-baker-574386084/natalie-jacobs-cat-poem


Useful vocabulary for winning awards

  • Humblebrag is when you say that you just won an award, and you deprecatingly say that the judges might have made a mistake.
  • Stumblebrag is when the first time you learn about the award is when someone asks you why you haven't picked it up.
  • Grumblebrag is when you complain about walking around in a Sydney heatwave to pick up an award that nobody told you about.
  • Mumblebrag is when you realise that the award was for a group you were in, not to you personally.
  • Fumblebrag is when you realise you didn't really do anything for that group and you were kind of the spare wheel.
  • Jumblebrag is the feeling of winning a dean's award for excellence in inter-departmental collaboration for stuff you didn't do.
  • Rumblebrag is when you send out congratulations to the rest of the data science teaching team at Macquarie University for their fine work that you get to take credit for.