Search This Blog

Sunday, 26 November 2023

Using AI in education, Part 1

Now that semester is over, I'm going to start writing up a post each week of some AI-related innovation I did in my teaching this year. (The real guru on this is Ethan Mollick -- https://substack.com/@oneusefulthing, but maybe I can be interesting.)


I had a lot of overseas students and other students whose first language wasn't English. So I did a poll at the start of the semi-semester to ask what languages they would like my lectures deep-faked into. Hindi was the most requested. Here's me speaking fluent, high-class Hindi! https://youtu.be/mFYjjNUa8ho?si=ONUAaEQd0zR3BwrE


Arabic was another request.


I'm vain and wanted it to be my own voice, so I used Eleven Labs (affiliate link here https://elevenlabs.io/?from=partnerrobles3623 ). I could probably have done this cheaper with Whisper + ChatGPT translation + ChatGPT text to speech (or some other alternative text-to-speech solution).


Issues and observations:

  • For some students, it's a clear issue of access to education, and I consider it a moral obligation to provide my lessons in ways that are accessible to as many as possible. If you are struggling with English, and I can easily provide resources to help you in your native language, I'll do that.
  • For some students, I think it was just the amusement of hearing obviously-Aussie Greg talking something else. Still, I don't mind: if that's what it takes to get students to listen to lectures, I'm in on it.
  • Eleven Labs does very formal translations. It seems like they are using ChatGPT for translations, and ChatGPT defaults to more formal language. Apparently my Hindi sounds like a government proclamation. Hopefully they will support more casual language in their translations.
  • It's not cheap for an individual. It begins to makes sense at an organisational level when you have a handful of students wanting the same language.


But I had a surprising amount of pushback. The argument is that part of the experience is for students to study in English so that they can improve their English, English being the world language and all that.


To me, that's last-year thinking: back in 2022 when automated translation wasn't a solved problem, we all needed to agree on a common language for business and education, and the world had chosen English.


But maybe now we don't need a common language any more, so is it still necessary, important and valuable for students to improve their English?


Which is more important now, and which will be more important in the future? Accessibility of education to people in their native languages, or improving English language fluency?

Monday, 6 November 2023

A prayer for debugging

 “God of all wisdom and knowledge, you created the universe and all within it, including logic and computation. I thank you for blessing me to be in this time and this place where I can work to resolve this [bug]. Let me be at peace in the puzzling and confusion, never to rise in anger or let my frustration out on others, who are also part of your beautiful creation. Help me remember the joy that comes from finding a solution, and to look forward in anticipation to this [bug]’s resolution as a tiny reminder of our anticipation of the joy that will come when you come again and make this world right, perfect and whole.”

Wednesday, 24 March 2021

How to make a maths nerd wince in pain

Here's a quick tip if you have some fractions that you need to simplify. Just cancel any digits in common between the numerator and denominator.

Here are a few examples...



Friday, 12 February 2021

Grandma Heidi

Many of the interesting and well-read women in your life (if you are a woman, this probably includes yourself) read Heidi when they were young, or had it read to them. The story of this young orphan girl in Switzerland has captured everyone's hearts for over a century, as did the sequels Heidi's Children and Heidi Grows Up.

But the fourth book in the series -- Grandma Heidi -- was never translated into English until now, when my daughter and I completed it.
 
There’s a bit of a backstory to it...
 
 
Growing up, there had been this one mysterious book — written in French — in the household. It had pretty pictures in it, but until I went to high school, I couldn’t even begin to read it.
 
Even when I had studied some French, it was still heavy going.

I was one of the worst French students in my family. My brother is actually quite fluent, having spent some time in New Caledonia, and acquiring a very amusing New Caledonian accent. My mother has spent more time in France than I ever did; and my father studied it for his leaving certificate. I dropped it at the end of year 10 (4th form as it was known at the time) before I started my pipe-organ building apprenticeship. I wrote my end-of-year French exam left-handed just to practice — that’s how little I cared about French. Sorry, Ms Kviz, it wasn’t your teaching.

Anyway, one day in 1990 when I was home from school, I promised my mother that I would do something productive, and not waste the day. I promised her that I would translate that book that she was given when she was a child. 

It was doomed to end badly. I translated two sentences that day in 1990 and got stuck on the third.

But I promised to finish it later.

In my family we have an alphabetic Christmas tradition — each year the presents begin with the next letter of the alphabet, which for the year 2020 was the letter “T”. So I finished the translation (30 years after I promised to do it, and with a lot of help from my daughter, who started French in kindergarten) and gave it to her for Christmas.

At the end of last year as I was wrapping up the first draft, I found that a lot of the women in my life started asking about it. Heidi was a part of their childhood, and they would like to know how it ends, so I’ve decided to publish it. I had some delightful watercolours done — based on the original that I was working from. 


It is available for pre-order for March 1st 2021: https://www.amazon.com/dp/B08WH6774S 

Friday, 6 November 2020

Composing some organ music and some spooky Halloween music, but with a twist and a view of 2030

My first job out of high school was as a pipe organ builder; I was one of the only apprentices to start that year. I’ve lost touch with Peter Jewkes who was my boss and who taught me. But I’ve got a funny cross-over between my first job and my current work today.
 
I realise that organ music isn’t everyone’s cup of tea, but would you listen to 30 seconds of something original and interesting?
 
 
And as a composition, it’s not exactly Bach, but it’s believably something you might hear during Mass at a traditional Catholic service or at a high-Anglican church.
 
Here’s the punchline: it wasn’t really composed by a human being. I created a one-line melody that had a few random notes and then I told AWS Deepcomposer to create a similar melody and fugue based on it. 



 
There’s no musical genius behind this: it’s just an autoregression algorithm. I’m really quite surprised at how well it works.
 
The Deepcomposer platform also has a way of creating accompaniments in a variety of styles (rock, pop, orchestral) which is so spectacularly bad it has to be heard to be believed. About the best I could do was to make it create background breathy noises and ocean effects (which sort of came in at the right time):
 
 
 
Incidentally, if you want to upvote that track on Soundcloud I might win their Halloween competition. It’s a spooky enough track, although I cheated a little bit. After each set of iterations I went back into the resulting composition and put it firmly back into F-minor. For some reason it couldn’t quite cope with the idea that E natural and A-flat could co-exist in one composition — which is odd because they are both together in the F-minor harmonic scale and the F-minor melodic scale. So I had to go in and turn the A-naturals back into A-flats, and vice-versa with the E-natural.
 
I’m not about to start composing professionally again so I don’t see DeepComposer as some kind of threat. It’s more of a fun toy at the moment. I can easily believe that it could get to the point where it could create an orchestration from a piano accompaniment quite quickly though. And creating a backing track from a pop song melody isn’t something the requires more than a superficial understanding of music theory; it’s more a matter of fashion and adapting to this year’s newest soundscape. I wouldn’t put this beyond about 10 years in the future.
 
So then we have:
  • https://www.vocaloid.com/en/ can do the vocals
  • Garritan and other sound providers who can provide the authentic-sounding instruments to play an arrangement
  • DeepComposer embellishing a melody and creating an accompaniment.
So I’ll predict that by 2030 it should be possible to take a melody lines and some lyrics and generate a nice song performance with a nice backing automatically. If you can play chopsticks on the piano you will probably have enough knowledge to create something good.
 
I’m hoping that that will expand the scope of people who can take some joy in composing. At the moment it’s not something that everyone can do, but creating a new music is such a wonderful feeling that I wish more people could experience it.

Sunday, 27 September 2020

Something funny is about to happen to some prices

Landfill is an interesting product. The price fluctuates far enough that it can fluctuate from positive to negative and back again. If someone is building a tunnel, you can probably get paid to take landfill. If someone is building an island, you can probably get paid for supplying landfill. There are projects that can only go ahead at certain times: the Hornsby quarry remediation project was only viable because the north connex was being built.

That's slightly obscure, and there's not much you can do with landfill other than, well, fill land. But we are about to hit a very unusual combination of circumstances which will create some head-scratching economics.

Solar panels are getting cheaper. The learning rate for solar energy is (probably) the steepest of any energy technology yet deployed. So we can be pretty much assured that within the next decade we'll have enough solar power to power anything we want (as long as it is day time). We'll overprovision our houses and factories (and offices if they still exist) with generation capacity so that in the morning or evening when the sun is hitting at an oblique angle, we still have enough for our needs.

That means that at peak times, we'll have more than enough solar energy. What do we do with that surplus energy?

Actually, it's slightly worse than that, because there are some (non-renewable) power plants that can't fully idle. A nuclear reactor can't fully turn off for a few minutes in the middle of the day. Neither can a coal power plant. So they will be running at a time when nobody has a need for power. That may be OK -- if you make a lot of money generating power overnight, you might be prepared to pay some small penalties during the day for overproduction if that's what it takes to keep the system running.

Which means that someone, somewhere is going to get paid to create a dummy electrical load. We've never (as far as I can tell) seen "grid stability" payments for using electricity before. But it's the inevitable conclusion of where we are the moment: discounted rates for consumers who can change their requirements to help with grid stability exist already; there's no reason that this wouldn't still be necessary when the price of electricity hits zero.

Therfore, there will be times of the day (in certain weather conditions) when some companies will get paid to use electricity.

The kinds of consumers that will get paid are likely to be doing something that is energy inefficient (obviously), and not particularly capital intensive (since their equipment will be idle a lot of the time), and making something that can be stored indefinitely (because you definitely can't do just-in-time manufacturing if you are waiting for times when the price for electricity goes negative). But quite a lot of products could be made to fit that.

A possible product that I think is likely to play out this way is water desalination. Desalinated water is very expensive because of the embedded energy cost: it's often described as "bottled electricity". Currently we tend to use reverse osmosis because it's more energy efficient; but if you don't have to worry about energy costs, you might choose instead to boil water and distil it: since the capital costs are cheaper.

Now things get even weirder. Let's say you are getting paid 0.5c per kWH (a not unreasonable price) that you use. You might find that you can sell the water for -0.01c per kL. Yes, you can make a profit: not just giving it away, but you could still be profitable while paying someone to consume it.

If this all plays out, we might have water and electricity (at times) having a negative price. These are common inputs to other processes, so it doesn't end there: there might be a business that requires water and electricity occasionally, but the input costs normally make the business unviable. They in turn might be able to make some money selling the end product at a negative price if their inputs have a negative price.

If this sounds all a bit unbelievable, remember that up until very recently, many economists thought that it was impossible to have negative interest rates... right up until the moment when they started happening. We already have examples of products that oscillate between positive and negative prices, it's just that they were never fundamental inputs to other processes. My prediction is that by 2030 we will have a small part of the economy ticking away profitably making negative price products.

(Post scriptum: now for the completely weird: how do negative price products interact with negative interest rates?)


Tuesday, 16 June 2020

Microfocus (HP) Data Protector, Azure backups and archiving

DataProtector can only restore from Azure container blobs that have an access tier of Cool or Hot. This is a pity, because Archive is so much cheaper, particularly if you buy it up-front. If the media you are trying to restore from is in the Archive tier, you will get an error like this:

[Normal] From: RSM@cellmgr.ifost.org.au ""  Time: 16/06/2020 11:06:26 AM
Restore session 2020/06/16-8 started.

[Normal] From: RMA@cellmgr.ifost.org.au "Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]"  Time: 16/06/2020 11:06:27 AM
STARTING Media Agent "Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]"

[Normal] From: RMA@cellmgr.ifost.org.au "Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]"  Time: 16/06/2020 11:06:28 AM
Loading medium from slot \\hpdparchivelongterm.blob.core.windows.net\general\531504b0_5ee2f861_04e4_038f to device Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]

[Major] From: RMA@cellmgr.ifost.org.au "Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]"  Time: 16/06/2020 11:06:29 AM
[90:54]  \\hpdparchivelongterm.blob.core.windows.net\general\531504b0_5ee2f861_04e4_038f
Cannot open device (System error)

[Normal] From: RMA@cellmgr.ifost.org.au "Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]"  Time: 16/06/2020 11:06:29 AM
Unloading medium to slot \\hpdparchivelongterm.blob.core.windows.net\general\531504b0_5ee2f861_04e4_038f from device Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]

[Normal] From: RMA@cellmgr.ifost.org.au "Azure_gw_via_dc1hdp01 [GW 6588:0:16462780082409836283]"  Time: 16/06/2020 11:06:29 AM
ABORTED Media Agent "Azure_gw_via_cellmgr [GW 6588:0:16462780082409836283]"
It’s not a big deal, you just have to find the corresponding blob(s) in the azure portal and bring them out of archive.
If you are reading this blog post because you’ve just seen that error, it’s pretty straightforward to find what you need to fix. The medium it is loading is: \\hpdparchivelongterm.blob.core.windows.net\general\531504b0_5ee2f861_04e4_038f
 ... which corresponds to the name of the name of the blob, except for the _0 on the end. (Azure has limitations on the size of a blob, so Data Protector media can span several blobs.)

 

On the other hand, if you are reading this blog because you are writing procedures for the future, then go to the Media tab, select "All media" and expand the column until you can see the medium id.

Then you can double-check in the Azure portal that it is in the right tier.