I'll never teach the class, so I need to mention it here:
Take "hello world!", encode it into Baudot-Murray code (so, 60 bits out).
Run it through a suitable rate 1/2 error correcting code (so, 120 bits out). "Suitable" probably means a fairly trivial convolutional code (i.e. length 3 or 4) or some fairly short (length 10?) block code.
Ask 30 multiple-choice questions, worth one point each, whose correct answers are encoded by nibbles of your encoded from "hello world!".
Follow that with a 70-point question to decode the message encoded in the multiple-choice question, and allow people to go back and correct their multiple-choice questions if they dare.
Your class will be split between the ones who will love you, and the ones who will hate you...
never learned Baudot-Murray nor Reed-Solomon. more familiar with Linkwitz-Riley or Sallen-Key or Karplus-Strong.
i know a little about error detection, but little about error correction. this is one area where i know only a teeny bit about it.
Baudot-Murray is the 5-bit Teletype code that preceeded ASCII, so you can encode "hello world!" into 60 bits.
so there are 26 letters and 10 numerals. i remember old typewriters had "l" for "1" and "O" for "0". but that only gets you down to 34. what other symbols did they eliminate to get it down to 32?
It had a "figures", "letters", and "capitals" mode. It's in Wikipedia.
Reminds me of a Dilbert comic strip. Something like
First engineer: "Yeah, when I started programming, we didn't have any sissy icons and windows. All we had were ones and zeroes, and sometimes we didn't even have ones. I once wrote an entire database program using nothing but zeroes."
Second engineer: "You had zeroes? We had to use the letter "O"!"
Somebody astute will be able to post a link to the exact comic.
Being the King of careless errors on a test, I would hate you.