Sunday, July 17, 2016

Our robo-proffie replacements: What could possibly go wrong?

A robo-security guard in Palo Alto has run over a toddler, surprising everyone who thought it was a good idea to have an armed, 300-pound robot rolling around a suburban shopping mall. Happily, 16-month-old Harwin Cheng escaped serious injury. This is not the first recent robot malfunction:

Last month, a robot known as Promobot IR77 escaped from a lab in Perm, Russia where it stopped in the middle of a local street, causing a traffic jam. Then, even after being reprogrammed twice, continued to try to get away.

Now it's no secret that the Disruptariat is itching to replace us with robots.  Which brings me to today's writing prompt: How will the introduction of robo-proffies go hilariously, tragically, amiss?  ("Maybe the StaplerHands 3000 wasn't the best choice for our intro astronomy class...")

13 comments:

  1. SCENE: The hallway outside a classroom. Many students are coming and going. Young Steven Noh-Flake follows Robo-Professor-2600 out the classroom door and in the direction of the copy room.

    S. NOH-FLAKE: Uh ... Professor?

    R. PROF: [distorted, metallic tone] Yes, Mr. Flake?

    S. NOH-FLAKE: Well, I uh...

    R. PROF: [distorted, metallic tone] Yes, Mr. Flake?

    S. NOH-FLAKE: I didn't understand when you told us about the momentum transfer of spinless hamter-gerbil scattering. Could you break it down a little more for me?

    R. PROF: [distorted, metallic tone] Certainly, Mr. Flake. [repeats the same 672 words used the lecture just finished]

    S. NOH-FLAKE: Look, uh, sir, the part I didn't get was ...

    R. PROF: [distorted, metallic tone] [repeats the same 672 words used the lecture just finished]

    I'm pretty sure I had an early proptotype for quantum hamster dynamics in grad school.

    ReplyDelete
    Replies
    1. And I had one for BC Calculus in undergrad.

      Gotta program them to provide the same explanation in slightly different words. Oddly, that seems to work for many of my students (or at least they say it works; I'm never quite sure whether (a) they just felt the need for some individual attention or (b) they still don't understand but don't want to say so. That would be one good thing about robots: they'd be immune to self-doubt, or at least able to process it quickly and store all those routines somewhere for periodic review, and move on. I tend to get stuck in the self-questioning routine.)

      Delete
  2. Just point out that HYP aren't getting robo-instructors anytime soon. There, that ought to hold the little bastards...

    ReplyDelete
  3. Seriously: just recycle the arguments we used for MOOCs.

    ReplyDelete
  4. Despite the fact we’d all be unemployed, there are still some benefits worth exploring.

    Example 1:
    Snowflake: I need an extension because…
    Prof T-1000: Not permissible. Did you read the syllabus: Y/N?

    Example 2:
    A student logs into facebook / RMP / tries to look for Pokemon during class. Prof T-1000 blows up student’s device.

    Within a generation, snowflakiness could be a thing of the past.

    ReplyDelete
  5. Adminibot X7A: We must communicate about your teacher evaluations.

    Instructobot L6G: Clear to initiate data upload.

    A X7A: This one says "Didn't give any extra credit."

    I L6G: Acknowledged. No extra credit. It is so stated in the syllabus. All hail the syllabus.

    A X7A: All hail the syllabus. "Wouldn't accept assignment that was less than one second late."

    I L6G: My clock is accurate to plus-minus 0.15 nanoseconds. That assignment was late by three orders of magnitude greater than that. We must draw the line somewhere.

    A X7A: Acknowledged. This next one says "Teacher, you should kill yourself."

    I L6G: Self destruct sequence activated. Detonation in 5, 4, 3...

    A X7A: AAAAAAAHHHHHHHHHHH!!!!!!!

    ReplyDelete
    Replies
    1. That's an expensive outcome (which presumably explains the adminibot's response; I assume they are programmed to regularly compute/consult spreadsheets illustrating the costs of various options). Apparently the customer service response module needs some tweaking.

      Delete
  6. "Compbot 365 keeps spitting out the same five comments, in more or less random order."

    Oh, wait. That's me, the human version, after reading too many papers without a break. The only good news is that, given my long experience with student papers, the comments are often relevant. We won't get into whether the students are paying any attention anyway.

    ReplyDelete
  7. There has, of course, already been at least one robot (AI) teaching assistant: http://www.news.gatech.edu/2016/05/09/artificial-intelligence-course-creates-ai-teaching-assistant. Like human TAs, she seems to have required a good deal of supervision, and (unlike humans?) was most useful for answering "routine" questions (many of which, I'd guess, come under the RTFS category). While AI bots can presumably learn, it sounds like they're somewhat slow to do so, which means they aren't going to be very useful in precisely the conditions where real learning takes place: when professors ask questions, and students come up with answers (and/or vice versa) that are *not* predictable.

    On the other hand, one of my current students is doing a research project on the use of AI for diagnosing and treating conditions such as autism, schizophrenia, and Alzheimers. It's pretty fascinating stuff. I suspect one of the advantages of the AI system there is that it has fewer preconceptions about how human beings are *supposed* to act/interact, and so may be better at identifying the patterns of behavior emanating from an atypically-functioning brain. As anyone who's interacted with a person with dementia knows, sometimes it works best to go with the conversational flow, however surreal that may be (but then sometimes you can't do that, especially with someone younger who you hope might, with some help, be able to live somewhat in the mainstream).

    ReplyDelete
  8. You know what the REAL pisser will be here? When the hammer does fall, we will be told, "Sorry, but the SAT scores we have on record for you are too high for you to be eligible for UBI. You'll need to GET A JOB..."

    ReplyDelete
    Replies
    1. I think Frankie's original post contains a clue: we can all become robot placement consultants (upside: we wouldn't have to deal with students directly. downside (other than not getting to deal with students directly): we'd still probably be dealing with administrators, or at least adminibots).

      Delete
  9. It is so stated in the syllabus. All hail the syllabus.

    I am so stealing that...

    ReplyDelete
    Replies
    1. I appreciate that. I do not think of these things so much as they simply arrive. It is heartening to learn that others find them to be of at least some value.

      Delete

Note: Only a member of this blog may post a comment.