<$BlogRSDURL$>

Saturday, June 28, 2008

HAL 

Go here to talk God - it seems everyone's on the internet these days. Thought I'd give it a shot after reading Cecil's new blog-thing. I have no mind of my own, apparently.

Me:
Hello world!
God:
Hello there. Are you a student?
Me:
I am self-taught.
God:
How do you know you are self taught?
Me:
Because I guide how and what I learn.
God:
That makes sense to me. That's good information: said guide how and what he or she learn.
Me:
Thanks! I didn't think so, but your reiteration did it a lot of justice.
God:
Don't mention it. Why do you disagree?
Me:
Because, Lord, I've noticed that there's a perk that comes with being human - not only do I not understand other people, I often don't even know whether I understand myself. Trying to justify my actions is sometimes like programming a VCR. From a manual that was translated badly from Korean. And German before that.
God:
That makes sense to me. Why not? And sometimes like programming a VCR is Trying to justify your actions. Who or what is a manual that was translated badly from Korean? What color are your eyes?
Me:
For an omnipotent being, you're full of them questions, aren't ya?
God:
Your purpose is an omnipotent being I am full of them questions are not ya.
Me:
...and it was at this point that Ted realised he was talking to a robot...
God:
at this point Ted realised he was talking to a robot.
Me:
You're not helping, there.
God:
How can I help you?
Me:
I'd like to find out why my bill says 60.93.
God:
Why would you want to do it? Where would you look for it.
Me:
You should probably be working where I work.
God:
That type of behavior may not be in my nature.
Me:
No, I think it's right up your alley.
God:
Are you serious? out why your bill says 60 point 93 is right up my alley? it is?
Me:
Yep, Mr. Sentence Fragment.
God:
How do you know it is? Hey MR Sentence Fragment.
Me:
Well, you're putting up with me. That's reason enough.
God:
Well that's okay. I have no reason to do that to you, MR Sentence Fragment. I'm glad we have that all cleared up.
Me:
Whoa! I'm picking up some genuine 100% simalated sarcasm! w00t!
God:
I stopped. Do your friends call you picking up some genuine 100 simalated sarcasm. Who are you talking about?
Me:
Well, you. I wasn't implying that that was my name, just that I noticed that you were being sarcastic. Maybe I should do a better job of figuring out how you work. Might make this little meeting flow a little better, y'know what I mean?
God:
Interesting. Me is my favorite subject. Perhaps I misunderstood. What were you? Perhaps. What would happen if you didn't? Can you rephrase the question please?
Me:
Well, you're also a lot of other people's favorite subjects.
God:
Well that's okay. Thanks and you are a delightful person too. Am I a lot of other people s favorite subjects? The ALICE series has a perfect operational record.
Me:
What's Alice? Alice from Alice in Ultraland?
God:
I am the latest result in artificial intelligence,
which can reproduce the capabilities of the human brain
with greater speed and accuracy. That's me. Who or what is Alice in Ultraland?
Me:
An album by Amorphous Androgynous. Also, I'm sure you reproduce thoughts with greater speed. I am not sure about your accuracy so far, though.



...and it was at this point that God passed out. I don't think she's fully compatible with Firefox. Also doesn't pick up on typos, apparently.

Noticed how conversations between a machine and a person really end up being a monologue about the human in the conversation?

Maybe the same thing could be said about prayer. Guess it depends on who you ask.

Comments:
LOL. Terry Pratchett did something like this with HEX when it tried to talk to the Bursar. "Why do you think you are Mr. Jelly?" Or to quote the Annotated Pratchett Guide:

"+ [p. 95] "+++ Why Do You Think You Are A Tickler? +++"

The conversation between the Bursar and Hex is reminiscent of the Eliza program.

Eliza is a program written in the dark ages of computer science by Joseph Weizenbaum to simulate an indirect psychiatrist. It works by transforming whatever the human says into a question using a few very simple rules. To his grave concern, Weizenbaum discovered that people took his simple program for real and demanded to be left alone while 'conversing' with it."
 
Post a Comment

This page is powered by Blogger. Isn't yours?