xu do sisku lo lojbo tcana
  [Home] [Manage]

Posts and uploaded files are owned by the poster. jbotcan.org is not liable for the content submitted by the poster. Downloading any poster-submitted files is doing so at your own risk.

Posting mode: Reply

Painter: Width: Height: Source:

Leave these fields empty (spam trap):
Subject (encouraged)
Password (for post and file deletion)
  • Supported file types are: GIF, JPG, PNG
  • Maximum file size allowed is 1000 KB.
  • Images greater than 200x200 pixels will be thumbnailed.


Can someone translate Asimov's Four Laws of Robotics into Lojban for me? I think that it would really help me with some of the nuances of the grammar.

Thank you.

>> No.964  


That includes the Zeroth Law, by the way.

>> No.965  

Just the Zeroth Law, for now:
nomai lo remymi'i ka'eku na ga selxaigau loi remna gi teri'a lonu cando cu curmi lonu loi remna cu selxai

I chose remymi'i ("human-ish machine(s)") as a translation for "robot", because the robots in Asimov's stories are androids, closely resembling humans. There are lots of possible Lojban words for different types of robots. The most general one may be zmimi'i, "automatic machine(s)".

A question for any experts reading this: is there a better way to get ka'e to apply to both halves of the conjunction?

I used "loi remna" to mean humanity. Possibly I should say "piro loi remna", "all of the mass of humans", but the shorter version forces the robots to prevent harm to parts of humanity too. This may be a bad thing for the accuracy of the translation, since it effectively combines the First and Zeroth Laws. Another question for the experts: is "lo'i remna" a better translation in this case?

"teri'a X" means X is a situation allowing something to physically cause something else.

"lonu cando" means "an event of something being idle", and I left it up to the reader to infer that it was the robot's idleness, although I could easily have been more specific. (I'm assuming the English is a description of the much more unambiguous laws programmed into the robots.)

>> No.974  


I think that a robot by the four laws in question could hurt a human if, for example, they were planning to A-bomb everything. The zeroth law, in my eyes, protects humanity as a larger kind. It protects the greater good.

This then could create a paradox, though, if someone commands a robot to kill a human if that human was going to kill all others. That would possible conflict with the first law- except the first law is subservient to the zeroth.

Delete Post []