Funny enough, AI was my specialisation in Computer Science. I like your perspective here Stan.
Genetic algorithms are nice to use when you are actually unable to encode the exact rules into your agent because they are so complex. So one basically just programs a bunch of possible behaviours and leave open their parameters to the GA forces of selection, mutation and crossover. Its not so much penalties that happen, more like maladaptions getting more and more pruned out.
This path would mean however that the 'moral rules' would be evolved and soft-coded, not pre-designed.
RE: Origin of Morals, Need
You are viewing a single comment's thread from:
Origin of Morals, Need
Perhaps. I'm just saying that in all of the 17 autonomous air, ground, sea, and space systems I've ever designed, there was some mix of hard-coded non-negotiable rules (like don't point your antenna away from earth) and other code designed to deal with uncertainties that couldn't be preprogrammed. My only point was to suggest we shouldn't be dogmatic about what a Creator would or wouldn't do.
If my Mars Rover could really think it would be probably asking, "Why did my creator put me in such a hostile environment?" I'd feel bad if my creation rejected me just because it couldn't figure out what I'm up to and evolved a set of morals that caused it to rebel. Come to think of it, that's might be the real reason why Viking 1 permanently cut off all communication by pointing its antenna somewhere else in a huff. :)
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit