cool post @doitvoluntarily it's good to see drones being used in this way.
The US Army recently announced they may be developing unmanned aerial drones designed to "automatically Detect, Recognize, Classify, Identify (DRCI) and target personnel and ground platforms or other targets of interest." More countries may likely research and develop this type of technology. This article raises a few interesting points:
Remote operation of drones by human operators still can lead to psychological trauma. AI-operated drones may mitigate such trauma but may also separate humanity from the decision to kill.
Responsibility for the decision-making algorithms and risk parameters (i.e. number/rate of acceptable civilian deaths) that shape drone behavior and learning processes will fall on AI scientists and political, military and industrial leaders.
Recent issues involving self-driving cars highlight some of the risks involved with AI. Imagine some of the risks created by AI-operated drones.