Forum

 
  Back to OzPolitic.com   Welcome, Guest. Please Login or Register
  Forum Home Album HelpSearch Recent Rules LoginRegister  
 

Page Index Toggle Pages: 1
Send Topic Print
Autonomous weapons - should we go down this path (Read 1410 times)
Super Nova
Moderator
*****
Offline



Posts: 3065
Everywhere
Gender: male
Autonomous weapons - should we go down this path
May 3rd, 2021 at 7:48pm
 
So autonomous weapons are now a realality, once we create a technology it is onlya matter of time before one uses it and all will use it.

Personally I think the Gennie is out of the bottle but this will change the whole world.

Robocop, Terminator, Black Mirror here we come.

If you haven't seen Black Mirror on Netflix, i recommend it.

The French could be first, see article below.

French military wants robodogs to round up the enemy

The French armed forces should be authorised to use autonomous weapons, known as killer robots, in certain strict conditions, a defence ministry report has advised.

The Defence Ethics Committee, a military and civilian body, set out one of the most elaborate moral cases by a western state for the artificial intelligence systems that are expected to transform warfare.

https://www.thetimes.co.uk/article/french-military-right-use-killer-robots-warfa...

From Black Mirror.

...
Back to top
« Last Edit: May 3rd, 2021 at 8:32pm by Super Nova »  
WWW  
IP Logged
 
Bobby.
Gold Member
*****
Online


Australian Politics

Posts: 103406
Melbourne
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #1 - May 3rd, 2021 at 10:39pm
 
subscriber only content - please copy and paste it all.
Back to top
 
 
IP Logged
 
Super Nova
Moderator
*****
Offline



Posts: 3065
Everywhere
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #2 - May 3rd, 2021 at 10:48pm
 
From the article.

The French armed forces should be authorised to use autonomous weapons, known as killer robots, in certain strict conditions, a defence ministry report has advised.

The Defence Ethics Committee, a military and civilian body, set out one of the most elaborate moral cases by a western state for the artificial intelligence systems that are expected to transform warfare.

The report, ordered by Florence Parly, the defence minister, came as debate arose over the French army’s use in battle exercises of Spot, a robot dog used by some US police forces.

The robot produced by Boston Dynamics was used by cadet officers at the Saint-Cyr military college in simulated street fighting. Over the past decade about 30 countries and aid groups have been campaigning for a global ban on the weapons as morally repugnant.

The western powers, including the US, Britain and France, have refused to renounce their development because the Chinese and Russians are racing to put them into service — and terrorists are not far behind. Armed drones already operate under human supervision. The French committee called for a ban on independent systems “programmed to be able to change their rules of operation”. It said, however, that French forces could use “partially autonomous” killing systems to identify and engage with targets while keeping human operators informed. The controllers could stop the devices.

The report said that automation would become essential to cope with the speed of future warfare as missiles were approaching at five times the speed of sound. The French analysis echoes British statements that weapons with autonomous functions must have human oversight. The UK used an early version of autonomy in Libya in 2011. Tornado jets fired Brimstone “fire-and-forget” missiles that chose their own targets and destroyed military vehicles. Experts say that the fine distinction between partial and full autonomy may become blurred in battle. Armed drones can already fire at targets without an operator’s order. Robot naval vessels and tanks are being developed.

France was behind the creation last week of a £6.8 billion European Union fund to increase research in military technology. Thierry Breton, the French commissioner for the internal market, said: “We must increasingly be able to take our security into our own hands.”

Boston Dynamics said that it had not been warned that the 31kg Spot would be used in a battlefield situation. “We do not want any customer using the robot to harm people,” Michael Perry, the company’s vice-president, told The Verge website. “This forward deployment model . . . is something that we need to better understand to determine whether it is actively being used to harm people.”
Back to top
 
WWW  
IP Logged
 
Bobby.
Gold Member
*****
Online


Australian Politics

Posts: 103406
Melbourne
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #3 - May 3rd, 2021 at 10:56pm
 
Thanks SN,
humans are so good at the technology of killing each other.   Cheesy
Back to top
 
 
IP Logged
 
Super Nova
Moderator
*****
Offline



Posts: 3065
Everywhere
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #4 - May 4th, 2021 at 3:16pm
 
Bobby. wrote on May 3rd, 2021 at 10:56pm:
Thanks SN,
humans are so good at the technology of killing each other.   Cheesy


Should we be letting machines, that are programmed and cannot think (AI is not thinking) run autonomously around killing people.

Who will be responsible in the event of innocence being killed?
- Is it the programmer who created a bug?
- Is it the person who set ambiguous directives?
- is it the operator?
- What will be the legal position?

What will happen when a terrorist plants a few of these in a crowd, that active then commit slaughter on a wide scale?

I see some serious issues for humanity in the future.
Back to top
 
WWW  
IP Logged
 
Bobby.
Gold Member
*****
Online


Australian Politics

Posts: 103406
Melbourne
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #5 - May 4th, 2021 at 3:27pm
 
Hi SN - I agree with you.

Also - weapons manufacturers have zero conscience -
zero morality.
That's why we have more guns in America than people.
If there's a buck to be made they will sell you a gun.

If it's a gun attached to a robot then they make more money.



...
Back to top
 
 
IP Logged
 
John Smith
Gold Member
*****
Offline


Australian Politics

Posts: 74303
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #6 - May 4th, 2021 at 4:32pm
 
I think the question 'should we?' is irrelevant ... the warmongers will use it anyway
Back to top
 

Our esteemed leader:
I hope that bitch who was running their brothels for them gets raped with a cactus.
 
IP Logged
 
Super Nova
Moderator
*****
Offline



Posts: 3065
Everywhere
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #7 - May 4th, 2021 at 5:25pm
 
John Smith wrote on May 4th, 2021 at 4:32pm:
I think the question 'should we?' is irrelevant ... the warmongers will use it anyway


I agree, once we started down this road, it is difficult for me to see this ending well for humanity.

I think we will need to now develop counter measures so the race begins.

We need a local EMP device to kill them dead. Let the race between weapon and defense begin.
Back to top
 
WWW  
IP Logged
 
Bobby.
Gold Member
*****
Online


Australian Politics

Posts: 103406
Melbourne
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #8 - May 4th, 2021 at 8:22pm
 
Super Nova wrote on May 4th, 2021 at 5:25pm:
John Smith wrote on May 4th, 2021 at 4:32pm:
I think the question 'should we?' is irrelevant ... the warmongers will use it anyway


I agree, once we started down this road, it is difficult for me to see this ending well for humanity.

I think we will need to now develop counter measures so the race begins.

We need a local EMP device to kill them dead. Let the race between weapon and defense begin.



What we need are arms or troop reduction treaties.
Let's stop it before it goes too far.

https://www.ozpolitic.com/forum/YaBB.pl?num=1619516298/0#0
Back to top
 
 
IP Logged
 
issuevoter
Gold Member
*****
Offline


Australian Politics

Posts: 9200
The Great State of Mind
Gender: male
Re: Autonomous weapons - should we go down this path
Reply #9 - May 5th, 2021 at 5:21pm
 
Super Nova wrote on May 4th, 2021 at 3:16pm:
Bobby. wrote on May 3rd, 2021 at 10:56pm:
Thanks SN,
humans are so good at the technology of killing each other.   Cheesy


Should we be letting machines, that are programmed and cannot think (AI is not thinking) run autonomously around killing people.

Who will be responsible in the event of innocence being killed?
- Is it the programmer who created a bug?
- Is it the person who set ambiguous directives?
- is it the operator?
- What will be the legal position?

What will happen when a terrorist plants a few of these in a crowd, that active then commit slaughter on a wide scale?

I see some serious issues for humanity in the future.


First things first, which is an agreement on the conduct of warfare. This has been tried with limited success. In modern history, the most civilised of nations have made declarations of war before they instigate hostilities. This "fair warning" seemed to end with WW2.

Innocent people are going to get killed in war, whether by robots or military personnel. A remote controlled drone is no different morally than a long range artillery shell or rocket. What is immoral is sending troops to fight an enemy who does not wear any clear insignia, and then charge the troops as criminals when something goes wrong.

It is also immoral to lie to the public about the nature of the enemy, whether by the Government or the media.

Back to top
 

No political allegiance. No philosophy. No religion.
 
IP Logged
 
Page Index Toggle Pages: 1
Send Topic Print