OpenAI wants the money and the military wants to never have to deal with accountability. That way when they bomb a wherever they want and just say “it wasn’t my decision, it was the AI” and then OpenAI can say “we need more money to make it more reliable. Also we need more training data from the military so it won’t happen again, can we have it all?”
OpenAI wants the money and the military wants to never have to deal with accountability. That way when they bomb a wherever they want and just say “it wasn’t my decision, it was the AI” and then OpenAI can say “we need more money to make it more reliable. Also we need more training data from the military so it won’t happen again, can we have it all?”