
🟢 Is the war in Gaza already changing what we can accept from AI?
As I was recently exploring this notion of 80%-OK vs. 99%-OK problems, I dove into how the current wars in Ukraine and Gaza have changed AI. For instance, how much leeway do you give to errors when a drone has to identify and lock on a specific target