Deep neural networks have mastered ways to solve various problems - from identifying and inferring objects in images to becoming "Go God." As these tasks become more complex, the solutions that neural networks explore are becoming more and more cumbersome. Because this system is too complicated, even an engineer designing the system may not be able to analyze why it issued an instruction. Of course, you can't force the neural network to give it the reason for each instruction: there is no system that allows AI to explain its behavior. In fact, this is the famous "black box" problem in the field of artificial intelligence. With the increasing use of neural networks in the real world, research on this issue has begun to become extremely important. In this regard, "MIT Science and Technology Review" published a special article on the "Dark Secrets of the Core of Artificial Intelligence" to explore the unexplained problem of neural networks. Figure å°é¢ MIT Science and Technology Review's cover article on the "black box" of artificial intelligence As a pioneer in the field of artificial intelligence research, DeepMind is also paying attention to the "black box" problem - currently, the team is working on developing more tools to explain artificial intelligence systems. On June 26th, in a newly published paper, DeepMind proposed a new method based on cognitive psychology to study deep neural networks. Lv 3Ph Asynchronous Motor,Three Phase Asynchronous Motor,3 Phase Asynchronous Motor,Asynchronous Motor 3 Phase Yizheng Beide Material Co., Ltd. , https://www.beidevendor.com