논문윤리하기 논문투고규정
  • 오늘 가입자수 0
  • 오늘 방문자수 271
  • 어제 방문자수 601
  • 총 방문자수 2790
2024-11-19 10:24am
논문지
HOME 자료실 > 논문지

발간년도 : [2023]

 
논문정보
논문명(한글) [Vol.18, No.2] A Study on the Artificial Intelligence Act and the Basic Principles of Explainable Artificial Intelligence(XAI)
논문투고자 Jae Kwon Bae
논문내용 XAI (eXplainable Artificial Intelligence) is a technology that explains the process by which a result is created so that the user understands and correctly interprets the operation and final result of an artificial intelligence system. The Black Box, the limit of artificial intelligence, increases opacity and complexity, and XAI was proposed as a way to compensate for this. It is time to clearly understand the contents and effects of artificial intelligence technology and need an XAI system based on concrete and effective norms, not an artificial intelligence industry that relies on autonomy. In addition, it is time for a unified artificial intelligence legal system (legal framework) of artificial intelligence basic law and artificial intelligence legislation to protect the health of artificial intelligence market and public interest. Deep learning and reinforcement learning, which have recently been actively used in various industries, can make incomprehensible judgments, so consumers (users) do not understand the conclusions of the models, resulting in a problem of lack of trust. The ultimate goal of XAI is to create reliable artificial intelligence, and the understandability of the algorithm and the explainability of the output must be guaranteed. Based on the discussion above, this study proposed five basic principles of XAI: Transparency, Accountability, Fairness, Reliability, and Result Demonstrability based on the Artificial Intelligence White Paper and the Artificial Intelligence Act enactment. The basic principles of XAI presented in this study are expected to contribute to the establishment of human-centered artificial intelligence and a reliable artificial intelligence ecosystem.
첨부논문
   18-2-17.pdf (450.1K) [13] DATE : 2023-05-04 16:25:07