site.btaUNESCO Draft Recommendations on AI Ethics Include Six Bulgarian Redactions

Sofia, November 9 (BTA) - Six Bulgarian redactions have been included in the final draft of a code of ethics for artificial intelligence which the UNESCO General Conference is to adopt at its 41st Session (November 9-24), BTA learnt from Mariana Todorova, a Doctor of Futures Studies at the Bulgarian Academy of Sciences' Institute of Philosophy and Social Sciences who shared in the drafting of the document.


Member States will be recommended to use the relevant code of ethics when they face matters related to artificial intelligence. Four key values are set down, including respect for human dignity, rights and freedoms, Todorova said. The draft focuses on education, science, environment, culture and, expressly at the insistence of Bulgaria and several other countries, included the labour market as well.


Regarding solutions for people threatened to be replaced entirely by AI, Todorova said some could be reskilled and others trained to cooperate with it. At some stage, accountants and office managers or security guards could be replaced entirely, while lawyers and doctors would have to learn to cooperate, she added. As an example, the expert cited the already existing due diligence platforms.


The recommendation that consumers should be aware whether a product or service they receive is AI-generated or AI-assisted is also Bulgarian, and Todorova considers it one of the best, considering that algorithms can profile users. Therefore, the final decisions rest with the consumer, such as AI-based bank credit platforms.


The second important Bulgarian proposal was that observation using the so-called intelligent wearables or implantables (implanted devices, such as identification chips or chips that control technological devices) should require the informed consent of the patient or their family. Todorova explained that AI will most probably have a very significant impact in medicine.


Then again, AI will be used in support of education, but in a manner guaranteed not to diminish cognitive capacity or extract sensitive information. This refers to AI-based platforms with facial recognition that can tell whether a student perceives the information adequately, understands it or experiences difficulties. This could both help an individual approach in education and prove a problem, because the personal data involved could be used for discrimination, Todorova commented. Not to mention that even more additional information could be collected.


The expert further explained that AI platforms that "take" dictation and "translate" from 180 languages possible are incapable of comprehending sarcasm or humour, although they are at a good operational level. If people stop learning languages, write by hand or compose music, this would lower their development and cognitive capacity, Todorova observed.


According to the fourth Bulgarian amendment, all countries and regions should have equal access to data, because AI evolves on the basis of multiple data processing. Unequal access to data would imply inequality of some countries compared to other, the expert pointed out.


The fifth Bulgarian contribution is to monitor avoiding monopolization of some services and products, as this would give some states or corporations an exclusive competitive advantage and power, Todorova said.


Finally, states should also monitor the influence of artificial intelligence when used by media service providers.


According to Todorova, the future will see the need of regulation and this will probably apply on a regional scale. It will hardly be possible to guarantee that it will be kept globally, she added, citing as an example China which has been implementing a social credit system for a year now. In her opinion, there will probably be an AI regulatory framework that will originate from the EU, but not all countries, including the US and Russia, would want to comply with it. "Now is the time for a global dialogue about what is admissible and what is not," she noted. DT/BR/LG

news.modal.header

news.modal.text

By 02:34 on 07.08.2024 Today`s news

This website uses cookies. By accepting cookies you can enjoy a better experience while browsing pages.

Accept More information