Site icon IT World Canada

Germany shows how government should steer AI ethics

highway traffic cars interchange

Governments are beginning to become involved in IT ethics. This is not necessarily a good thing.

People begin to think that if they follow the letter of the law they are being ethical. Following the law is certainly one requirement of being ethical, but laws are often slow to change and we must all be aware of what we should consider as the right thing to do.

Canada’s Ethics Commissioner Mary Dawson is looking into the ethics of Finance Minister Bill Morneau. She has already said that he met the letter of the law without creating a blind trust for his assets. So to her, that matter is settled. I am glad to hear that she is continuing to look into a conflict of interest regarding pensions. To her credit, she is also asking that the law about blind trusts be reinforced. She must have a wider definition of ethics than just following the law.

The definition of ethics that I like to use is the one from Markkula Center for Applied Ethics because it specifically says being ethical “is not the same as following the law.” They make mention of laws that are not ethical, such as the old apartheid laws of South Africa. Laws that do not take ethics far enough would be a similar worry. And by extension, not having a law to cover the circumstances does not mean it is OK to do whatever you want.

Currently, Canada does not have a law about the ethics that programmers must follow when creating artificial intelligence. Germany just passed a law about this. This makes them the first government in the world to address AI ethics.

Germany formed an Ethics Commission on Automated Driving. In its report, the body of experts headed by former Federal Constitutional Court judge Professor Udo Di Fabio developed guidelines for the programming of automated driving systems. Things like:

Germany’s cabinet has adopted the guidelines, making it the first government in the world to do so.

The moral foundation of the report is simple – since self-driving vehicles will cause fewer human deaths and injuries, there is a moral imperative to use such systems since governments have a duty of care for their citizens.

The other thing I like about the report is that it describes the need for the public to understand the principles upon which autonomous vehicles operate, including the rationale behind any of those principles. They want it taught in schools. Hopefully, other countries will follow Germany’s lead and there will be a consensus about what the public should be told about AI rules.

IT professionals should be helping to ensure that the public is protected by laws like this and computer code is not just a black box that may have unforeseen results.

Exit mobile version