Citizens should be able to participate in development to determine what the priorities should be and how values should be weighed against each other.

Video focuses on post-hoc contestation, but ex-ante is also important.

It is not the citizen's responsibility to solve all issues with fair AI. It is the city's.

Dislikes contestability as a concept because it implies being against each other rather than participation which implies a dialogue. 

Would prefer to see participation as the umbrella term under which contestability falls. But as something to be avoided if at all possible.

Key to resolving issues is often making an effort to understand values or problems that inform citizens' statements.

There are different forms of inspiration. They have experience with personal data commission and interest groups. Not with direct participation by citizens.

Personal data commission is independent but does not represent views of all citizens.

Citizen participation struggles with lack of knowledge on part of citizens. Would be great if they get an independent expert that can support them.

Because of recent events (SyRI verdict, benefits scandal), citizens, interest groups and client councils are distrustful of government use of AI.

Their own experience of a client council is that despite making an effort to explain things to them their advice does not a real understanding of the issue. 

For a next project they would try to make a combination of an independent external commission of experts, and a council of citizens.

Lack of trust in government and algorithms makes participation challenging currently.

Next time they would make a description that includes context, aim, legal responsibility, product idea, values and rights that are promoted, values and rights that are "pressured". And ask committee for their perspective, if they would make the same trade-off, or negotiate a different one.

They would want to pay more attention to ensuring a representative sample.

They would want to ensure participants are adequately compensated for their time.

National and local policy frameworks can conflict with each other. The local executive commissions a system and is democratically elected. It is possible that an additional layer of participation introduces yet another a conflicting viewpoint.

Accountability for choices made happens through algorithm register and privacy statement. 

Information should have layers of abstraction and detail. 

Contact mechanisms should be present for people who want to know more.

Evaluation of the account of design choices happens with personal data commission and alderperson but could also include citizens directly. But for direct participants the issue is again how to get past the suspicion and fear: "I just think what a challenge it is to have a substantive conversation and how do you arrive at that substantive conversation."

Evaluations and participation at scale are costly. Is subject to a cost-benefit calculation as well.

According to one definition of algorithms that the city employs there could be as many as 3000 systems. Arranging evaluations and participation for all would be way too costly.

When systems are purchased those doing the negotiation need to understand things in order to enforce accountability.

Purchased systems sometimes lack transparency. The city often lacks control over external development processes.

Pilots are "in production" but not in the final maintenance phase. Pilots require a go/no-go before going into permanent use. Scrutiny and monitoring during pilot is more intense than regular systems.

Citizens can be concerned about potential future harms. Responses from the city can be about the likelihood of such harms occurring (if at all) and ways they mitigate them. Citizens may not be satisfied by such responses because of lack of trust or lack of understanding or both.

"For example, the concern may be that the city will only use it in neighborhoods where people are very vulnerable financially. Then you can understand as a citizen, for example, if the city says that we are not going to do that here, we use it municipality-wide, for example. You can understand that as a citizen. But if, for example, citizens think of oh yes, but it is all a black box, how do I understand that. Then I can explain, well we chose an algorithm with this name and then you can trace all the way from […] how it came about. In addition, these are the features used, we can also have conversations about that. But that is already very technical, so to speak, so at that point they already give up quite a bit."

Texts explaining systems need to be written in an accessible manner.

They develop frequently asked questions.

For other questions they receive emails and simply answer them.

Contestations can be about the design, or about individual decisions.

The latter must be monitored for changes that should be made to the system's design. They monitor complains, they monitor objections and appeals, and they monitor the contact center. They do a qualitative and quantitative analysis of these.

Can be necessary to build custom ways for citizens to appeal decisions.