4 Comments
Dec 1, 2023Liked by Helen Beetham

Thanks for the kind words, Helen, and indeed agreeing to come on the podcast!

There's much to be cautious and sad about in what you share above, but this made me laugh: "Maybe with AI interviewing AI, the loop will eventually spit out the most suitable AI for the job."

Hiring is entirely broken, and I can only hope for results from recent moves to remove a) university degrees as a proxy to find 'the right sort of candidate', and b) CVs as the first step of the hiring process. I somewhat doubt that, though, and in fact what's beginning to replace both is probably worse.

Thanks for continuing to write so eloquently and provide so many links! One post from you adds to my reading list significantly 😅

Expand full comment

The idea of one suite of AI software interviewing another set for a job made me laugh, but I guess it's just another step along the road to the promised land of appointing candidates that are so unexceptionable that no-one needs to even know they are there.

Expand full comment

The Biobank thing is interesting (and not just because my data's in there too!). I agreed to participate on the same understanding as you, expecting that the game was being played with a straight bat. It's an asymmetrical game.

I suspect for many people the idea of corporations gaining and profiting from their personal data would be greeted with a shrug. For a percentage of us, this all happens in a seemingly benign environment. If the politics of the last few years tells us anything, it's that the veneer of civil society can be eggshell thin and the guardrails that prevent truly awful, personally impactful things happening with that data may vanish leaving us with no protection. We act like that's just not a possibility.

Of course, for a large number of the disadvantaged and marginalised in our society, that is already their reality.

Expand full comment
author

You're right, Chris, and part of the asymetry is that corporations can always change the rules - or if they can't, they can absorb any slap-downs that might follow. But you're also right that this isn't a great example. If it was the worst thing to happen to me and my data (it isn't btw) a shrug might be the best response. The problem is - and perhaps I should be much more explicit and less flippant when I discuss this - that 'awful, personally impactful things' are happening to people all the time, through systems of the exactly the kind being integrated into the UK public sector. They just tend not to be the kind of people who work in higher education. They are people who find themselves on the wrong side of the law, or the militarised border, or the occupied zone, or the risk assessment, or the credit rating, or the insurance claim. Worrying about how my data is being misused may be a luxury, but worrying about how vulnerable people's data is being misused is politically necessary, right?

Perhaps the closest we have come to seeing 'our people' becoming 'those people' is the news that the DfE is monitoring the social media accounts of educators who are critical of government policy. It's difficult to imagine there are enough civil servants left to do this manually, so hello algorithms [waving] !

Expand full comment