Unlocking the value, and managing the risks, of AI requires input from across an organisation. HR plays a key role. Here are some suggestions as to what HR needs to know about AI:
- Make sure HR has a seat at the table: if your people hold the key to your organisation getting the most out of its use of AI, you, as an HR leader, must be at the heart of the strategic discussions. Demonstrating within your strategy an appreciation of the importance of good governance, ethics and transparency in your deployment of AI will be key. Whilst a good AI strategy will need input from all areas of the business, including procurement and operations as well as your technology and legal teams, HR must be involved in the conversation.
- Employee engagement is vital: employees may have legitimate concerns; would you engage in a piece of tech if you thought it was out for your job? You can deploy the most sophisticated, most expensive tech available but if your workforce won’t engage the rewards won’t flow. Make sure your employees understand your organisation’s strategy. Explain what you’re aiming to achieve from your use of AI and demonstrate the benefits. Engage with people individually on how AI could enhance their day-to-day tasks. Their outputs are likely to deliver real time and cost-savings as they know their roles best.
- Employees at all levels may need AI training: whilst some AI tools are intuitive, your people may need training to get the best out of the tech – how to prompt generative AI tools effectively is a good example – and to govern it effectively, so make sure training at all levels of ability is available.
- AI can result in skills loss: just because a task can be done by AI doesn’t mean it should be. Whenever AI is deployed take the time to analyse the human activities the tool will be replacing – are you losing valuable skills as a result? If so, what steps can you take to mitigate against that.
- AI risks discrimination: with certain roles attracting hundreds of applications, AI tools abound to help make recruitment processes easier. But beware of inadvertent discrimination – one-way video interviews are not uncommon and can involve analysis of facial expressions - but this can leave you exposed to potential disability discrimination claims if the software hasn’t been programmed to take account of certain disabilities. Equally sifting CVs by algorithm can also present discrimination risks depending on how the parameters for which CVs will pass the sift have been set.
- You may need a human ‘backstop’: discrimination risks may also arise when using ‘in work’ AI tools. For example the use of automated shift programming, designed to save valuable line manager time, may put those with childcare obligations at a disadvantage if there is no facility for the employee to speak to a manager to swap out or alter allocated shifts which cannot be accommodated due to those childcare needs.
- Also be mindful of the implied duty of trust and confidence – if software is used to allocate workloads and an employee consistently misses out on the high-profile projects, could they argue that duty has been breached by your use of AI? Would you be able to explain how the algorithm had allocated the work? Equally if a person is selected for redundancy using AI, could you explain and justify that decision to satisfy an Employment Tribunal that the decision is fair? Build in a human ‘backstop’ to check AI-generated decisions.
- Use of AI requires data protection compliance: AI often uses or creates personal data and legal provisions protect individuals where AI is used giving rise to various rights. The Information Commissioner’s Office (with its powers to fine) is very aware of the risks AI presents in relation to the misuse personal data.
If you have any questions or would otherwise like to discuss any of the issues raised in this article, please contact Kate Redshaw, Katie Russell, or a member of our Technology Team, For the latest updates on AI law, regulation, and governance, see our AI blog at: AI: Burges Salmon blog (burges-salmon.com)