Using AI for analysis in GOV.UK
These guidelines should be followed when using AI for analysis in GOV.UK.
1. Be careful to not upload sensitive data to AI tools. Don’t upload any data into non-enterprise AI tools
Many of our datasets sometimes contain personally identifiable information (PII) despite our best efforts to redact and remove it. We do not want users’ information to be uploaded to tools that might save the data.
With most LLMs/AI tools, if you have an enterprise account they will not save the data, but if you are using a normal customer account they can keep it. Gemini is currently our organisation’s approved option, and as we pay for it we have an agreement in place that they do not keep our data.
If you wouldn’t share the data with someone outside of GDS, don’t share it with a non-enterprise AI!
2. Always check work done using AI tooling
AI analysis can be notoriously error-ridden, so do make sure you check all work thoroughly.
For this reason, it may be best to not use an AI/LLM to help you with tasks you are unfamiliar with, as you may not be able to properly check the accuracy of the outputs.
3. Disclose when work has used AI
As you would document or disclose your methods for any piece of analysis or reporting, you should mention when tasks have been completed using AI. This is important so that others know how your analysis/work was carried out and can possibly replicate it in future.
4. Familiarise yourself with the AI playbook
Use the guidance in the Artificial Intelligence Playbook for the UK Government when carrying out analysis using AI.