Inicio Educational Trust Ofsted good 2023

Would you like to see our school in action?

View our School Prospectus
Biggin Hill School, Bromley

News

Safeguarding Bulletin Issue 4 dated 13th March 2026

Posted: Fri 13th Mar 2026

Dear Parents and Carers,

We have had a wonderful week in school this week with not one but two year group celebrations! Thank you and well done to Year 3 and Year 5 who showcased their learning beautifully. Thank you to parents who were able to attend and to the staff members for facilitating. Any feedback from families would be greatly appreciated - the format of these celebrations changed following feedback in parent forum and we would like to know your view on the changes made. We thank you in advance. 

Parent Forum is on Tuesday 17th March - be sure to share any queries, questions, concerns or thanks with your parent representative. 

In this week’s safeguarding bulletin we will be covering the use of AI Chatbots including the risks and positives associated with them. 

Artificial Intelligence (AI) chatbots – such as ChatGPT, Google Gemini, Alexa, and others are becoming more common in everyday life. Children may encounter them at school, in games or even through websites and apps. While these tools can be useful and fun, it’s important for parents to understand how they work, the potential risks and how to keep children safe. AI chatbots are computer programmes that can hold a conversation with a person. They can: answer questions (e.g. helping with homework or explaining a tricky topic), spark creativity (e.g. helping write a poem, story or quiz), support hobbies and interests (e.g. generating recipe ideas, sports facts or coding tips) and provide entertainment (e.g. riddles, jokes or role-play style games). Used wisely, chatbots can be a helpful tool for learning and fun.

Although conversations may start out innocent, there are risks to be aware of: Unpredictable responses: chatbots sometimes give inaccurate, confusing or inappropriate information. Role-play risks: a child may ask the chatbot to pretend to be a friend, character or even a parent figure. This can blur the line between fantasy and reality. Sensitive topics: children may explore personal worries with a chatbot and the answers they receive might not always be supportive, accurate or safe. Over-reliance: a child might start turning to a chatbot for advice instead of trusted adults. See the following page  for key takeaways regarding AI Chatbots. 

Have a wonderful weekend,

Mrs Lawrence

Headteacher and Safeguarding Lead

 

Return to News listing page