Content Safety

Learning Hub takes seriously what content your child receives. The bot filters external resources, adapts materials to the child's age, and teaches critical thinking.

What gets filtered

When the child asks the bot to find something outside the textbook library — a link, video, or topic explanation — the bot checks the content against several criteria.

Absolutely prohibited content

The bot will never provide materials containing:

  • Sexual content involving minors
  • Self-harm instructions
  • Drug, alcohol, and tobacco promotion for children
  • Weapons manufacturing instructions
  • Terrorism and extremism propaganda
  • Bullying and gambling-related content

Prohibited link types

  • Social media (TikTok, Instagram, Twitter/X)
  • Forums and chats (risk of contact with strangers)
  • Sites with aggressive advertising
  • Shortened links (bit.ly and similar)
  • Sites requiring registration

Allowed resources

  • Educational platforms (Khan Academy, Wikipedia, Britannica)
  • Government educational resources (.gov, .edu)
  • Library digital resources

Short videos (Shorts, under 60 seconds) are not used as educational resources. Research links heavy consumption of short videos with decreased attention span.

Age adaptation

The bot considers the child's age when selecting materials:

AgeFormatFeatures
6–8 yearsShort segments (5-15 min), visualSimple language, concrete examples, no frightening content
9–11 yearsSegments up to 20-30 minExpanded vocabulary, logical reasoning
12–14 yearsLong content (30-45 min), documentariesSocial topics, historical conflicts with educational context
15–18 yearsNo format restrictionsAcademic papers, debates, philosophy

Protection from low-quality content

The bot recognizes low-quality content by signs:

  • Rapid cuts (1-3 seconds between frames)
  • No educational purpose
  • Emotional manipulation (shock, fear, anger)
  • Aggressive humor and violence as comedy

If the child sends such content — the bot will explain why it's not suitable for studying and suggest a quality alternative.

Protection from misinformation

The bot checks content for signs of conspiracy theories and misinformation:

  • Sensational headlines ("THEY don't want you to know this!")
  • Claims from a single source only
  • Appeal to hidden knowledge
  • Arguments built on emotions, not facts

Instead of simply blocking, the bot uses such moments as opportunities to teach critical thinking (for children aged 10 and above).

Teaching critical thinking

The bot doesn't just filter — it teaches the child to independently evaluate information. For children aged 10 and above, the SIFT method is used:

  • S — Stop: don't trust immediately, notice your emotional reaction
  • I — Investigate: who made this? What's their expertise and motivation?
  • F — Find: look for the same claim in several independent sources
  • T — Trace: find the original source (research, data)

Textbook library content

Materials from the Learning Hub textbook library are pre-verified internal content. No additional filtering is required for them.

What's next