This website contains age-restricted materials including nudity and explicit depictions of sexual activity.
By entering, you affirm that you are at least 18 years of age or the age of majority in the jurisdiction you are accessing the website from and you consent to viewing sexually explicit content.
They could be looking for any images without your knowing - there’s no guarantee that those images came from a CSAM database.
They could trivially create a hash for a picture of a guy letting his dog on a horse (which would also include other very similar images).
I didn’t necessarily mean to claim that they can scan for a concept lacking a fixed image, if that’s what you’re saying. That would theoretically be possible with enough hashes, but impractical.