Modern, and even old, sensors absolutely can see stars during the day. It's really a question of magnification. When you magnify the image, the pixels a star falls on don't change much, but the background gets darker because the background is spread over more pixels.
I was talking to a grad student who was having an issue getting descent flat frames (images of a uniform field, used to account for dust and optical train effects). I asked why she didn't just take a picture of the sky in the daytime like most amateurs do, and she said she always ends up with stars in the image doing it that way. They tried finding the least populated part of the sky, but still always picked up stars.
That was the only time I got to talk to her, so not sure how she ended up solving it. She was searching for transiting exoplanets, so looking for a star's dip in brightness of less than 1%. Even a bug temporarily flying in front of the star could throw it off.
Some people do use tissue paper, or a t-shirt over the lens, and that works for making pretty pictures. But she said none of them come close to 1% accuracy.
But now you have me wondering how the big telescopes do it. Things like the JWST or HST, or even large ground based scopes don't really have any of those options.
I was talking to a grad student who was having an issue getting descent flat frames (images of a uniform field, used to account for dust and optical train effects). I asked why she didn't just take a picture of the sky in the daytime like most amateurs do, and she said she always ends up with stars in the image doing it that way. They tried finding the least populated part of the sky, but still always picked up stars.