You can test this yourself easily on facebook. Just target a blank audience of all users vs a lookalike audience FB builds for you algorithmically. The latter works much, much better at generating sales for almost everyone.
> "The latter works much, much better at generating sales for almost everyone."
This seems like an extraordinary claim that would require extraordinary statistical evidence to believe it. Both evidence to ensure the methodology of comparing whatever Facebook does with blank audience ad serving to whatever is done with lookalike audiences, as well as the "much, much better" part and the "almost everyone" part.
For example, with Google and Facebook controlling so much of all advertising traffic, it could easily be the case that 'blank audience' ad purchasing (which is less in Facebook's interest, since it doesn't help highlight their specialized data products) are directed towards users with less likelihood to engage to begin with, regardless of algorithmic profile or interests.
Do we have any knowledge of the precise differences of the two treatments (blank vs. digital lookalike)? If not, how could we even begin to attribute success seen in the 'lookalike' category to any value-add from Facebook? As opposed to possibly gaming the different user groups to make such an A/B test always tilt in their favor.
Alternatively, you could serve ads on Yahoo, or Reddit, or the whole Google Display Network, etc., and see how much vastly worse those clicks convert for you.
This isn't rocket science. I've managed millions of dollars of ad spend across a dozen companies. I know lots of other people who manage ad spend. The only other digital channel that's anywhere near as targeted and scalable as Facebook is Google search, which is why the two combined make up 90% of new online ad spend (according to data).
> “you could serve ads on Yahoo, or Reddit, or the whole Google Display Network, etc., and see how much vastly worse those clicks convert for you.”
But that experiment could only possibly tell you if Facebook ads, in general, are more effective than ads on other platforms. And that might be caused by all sorts of confounders, like the relative demographics or usage patterns being different on those other platforms, which have no connection to whether anything that Facebook does contributes to positive ad effect.
In particular, comparing with other ad platforms could not tell you whether Facebook invisible audience ad algorithms are any better or worse than Facebook algorithmically targeted audience ads (especially since you still wouldn’t know if Facebook privileges algorithmic ads in some way just because it’s good for their business if it appears that their specialized product is better than a non-specialized one).
> “This isn't rocket science.”
It’s funny to me that a lot of marketing, product, and A/B testing people express this attitude about understanding what succeeds in marketing & product problems.
When really, those questions require a great degree of statistical rigor that is like rocket science.
It takes a great deal of advanced econometrics theory or formal statistics theory to answer ad spend attribution questions in a way that’s not completely defeated by methodological flaws, poor experimental design, causal indeterminacy, or various statistical fallacies.
Perhaps it’s one reason why the claim that digital ads work is still so hugely debated, with many claiming that quantitatively, digital ads (including Facebook) utterly don’t work.