Perplexity is a fascinating AI application. But some aspects of the app seem worrying. It takes a Wikipedia-like approach to news: it parses through verified news sources and compiles the information in an easy-to-understand format with all the key facts. It also allows you to ask questions, answers to which it creates on the fly, based, once again, on sources.
A recent Forbes article calls Perplexity's aggregation and generation a 'cynical theft'. It had taken its work, without permission, and republished it as though it were itself 'a media outlet'.
AI summaries, including Google's feature, pose some key challenges for society and news publishers. For starters, they have issues with accuracy. Unlike Wikipedia, they don't have human oversight or verification, and can pick up unreliable or unverified information from articles as facts. A user who doesn't know better, or can't easily spot issues, may accept it as news.
For news publishers, there is an issue of their work being used as raw material to generate these summaries, without any compensation. Facts are not protected by copyright because their dissemination is an essential public service, and there's greater benefit for society if facts can be re-reported. A copyright-related concept called 'fair usage' enables derivative works based on facts: basically, you need to add your own value, in terms of a broader composition, if you're copying. Perplexity's and Google's AI summaries add zero original value — no additional reporting, no original context, and