It’s like ChatGPT and Google had a genius baby! 🚀 Just wrote a fun article about why this could make Google sweat. I think Google might need some new tricks!
Sam and the truly talented team at OpenAI innately understand that for AI-powered search to be effective, it must be founded on the highest-quality, most reliable information furnished by trusted sources. For the heavens to be in equilibrium, the relationship between technology and content must be symbiotic and provenance must be protected.
It seems weird for OpenAI to be building what is essentially an OpenAI wrapper. This is costing them extra resources (since they have to build it out), taking away money from other businesses trying to use the API to build similar tools, and reinforcing the idea that one shouldn't build with AI.
Yeah you are Right, but I think openai realised that you cant make any money by simple selling tokens for others to Build applications around. The inference cost of LLMs are about to become close to 0 (as shown by gpt4o Mini and gemini flash). This also brings down revenue potential from APIs.
Now desperately trying to do everything that was “complementary” to LLMs and would be wiped away by capabilities. I bet they can’t come close to Perplexity for at least six months. Not to mention that the prompting strategies that Perplexity keep improving are nowhere to be seen coming from OpenAI…
Perplexity has an edge in how it ignores robots.txt and lets you pair it with Claude 3.5 Sonnet. It works incredibly well with very high daily message limits.
It’s like ChatGPT and Google had a genius baby! 🚀 Just wrote a fun article about why this could make Google sweat. I think Google might need some new tricks!
Sam and the truly talented team at OpenAI innately understand that for AI-powered search to be effective, it must be founded on the highest-quality, most reliable information furnished by trusted sources. For the heavens to be in equilibrium, the relationship between technology and content must be symbiotic and provenance must be protected.
It seems weird for OpenAI to be building what is essentially an OpenAI wrapper. This is costing them extra resources (since they have to build it out), taking away money from other businesses trying to use the API to build similar tools, and reinforcing the idea that one shouldn't build with AI.
Yeah you are Right, but I think openai realised that you cant make any money by simple selling tokens for others to Build applications around. The inference cost of LLMs are about to become close to 0 (as shown by gpt4o Mini and gemini flash). This also brings down revenue potential from APIs.
Now desperately trying to do everything that was “complementary” to LLMs and would be wiped away by capabilities. I bet they can’t come close to Perplexity for at least six months. Not to mention that the prompting strategies that Perplexity keep improving are nowhere to be seen coming from OpenAI…
Perplexity has an edge in how it ignores robots.txt and lets you pair it with Claude 3.5 Sonnet. It works incredibly well with very high daily message limits.