New York
6h
527
Real talk: in what way private schools are better than public in nyc?
Personal Finance
Yesterday
1467
IRS Warns Thousands of Taxpayers They Could Face Jail Time
Tech Industry
2h
619
Racism towards Indians
India
8h
510
Why Worshipping Lord Ram Important in Hinduism?
Tech Industry
Yesterday
474
Can I bypass leetcode with this?
First it was sora stunt when gemini 1.5 context window came out Then it was cto acting like a π€‘ when asked if they train on YouTube. Now coo talks complete vapor ware bullshit https://twitter.com/ai_for_success/status/1787700287324782756 Looks like gpt5 isn't going well and things have saturated at gpt4 level and tons of competition in that space. Ai agents are super difficult as errors get chained. Meanwhile They have lawsuits to deal with.
yea claude is better anyway / went under the radar of my company"s security team so i dont have to deal with security by obscurity bullshit rate limiting
There is almost nothing proprietary in AI space. Everyone trains similar model architectures on similar data.
The speed of ChatGPT is going up and so is context window but the product isnβt much more differentiated than Claude and they probably care about data protection less
Wait until they release their own version of search before declaring them dead
Having seen sge and bing copilot and pplx, i don't see how OpenAI can do dramatically better. Search is much broader than genai, and in fairly sure openai doesn't have a Google killer, they cannot even do better than bing
That Twitter post is garbage. It is nothing but just trying to hate on someone.
There's one key difference, open ai gives out complete resources on paid, and whatever they give for free it's a lower mode but complete. Google fiddle around unnecessarily, even though gemini (free) vs gpt3.5 , gemini has better reasoning but gpt3.5 codes are way better. Pichai hust claims they have alpha code which is supposed to best coding assistant/ai coder but people never gets their hands on that. Google has deviated from the path they started. This won't serve them good for long time. Openai is at least transparent.
Alpha code is not actually usable because it requires thousands of proposals which then need to be evaluated. It's not scalable.
Anything which grows so quickly is always a bubble. At least 10 years later we will get to know the correct trend, when the dust settles.
the fat lady is still singing
the issue is that we're out of good data to train on. GPT4 maxed it out, there isn't much more for GPT5 to train on and the pool is being polluted with AI generated content more and more.
Nvidia will suffer badly
How. All these ppl need NVDA. Some may win some may lose but NVDA will keep printing money
All LLMs are failing. No one needs language models