Ask HN: Codex is too slow. Is there any solution?

The codex backend is of good quality, the frontend is average, but most importantly it is too slow. I wonder if OpenAI will improve it.

5 points | by rule2025 3 days ago

4 comments

  • moomoo11 3 days ago
    It seems to work with less issues than CC opus.

    I don’t mind if it takes longer as long as the answer is correct more often.

    You can always be doing more work while one chat is working..

  • naiv 2 days ago
    the new 0.47 has a better performance now imho
    • i_have_an_idea 1 day ago
      this seems like a crazy idea as the cli client has nothing to do with how many tokens per second the api streams
      • naiv 1 day ago
        The api is never the bottleneck but how fast the cli provides context. So just by using ripgrep it will be faster than using grep. On top of this concurrent code search compared to sync etc
  • esafak 3 days ago
    Sonnet and Gemini are good and fast. Can't speak for Grok.
    • muzani 2 days ago
      Grok Turbo is fast.
  • ajay_as 22 hours ago
    [dead]