upvote
Valid, but for all the crap that LangChain gets it at least has its own layer for upstream LLM provider calls, which means it isn't affected by this supply chain compromise (unless you're using the optional langchain-litellm package). DSPy uses LiteLLM as its primary way to call OpenAI, etc. and CrewAI imports it, too, but I believe it prefers the vendor libraries directly before it falls back to LiteLLM.
reply
the requests.post advice is right but its also kind of depressing that the state of the art recommendation for using llm apis safely in 2026 is to just write the http call yourself. we went from dont reinvent the wheel to actually maybe reinvent it because the wheel might steal your ssh keys. the abstraction layer that was supposed to save you time just cost an unknown number of people every credential on their machine
reply