How to debug langgraph/agents/leo/nodes.py in a Leonardo project
How to debug langgraph/agents/leo/nodes.py
1. Run docker compose down
2. In docker-compose-dev.yml, comment out command: bash -c “python init_pg_checkpointer.py && uvicorn main:app –host 0.0.0.0 –port 8000″ and uncomment # command: tail -f /dev/null .
3. Run bash bin/dev again. This time, the LlamaBot container will start, but Uvicorn (FastAPI) won’t run, so you won’t be able to access localhost:8000 yet.
4. Run this docker compose exec -it llamabot uvicorn main:app –host 0.0.0.0 –port 8000 to manuallys tart Uvicorn (FastAPI). You should now be able to access localhost:8000
5. Now, you can add breakpoint() in langgraph/agents/leo/nodes.py, like so:
- # Node
def leo(state: LlamaPressState):
breakpoint() # add this line to test if this is being loaded correctly, and that we hit the breakpoint.
llm = ChatOpenAI(model=”gpt-4.1″)
llm_with_tools = llm.bind_tools(tools) custom_prompt_instructions_from_llamapress_dev = state.get(“agent_prompt”)
full_sys_msg = SystemMessage(content=f”””{sys_msg} Here are additional instructions provided by the developer: {custom_prompt_instructions_from_llamapress_dev} “””) return {“messages”: [llm_with_tools.invoke([full_sys_msg] + state[“messages”])]}
Now, when you send a message into Leo, you should hit the breakpoint in your terminal.
This will confirm that the file changes you’re making in langgraph/agents/leo/nodes.py is actually being loaded into the llamabot docker container properly and that code is being ran.
[9/29/25, 1:51:19 PM] ~M: so it’ll stop execution if there is an error and show it in my terminal?
[9/29/25, 1:54:11 PM] Kody: Yes
[9/29/25, 1:54:20 PM] Kody: And you can see the Python logs in real time
[9/29/25, 1:57:26 PM] Kody: breakpoint() will stop execution anytime that line of code is ran. Then you can type out variables to inspect them, or just type “c” and hit enter, and it will resume execution
[9/29/25, 1:57:34 PM] Kody: This is helpful to make sure that the code is actually being ran
[9/29/25, 1:58:10 PM] ~M: I figured out that I called the tools before defining them. Currently waiting on Leo to reply, not sure if it should take this long
[9/29/25, 1:58:47 PM] Kody: It might be stuck on the breakpoint. Try hitting enter in the terminal window to see if it’s stopped at the (pdb) breakpoint
[9/29/25, 1:59:04 PM] Kody: If it is, you can type “c” and then hit enter and it will resume execution
[9/29/25, 1:59:31 PM] ~M: INFO:app.websocket.web_socket_handler:Received message from LlamaPress!
INFO:app.websocket.web_socket_handler:PING RECV, SENDING PONG
INFO:app.websocket.web_socket_handler:Waiting for message from LlamaPress
(Pdb) c
(Pdb) INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions “HTTP/1.1 200 OK”
[9/29/25, 1:59:36 PM] Kody: Perfect
[9/29/25, 2:00:47 PM] ~M: he’s still thinking
[9/29/25, 2:06:10 PM] Kody: It shouldn’t take that long
[9/29/25, 2:06:20 PM] Kody: Try entering c a couple more times
[9/29/25, 2:06:30 PM] Kody: You have to type c and then hit enter
[9/29/25, 2:12:36 PM] Kody: Let me know if you have any other questions 🙂