LLM

If LLM Connections are enabled in GraphXR, you can use gxr.llm() to do chat completions.

  1. Visit /admin.
  2. Navigate to the LLM tab.
  3. Add an LLM Connection.
  4. In Grove, try the following code.

Using gxr.llm()

// Cell 1
Button("Summarize (gxr.chat)", async () => {
  const context = JSON.stringify(gxr.nodes("[[selected]]").properties());
  const response = await gxr.llm({messages: [{
    role: "system",
    content: "Summarize the nodes"
  }, {
    role: "user",
    content: context,
  }]});
  mutable answer = response.content;
});
 
// Cell 2
mutable answer = "";

Using gxr.ask()

Here's an alternative using gxr.ask().

// Cell 1
Button("Summarize (gxr.ask)", async () => {
  const context = JSON.stringify(gxr.nodes("[[selected]]").properties());
  const response = await gxr.ask(context, {
  	systemPrompt: "Summarize the nodes",
  });
  mutable answer = response.content;
});
 
// Cell 2
mutable answer = "";

Structured Outputs

If the LLM supports Structured Outputs, we can request a schema in the response.

Button("llm.ask + schema", async () => {
  const result = await gxr.ask(
    JSON.stringify(gxr.nodes("[[selected]]").serialize()),
    {
      model: "gpt-4.1-nano",
      systemPrompt: `Anonymize all properties of all nodes by creating plausible sounding alternatives. Don't anonymize the id the node, because I will use that to locate the node. Be thorough. If something is unchanged I will lose my job!`,
      schema: {
        type: "object",
        properties: {
          nodes: {
            type: "array",
            items: {
              type: "object",
              properties: {
                id: { type: "string" },
                properties: { type: "string", description: "JSON" },
              },
              required: ["id", "properties"],
              additionalProperties: false,
            },
          },
        },
        required: ["nodes"],
        additionalProperties: false,
      },
    }
  );
  console.log(result);
})