David Alonso
David Alonso6mo ago

Latency between Convex runtimes

Is the latency of calls from actions (node runtime) to mutations/queries (convex runtime) negligible?
43 Replies
David Alonso
David AlonsoOP6mo ago
idk if this question even makes sense, my mental model might be off
ballingt
ballingt6mo ago
If you can, prefer Convex actions to Node.js actions. AWS Lambda (which powers Convex Node.js actions) sometimes has cold starts, Convex mutations, queries, and actions don't. The network latency is negligable, but AWS Lambda cold starts are not. If you're running into this we want to hear about it, it's not impossible to speed up AWS lambda by spending more money (keeping runtimes warm for you at a cost) but there's no button on the dashboard you can press to enable this. Ah sorry, I didn't answer your question!
Is the latency of calls from actions (node runtime) to mutations/queries (convex runtime) negligible?
It's pretty small. Calling from Lambda to Convex is cheap, but it's not negligible if you're doing hundreds of these calls. Instead write a mutation that does all hundred things.
David Alonso
David AlonsoOP6mo ago
gotcha, I'm calling LLM flows from this action so I think there's no way for me to speed this up right? (i.e. I need the node environment) good to know, for now one action will not make more than 5 calls to the db, so I think it should be fine to skip the wrapping of these calls in a single mutation the only alternative I can think of is to make these calls from the vercel server but then I'll probably have to wrap the mutations. What would you recommend? What I'm trying to do is to have an agent with tools be able to both query and write to the convex db. I thought the best approach would be to have this agent on convex...
ballingt
ballingt6mo ago
Doing this from a Convex Node.js action sounds great!
good to know, for now one action will not make more than 5 calls to the db, so I think it should be fine to skip the wrapping of these calls in a single mutation
The other thing to think about here is that if you combine these mutations into one, they'll happen in one transaction; they'll either all run or none of them will run. If you run them one at a time, there will be a period of time where one has finished but another hasn't, possibly producing inconsistent database state. Are you hitting an issue, is there a reason you want an alternative? Making these calls from a Vercel server is roughly equivalent to making them in a Convex Node.js action; Vercel also runs on Lambda. But the latency could be quite different, it depends where these lambdas are running. If you use Convex ones we know they'll be in the same AWS region.
David Alonso
David AlonsoOP6mo ago
okay good to know. There's just more latency than I was expecting but it could be due to an issue in my code so I'll follow up
ballingt
ballingt6mo ago
It'd be useful if you could find where the latency is by console.logging Date.now() which will work as expected in a Node.js action
David Alonso
David AlonsoOP6mo ago
Is this well set up?
const sdk = new NodeSDK({
traceExporter: new LangfuseExporter({
debug: process.env.NODE_ENV === "development",
}),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

const groq = createOpenAI({
baseURL: "https://api.groq.com/openai/v1",
apiKey: process.env.GROQ_API_KEY,
});

export const createBlockFromPromptVercel = authenticatedAction({
args: {
userPrompt: v.string(),
blockId: v.id("blocks"),
workspaceCollectionInfo: zodToConvex(zWorkspaceCollectionsInfo),
},
handler: async (ctx, args) => {
const result = await createBlockFromPromptAls.run(
{
ctx: ctx,
clerkOrgId: args.clerkOrgId,
blockId: args.blockId,
workspaceCollectionInfo: args.workspaceCollectionInfo,
},
() =>
generateText({
model: groq("llama-3.1-70b-versatile"),
maxToolRoundtrips: 5, // allow up to 5 tool roundtrips
experimental_telemetry: {
isEnabled: true,
functionId: "createBlockFromPrompt",
metadata: {
environment: process.env.NODE_ENV,
},
}, // langfuse telemetry
tools: {
getFields: FireviewAITools.getFields,
createTableBlockOneShot: FireviewAITools.createTableBlockOneShot,
},
})
);

// Save messages, etc

console.log(result.text);

await sdk.shutdown(); // Flushes the trace to Langfuse
},
});
const sdk = new NodeSDK({
traceExporter: new LangfuseExporter({
debug: process.env.NODE_ENV === "development",
}),
instrumentations: [getNodeAutoInstrumentations()],
});

sdk.start();

const groq = createOpenAI({
baseURL: "https://api.groq.com/openai/v1",
apiKey: process.env.GROQ_API_KEY,
});

export const createBlockFromPromptVercel = authenticatedAction({
args: {
userPrompt: v.string(),
blockId: v.id("blocks"),
workspaceCollectionInfo: zodToConvex(zWorkspaceCollectionsInfo),
},
handler: async (ctx, args) => {
const result = await createBlockFromPromptAls.run(
{
ctx: ctx,
clerkOrgId: args.clerkOrgId,
blockId: args.blockId,
workspaceCollectionInfo: args.workspaceCollectionInfo,
},
() =>
generateText({
model: groq("llama-3.1-70b-versatile"),
maxToolRoundtrips: 5, // allow up to 5 tool roundtrips
experimental_telemetry: {
isEnabled: true,
functionId: "createBlockFromPrompt",
metadata: {
environment: process.env.NODE_ENV,
},
}, // langfuse telemetry
tools: {
getFields: FireviewAITools.getFields,
createTableBlockOneShot: FireviewAITools.createTableBlockOneShot,
},
})
);

// Save messages, etc

console.log(result.text);

await sdk.shutdown(); // Flushes the trace to Langfuse
},
});
the als is to expose the Convex context to the tools asking cause we're seeing elevated latency with deploying convex functions as noted here <#1271428583377600545>
ballingt
ballingt6mo ago
Deploy time is all about bundle size, the code above looks reasonable. Are you using external packages for your big Node.js dependencies? Also just to clarify, at the beginning of this thread you were talking about execution latency, and now you're talking deploy latency? What kinds of times are you seeing?
David Alonso
David AlonsoOP6mo ago
yep, this is a different type of latency. After integrating with the vercel ai sdk and genkit (from firebase) we've started to notice functions taking significantly longer and how that degrades the DX. We're seeing times of 30s-1m+ but not for every single function update. These are some of the packages that the AI actions are using:
"@genkit-ai/ai": "^0.5.9",
"@genkit-ai/core": "^0.5.9",
"@genkit-ai/dotprompt": "^0.5.9",
"@genkit-ai/firebase": "^0.5.9",
"@genkit-ai/flow": "^0.5.9",
"@genkit-ai/googleai": "^0.5.9",
"@ai-sdk/anthropic": "^0.0.40",
"@ai-sdk/google": "^0.0.30",
"@ai-sdk/openai": "^0.0.40",
"ai": "^3.2.44",
"@genkit-ai/ai": "^0.5.9",
"@genkit-ai/core": "^0.5.9",
"@genkit-ai/dotprompt": "^0.5.9",
"@genkit-ai/firebase": "^0.5.9",
"@genkit-ai/flow": "^0.5.9",
"@genkit-ai/googleai": "^0.5.9",
"@ai-sdk/anthropic": "^0.0.40",
"@ai-sdk/google": "^0.0.30",
"@ai-sdk/openai": "^0.0.40",
"ai": "^3.2.44",
not sure exactly on how to exactly measure which of these is causing issues
ballingt
ballingt6mo ago
Are you using external packages for your big Node.js dependencies?
David Alonso
David AlonsoOP6mo ago
not sure I get what you mean. The packages above are from our package.json so they are external packages
ballingt
ballingt6mo ago
Bundling | Convex Developer Hub
Bundling is the process of gathering, optimizing and transpiling the JS/TS
David Alonso
David AlonsoOP6mo ago
ah that's probably it, we weren't using external packages then. I'm a bit confused though: in what cases wouldn't I want to mark all packages as external? In the example you show
import SomeModule from "some-module";
const { Foo } = SomeModule;
import SomeModule from "some-module";
const { Foo } = SomeModule;
but I thought all external package imports had to be wrapped in require or import or is that not the case?
ballingt
ballingt6mo ago
I'd try marking thing all as external. It's possible some won't work and if they don't you'd have to drill down to find which ones are a problem. I don't follow the second question. import SomeModule from "some-module"; looks like an import to me?
David Alonso
David AlonsoOP6mo ago
ah sorry misunderstood, I'll give this a try and report back!
✖ Error: Unable to push deployment config to https://sensible-salmon-524.convex.cloud
Error fetching POST https://sensible-salmon-524.convex.cloud/api/push_config 400 Bad Request: ModulesTooLarge: Hit an error while pushing:
Total module size exceeded the zipped maximum (141.13 MiB > maximum size 42.92 MiB)
✖ Error: Unable to push deployment config to https://sensible-salmon-524.convex.cloud
Error fetching POST https://sensible-salmon-524.convex.cloud/api/push_config 400 Bad Request: ModulesTooLarge: Hit an error while pushing:
Total module size exceeded the zipped maximum (141.13 MiB > maximum size 42.92 MiB)
Okay I guess I'll have to pick.. is there an easy way to see the packages required by the node js environment? i guess everything imported after "use node" across all files? With this:
{
"node": {
"externalPackages": [
"@ai-sdk/openai",
"@ai-sdk/anthropic",
"@ai-sdk/groq",
"@ai-sdk/gemini",
"@ai-sdk/core",
"@clerk/clerk-sdk-node",
"@clerk/nextjs/server",
"@genkit-ai/ai",
"@genkit-ai/core",
"@genkit-ai/flow",
"@genkit-ai/firebase",
"@genkit-ai/googleai",
"@opentelemetry/sdk-node",
"@opentelemetry/auto-instrumentations-node",
"svix",
"loops",
"convex-helpers/server/zod",
"zod",
"langfuse-vercel",
"genkitx-openai"
]
}
}
{
"node": {
"externalPackages": [
"@ai-sdk/openai",
"@ai-sdk/anthropic",
"@ai-sdk/groq",
"@ai-sdk/gemini",
"@ai-sdk/core",
"@clerk/clerk-sdk-node",
"@clerk/nextjs/server",
"@genkit-ai/ai",
"@genkit-ai/core",
"@genkit-ai/flow",
"@genkit-ai/firebase",
"@genkit-ai/googleai",
"@opentelemetry/sdk-node",
"@opentelemetry/auto-instrumentations-node",
"svix",
"loops",
"convex-helpers/server/zod",
"zod",
"langfuse-vercel",
"genkitx-openai"
]
}
}
I still hit limits...
✖ Error: Unable to push deployment config to https://sensible-salmon-524.convex.cloud
Error fetching POST https://sensible-salmon-524.convex.cloud/api/push_config 400 Bad Request: ModulesTooLarge: Hit an error while pushing:
Total module size exceeded the zipped maximum (49.67 MiB > maximum size 42.92 MiB)
✖ Error: Unable to push deployment config to https://sensible-salmon-524.convex.cloud
Error fetching POST https://sensible-salmon-524.convex.cloud/api/push_config 400 Bad Request: ModulesTooLarge: Hit an error while pushing:
Total module size exceeded the zipped maximum (49.67 MiB > maximum size 42.92 MiB)
had to remove quite a lot of packages to get it to work, and it's tricky cause idl how heavy they are easily any ideas on how to improve the DX here? also when it succeeded I had to idea what the actual zipped size was and if there was room for more packages we brought down function deployment down to 30s with this but it's still really hindering our development experience. For reference, this is the subset of packages I was able to add to the external list:
{
"node": {
"externalPackages": [
"@ai-sdk/openai",
"@ai-sdk/anthropic",
"@ai-sdk/groq",
"@ai-sdk/core",
"@clerk/clerk-sdk-node",
"@clerk/nextjs/server",
"svix",
"loops",
"langfuse-vercel"
]
}
}
{
"node": {
"externalPackages": [
"@ai-sdk/openai",
"@ai-sdk/anthropic",
"@ai-sdk/groq",
"@ai-sdk/core",
"@clerk/clerk-sdk-node",
"@clerk/nextjs/server",
"svix",
"loops",
"langfuse-vercel"
]
}
}
Doing npx convex dev with --typecheck=disable does not improve things btw our project is not that big so I'm surprised we're already hitting these limits unless we're doing something horribly wrong somewhere
David Alonso
David AlonsoOP6mo ago
my source map explorer.. This is after removing my convex.json and running:
npx convex dev --once --debug-bundle-path /tmp/myBundle --typecheck=disable
Wrote bundle and metadata to /tmp/myBundle. Skipping rest of push.
✔ 17:23:31 Convex functions ready! (4.52s)
(base) dalonso  ~/code/FireCompany/TheFireCo/fireview/packages/fireview   stripe ±  npx source-map-explorer /tmp/myBundle/**/*.js


/tmp/myBundle/node/actions/genai/vercel/test.js
Your source map refers to generated column 1 on line 4880, but the source only contains 0 column(s) on that line.
Check that you are using the correct source map.
/tmp/myBundle/isolate/_deps/2DMKOZXQ.js
Unable to map 15935/242824 bytes (6.56%)
/tmp/myBundle/isolate/_deps/2YP7PMLX.js
Unable to map 569/4487 bytes (12.68%)
/tmp/myBundle/isolate/_deps/3DSORPNE.js
Unable to map 2427/19162 bytes (12.67%)
/tmp/myBundle/isolate/_deps/3QN3MYJJ.js
Unable to map 133/295 bytes (45.08%)
/tmp/myBundle/isolate/_deps/4IZVGYKB.js
Unable to map 212/2724 bytes (7.78%)
/tmp/myBundle/isolate/_deps/5RLGQERQ.js
Unable to map 160/394 bytes (40.61%)
/tmp/myBundle/isolate/_deps/6QOFN2NB.js
Unable to map 174/774 bytes (22.48%)
/tmp/myBundle/isolate/_deps/6RFDWXD5.js
Unable to map 393/2408 bytes (16.32%)
/tmp/myBundle/isolate/_deps/6WIGGPLA.js
Unable to map 191/891 bytes (21.44%)
/tmp/myBundle/isolate/_deps/7IHYSIOW.js
Unable to map 340/1618 bytes (21.01%)
/tmp/myBundle/isolate/_deps/7J63KEK3.js
Unable to map 402/4172 bytes (9.64%)
npx convex dev --once --debug-bundle-path /tmp/myBundle --typecheck=disable
Wrote bundle and metadata to /tmp/myBundle. Skipping rest of push.
✔ 17:23:31 Convex functions ready! (4.52s)
(base) dalonso  ~/code/FireCompany/TheFireCo/fireview/packages/fireview   stripe ±  npx source-map-explorer /tmp/myBundle/**/*.js


/tmp/myBundle/node/actions/genai/vercel/test.js
Your source map refers to generated column 1 on line 4880, but the source only contains 0 column(s) on that line.
Check that you are using the correct source map.
/tmp/myBundle/isolate/_deps/2DMKOZXQ.js
Unable to map 15935/242824 bytes (6.56%)
/tmp/myBundle/isolate/_deps/2YP7PMLX.js
Unable to map 569/4487 bytes (12.68%)
/tmp/myBundle/isolate/_deps/3DSORPNE.js
Unable to map 2427/19162 bytes (12.67%)
/tmp/myBundle/isolate/_deps/3QN3MYJJ.js
Unable to map 133/295 bytes (45.08%)
/tmp/myBundle/isolate/_deps/4IZVGYKB.js
Unable to map 212/2724 bytes (7.78%)
/tmp/myBundle/isolate/_deps/5RLGQERQ.js
Unable to map 160/394 bytes (40.61%)
/tmp/myBundle/isolate/_deps/6QOFN2NB.js
Unable to map 174/774 bytes (22.48%)
/tmp/myBundle/isolate/_deps/6RFDWXD5.js
Unable to map 393/2408 bytes (16.32%)
/tmp/myBundle/isolate/_deps/6WIGGPLA.js
Unable to map 191/891 bytes (21.44%)
/tmp/myBundle/isolate/_deps/7IHYSIOW.js
Unable to map 340/1618 bytes (21.01%)
/tmp/myBundle/isolate/_deps/7J63KEK3.js
Unable to map 402/4172 bytes (9.64%)
Not sure what's going on so I'd appreciate your help!
No description
David Alonso
David AlonsoOP6mo ago
but the total bundle size without external deps (no convex.json) is 17MB, yet when I do:
"externalPackages": ["*"]
"externalPackages": ["*"]
I get the error I shared above:
Total module size exceeded the zipped maximum (141.13 MiB > maximum size 42.92 MiB)
Total module size exceeded the zipped maximum (141.13 MiB > maximum size 42.92 MiB)
which seems off to me
ballingt
ballingt6mo ago
Big picture, one thing that's going to help here is local development. This is something we're working on now, the idea is you push this stuff locally faster iteration. We do want these deploys to be as fast as possible though, because there's nothing like having code in a prod-like environment for testing.
David Alonso
David AlonsoOP6mo ago
okay so you'd recommend we run convex locally? And there's nothing fishy going on in our project you think? wondering if this could be fixed temporarily by increasing the zipped limit, but maybe that's hard for you guys afaik running convex locally comes with no dashboard which would hurt our speed in other ways, so I hope we can find a better solution...🥹
ballingt
ballingt6mo ago
oh that could be something we're working on 🤫
David Alonso
David AlonsoOP6mo ago
haha happy to hear that, but not sure how far into the future that is so what would you recommend as a temporary solution?
ampp
ampp6mo ago
my impression was in the next month or two.. i hope.. luckily, we aren't focused much on LLM's right now. I was hoping it would be possible to use one LLM sdk not to bloat bundler/convex actions as all these inputs and outputs are essentially the same format. But your imports are not giving me any faith that will be in my future. 😅 Maybe some proxy to provide one standard data interface?
ballingt
ballingt6mo ago
@ampp are you trying to run the LLMs in COnvex? LLM SDKs like OpenAI run great on Convex — it's just when you try to do inference with Tensorflow in an AWS Lamba that you're running into issues, right?
ampp
ampp6mo ago
I'm not having issues.. yet, i don't want to make my bundling time too long right now. Our long term goal is to allow all our users to "bring your own LLM" and input any api key, ideally we would support every LLM. And i need to decide what the most efficient way to do that is.
David Alonso
David AlonsoOP6mo ago
I'm still not sure what the best short-term course of action is, or how long we should wait to be able to have a good DX when running convex locally, would appreciate some more info @ballingt 🙏
David Alonso
David AlonsoOP6mo ago
we're spending 10+hrs a day on convex atm building https://fireview.dev so these time savings are a big deal for us
Fireview - The Firestore Console You Deserve
Fireview helps your team manage and visualize your Firestore data with ease.
ballingt
ballingt6mo ago
One thing I'd try is to stop Node.js actions for everything you can; the "edge runtime" versions of these libraries are usually lighter. Was there a problem using @ai-sdk libraries with the Convex actions runtime? We have an action item to look into these zipfile sizes, why when you use "externalPackages": ["*"] that gets so much larger. Likely it's because external packages are not bundled so they're much larger.
jamwt
jamwt6mo ago
@David Alonso dm me or create a support ticket if you all want to Zoom about this some time with the team and brainstorm in a faster-iteration forum we definitely don't want you all to have a painful dev experience. we're obviously kind of trying to do exactly the opposite 🙂 we can also share more details about the timeline for local dev. the internal alpha just landed for us, so our team is starting to play with it
David Alonso
David AlonsoOP6mo ago
thanks Jamie, just texted you, hopefully we can chat soon! great to hear! shoot I wish I'd understood this from the get go. When glancing over the docs, specifically this code snippet:
"use node";

import { action } from "./_generated/server";
import SomeNpmPackage from "some-npm-package";

export const doSomething = action({
args: {},
handler: () => {
// do something with SomeNpmPackage
},
});
"use node";

import { action } from "./_generated/server";
import SomeNpmPackage from "some-npm-package";

export const doSomething = action({
args: {},
handler: () => {
// do something with SomeNpmPackage
},
});
My understanding was that I needed to include "use node" whenever I was using an NPM package but now i realize it's only on unsupported NPM packages. If I'd known this I'd just never add use node unless Convex complains Maybe this fixes all our issues for now, since I actually haven't tried if these packages run in Convex' environment just to have a sense, what rough percent of npm packages are unsupported? this warning also made me think that most actions had to be run in nodejs envs:
actions/auth/genai/genkit/geminiDemo.ts is in /actions subfolder but has no "use node"; directive. You can now define actions in any folder and indicate they should run in node by adding "use node" directive. /actions is a deprecated way to choose Node.js environment, and we require "use node" for all files within that folder to avoid unexpected errors during the migration. See https://docs.convex.dev/functions/actions for more details
actions/auth/genai/genkit/geminiDemo.ts is in /actions subfolder but has no "use node"; directive. You can now define actions in any folder and indicate they should run in node by adding "use node" directive. /actions is a deprecated way to choose Node.js environment, and we require "use node" for all files within that folder to avoid unexpected errors during the migration. See https://docs.convex.dev/functions/actions for more details
ballingt
ballingt6mo ago
Ah we should remove that! It used to be that if you make a directory called convex/actions it would automatically be a "use node" action. well for compatibilty we can't remove it, but I understand why it would be confusing
David Alonso
David AlonsoOP6mo ago
should i ignore it for now or does it actually prevent code deployment? or just rename the folder to something else
ballingt
ballingt6mo ago
souns like it's just a warning? What percent, let's see. It's anything with a native extension (use node-gyp or downloads a binary)
David Alonso
David AlonsoOP6mo ago
seems like a blocker... npx convex dev -v:
No description
ballingt
ballingt6mo ago
Ah yeah, sounds like you'll have to move it. I didn't see "require" We should build a database of libraries known not to work, but if the npm name of the library has "node" in it that's often a bad sign. If it works in Vercel edge runtime it probably works in the Convex runtime. It's a different implementation but we've implemented most of the same APIs. Most libraries should work in the Convex runtime and if it doesn't we'd love to hear about it.
David Alonso
David AlonsoOP6mo ago
i get a ton of these issues when I comment out use node and it's usually dependencies of packages we're using which makes it hard to find the files that need the directive and the ones that don't...
No description
David Alonso
David AlonsoOP6mo ago
like here I'm not even sure which of our packages is using agent_base
ballingt
ballingt6mo ago
yeah it requires closer attention to dependencies as you add them, definitely takes some work to go through them all this is probably for proxy-agent, which is only for Node.js
David Alonso
David AlonsoOP6mo ago
The AI files typically have these imports, where opentelemetry is used for tracing in Langfuse which i think requires unsupported packages
import { generateText } from "ai";
import { zodToConvex } from "convex-helpers/server/zod";
import { createOpenAI, openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { LangfuseExporter } from "langfuse-vercel";
import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
import { zWorkspaceCollectionsInfo } from "../../../../lib/ai/interface";
import { FireviewAITools } from "../../../../lib/ai/vercel/tools";
import { wrapAISDKModel } from "langsmith/wrappers/vercel";
import { createBlockFromPromptAls } from "../../../../lib/ai/als";
import { generateText } from "ai";
import { zodToConvex } from "convex-helpers/server/zod";
import { createOpenAI, openai } from "@ai-sdk/openai";
import { anthropic } from "@ai-sdk/anthropic";
import { LangfuseExporter } from "langfuse-vercel";
import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
import { zWorkspaceCollectionsInfo } from "../../../../lib/ai/interface";
import { FireviewAITools } from "../../../../lib/ai/vercel/tools";
import { wrapAISDKModel } from "langsmith/wrappers/vercel";
import { createBlockFromPromptAls } from "../../../../lib/ai/als";
as you said, has node in the name
ballingt
ballingt6mo ago
Let's start a doc, do all of these require node? There are three (four?) categories: 1) libraries known to work with the Convex JS runtime 2) libraries known not to work in the Convex JS runtime 3) libraries that dont' work in "use node" files either (usually this is anything that doesn't work on AWS Lambda) 4? unknown "ai" should be fine, we've tested most of that convex-helpers/server/zod is fine "@ai-sdk/openai I hope is fine, curious if you see isues wiht that same with the rest of @ai-sdk since it's known to work in the Vercel edge runtime
David Alonso
David AlonsoOP2mo ago
Issue is with the telemetry files which i need for tracing and the genkit packages I can get rid of genkit if we use the vercel ai sdk, but if I want to do tracing I’d still need use node right? still running into this issue btw! we forgot why we named our actions folder action and then we ran into it again, no biggie though but it could cause confusion
ballingt
ballingt2mo ago
what issue, this warning?
David Alonso
David AlonsoOP2mo ago
yep, but it's an error for me
ballingt
ballingt2mo ago
Ah sorry, yeah error. Yeah this is still unfortunate, it may be long enough ago that we can change this behavior! I'll file in, it's always risky because someone on an older project could be expecting everything in the actions folder to automatically be a node action and we don't want it to be confusing when they upgrade. But it's not hard to fix, so probably safe.

Did you find this page helpful?