by peterkelly
8 subcomments
- I've always been of the view that for a workflow language, you should use a proper, turing-complete functional language which gives you all the usual flexiblity for transformations on intermediate data, while also supporting things like automatic parallelisation of things like external, compute-intensive tasks.
I recommend checking out https://github.com/peterkelly/rex and also my PhD thesis on the topic https://www.pmkelly.net/publications/thesis.pdf.
The gap in flexiblity between DAG-only and a full language designed for the task is a significant one.
by kovariance
1 subcomments
- YAML as a programming language is something I consider as an anti-pattern (see AWS Step Functions). Very difficult to read/debug/test. It's better to use a real programming language that compiles into a DAG (e.g. Temporal, Dagger.io).
by panda888888
1 subcomments
- How is this different from Airflow or commercial data orchestration tools, like Astronomer, Dagster, Prefect, etc.?
by SkyPuncher
0 subcomment
- I'm working on something similar as a side project. I'm finding frustration with a lack of repeatability in my LLM flows. 90% of my code is AI written, but most of my guidance to LLMs is not particularly specific. It's "make sure you've read this file", "how does that match against existing patterns", "what's the performance like".
I've ended up building my workflow engine directly in Python, despite YAML being the default choice for LLMs.
I found that YAML had some drawbacks:
* LLMs don't have an inherent understanding of YAML conventions. They tend to be overly verbose. Python code solved this because "good" code is generally as short as you need.
* YAML isn't really composable. Yes, you can technically compose it, but you'll be fighting the LLM the entire time. Python solved this because the LLM knows how to decouple code.
* I want _some_ things to be programatic still. Having Python solves that
* Pretty much any programming language would do. Python just feels like the default for LLM-centric code.
by subhobroto
1 subcomments
- This is a good exercise but IMHO, when you really start using a workflow for production usecases, you need a a proper, turing-complete programming language as a DSL.
There used to be a project called Benthos (since acquired and rebranded by Redpanda in 2024) that was amazing, that you might want to gain some inspiration from.
However, durable workflows have also gained popular acceptance as functional design reaches a wider audience.
While Temporal is the most popular choice when it comes to durable workflows, DBOS (cofounded by the father of PostgreSQL) is my personal favorite.
At the moment, orchestration in DBOS has certain gaps - you might very well consider spending your effort on closing those gaps. The value there would be phenomenal!
- I was expecting to see some verbose LLM output, but actually the code has a distinctly hand-crafted feel. Nice to see! I'm not sure if "production ready" is a safe claim 7 commits in to a project ;)
- It’s interesting to see something new in this space, especially since some people claim that flowcharts will be replaced by AI automation or AI-generated code.
P.S. I'm the author of a similar solution:
* https://github.com/nocode-js/sequential-workflow-designer
* https://github.com/nocode-js/sequential-workflow-machine
- This is a very interesting project, especially since I've been building a similar declarative workflow engine for over 5 years. With a well-designed YAML schema, it's now possible to build workflows with AI agents. I call this "Vibe workflows."
There's no need for humans to write DAGs anymore, yet they remain human-readable. I truly believe this is the future of workflow orchestration.
https://github.com/dagucloud/dagu
by purpleidea
0 subcomment
- Here's a different kind of workflow engine with a proper DSL. It turns out config management is the same problem as workflow engines, if you use my modern definition of config management.
https://github.com/purpleidea/mgmt/
- How does this compare to Temporal? That seems to be the current baseline for application-oriented workflow engines.
by bognition
1 subcomments
- Production Ready?
That is a is a pretty bold claim for a repo that existed for a few days, has 0 issues, PRs, etc...
by philipodonnell
0 subcomment
- This particular example aside, I don’t think it being derivative and simplified is necessarily bad. Libraries that are popular today were written for humans and reinforced by LLMs via training. It’s unlikely they represent the ideal interaction surface for an agent.
There was a study recently that LLms prefer resumes written by LLMs rather than by humans. Stands to reason they would prefer apis written by LLMs.
This is probably the early days of such intentionally simplified agentic semantic primitives like “DAG Workflow” where the answer for why not Temporal is that LLMs prefer different things than humans.
by zaptheimpaler
2 subcomments
- I have several sources of data I want to fetch, retry, process periodically. Like exporting Claude chats into .md files that go to Obsidian, fetching Garmin data from the API and processing it for a custom tool, exporting replays for a game, maybe even running some browser automation to get bank CSVs. I have some ad-hoc python scripts for all of this but no central way to manage them, schedule, handle errors and retries, store the original data and processed versions, resume from the last point etc.. is a workflow engine useful for something like that?
by barelysapient
0 subcomment
- My version of a similar tool; but written in Go with a compile time guarantee.
https://github.com/swetjen/daggo
- What makes it production ready? What's the code coverage on your tests? There are only seven commits in this repo as of this comment.
- I have a project in this space that I've run many thousands of jobs through. It's solid and full featured. Feel free to connect: https://stepwise.run/
- YAML no thanks.
I want something that uses BPML for actual business workflows.
- how it compares to airflow?
- DAG Workflow Engine
A production-ready DAG (Directed Acyclic Graph) workflow engine driven by a YAML DSL. Validates, executes, and visualizes workflows with support for parallel execution, retries, conditional branching, batch iteration, and pluggable actions.
- These are always a fun couple-day project. :)
- I don't see any references to existing orchestrators, which are way more complete, so I presume you did this as an exercise?
Just seeing YAML used for workflows in this age makes me automatically nope out.
by 535188B17C93743
0 subcomment
- [dead]