-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Orchestrating tasks #8
Comments
@avico78 glad that you are enjoying the project so far, I wish I could update it more as well. To accomplish what you are describing, should be doable via the following: server = FastAPI()
@server.on_event('startup')
async def setup():
worker = await EasyJobsWorker.create(
server,
server_secret='abcd1234',
manager_host='0.0.0.0',
manager_port=8222,
manager_secret='abcd1234',
jobs_queue='ETL',
max_tasks_per_worker=5
)
@worker.task(run_after=['task4'])
async def task1 ():
print(f"task1 - starting")
await asyncio.sleep(5)
print(f"task1 - finished")
return f"task1!"
@worker.task()
async def task2():
print(f"task1 - starting")
await asyncio.sleep(5)
print(f"task2 - finished")
return f"task2!"
@worker.task(run_before=['task2'])
async def task3():
print(f"task3 - starting")
await asyncio.sleep(5)
print(f"task3 - finished")
return f"task3!"
@worker.task(run_before=['task3'])
async def task4():
print(f"task4 - starting")
await asyncio.sleep(5)
print(f"task4 - finished")
return f"task4"
|
@codemation - thank much for you answer , is both run_after and run_before providing same functionality for triggering dependencies /correct tasks order? for "run_after": I tried triggering the tasks flow as you advised - it is not working , Working example for task1--> task2
for "run_before": Here I couldn't understand if it must require some parameter(ars/kwargs) as it doesn't work properly task2--> task1--> task0 where for one level it does work for: task2 --> task1 && task0 see code for 2 levels(not working): @worker.task()
async def task0(*args,**kwargs):
func_name = sys._getframe().f_code.co_name
print(f"{func_name} started")
await asyncio.sleep(1)
print(f"{func_name} finished")
return {'data': None}
@worker.task(run_before=['task0'])
async def task1(*args,**kwargs):
func_name = sys._getframe().f_code.co_name
print(f"{func_name} started")
await asyncio.sleep(1)
print(f"{func_name} finished")
return {'data': None}
@worker.task(run_before=['task1'] ,default_args=default_args)
async def task2(*args,**kwargs):
func_name = sys._getframe().f_code.co_name
print(f"{func_name} started")
await asyncio.sleep(1)
print(f"{func_name} finished")
return {'data': None}
@worker.task()
async def pipeline():
print(f"pipline started")
result = await task2(data={'test': 'data'})
print(f"pipline - result is {result} - finished")
return result While running the run_before on one level of dependencies :
Code:
it does work for 1-3 times and then it failed:
Also sometimes below error show - couldn't understand why:
Suggestion to add: 1.Add an endpoint for getting the tasks workflow tree (even json view) - 2.for reloading changes im running the apis as:
but seem very slow to reload all changes, 3.it could be really interesting if it could be more generic solution for orchestrating tasks , I really think u came up with great idea and hopefully continue developing this. |
first, GREAT(!) project and I believe it should deserve much more attention .
I pass through the documentation and the example but still have questions as i couldnt make it work as i expect,
so first for triggering a tasks flow without schedule it (by request)
i.e lets say i have basic flow :
task2/task4 depends on task1
task3 depends on task3 and task2
worker:
Based on the tasks plan i described above , should both run_after and run_before required?
As schedule is not set for none of the tasks ,
I expect that triggering task1 - will trigger the depended tasks automatically but it's not ,
It trigger just task1 .
Question regarding the pipline tasks:
is it possible to pipeline data between tasks - in example , the return values of task1 will use in task2?
Is there an option to reuse task in different nodes ? while providing kwargs dynamically so in I can trigger same task with different run_before and run_after ?
Suggest adding discussion tab and if it possible adding example folder that could be really helpful .
great project!
The text was updated successfully, but these errors were encountered: