S

SiftZendeskTest

Best practices for managing large workflows

Summary

The user is seeking best practices for managing workflows with large data sizes, specifically addressing output size limits that cause failures. They are currently using a Flytefile and considering a temporary fix by increasing the max-output-size-bytes parameter. The user is looking for a more effective method to handle large input and output data, mentioning that JSON is passed inline and asking about the number of items in a list. They are also working on auto offloading support for large lists and a more compact representation of JSON. Additionally, they inquire about Pydantic data models and where to find information on which types are transferred inline versus offloaded.

Status
resolved
Tags
    Source
    #ask-the-community