Skip to content

upload instrumentation #443

@ransomw1c

Description

@ransomw1c

at One Concern, in addition to using the sidecar within Argo workflows, we distribute datamon to desktop with brew.

frequently, data-scientists need to "ingest," we say, data into the Argo workflows comprising the flood, for instance, simulation pipeline(s) without running a pre-packaged ingestor workflow. sometimes there's a 500 error or bundle upload or bundle mount new fail for one reason or another. this task proposes to begin to address the pain-point already solved in part by the fact that duplicate blobs (2k chunks) aren't uploaded twice.

specifically, the idea is to instrument (via golang in the binary, shell-script as in the sidecar, or Python, bindings for which exist in #393 , not having been merged only because of documentation requirements) the paths from desktop to cloud (bundle upload, bundle mount new, etc) to provide

  • metrics and usage statistics to improve datamon
  • progress indicators, logging, and a smoother experience for data-science
  • any and all additional tracing, timeing, and output formatting to ease backpressure on this known iss

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions