Skip to content

A question about parameter server training #1

@SearchVera

Description

@SearchVera

Hi, your code really helps! I have one question:
In the coordinator(train_dataset_fn), you use shard to split data to each worker, but the input param(input_context.input_pipeline_id) indicates which worker index is, so I think every worker should call the function(train_dataset_fn) to get his part of data. But your code show that only the coordinator use the train_dataset_fn function.
Can you explain to me how this param(input_context.input_pipeline_id) works
thx!!!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions