Download PDFOpen PDF in browserA survey on Distributed learning SystemEasyChair Preprint 20415 pages•Date: November 28, 2019AbstractMachine learning has been widely used for scientific research and occupational purposes newly to extract valuable information. A most important experiment comes from the communication cost in the distributed computing environment. In specific, composed the iterative nature of many machine learning algorithms and the vastness of the models and the training data require a huge amount of communication amid different machines in the training process. To one side from the actual computational cost that is joint among multiple machines, distributed computing invites additional cost of communication overhead and machine synchronization. The conventional privacy-preserving distributed machine learning approaches emphasis on the simple distributed system architectures, which requires heavy computation loads or can only provide learning systems over the restricted scenarios. The proposed scheme not only reduces the overhead for the learning process but also be responsible for the comprehensive shield for each layer of the hierarchical distributed system. Keyphrases: distributed, hierarchical, machine learning
|