R.G.L. DOliveira, S. El Rouayheb, D. Karpuk, H. Seferoglu, Y. Xing
Rutgers University, United States
Keywords: distributed computation, privacy, secure computations, machine learningWe consider the task of training a machine learning problem distributively, while keeping both the data and the model information-theoretically private. To do this we use a recently proposed family of polynomial codes, known as GASP codes, to perform the underlying matrix multiplications. To evaluate our results, we consider different performance metrics, such as communication cost, energy cost, and computation time. We also show how these codes can be used to deal with malicious adversaries.