CryptoNets and subsequent work have demonstrated the capability of homomorphic encryption (HE) in the applications of private artificial intelligence (AI). While convolutional neural networks (CNNs) are primarily composed of linear functions which can be homomorphically evaluated, layers such as the activation layer are non-linear and cannot be homomorphically evaluated. One of the most commonly used alternatives is approximating these non-linear functions using low-degree polynomials. However, it is difficult to generate efficient approximations and often, dataset specific improvements are required. This thesis presents a systematic method to construct HE-friendly activation functions for CNNs. We first determine the key properties in a good activation function that contribute to performance by analyzing commonly used functions such as Rectified Linear Units (ReLU) and Sigmoid. We then analyse the inputs to the activation layer and search for an optimal range of approximation for the polynomial activation. Based on our findings, we propose a novel weighted polynomial approximation method tailored to this input distribution. Finally, we demonstrate effectiveness and robustness of our method using three datasets; MNIST, FMNIST, CIFAR-10.

Library of Congress Subject Headings

Neural networks (Computer science); Homomorphisms (Mathematics); Data encryption (Computer science); Convolutions (Mathematics); Machine learning

Publication Date


Document Type


Student Type


Degree Name

Computer Science (MS)

Department, Program, or Center

Computer Science (GCCIS)


Peizhao Hu

Advisor/Committee Member

Stanislaw Radziszowski

Advisor/Committee Member

Yu Kong


RIT – Main Campus

Plan Codes