Week 11: The end for now
May 17, 2025
                Welcome to my final blog post. Here I will discuss the results of my project and further exploration of the topic. Logistic regression proved to be the simplest to work with and the most secure/accurate. This is due to the sigmoid function easily being approximated as a 3rd degree polynomial and a simple standardization allowing the data to stay in the [-4,4] range. Since the sigmoid function is a polynomial we can calculate its derivatives easily. This allows us to use gradient descent and back propagation easily. Running the encrypted sigmoid function and randomly generated data showed that it gave a similar accuracy to the plaintext version while being 2 minutes slower per iteration. However I believe that this is the most successful of all algorithms.
The next algorithm we got results for was KNN. KNN is quite a simple algorithm to code as the distances itself are just the sum of squares of the differences between the data points. However I was unable to find a suitable way to compare the encrypted distances to find the K smallest. The only successful comparison I was able to do was between two relatively different numbers. With multiple distances I was unable to find the K smallest and was therefore forces to decrypt the data to find the K smallest. I personally don’t think the 75 seconds time it took was necessary when we inevitable have to decrypt the final distances although some will argue that decrypting it doesn’t give any unnecessary information.
The neural network provided to be the most interesting. The square activation function provided interesting results sometimes up to par with the RELU function but other times 10% less accurate. However comparing the encrypted neural network to other square activation functions on plaintext would always provide the same results as expected. It took around 5 minutes for one iteration to run. I personally consider this a strong success even getting a 76% accuracy on one iteration and one hidden layer. Future work in my opinion involves understanding the RELU function more and using the definition of the derivative and coordinate descent to calculate the optimal point. I would write RELU(x) = f(x) = max(x,0) = (|x|+x)/2 = (x+sqrt(x^2))/2 = (x+x^2*1/sqrt(x^2))*0.5. And the inverse square can be calculated using newtons method for x^2 = 1/a^2. However this does require more study and analysis to see whether its viable.
Thank you so much for reading my blog posts. Hope you learned a lot about homomorphic encryption.

Leave a Reply
You must be logged in to post a comment.