Week 7: Rough Waters
April 21, 2023
Hi everyone! Welcome to my week 7 blog post. A bit delayed, but here
This week I got started on creating the methods that would help me with communication for parallelized code.
My program uses spatial decomposition and MPI (message passing interface). We decompose the whole channel into subparts, and to do this, we use a cartesian topology. The method Create_cart in MPI helps with this task.
At first, I tried using Google Colab like before. I broke down the fluid channel into 2×1 regions, meaning there will be 2 subsections. I chose this because there are only 2 cores available on free Google Colab. When I tried to code this part I got a couple errors. One of which I could not find the solution to, so I gave up on Colab and went to VSCode.
Here I set up the default 3d communicator with Create_cart, setting the dimensions to be 2x2x2, with periods being false, meaning I don’t want to connect my top/bottom and left/right boundaries periodically. For example, if I had process 1 and 2 adjacent to each other, one on the left and one on the right, and I set periods to be True, I would get 1212… etc. They loop around. Next, I created a sub-communicator that was 2×2, since I don’t actually need one in 3d. The channel of fluid I am dealing with is 2d. Essentially, I would be cutting the grid into 4 equal chunks, along the midpoints of the x and y directions.
I tried running the code with mpirun -np 8 mpicomm.py, meaning I wanted it to run my program named mpicomm.py on 8 cores. However, I kept getting this error.
Searching through Stack Overflow posts with the same error, I tried many random things, and the one that worked for me was downloading mpich again. It seemed there was something wrong with the way I installed mpi on my mac. I ran “conda install mpich” on terminal, ran my code again, and did not see this error. However, another one appeared!
“Size of the communicator (1) is smaller than the size of the Cartesian topology (8)”
I came across this answer on Stack Overflow, which makes sense because I did just download mpich! But that was in an attempt to fix another error. Unfortunately it brought up this one too. Next step is to purge my computer of all MPI implementations and redownload everything from Open MPI. Starting on a clean slate! The code itself is pretty simple to understand, but the backend happenings of setting up MPI has been quite the hassle.
Phew that was a rough week, but see you next week with some (hopefully successful) updates after I redownload the implementation.
Thank you for reading!