Gradient descent is a popular optimization algorithm used to minimize or maximize an objective function. It is commonly used in machine learning, data science, and optimization problems.It works by iteratively adjusting the parameters of a model or the variables of an objective function in the direction of steepest descent (or ascent) of the function. It leverages the gradient (a vector of partial derivatives) to determine the direction of the steepest slope and takes steps proportional to the negative of the gradient to reach the minimum (or maximum) of the function.
The repository includes a Jupyter Notebook notebook.ipynb
that demonstrates how to use the algorithm. The notebook provides an example problem and walks through the steps to apply parallel gradient descent to find the optimal solution.
To use this code locally, you need to have the following dependencies installed:
- Python (version 3.X.X)
- Jupyter Notebook
- NumPy
- Matplotlib
Follow these steps to get started:
- Clone this repository to your local machine using the following command:
git clone https://github.com/pratham-404/Parellel-Gradient-Descent.git
- Navigate to the repository's directory:
cd Parellel-Gradient-Descent
- Install the required dependencies using pip:
pip install -r requirements.txt
Now you have all the necessary dependencies installed and you're ready to use the code.
- Launch Jupyter Notebook by running the following command:
jupyter notebook
-
Open the
notebook.ipynb
notebook. -
Follow the instructions provided in the notebook to understand the algorithm and run the code on your specific problem.
-
Modify the notebook to suit your specific use case or objective function.
Contributions to this repository are always welcome. If you find any issues or have suggestions for improvements, please feel free to open an issue or submit a pull request.
This project is licensed under the MIT License. See the LICENSE
file for more details.