Skip to content

Optimized gradient descent using parallel processing for linear regression.

License

Notifications You must be signed in to change notification settings

pratham-404/Parellel-Gradient-Descent

Repository files navigation

⚡ Parellel-Gradient-Descent ⚡

Gradient descent is a popular optimization algorithm used to minimize or maximize an objective function. It is commonly used in machine learning, data science, and optimization problems.It works by iteratively adjusting the parameters of a model or the variables of an objective function in the direction of steepest descent (or ascent) of the function. It leverages the gradient (a vector of partial derivatives) to determine the direction of the steepest slope and takes steps proportional to the negative of the gradient to reach the minimum (or maximum) of the function.

gradient descent

The repository includes a Jupyter Notebook notebook.ipynb that demonstrates how to use the algorithm. The notebook provides an example problem and walks through the steps to apply parallel gradient descent to find the optimal solution.

🛠️ Installation

To use this code locally, you need to have the following dependencies installed:

  • Python (version 3.X.X)
  • Jupyter Notebook
  • NumPy
  • Matplotlib

Follow these steps to get started:

  1. Clone this repository to your local machine using the following command:
git clone https://github.com/pratham-404/Parellel-Gradient-Descent.git
  1. Navigate to the repository's directory:
cd Parellel-Gradient-Descent
  1. Install the required dependencies using pip:
pip install -r requirements.txt

Now you have all the necessary dependencies installed and you're ready to use the code.

🚀 Usage

  1. Launch Jupyter Notebook by running the following command:
jupyter notebook
  1. Open the notebook.ipynb notebook.

  2. Follow the instructions provided in the notebook to understand the algorithm and run the code on your specific problem.

  3. Modify the notebook to suit your specific use case or objective function.

🤝 Contributing

Contributions to this repository are always welcome. If you find any issues or have suggestions for improvements, please feel free to open an issue or submit a pull request.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

About

Optimized gradient descent using parallel processing for linear regression.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published