From c6568c9edc3243d72bcea6d6f279e14abc7fb132 Mon Sep 17 00:00:00 2001 From: FANGAreNotGnu Date: Fri, 15 Nov 2024 19:19:22 +0000 Subject: [PATCH 1/3] update a bit --- README.md | 17 ++++++++--------- .../autogluon-assistant-quick-start.ipynb | 1 + 2 files changed, 9 insertions(+), 9 deletions(-) diff --git a/README.md b/README.md index 697d305..8ea2488 100644 --- a/README.md +++ b/README.md @@ -4,20 +4,19 @@ [![GitHub license](https://img.shields.io/badge/License-Apache_2.0-blue.svg)](./LICENSE) [![Continuous Integration](https://github.com/autogluon/autogluon-assistant/actions/workflows/lint.yml/badge.svg)](https://github.com/autogluon/autogluon-assistant/actions/workflows/lint.yml) -AutoGluon Assistant (AG-A) provides users a simple interface where they can input their data, describe their problem, and receive a highly accurate and competitive ML solution — without writing any code. By leveraging the state-of-the-art AutoML capabilities of AutoGluon and integrating them with a Large Language Model (LLM), AG-A automates the entire data science pipeline. AG-A takes AutoGluon’s automation from three lines of code to zero, enabling users to solve new supervised learning tabular problems using only natural language descriptions. +AutoGluon Assistant (AG-A) provides users a simple interface where they can input their data, describe their problem, and receive a highly accurate and competitive ML solution — without writing any code. By leveraging the state-of-the-art AutoML capabilities of [AutoGluon](https://github.com/autogluon/autogluon) and integrating them with a Large Language Model (LLM), AG-A automates the entire data science pipeline. AG-A takes [AutoGluon](https://github.com/autogluon/autogluon)'s automation from three lines of code to zero, enabling users to solve new supervised learning tabular problems using only natural language descriptions. -## Setup +## 💾 Installation -```bash -# create a conda env -conda create -n aga python=3.10 -conda activate aga +AutoGluon is supported on Python 3.8 - 3.11 and is available on Linux, MacOS, and Windows. + +You can install AutoGluon Assistant with: -# clone repositories -git clone https://github.com/autogluon/autogluon-assistant.git -cd autogluon-assistant && pip install -e ".[dev]" && cd .. +```python +pip install autogluon-assistant ``` +Visit our [Installation Guide (WIP)](https://auto.gluon.ai/stable/install.html) for detailed instructions, including (TBA). ### API Keys diff --git a/docs/tutorials/autogluon-assistant-quick-start.ipynb b/docs/tutorials/autogluon-assistant-quick-start.ipynb index ee32d83..0e5fbb5 100644 --- a/docs/tutorials/autogluon-assistant-quick-start.ipynb +++ b/docs/tutorials/autogluon-assistant-quick-start.ipynb @@ -58,6 +58,7 @@ "metadata": {}, "outputs": [], "source": [ + "#TODO: Add command for configuration tool.\n", "# Option A: AWS Bedrock (Recommended)\n", "!export AWS_DEFAULT_REGION=''\n", "!export AWS_ACCESS_KEY_ID=''\n", From 43b798848e402347cac2188c4d7bf6d363d4f42b Mon Sep 17 00:00:00 2001 From: FANGAreNotGnu Date: Tue, 19 Nov 2024 22:03:54 +0000 Subject: [PATCH 2/3] update tutorial --- .../autogluon-assistant-quick-start.ipynb | 36 ++++++++++++++----- 1 file changed, 28 insertions(+), 8 deletions(-) diff --git a/docs/tutorials/autogluon-assistant-quick-start.ipynb b/docs/tutorials/autogluon-assistant-quick-start.ipynb index 60f5407..c8ed824 100644 --- a/docs/tutorials/autogluon-assistant-quick-start.ipynb +++ b/docs/tutorials/autogluon-assistant-quick-start.ipynb @@ -43,12 +43,29 @@ "!pip install autogluon.assistant" ] }, + { + "cell_type": "markdown", + "id": "3ea8b014", + "metadata": {}, + "source": [ + "*Warning: If you are using an MacOS, you may need to install libomp with*\n", + "```bash\n", + "brew install libomp\n", + "pip install --upgrade lightgbm\n", + "```" + ] + }, { "cell_type": "markdown", "id": "8d4f6834", "metadata": {}, "source": [ - "AutoGluon Assistant supports two LLM providers: AWS Bedrock (default) and OpenAI. Choose one of the following setups:" + "AutoGluon Assistant supports two LLM providers: AWS Bedrock (default) and OpenAI. You can configure with our provided tool:\n", + "```bash\n", + "wget https://raw.githubusercontent.com/autogluon/autogluon-assistant/refs/heads/main/tools/configure_llms.sh\n", + "source ./configure_llms.sh\n", + "```\n", + "Or choose one of the following setups:" ] }, { @@ -58,7 +75,6 @@ "metadata": {}, "outputs": [], "source": [ - "#TODO: Add command for configuration tool.\n", "# Option A: AWS Bedrock (Recommended)\n", "!export AWS_DEFAULT_REGION=''\n", "!export AWS_ACCESS_KEY_ID=''\n", @@ -373,16 +389,18 @@ "\n", "......\n", "\n", - "Fitting model: WeightedEnsemble_L2 ... Training model for up to 360.0s of the 581.72s of remaining time.\n", + "Fitting model: WeightedEnsemble_L2 ... Training model for up to 360.0s of the 556.21s of remaining time.\n", " Ensemble Weights: {'LightGBMLarge': 0.4, 'NeuralNetTorch': 0.25, 'NeuralNetFastAI': 0.2, 'CatBoost': 0.15}\n", " 0.855 = Validation score (accuracy)\n", - " 0.12s = Training runtime\n", + " 0.16s = Training runtime\n", " 0.0s = Validation runtime\n", - "AutoGluon training complete, total runtime = 18.41s ... Best model: WeightedEnsemble_L2 | Estimated inference throughput: 4025.3 rows/s (200 batch size)\n", - "TabularPredictor saved. To load, use: predictor = TabularPredictor.load(\"AutogluonModels/ag-20241111_055131\")\n", + "AutoGluon training complete, total runtime = 26.47s ... Best model: WeightedEnsemble_L2 | Estimated inference throughput: 2470.3 rows/s (200 batch size)\n", + "TabularPredictor saved. To load, use: predictor = TabularPredictor.load(\"AutogluonModels/ag-20241119_214901\")\n", "Model training complete!\n", + "INFO:root:It took 26.84 seconds training model. Time remaining: 555.67/600.00\n", "Prediction starts...\n", - "Prediction complete! Outputs written to aga-output-20241111_055149.csv\n", + "Prediction complete! Outputs written to aga-output-20241119_214928.csv\n", + "INFO:root:It took 0.15 seconds making predictions. Time remaining: 555.52/600.00\n", "```" ] }, @@ -511,7 +529,9 @@ "id": "b20d780a", "metadata": {}, "source": [ - "AG-A Web UI should now be accessible in your web browser at http://localhost:8501 or the specified port." + "AG-A Web UI should now be accessible in your web browser at http://localhost:8501 or the specified port.\n", + "\n", + "*Note: It might take up to a few mins to launch webui for the first time, since we are downloading the sample datasets...*" ] }, { From 3050dc8684262c2945249b5f8510ce9492b1570c Mon Sep 17 00:00:00 2001 From: FANGAreNotGnu Date: Tue, 19 Nov 2024 22:09:35 +0000 Subject: [PATCH 3/3] update bash --- .../autogluon-assistant-quick-start.ipynb | 54 ++++++------------- 1 file changed, 16 insertions(+), 38 deletions(-) diff --git a/docs/tutorials/autogluon-assistant-quick-start.ipynb b/docs/tutorials/autogluon-assistant-quick-start.ipynb index c8ed824..ba4c94b 100644 --- a/docs/tutorials/autogluon-assistant-quick-start.ipynb +++ b/docs/tutorials/autogluon-assistant-quick-start.ipynb @@ -65,23 +65,16 @@ "wget https://raw.githubusercontent.com/autogluon/autogluon-assistant/refs/heads/main/tools/configure_llms.sh\n", "source ./configure_llms.sh\n", "```\n", - "Or choose one of the following setups:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "ff904c9d1af0ac39", - "metadata": {}, - "outputs": [], - "source": [ + "Or choose one of the following setups:\n", + "```bash\n", "# Option A: AWS Bedrock (Recommended)\n", - "!export AWS_DEFAULT_REGION=''\n", - "!export AWS_ACCESS_KEY_ID=''\n", - "!export AWS_SECRET_ACCESS_KEY=''\n", + "export AWS_DEFAULT_REGION=''\n", + "export AWS_ACCESS_KEY_ID=''\n", + "export AWS_SECRET_ACCESS_KEY=''\n", "### OR ###\n", "# Option B: OpenAI\n", - "!export OPENAI_API_KEY='sk-...'" + "export OPENAI_API_KEY='sk-...'\n", + "```" ] }, { @@ -225,20 +218,13 @@ "source": [ "## Using AutoGluon Assistant (via Command Line Interface)\n", "\n", - "Now that we have our data ready, let's use AutoGluon Assistant to build our ML model. The simplest way to use AutoGluon Assistant is through the command line - no coding required! After installing the package, you can run it directly from your terminal:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "47bbf825", - "metadata": {}, - "outputs": [], - "source": [ - "%%bash\n", + "Now that we have our data ready, let's use AutoGluon Assistant to build our ML model. The simplest way to use AutoGluon Assistant is through the command line - no coding required! After installing the package, you can run it directly from your terminal:\n", + "\n", + "```bash\n", "aga run ./toy_data \\\n", " --presets medium_quality # (Optional) Choose prediction quality level:\n", - " # Options: medium_quality, high_quality, best_quality (default)" + " # Options: medium_quality, high_quality, best_quality (default)\n", + "```" ] }, { @@ -504,24 +490,16 @@ "id": "e50f3025", "metadata": {}, "source": [ - "### To run the AG-A Web UI:" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "id": "e609ff9b", - "metadata": {}, - "outputs": [], - "source": [ - "%%bash\n", + "### To run the AG-A Web UI:\n", "\n", + "```bash\n", "aga ui\n", "\n", "# OR\n", "\n", "# Launch Web-UI on specific port e.g. 8888\n", - "aga ui --port 8888" + "aga ui --port 8888\n", + "```" ] }, {