Compare commits
No commits in common. "master" and "github-repo-stats" have entirely different histories.
master
...
github-rep
|
@ -1,61 +0,0 @@
|
|||
# Ignore everything by default, selectively add things to context
|
||||
*
|
||||
|
||||
# Platform - Libs
|
||||
!autogpt_platform/autogpt_libs/autogpt_libs/
|
||||
!autogpt_platform/autogpt_libs/pyproject.toml
|
||||
!autogpt_platform/autogpt_libs/poetry.lock
|
||||
!autogpt_platform/autogpt_libs/README.md
|
||||
|
||||
# Platform - Backend
|
||||
!autogpt_platform/backend/backend/
|
||||
!autogpt_platform/backend/migrations/
|
||||
!autogpt_platform/backend/schema.prisma
|
||||
!autogpt_platform/backend/pyproject.toml
|
||||
!autogpt_platform/backend/poetry.lock
|
||||
!autogpt_platform/backend/README.md
|
||||
|
||||
# Platform - Market
|
||||
!autogpt_platform/market/market/
|
||||
!autogpt_platform/market/scripts.py
|
||||
!autogpt_platform/market/schema.prisma
|
||||
!autogpt_platform/market/pyproject.toml
|
||||
!autogpt_platform/market/poetry.lock
|
||||
!autogpt_platform/market/README.md
|
||||
|
||||
# Platform - Frontend
|
||||
!autogpt_platform/frontend/src/
|
||||
!autogpt_platform/frontend/public/
|
||||
!autogpt_platform/frontend/package.json
|
||||
!autogpt_platform/frontend/yarn.lock
|
||||
!autogpt_platform/frontend/tsconfig.json
|
||||
!autogpt_platform/frontend/README.md
|
||||
## config
|
||||
!autogpt_platform/frontend/*.config.*
|
||||
!autogpt_platform/frontend/.env.*
|
||||
|
||||
# Classic - AutoGPT
|
||||
!classic/original_autogpt/autogpt/
|
||||
!classic/original_autogpt/pyproject.toml
|
||||
!classic/original_autogpt/poetry.lock
|
||||
!classic/original_autogpt/README.md
|
||||
!classic/original_autogpt/tests/
|
||||
|
||||
# Classic - Benchmark
|
||||
!classic/benchmark/agbenchmark/
|
||||
!classic/benchmark/pyproject.toml
|
||||
!classic/benchmark/poetry.lock
|
||||
!classic/benchmark/README.md
|
||||
|
||||
# Classic - Forge
|
||||
!classic/forge/
|
||||
!classic/forge/pyproject.toml
|
||||
!classic/forge/poetry.lock
|
||||
!classic/forge/README.md
|
||||
|
||||
# Classic - Frontend
|
||||
!classic/frontend/build/web/
|
||||
|
||||
# Explicitly re-ignore some folders
|
||||
.*
|
||||
**/__pycache__
|
|
@ -1,10 +0,0 @@
|
|||
classic/frontend/build/** linguist-generated
|
||||
|
||||
**/poetry.lock linguist-generated
|
||||
|
||||
docs/_javascript/** linguist-vendored
|
||||
|
||||
# Exclude VCR cassettes from stats
|
||||
classic/forge/tests/vcr_cassettes/**/**.y*ml linguist-generated
|
||||
|
||||
* text=auto
|
|
@ -1,7 +0,0 @@
|
|||
* @Significant-Gravitas/maintainers
|
||||
.github/workflows/ @Significant-Gravitas/devops
|
||||
classic/forge/ @Significant-Gravitas/forge-maintainers
|
||||
classic/benchmark/ @Significant-Gravitas/benchmark-maintainers
|
||||
classic/frontend/ @Significant-Gravitas/frontend-maintainers
|
||||
autogpt_platform/infra @Significant-Gravitas/devops
|
||||
.github/CODEOWNERS @Significant-Gravitas/admins
|
|
@ -1,173 +0,0 @@
|
|||
name: Bug report 🐛
|
||||
description: Create a bug report for AutoGPT.
|
||||
labels: ['status: needs triage']
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
### ⚠️ Before you continue
|
||||
* Check out our [backlog], [roadmap] and join our [discord] to discuss what's going on
|
||||
* If you need help, you can ask in the [discussions] section or in [#tech-support]
|
||||
* **Thoroughly search the [existing issues] before creating a new one**
|
||||
* Read our [wiki page on Contributing]
|
||||
[backlog]: https://github.com/orgs/Significant-Gravitas/projects/1
|
||||
[roadmap]: https://github.com/orgs/Significant-Gravitas/projects/2
|
||||
[discord]: https://discord.gg/autogpt
|
||||
[discussions]: https://github.com/Significant-Gravitas/AutoGPT/discussions
|
||||
[#tech-support]: https://discord.com/channels/1092243196446249134/1092275629602394184
|
||||
[existing issues]: https://github.com/Significant-Gravitas/AutoGPT/issues?q=is%3Aissue
|
||||
[wiki page on Contributing]: https://github.com/Significant-Gravitas/AutoGPT/wiki/Contributing
|
||||
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: ⚠️ Search for existing issues first ⚠️
|
||||
description: >
|
||||
Please [search the history](https://github.com/Significant-Gravitas/AutoGPT/issues)
|
||||
to see if an issue already exists for the same problem.
|
||||
options:
|
||||
- label: I have searched the existing issues, and there is no existing issue for my problem
|
||||
required: true
|
||||
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
Please confirm that the issue you have is described well and precise in the title above ⬆️.
|
||||
A good rule of thumb: What would you type if you were searching for the issue?
|
||||
|
||||
For example:
|
||||
BAD - my AutoGPT keeps looping
|
||||
GOOD - After performing execute_python_file, AutoGPT goes into a loop where it keeps trying to execute the file.
|
||||
|
||||
⚠️ SUPER-busy repo, please help the volunteer maintainers.
|
||||
The less time we spend here, the more time we can spend building AutoGPT.
|
||||
|
||||
Please help us help you by following these steps:
|
||||
- Search for existing issues, adding a comment when you have the same or similar issue is tidier than "new issue" and
|
||||
newer issues will not be reviewed earlier, this is dependent on the current priorities set by our wonderful team
|
||||
- Ask on our Discord if your issue is known when you are unsure (https://discord.gg/autogpt)
|
||||
- Provide relevant info:
|
||||
- Provide commit-hash (`git rev-parse HEAD` gets it) if possible
|
||||
- If it's a pip/packages issue, mention this in the title and provide pip version, python version
|
||||
- If it's a crash, provide traceback and describe the error you got as precise as possible in the title.
|
||||
|
||||
- type: dropdown
|
||||
attributes:
|
||||
label: Which Operating System are you using?
|
||||
description: >
|
||||
Please select the operating system you were using to run AutoGPT when this problem occurred.
|
||||
options:
|
||||
- Windows
|
||||
- Linux
|
||||
- MacOS
|
||||
- Docker
|
||||
- Devcontainer / Codespace
|
||||
- Windows Subsystem for Linux (WSL)
|
||||
- Other
|
||||
validations:
|
||||
required: true
|
||||
nested_fields:
|
||||
- type: text
|
||||
attributes:
|
||||
label: Specify the system
|
||||
description: Please specify the system you are working on.
|
||||
|
||||
- type: dropdown
|
||||
attributes:
|
||||
label: Which version of AutoGPT are you using?
|
||||
description: |
|
||||
Please select which version of AutoGPT you were using when this issue occurred.
|
||||
If you downloaded the code from the [releases page](https://github.com/Significant-Gravitas/AutoGPT/releases/) make sure you were using the latest code.
|
||||
**If you weren't please try with the [latest code](https://github.com/Significant-Gravitas/AutoGPT/releases/)**.
|
||||
If installed with git you can run `git branch` to see which version of AutoGPT you are running.
|
||||
options:
|
||||
- Latest Release
|
||||
- Stable (branch)
|
||||
- Master (branch)
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: dropdown
|
||||
attributes:
|
||||
label: What LLM Provider do you use?
|
||||
description: >
|
||||
If you are using AutoGPT with `SMART_LLM=gpt-3.5-turbo`, your problems may be caused by
|
||||
the [limitations](https://github.com/Significant-Gravitas/AutoGPT/issues?q=is%3Aissue+label%3A%22AI+model+limitation%22) of GPT-3.5.
|
||||
options:
|
||||
- Azure
|
||||
- Groq
|
||||
- Anthropic
|
||||
- Llamafile
|
||||
- Other (detail in issue)
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: dropdown
|
||||
attributes:
|
||||
label: Which area covers your issue best?
|
||||
description: >
|
||||
Select the area related to the issue you are reporting.
|
||||
options:
|
||||
- Installation and setup
|
||||
- Memory
|
||||
- Performance
|
||||
- Prompt
|
||||
- Commands
|
||||
- Plugins
|
||||
- AI Model Limitations
|
||||
- Challenges
|
||||
- Documentation
|
||||
- Logging
|
||||
- Agents
|
||||
- Other
|
||||
validations:
|
||||
required: true
|
||||
autolabels: true
|
||||
nested_fields:
|
||||
- type: text
|
||||
attributes:
|
||||
label: Specify the area
|
||||
description: Please specify the area you think is best related to the issue.
|
||||
|
||||
- type: input
|
||||
attributes:
|
||||
label: What commit or version are you using?
|
||||
description: It is helpful for us to reproduce to know what version of the software you were using when this happened. Please run `git log -n 1 --pretty=format:"%H"` to output the full commit hash.
|
||||
validations:
|
||||
required: true
|
||||
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Describe your issue.
|
||||
description: Describe the problem you are experiencing. Try to describe only the issue and phrase it short but clear. ⚠️ Provide NO other data in this field
|
||||
validations:
|
||||
required: true
|
||||
|
||||
#Following are optional file content uploads
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
⚠️The following is OPTIONAL, please keep in mind that the log files may contain personal information such as credentials.⚠️
|
||||
|
||||
"The log files are located in the folder 'logs' inside the main AutoGPT folder."
|
||||
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Upload Activity Log Content
|
||||
description: |
|
||||
Upload the activity log content, this can help us understand the issue better.
|
||||
To do this, go to the folder logs in your main AutoGPT folder, open activity.log and copy/paste the contents to this field.
|
||||
⚠️ The activity log may contain personal data given to AutoGPT by you in prompt or input as well as
|
||||
any personal information that AutoGPT collected out of files during last run. Do not add the activity log if you are not comfortable with sharing it. ⚠️
|
||||
validations:
|
||||
required: false
|
||||
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Upload Error Log Content
|
||||
description: |
|
||||
Upload the error log content, this will help us understand the issue better.
|
||||
To do this, go to the folder logs in your main AutoGPT folder, open error.log and copy/paste the contents to this field.
|
||||
⚠️ The error log may contain personal data given to AutoGPT by you in prompt or input as well as
|
||||
any personal information that AutoGPT collected out of files during last run. Do not add the activity log if you are not comfortable with sharing it. ⚠️
|
||||
validations:
|
||||
required: false
|
|
@ -1,28 +0,0 @@
|
|||
name: Feature request 🚀
|
||||
description: Suggest a new idea for AutoGPT!
|
||||
labels: ['status: needs triage']
|
||||
body:
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
First, check out our [wiki page on Contributing](https://github.com/Significant-Gravitas/AutoGPT/wiki/Contributing)
|
||||
Please provide a searchable summary of the issue in the title above ⬆️.
|
||||
- type: checkboxes
|
||||
attributes:
|
||||
label: Duplicates
|
||||
description: Please [search the history](https://github.com/Significant-Gravitas/AutoGPT/issues) to see if an issue already exists for the same problem.
|
||||
options:
|
||||
- label: I have searched the existing issues
|
||||
required: true
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Summary 💡
|
||||
description: Describe how it should work.
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Examples 🌈
|
||||
description: Provide a link to other implementations, or screenshots of the expected behavior.
|
||||
- type: textarea
|
||||
attributes:
|
||||
label: Motivation 🔦
|
||||
description: What are you trying to accomplish? How has the lack of this feature affected you? Providing context helps us come up with a solution that is more useful in the real world.
|
|
@ -1,38 +0,0 @@
|
|||
<!-- Clearly explain the need for these changes: -->
|
||||
|
||||
### Changes 🏗️
|
||||
|
||||
<!-- Concisely describe all of the changes made in this pull request: -->
|
||||
|
||||
### Checklist 📋
|
||||
|
||||
#### For code changes:
|
||||
- [ ] I have clearly listed my changes in the PR description
|
||||
- [ ] I have made a test plan
|
||||
- [ ] I have tested my changes according to the test plan:
|
||||
<!-- Put your test plan here: -->
|
||||
- [ ] ...
|
||||
|
||||
<details>
|
||||
<summary>Example test plan</summary>
|
||||
|
||||
- [ ] Create from scratch and execute an agent with at least 3 blocks
|
||||
- [ ] Import an agent from file upload, and confirm it executes correctly
|
||||
- [ ] Upload agent to marketplace
|
||||
- [ ] Import an agent from marketplace and confirm it executes correctly
|
||||
- [ ] Edit an agent from monitor, and confirm it executes correctly
|
||||
</details>
|
||||
|
||||
#### For configuration changes:
|
||||
- [ ] `.env.example` is updated or already compatible with my changes
|
||||
- [ ] `docker-compose.yml` is updated or already compatible with my changes
|
||||
- [ ] I have included a list of my configuration changes in the PR description (under **Changes**)
|
||||
|
||||
<details>
|
||||
<summary>Examples of configuration changes</summary>
|
||||
|
||||
- Changing ports
|
||||
- Adding new services that need to communicate with each other
|
||||
- Secrets or environment variable changes
|
||||
- New or infrastructure changes such as databases
|
||||
</details>
|
|
@ -1,175 +0,0 @@
|
|||
version: 2
|
||||
updates:
|
||||
# autogpt_libs (Poetry project)
|
||||
- package-ecosystem: "pip"
|
||||
directory: "autogpt_platform/autogpt_libs"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 10
|
||||
target-branch: "dev"
|
||||
commit-message:
|
||||
prefix: "chore(libs/deps)"
|
||||
prefix-development: "chore(libs/deps-dev)"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
# backend (Poetry project)
|
||||
- package-ecosystem: "pip"
|
||||
directory: "autogpt_platform/backend"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 10
|
||||
target-branch: "dev"
|
||||
commit-message:
|
||||
prefix: "chore(backend/deps)"
|
||||
prefix-development: "chore(backend/deps-dev)"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
# frontend (Next.js project)
|
||||
- package-ecosystem: "npm"
|
||||
directory: "autogpt_platform/frontend"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 10
|
||||
target-branch: "dev"
|
||||
commit-message:
|
||||
prefix: "chore(frontend/deps)"
|
||||
prefix-development: "chore(frontend/deps-dev)"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
# infra (Terraform)
|
||||
- package-ecosystem: "terraform"
|
||||
directory: "autogpt_platform/infra"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 5
|
||||
target-branch: "dev"
|
||||
commit-message:
|
||||
prefix: "chore(infra/deps)"
|
||||
prefix-development: "chore(infra/deps-dev)"
|
||||
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
|
||||
# GitHub Actions
|
||||
- package-ecosystem: "github-actions"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 5
|
||||
target-branch: "dev"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
|
||||
# Docker
|
||||
- package-ecosystem: "docker"
|
||||
directory: "autogpt_platform/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 5
|
||||
target-branch: "dev"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
|
||||
# Submodules
|
||||
- package-ecosystem: "gitsubmodule"
|
||||
directory: "autogpt_platform/supabase"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 1
|
||||
target-branch: "dev"
|
||||
commit-message:
|
||||
prefix: "chore(platform/deps)"
|
||||
prefix-development: "chore(platform/deps-dev)"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
|
||||
|
||||
# Docs
|
||||
- package-ecosystem: 'pip'
|
||||
directory: "docs/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
open-pull-requests-limit: 1
|
||||
target-branch: "dev"
|
||||
commit-message:
|
||||
prefix: "chore(docs/deps)"
|
||||
groups:
|
||||
production-dependencies:
|
||||
dependency-type: "production"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
||||
development-dependencies:
|
||||
dependency-type: "development"
|
||||
update-types:
|
||||
- "minor"
|
||||
- "patch"
|
|
@ -1,32 +0,0 @@
|
|||
Classic AutoGPT Agent:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: classic/original_autogpt/**
|
||||
|
||||
Classic Benchmark:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: classic/benchmark/**
|
||||
|
||||
Classic Frontend:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: classic/frontend/**
|
||||
|
||||
Forge:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: classic/forge/**
|
||||
|
||||
documentation:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: docs/**
|
||||
|
||||
platform/frontend:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: autogpt_platform/frontend/**
|
||||
|
||||
platform/backend:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: autogpt_platform/backend/**
|
||||
- all-globs-to-all-files: '!autogpt_platform/backend/backend/blocks/**'
|
||||
|
||||
platform/blocks:
|
||||
- changed-files:
|
||||
- any-glob-to-any-file: autogpt_platform/backend/backend/blocks/**
|
|
@ -1,138 +0,0 @@
|
|||
name: Classic - AutoGPT CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master, dev, ci-test* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-autogpt-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
pull_request:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-autogpt-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('classic-autogpt-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
|
||||
cancel-in-progress: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
working-directory: classic/original_autogpt
|
||||
|
||||
jobs:
|
||||
test:
|
||||
permissions:
|
||||
contents: read
|
||||
timeout-minutes: 30
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version: ["3.10"]
|
||||
platform-os: [ubuntu, macos, macos-arm64, windows]
|
||||
runs-on: ${{ matrix.platform-os != 'macos-arm64' && format('{0}-latest', matrix.platform-os) || 'macos-14' }}
|
||||
|
||||
steps:
|
||||
# Quite slow on macOS (2~4 minutes to set up Docker)
|
||||
# - name: Set up Docker (macOS)
|
||||
# if: runner.os == 'macOS'
|
||||
# uses: crazy-max/ghaction-setup-docker@v3
|
||||
|
||||
- name: Start MinIO service (Linux)
|
||||
if: runner.os == 'Linux'
|
||||
working-directory: '.'
|
||||
run: |
|
||||
docker pull minio/minio:edge-cicd
|
||||
docker run -d -p 9000:9000 minio/minio:edge-cicd
|
||||
|
||||
- name: Start MinIO service (macOS)
|
||||
if: runner.os == 'macOS'
|
||||
working-directory: ${{ runner.temp }}
|
||||
run: |
|
||||
brew install minio/stable/minio
|
||||
mkdir data
|
||||
minio server ./data &
|
||||
|
||||
# No MinIO on Windows:
|
||||
# - Windows doesn't support running Linux Docker containers
|
||||
# - It doesn't seem possible to start background processes on Windows. They are
|
||||
# killed after the step returns.
|
||||
# See: https://github.com/actions/runner/issues/598#issuecomment-2011890429
|
||||
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Configure git user Auto-GPT-Bot
|
||||
run: |
|
||||
git config --global user.name "Auto-GPT-Bot"
|
||||
git config --global user.email "github-bot@agpt.co"
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- id: get_date
|
||||
name: Get date
|
||||
run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Set up Python dependency cache
|
||||
# On Windows, unpacking cached dependencies takes longer than just installing them
|
||||
if: runner.os != 'Windows'
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ runner.os == 'macOS' && '~/Library/Caches/pypoetry' || '~/.cache/pypoetry' }}
|
||||
key: poetry-${{ runner.os }}-${{ hashFiles('classic/original_autogpt/poetry.lock') }}
|
||||
|
||||
- name: Install Poetry (Unix)
|
||||
if: runner.os != 'Windows'
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python3 -
|
||||
|
||||
if [ "${{ runner.os }}" = "macOS" ]; then
|
||||
PATH="$HOME/.local/bin:$PATH"
|
||||
echo "$HOME/.local/bin" >> $GITHUB_PATH
|
||||
fi
|
||||
|
||||
- name: Install Poetry (Windows)
|
||||
if: runner.os == 'Windows'
|
||||
shell: pwsh
|
||||
run: |
|
||||
(Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | python -
|
||||
|
||||
$env:PATH += ";$env:APPDATA\Python\Scripts"
|
||||
echo "$env:APPDATA\Python\Scripts" >> $env:GITHUB_PATH
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: poetry install
|
||||
|
||||
- name: Run pytest with coverage
|
||||
run: |
|
||||
poetry run pytest -vv \
|
||||
--cov=autogpt --cov-branch --cov-report term-missing --cov-report xml \
|
||||
--numprocesses=logical --durations=10 \
|
||||
tests/unit tests/integration
|
||||
env:
|
||||
CI: true
|
||||
PLAIN_OUTPUT: True
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
S3_ENDPOINT_URL: ${{ runner.os != 'Windows' && 'http://127.0.0.1:9000' || '' }}
|
||||
AWS_ACCESS_KEY_ID: minioadmin
|
||||
AWS_SECRET_ACCESS_KEY: minioadmin
|
||||
|
||||
- name: Upload coverage reports to Codecov
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
flags: autogpt-agent,${{ runner.os }}
|
||||
|
||||
- name: Upload logs to artifact
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: test-logs
|
||||
path: classic/original_autogpt/logs/
|
|
@ -1,60 +0,0 @@
|
|||
name: Classic - Purge Auto-GPT Docker CI cache
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: 20 4 * * 1,4
|
||||
|
||||
env:
|
||||
BASE_BRANCH: dev
|
||||
IMAGE_NAME: auto-gpt
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
build-type: [release, dev]
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- id: build
|
||||
name: Build image
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: classic/
|
||||
file: classic/Dockerfile.autogpt
|
||||
build-args: BUILD_TYPE=${{ matrix.build-type }}
|
||||
load: true # save to docker images
|
||||
# use GHA cache as read-only
|
||||
cache-to: type=gha,scope=autogpt-docker-${{ matrix.build-type }},mode=max
|
||||
|
||||
- name: Generate build report
|
||||
env:
|
||||
event_name: ${{ github.event_name }}
|
||||
event_ref: ${{ github.event.schedule }}
|
||||
|
||||
build_type: ${{ matrix.build-type }}
|
||||
|
||||
prod_branch: master
|
||||
dev_branch: dev
|
||||
repository: ${{ github.repository }}
|
||||
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'dev' && 'dev' || 'master' }}
|
||||
|
||||
current_ref: ${{ github.ref_name }}
|
||||
commit_hash: ${{ github.sha }}
|
||||
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.sha) }}
|
||||
push_forced_label:
|
||||
|
||||
new_commits_json: ${{ null }}
|
||||
compare_url_template: ${{ format('/{0}/compare/{{base}}...{{head}}', github.repository) }}
|
||||
|
||||
github_context_json: ${{ toJSON(github) }}
|
||||
job_env_json: ${{ toJSON(env) }}
|
||||
vars_json: ${{ toJSON(vars) }}
|
||||
|
||||
run: .github/workflows/scripts/docker-ci-summary.sh >> $GITHUB_STEP_SUMMARY
|
||||
continue-on-error: true
|
|
@ -1,166 +0,0 @@
|
|||
name: Classic - AutoGPT Docker CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [master, dev]
|
||||
paths:
|
||||
- '.github/workflows/classic-autogpt-docker-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
- 'classic/forge/**'
|
||||
pull_request:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-autogpt-docker-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
- 'classic/forge/**'
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('classic-autogpt-docker-ci-{0}', github.head_ref && format('pr-{0}', github.event.pull_request.number) || github.sha) }}
|
||||
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
working-directory: classic/original_autogpt
|
||||
|
||||
env:
|
||||
IMAGE_NAME: auto-gpt
|
||||
DEPLOY_IMAGE_NAME: ${{ secrets.DOCKER_USER && format('{0}/', secrets.DOCKER_USER) || '' }}auto-gpt
|
||||
DEV_IMAGE_TAG: latest-dev
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
build-type: [release, dev]
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- if: runner.debug
|
||||
run: |
|
||||
ls -al
|
||||
du -hs *
|
||||
|
||||
- id: build
|
||||
name: Build image
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: classic/
|
||||
file: classic/Dockerfile.autogpt
|
||||
build-args: BUILD_TYPE=${{ matrix.build-type }}
|
||||
tags: ${{ env.IMAGE_NAME }}
|
||||
labels: GIT_REVISION=${{ github.sha }}
|
||||
load: true # save to docker images
|
||||
# cache layers in GitHub Actions cache to speed up builds
|
||||
cache-from: type=gha,scope=autogpt-docker-${{ matrix.build-type }}
|
||||
cache-to: type=gha,scope=autogpt-docker-${{ matrix.build-type }},mode=max
|
||||
|
||||
- name: Generate build report
|
||||
env:
|
||||
event_name: ${{ github.event_name }}
|
||||
event_ref: ${{ github.event.ref }}
|
||||
event_ref_type: ${{ github.event.ref}}
|
||||
|
||||
build_type: ${{ matrix.build-type }}
|
||||
|
||||
prod_branch: master
|
||||
dev_branch: dev
|
||||
repository: ${{ github.repository }}
|
||||
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'dev' && 'dev' || 'master' }}
|
||||
|
||||
current_ref: ${{ github.ref_name }}
|
||||
commit_hash: ${{ github.event.after }}
|
||||
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.event.release && github.event.release.tag_name || github.sha) }}
|
||||
push_forced_label: ${{ github.event.forced && '☢️ forced' || '' }}
|
||||
|
||||
new_commits_json: ${{ toJSON(github.event.commits) }}
|
||||
compare_url_template: ${{ format('/{0}/compare/{{base}}...{{head}}', github.repository) }}
|
||||
|
||||
github_context_json: ${{ toJSON(github) }}
|
||||
job_env_json: ${{ toJSON(env) }}
|
||||
vars_json: ${{ toJSON(vars) }}
|
||||
|
||||
run: .github/workflows/scripts/docker-ci-summary.sh >> $GITHUB_STEP_SUMMARY
|
||||
continue-on-error: true
|
||||
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
services:
|
||||
minio:
|
||||
image: minio/minio:edge-cicd
|
||||
options: >
|
||||
--name=minio
|
||||
--health-interval=10s --health-timeout=5s --health-retries=3
|
||||
--health-cmd="curl -f http://localhost:9000/minio/health/live"
|
||||
|
||||
steps:
|
||||
- name: Check out repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: true
|
||||
|
||||
- if: github.event_name == 'push'
|
||||
name: Log in to Docker hub
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- id: build
|
||||
name: Build image
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: classic/
|
||||
file: classic/Dockerfile.autogpt
|
||||
build-args: BUILD_TYPE=dev # include pytest
|
||||
tags: >
|
||||
${{ env.IMAGE_NAME }},
|
||||
${{ env.DEPLOY_IMAGE_NAME }}:${{ env.DEV_IMAGE_TAG }}
|
||||
labels: GIT_REVISION=${{ github.sha }}
|
||||
load: true # save to docker images
|
||||
# cache layers in GitHub Actions cache to speed up builds
|
||||
cache-from: type=gha,scope=autogpt-docker-dev
|
||||
cache-to: type=gha,scope=autogpt-docker-dev,mode=max
|
||||
|
||||
- id: test
|
||||
name: Run tests
|
||||
env:
|
||||
CI: true
|
||||
PLAIN_OUTPUT: True
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
S3_ENDPOINT_URL: http://minio:9000
|
||||
AWS_ACCESS_KEY_ID: minioadmin
|
||||
AWS_SECRET_ACCESS_KEY: minioadmin
|
||||
run: |
|
||||
set +e
|
||||
docker run --env CI --env OPENAI_API_KEY \
|
||||
--network container:minio \
|
||||
--env S3_ENDPOINT_URL --env AWS_ACCESS_KEY_ID --env AWS_SECRET_ACCESS_KEY \
|
||||
--entrypoint poetry ${{ env.IMAGE_NAME }} run \
|
||||
pytest -v --cov=autogpt --cov-branch --cov-report term-missing \
|
||||
--numprocesses=4 --durations=10 \
|
||||
tests/unit tests/integration 2>&1 | tee test_output.txt
|
||||
|
||||
test_failure=${PIPESTATUS[0]}
|
||||
|
||||
cat << $EOF >> $GITHUB_STEP_SUMMARY
|
||||
# Tests $([ $test_failure = 0 ] && echo '✅' || echo '❌')
|
||||
\`\`\`
|
||||
$(cat test_output.txt)
|
||||
\`\`\`
|
||||
$EOF
|
||||
|
||||
exit $test_failure
|
||||
|
||||
- if: github.event_name == 'push' && github.ref_name == 'master'
|
||||
name: Push image to Docker Hub
|
||||
run: docker push ${{ env.DEPLOY_IMAGE_NAME }}:${{ env.DEV_IMAGE_TAG }}
|
|
@ -1,87 +0,0 @@
|
|||
name: Classic - AutoGPT Docker Release
|
||||
|
||||
on:
|
||||
release:
|
||||
types: [published, edited]
|
||||
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
no_cache:
|
||||
type: boolean
|
||||
description: 'Build from scratch, without using cached layers'
|
||||
|
||||
env:
|
||||
IMAGE_NAME: auto-gpt
|
||||
DEPLOY_IMAGE_NAME: ${{ secrets.DOCKER_USER }}/auto-gpt
|
||||
|
||||
jobs:
|
||||
build:
|
||||
if: startsWith(github.ref, 'refs/tags/autogpt-')
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Log in to Docker hub
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ secrets.DOCKER_USER }}
|
||||
password: ${{ secrets.DOCKER_PASSWORD }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
# slashes are not allowed in image tags, but can appear in git branch or tag names
|
||||
- id: sanitize_tag
|
||||
name: Sanitize image tag
|
||||
run: |
|
||||
tag=${raw_tag//\//-}
|
||||
echo tag=${tag#autogpt-} >> $GITHUB_OUTPUT
|
||||
env:
|
||||
raw_tag: ${{ github.ref_name }}
|
||||
|
||||
- id: build
|
||||
name: Build image
|
||||
uses: docker/build-push-action@v6
|
||||
with:
|
||||
context: classic/
|
||||
file: Dockerfile.autogpt
|
||||
build-args: BUILD_TYPE=release
|
||||
load: true # save to docker images
|
||||
# push: true # TODO: uncomment when this issue is fixed: https://github.com/moby/buildkit/issues/1555
|
||||
tags: >
|
||||
${{ env.IMAGE_NAME }},
|
||||
${{ env.DEPLOY_IMAGE_NAME }}:latest,
|
||||
${{ env.DEPLOY_IMAGE_NAME }}:${{ steps.sanitize_tag.outputs.tag }}
|
||||
labels: GIT_REVISION=${{ github.sha }}
|
||||
|
||||
# cache layers in GitHub Actions cache to speed up builds
|
||||
cache-from: ${{ !inputs.no_cache && 'type=gha' || '' }},scope=autogpt-docker-release
|
||||
cache-to: type=gha,scope=autogpt-docker-release,mode=max
|
||||
|
||||
- name: Push image to Docker Hub
|
||||
run: docker push --all-tags ${{ env.DEPLOY_IMAGE_NAME }}
|
||||
|
||||
- name: Generate build report
|
||||
env:
|
||||
event_name: ${{ github.event_name }}
|
||||
event_ref: ${{ github.event.ref }}
|
||||
event_ref_type: ${{ github.event.ref}}
|
||||
inputs_no_cache: ${{ inputs.no_cache }}
|
||||
|
||||
prod_branch: master
|
||||
dev_branch: dev
|
||||
repository: ${{ github.repository }}
|
||||
base_branch: ${{ github.ref_name != 'master' && github.ref_name != 'dev' && 'dev' || 'master' }}
|
||||
|
||||
ref_type: ${{ github.ref_type }}
|
||||
current_ref: ${{ github.ref_name }}
|
||||
commit_hash: ${{ github.sha }}
|
||||
source_url: ${{ format('{0}/tree/{1}', github.event.repository.url, github.event.release && github.event.release.tag_name || github.sha) }}
|
||||
|
||||
github_context_json: ${{ toJSON(github) }}
|
||||
job_env_json: ${{ toJSON(env) }}
|
||||
vars_json: ${{ toJSON(vars) }}
|
||||
|
||||
run: .github/workflows/scripts/docker-release-summary.sh >> $GITHUB_STEP_SUMMARY
|
||||
continue-on-error: true
|
|
@ -1,76 +0,0 @@
|
|||
name: Classic - Agent smoke tests
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
schedule:
|
||||
- cron: '0 8 * * *'
|
||||
push:
|
||||
branches: [ master, dev, ci-test* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-autogpts-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
- 'classic/forge/**'
|
||||
- 'classic/benchmark/**'
|
||||
- 'classic/run'
|
||||
- 'classic/cli.py'
|
||||
- 'classic/setup.py'
|
||||
- '!**/*.md'
|
||||
pull_request:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-autogpts-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
- 'classic/forge/**'
|
||||
- 'classic/benchmark/**'
|
||||
- 'classic/run'
|
||||
- 'classic/cli.py'
|
||||
- 'classic/setup.py'
|
||||
- '!**/*.md'
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
working-directory: classic
|
||||
|
||||
jobs:
|
||||
serve-agent-protocol:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
agent-name: [ original_autogpt ]
|
||||
fail-fast: false
|
||||
timeout-minutes: 20
|
||||
env:
|
||||
min-python-version: '3.10'
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Set up Python ${{ env.min-python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ env.min-python-version }}
|
||||
|
||||
- name: Install Poetry
|
||||
working-directory: ./classic/${{ matrix.agent-name }}/
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python -
|
||||
|
||||
- name: Run regression tests
|
||||
run: |
|
||||
./run agent start ${{ matrix.agent-name }}
|
||||
cd ${{ matrix.agent-name }}
|
||||
poetry run agbenchmark --mock --test=BasicRetrieval --test=Battleship --test=WebArenaTask_0
|
||||
poetry run agbenchmark --test=WriteFile
|
||||
env:
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
AGENT_NAME: ${{ matrix.agent-name }}
|
||||
REQUESTS_CA_BUNDLE: /etc/ssl/certs/ca-certificates.crt
|
||||
HELICONE_CACHE_ENABLED: false
|
||||
HELICONE_PROPERTY_AGENT: ${{ matrix.agent-name }}
|
||||
REPORTS_FOLDER: ${{ format('../../reports/{0}', matrix.agent-name) }}
|
||||
TELEMETRY_ENVIRONMENT: autogpt-ci
|
||||
TELEMETRY_OPT_IN: ${{ github.ref_name == 'master' }}
|
|
@ -1,169 +0,0 @@
|
|||
name: Classic - AGBenchmark CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master, dev, ci-test* ]
|
||||
paths:
|
||||
- 'classic/benchmark/**'
|
||||
- '!classic/benchmark/reports/**'
|
||||
- .github/workflows/classic-benchmark-ci.yml
|
||||
pull_request:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths:
|
||||
- 'classic/benchmark/**'
|
||||
- '!classic/benchmark/reports/**'
|
||||
- .github/workflows/classic-benchmark-ci.yml
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('benchmark-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
|
||||
cancel-in-progress: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
env:
|
||||
min-python-version: '3.10'
|
||||
|
||||
jobs:
|
||||
test:
|
||||
permissions:
|
||||
contents: read
|
||||
timeout-minutes: 30
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version: ["3.10"]
|
||||
platform-os: [ubuntu, macos, macos-arm64, windows]
|
||||
runs-on: ${{ matrix.platform-os != 'macos-arm64' && format('{0}-latest', matrix.platform-os) || 'macos-14' }}
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
working-directory: classic/benchmark
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- name: Set up Python dependency cache
|
||||
# On Windows, unpacking cached dependencies takes longer than just installing them
|
||||
if: runner.os != 'Windows'
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ runner.os == 'macOS' && '~/Library/Caches/pypoetry' || '~/.cache/pypoetry' }}
|
||||
key: poetry-${{ runner.os }}-${{ hashFiles('classic/benchmark/poetry.lock') }}
|
||||
|
||||
- name: Install Poetry (Unix)
|
||||
if: runner.os != 'Windows'
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python3 -
|
||||
|
||||
if [ "${{ runner.os }}" = "macOS" ]; then
|
||||
PATH="$HOME/.local/bin:$PATH"
|
||||
echo "$HOME/.local/bin" >> $GITHUB_PATH
|
||||
fi
|
||||
|
||||
- name: Install Poetry (Windows)
|
||||
if: runner.os == 'Windows'
|
||||
shell: pwsh
|
||||
run: |
|
||||
(Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | python -
|
||||
|
||||
$env:PATH += ";$env:APPDATA\Python\Scripts"
|
||||
echo "$env:APPDATA\Python\Scripts" >> $env:GITHUB_PATH
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: poetry install
|
||||
|
||||
- name: Run pytest with coverage
|
||||
run: |
|
||||
poetry run pytest -vv \
|
||||
--cov=agbenchmark --cov-branch --cov-report term-missing --cov-report xml \
|
||||
--durations=10 \
|
||||
tests
|
||||
env:
|
||||
CI: true
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
|
||||
- name: Upload coverage reports to Codecov
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
flags: agbenchmark,${{ runner.os }}
|
||||
|
||||
self-test-with-agent:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
agent-name: [forge]
|
||||
fail-fast: false
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Set up Python ${{ env.min-python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ env.min-python-version }}
|
||||
|
||||
- name: Install Poetry
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python -
|
||||
|
||||
- name: Run regression tests
|
||||
working-directory: classic
|
||||
run: |
|
||||
./run agent start ${{ matrix.agent-name }}
|
||||
cd ${{ matrix.agent-name }}
|
||||
|
||||
set +e # Ignore non-zero exit codes and continue execution
|
||||
echo "Running the following command: poetry run agbenchmark --maintain --mock"
|
||||
poetry run agbenchmark --maintain --mock
|
||||
EXIT_CODE=$?
|
||||
set -e # Stop ignoring non-zero exit codes
|
||||
# Check if the exit code was 5, and if so, exit with 0 instead
|
||||
if [ $EXIT_CODE -eq 5 ]; then
|
||||
echo "regression_tests.json is empty."
|
||||
fi
|
||||
|
||||
echo "Running the following command: poetry run agbenchmark --mock"
|
||||
poetry run agbenchmark --mock
|
||||
|
||||
echo "Running the following command: poetry run agbenchmark --mock --category=data"
|
||||
poetry run agbenchmark --mock --category=data
|
||||
|
||||
echo "Running the following command: poetry run agbenchmark --mock --category=coding"
|
||||
poetry run agbenchmark --mock --category=coding
|
||||
|
||||
# echo "Running the following command: poetry run agbenchmark --test=WriteFile"
|
||||
# poetry run agbenchmark --test=WriteFile
|
||||
cd ../benchmark
|
||||
poetry install
|
||||
echo "Adding the BUILD_SKILL_TREE environment variable. This will attempt to add new elements in the skill tree. If new elements are added, the CI fails because they should have been pushed"
|
||||
export BUILD_SKILL_TREE=true
|
||||
|
||||
# poetry run agbenchmark --mock
|
||||
|
||||
# CHANGED=$(git diff --name-only | grep -E '(agbenchmark/challenges)|(../classic/frontend/assets)') || echo "No diffs"
|
||||
# if [ ! -z "$CHANGED" ]; then
|
||||
# echo "There are unstaged changes please run agbenchmark and commit those changes since they are needed."
|
||||
# echo "$CHANGED"
|
||||
# exit 1
|
||||
# else
|
||||
# echo "No unstaged changes."
|
||||
# fi
|
||||
env:
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
TELEMETRY_ENVIRONMENT: autogpt-benchmark-ci
|
||||
TELEMETRY_OPT_IN: ${{ github.ref_name == 'master' }}
|
|
@ -1,55 +0,0 @@
|
|||
name: Classic - Publish to PyPI
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: true
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.8
|
||||
|
||||
- name: Install Poetry
|
||||
working-directory: ./classic/benchmark/
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python3 -
|
||||
echo "$HOME/.poetry/bin" >> $GITHUB_PATH
|
||||
|
||||
- name: Build project for distribution
|
||||
working-directory: ./classic/benchmark/
|
||||
run: poetry build
|
||||
|
||||
- name: Install dependencies
|
||||
working-directory: ./classic/benchmark/
|
||||
run: poetry install
|
||||
|
||||
- name: Check Version
|
||||
working-directory: ./classic/benchmark/
|
||||
id: check-version
|
||||
run: |
|
||||
echo version=$(poetry version --short) >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Create Release
|
||||
uses: ncipollo/release-action@v1
|
||||
with:
|
||||
artifacts: "classic/benchmark/dist/*"
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
draft: false
|
||||
generateReleaseNotes: false
|
||||
tag: agbenchmark-v${{ steps.check-version.outputs.version }}
|
||||
commit: master
|
||||
|
||||
- name: Build and publish
|
||||
working-directory: ./classic/benchmark/
|
||||
run: poetry publish -u __token__ -p ${{ secrets.PYPI_API_TOKEN }}
|
|
@ -1,236 +0,0 @@
|
|||
name: Classic - Forge CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master, dev, ci-test* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-forge-ci.yml'
|
||||
- 'classic/forge/**'
|
||||
- '!classic/forge/tests/vcr_cassettes'
|
||||
pull_request:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-forge-ci.yml'
|
||||
- 'classic/forge/**'
|
||||
- '!classic/forge/tests/vcr_cassettes'
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('forge-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
|
||||
cancel-in-progress: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
working-directory: classic/forge
|
||||
|
||||
jobs:
|
||||
test:
|
||||
permissions:
|
||||
contents: read
|
||||
timeout-minutes: 30
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version: ["3.10"]
|
||||
platform-os: [ubuntu, macos, macos-arm64, windows]
|
||||
runs-on: ${{ matrix.platform-os != 'macos-arm64' && format('{0}-latest', matrix.platform-os) || 'macos-14' }}
|
||||
|
||||
steps:
|
||||
# Quite slow on macOS (2~4 minutes to set up Docker)
|
||||
# - name: Set up Docker (macOS)
|
||||
# if: runner.os == 'macOS'
|
||||
# uses: crazy-max/ghaction-setup-docker@v3
|
||||
|
||||
- name: Start MinIO service (Linux)
|
||||
if: runner.os == 'Linux'
|
||||
working-directory: '.'
|
||||
run: |
|
||||
docker pull minio/minio:edge-cicd
|
||||
docker run -d -p 9000:9000 minio/minio:edge-cicd
|
||||
|
||||
- name: Start MinIO service (macOS)
|
||||
if: runner.os == 'macOS'
|
||||
working-directory: ${{ runner.temp }}
|
||||
run: |
|
||||
brew install minio/stable/minio
|
||||
mkdir data
|
||||
minio server ./data &
|
||||
|
||||
# No MinIO on Windows:
|
||||
# - Windows doesn't support running Linux Docker containers
|
||||
# - It doesn't seem possible to start background processes on Windows. They are
|
||||
# killed after the step returns.
|
||||
# See: https://github.com/actions/runner/issues/598#issuecomment-2011890429
|
||||
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Checkout cassettes
|
||||
if: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
env:
|
||||
PR_BASE: ${{ github.event.pull_request.base.ref }}
|
||||
PR_BRANCH: ${{ github.event.pull_request.head.ref }}
|
||||
PR_AUTHOR: ${{ github.event.pull_request.user.login }}
|
||||
run: |
|
||||
cassette_branch="${PR_AUTHOR}-${PR_BRANCH}"
|
||||
cassette_base_branch="${PR_BASE}"
|
||||
cd tests/vcr_cassettes
|
||||
|
||||
if ! git ls-remote --exit-code --heads origin $cassette_base_branch ; then
|
||||
cassette_base_branch="master"
|
||||
fi
|
||||
|
||||
if git ls-remote --exit-code --heads origin $cassette_branch ; then
|
||||
git fetch origin $cassette_branch
|
||||
git fetch origin $cassette_base_branch
|
||||
|
||||
git checkout $cassette_branch
|
||||
|
||||
# Pick non-conflicting cassette updates from the base branch
|
||||
git merge --no-commit --strategy-option=ours origin/$cassette_base_branch
|
||||
echo "Using cassettes from mirror branch '$cassette_branch'," \
|
||||
"synced to upstream branch '$cassette_base_branch'."
|
||||
else
|
||||
git checkout -b $cassette_branch
|
||||
echo "Branch '$cassette_branch' does not exist in cassette submodule." \
|
||||
"Using cassettes from '$cassette_base_branch'."
|
||||
fi
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- name: Set up Python dependency cache
|
||||
# On Windows, unpacking cached dependencies takes longer than just installing them
|
||||
if: runner.os != 'Windows'
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ runner.os == 'macOS' && '~/Library/Caches/pypoetry' || '~/.cache/pypoetry' }}
|
||||
key: poetry-${{ runner.os }}-${{ hashFiles('classic/forge/poetry.lock') }}
|
||||
|
||||
- name: Install Poetry (Unix)
|
||||
if: runner.os != 'Windows'
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python3 -
|
||||
|
||||
if [ "${{ runner.os }}" = "macOS" ]; then
|
||||
PATH="$HOME/.local/bin:$PATH"
|
||||
echo "$HOME/.local/bin" >> $GITHUB_PATH
|
||||
fi
|
||||
|
||||
- name: Install Poetry (Windows)
|
||||
if: runner.os == 'Windows'
|
||||
shell: pwsh
|
||||
run: |
|
||||
(Invoke-WebRequest -Uri https://install.python-poetry.org -UseBasicParsing).Content | python -
|
||||
|
||||
$env:PATH += ";$env:APPDATA\Python\Scripts"
|
||||
echo "$env:APPDATA\Python\Scripts" >> $env:GITHUB_PATH
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: poetry install
|
||||
|
||||
- name: Run pytest with coverage
|
||||
run: |
|
||||
poetry run pytest -vv \
|
||||
--cov=forge --cov-branch --cov-report term-missing --cov-report xml \
|
||||
--durations=10 \
|
||||
forge
|
||||
env:
|
||||
CI: true
|
||||
PLAIN_OUTPUT: True
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
S3_ENDPOINT_URL: ${{ runner.os != 'Windows' && 'http://127.0.0.1:9000' || '' }}
|
||||
AWS_ACCESS_KEY_ID: minioadmin
|
||||
AWS_SECRET_ACCESS_KEY: minioadmin
|
||||
|
||||
- name: Upload coverage reports to Codecov
|
||||
uses: codecov/codecov-action@v4
|
||||
with:
|
||||
token: ${{ secrets.CODECOV_TOKEN }}
|
||||
flags: forge,${{ runner.os }}
|
||||
|
||||
- id: setup_git_auth
|
||||
name: Set up git token authentication
|
||||
# Cassettes may be pushed even when tests fail
|
||||
if: success() || failure()
|
||||
run: |
|
||||
config_key="http.${{ github.server_url }}/.extraheader"
|
||||
if [ "${{ runner.os }}" = 'macOS' ]; then
|
||||
base64_pat=$(echo -n "pat:${{ secrets.PAT_REVIEW }}" | base64)
|
||||
else
|
||||
base64_pat=$(echo -n "pat:${{ secrets.PAT_REVIEW }}" | base64 -w0)
|
||||
fi
|
||||
|
||||
git config "$config_key" \
|
||||
"Authorization: Basic $base64_pat"
|
||||
|
||||
cd tests/vcr_cassettes
|
||||
git config "$config_key" \
|
||||
"Authorization: Basic $base64_pat"
|
||||
|
||||
echo "config_key=$config_key" >> $GITHUB_OUTPUT
|
||||
|
||||
- id: push_cassettes
|
||||
name: Push updated cassettes
|
||||
# For pull requests, push updated cassettes even when tests fail
|
||||
if: github.event_name == 'push' || (! github.event.pull_request.head.repo.fork && (success() || failure()))
|
||||
env:
|
||||
PR_BRANCH: ${{ github.event.pull_request.head.ref }}
|
||||
PR_AUTHOR: ${{ github.event.pull_request.user.login }}
|
||||
run: |
|
||||
if [ "${{ startsWith(github.event_name, 'pull_request') }}" = "true" ]; then
|
||||
is_pull_request=true
|
||||
cassette_branch="${PR_AUTHOR}-${PR_BRANCH}"
|
||||
else
|
||||
cassette_branch="${{ github.ref_name }}"
|
||||
fi
|
||||
|
||||
cd tests/vcr_cassettes
|
||||
# Commit & push changes to cassettes if any
|
||||
if ! git diff --quiet; then
|
||||
git add .
|
||||
git commit -m "Auto-update cassettes"
|
||||
git push origin HEAD:$cassette_branch
|
||||
if [ ! $is_pull_request ]; then
|
||||
cd ../..
|
||||
git add tests/vcr_cassettes
|
||||
git commit -m "Update cassette submodule"
|
||||
git push origin HEAD:$cassette_branch
|
||||
fi
|
||||
echo "updated=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "updated=false" >> $GITHUB_OUTPUT
|
||||
echo "No cassette changes to commit"
|
||||
fi
|
||||
|
||||
- name: Post Set up git token auth
|
||||
if: steps.setup_git_auth.outcome == 'success'
|
||||
run: |
|
||||
git config --unset-all '${{ steps.setup_git_auth.outputs.config_key }}'
|
||||
git submodule foreach git config --unset-all '${{ steps.setup_git_auth.outputs.config_key }}'
|
||||
|
||||
- name: Apply "behaviour change" label and comment on PR
|
||||
if: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
run: |
|
||||
PR_NUMBER="${{ github.event.pull_request.number }}"
|
||||
TOKEN="${{ secrets.PAT_REVIEW }}"
|
||||
REPO="${{ github.repository }}"
|
||||
|
||||
if [[ "${{ steps.push_cassettes.outputs.updated }}" == "true" ]]; then
|
||||
echo "Adding label and comment..."
|
||||
echo $TOKEN | gh auth login --with-token
|
||||
gh issue edit $PR_NUMBER --add-label "behaviour change"
|
||||
gh issue comment $PR_NUMBER --body "You changed AutoGPT's behaviour on ${{ runner.os }}. The cassettes have been updated and will be merged to the submodule when this Pull Request gets merged."
|
||||
fi
|
||||
|
||||
- name: Upload logs to artifact
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: test-logs
|
||||
path: classic/forge/logs/
|
|
@ -1,60 +0,0 @@
|
|||
name: Classic - Frontend CI/CD
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
- dev
|
||||
- 'ci-test*' # This will match any branch that starts with "ci-test"
|
||||
paths:
|
||||
- 'classic/frontend/**'
|
||||
- '.github/workflows/classic-frontend-ci.yml'
|
||||
pull_request:
|
||||
paths:
|
||||
- 'classic/frontend/**'
|
||||
- '.github/workflows/classic-frontend-ci.yml'
|
||||
|
||||
jobs:
|
||||
build:
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
BUILD_BRANCH: ${{ format('classic-frontend-build/{0}', github.ref_name) }}
|
||||
|
||||
steps:
|
||||
- name: Checkout Repo
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Flutter
|
||||
uses: subosito/flutter-action@v2
|
||||
with:
|
||||
flutter-version: '3.13.2'
|
||||
|
||||
- name: Build Flutter to Web
|
||||
run: |
|
||||
cd classic/frontend
|
||||
flutter build web --base-href /app/
|
||||
|
||||
# - name: Commit and Push to ${{ env.BUILD_BRANCH }}
|
||||
# if: github.event_name == 'push'
|
||||
# run: |
|
||||
# git config --local user.email "action@github.com"
|
||||
# git config --local user.name "GitHub Action"
|
||||
# git add classic/frontend/build/web
|
||||
# git checkout -B ${{ env.BUILD_BRANCH }}
|
||||
# git commit -m "Update frontend build to ${GITHUB_SHA:0:7}" -a
|
||||
# git push -f origin ${{ env.BUILD_BRANCH }}
|
||||
|
||||
- name: Create PR ${{ env.BUILD_BRANCH }} -> ${{ github.ref_name }}
|
||||
if: github.event_name == 'push'
|
||||
uses: peter-evans/create-pull-request@v7
|
||||
with:
|
||||
add-paths: classic/frontend/build/web
|
||||
base: ${{ github.ref_name }}
|
||||
branch: ${{ env.BUILD_BRANCH }}
|
||||
delete-branch: true
|
||||
title: "Update frontend build in `${{ github.ref_name }}`"
|
||||
body: "This PR updates the frontend build based on commit ${{ github.sha }}."
|
||||
commit-message: "Update frontend build based on commit ${{ github.sha }}"
|
|
@ -1,151 +0,0 @@
|
|||
name: Classic - Python checks
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ master, dev, ci-test* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-python-checks-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
- 'classic/forge/**'
|
||||
- 'classic/benchmark/**'
|
||||
- '**.py'
|
||||
- '!classic/forge/tests/vcr_cassettes'
|
||||
pull_request:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths:
|
||||
- '.github/workflows/classic-python-checks-ci.yml'
|
||||
- 'classic/original_autogpt/**'
|
||||
- 'classic/forge/**'
|
||||
- 'classic/benchmark/**'
|
||||
- '**.py'
|
||||
- '!classic/forge/tests/vcr_cassettes'
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('classic-python-checks-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
|
||||
cancel-in-progress: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
|
||||
jobs:
|
||||
get-changed-parts:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- id: changes-in
|
||||
name: Determine affected subprojects
|
||||
uses: dorny/paths-filter@v3
|
||||
with:
|
||||
filters: |
|
||||
original_autogpt:
|
||||
- classic/original_autogpt/autogpt/**
|
||||
- classic/original_autogpt/tests/**
|
||||
- classic/original_autogpt/poetry.lock
|
||||
forge:
|
||||
- classic/forge/forge/**
|
||||
- classic/forge/tests/**
|
||||
- classic/forge/poetry.lock
|
||||
benchmark:
|
||||
- classic/benchmark/agbenchmark/**
|
||||
- classic/benchmark/tests/**
|
||||
- classic/benchmark/poetry.lock
|
||||
outputs:
|
||||
changed-parts: ${{ steps.changes-in.outputs.changes }}
|
||||
|
||||
lint:
|
||||
needs: get-changed-parts
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
min-python-version: "3.10"
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
sub-package: ${{ fromJson(needs.get-changed-parts.outputs.changed-parts) }}
|
||||
fail-fast: false
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python ${{ env.min-python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ env.min-python-version }}
|
||||
|
||||
- name: Set up Python dependency cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.cache/pypoetry
|
||||
key: ${{ runner.os }}-poetry-${{ hashFiles(format('{0}/poetry.lock', matrix.sub-package)) }}
|
||||
|
||||
- name: Install Poetry
|
||||
run: curl -sSL https://install.python-poetry.org | python3 -
|
||||
|
||||
# Install dependencies
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: poetry -C classic/${{ matrix.sub-package }} install
|
||||
|
||||
# Lint
|
||||
|
||||
- name: Lint (isort)
|
||||
run: poetry run isort --check .
|
||||
working-directory: classic/${{ matrix.sub-package }}
|
||||
|
||||
- name: Lint (Black)
|
||||
if: success() || failure()
|
||||
run: poetry run black --check .
|
||||
working-directory: classic/${{ matrix.sub-package }}
|
||||
|
||||
- name: Lint (Flake8)
|
||||
if: success() || failure()
|
||||
run: poetry run flake8 .
|
||||
working-directory: classic/${{ matrix.sub-package }}
|
||||
|
||||
types:
|
||||
needs: get-changed-parts
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
min-python-version: "3.10"
|
||||
|
||||
strategy:
|
||||
matrix:
|
||||
sub-package: ${{ fromJson(needs.get-changed-parts.outputs.changed-parts) }}
|
||||
fail-fast: false
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python ${{ env.min-python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ env.min-python-version }}
|
||||
|
||||
- name: Set up Python dependency cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.cache/pypoetry
|
||||
key: ${{ runner.os }}-poetry-${{ hashFiles(format('{0}/poetry.lock', matrix.sub-package)) }}
|
||||
|
||||
- name: Install Poetry
|
||||
run: curl -sSL https://install.python-poetry.org | python3 -
|
||||
|
||||
# Install dependencies
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: poetry -C classic/${{ matrix.sub-package }} install
|
||||
|
||||
# Typecheck
|
||||
|
||||
- name: Typecheck
|
||||
if: success() || failure()
|
||||
run: poetry run pyright
|
||||
working-directory: classic/${{ matrix.sub-package }}
|
|
@ -1,98 +0,0 @@
|
|||
# For most projects, this workflow file will not need changing; you simply need
|
||||
# to commit it to your repository.
|
||||
#
|
||||
# You may wish to alter this file to override the set of languages analyzed,
|
||||
# or to provide custom queries or build logic.
|
||||
#
|
||||
# ******** NOTE ********
|
||||
# We have attempted to detect the languages in your repository. Please check
|
||||
# the `language` matrix defined below to confirm you have the correct set of
|
||||
# supported CodeQL languages.
|
||||
#
|
||||
name: "CodeQL"
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ "master", "release-*", "dev" ]
|
||||
pull_request:
|
||||
branches: [ "master", "release-*", "dev" ]
|
||||
merge_group:
|
||||
schedule:
|
||||
- cron: '15 4 * * 0'
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze (${{ matrix.language }})
|
||||
# Runner size impacts CodeQL analysis time. To learn more, please see:
|
||||
# - https://gh.io/recommended-hardware-resources-for-running-codeql
|
||||
# - https://gh.io/supported-runners-and-hardware-resources
|
||||
# - https://gh.io/using-larger-runners (GitHub.com only)
|
||||
# Consider using larger runners or machines with greater resources for possible analysis time improvements.
|
||||
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
|
||||
permissions:
|
||||
# required for all workflows
|
||||
security-events: write
|
||||
|
||||
# required to fetch internal or private CodeQL packs
|
||||
packages: read
|
||||
|
||||
# only required for workflows in private repositories
|
||||
actions: read
|
||||
contents: read
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- language: typescript
|
||||
build-mode: none
|
||||
- language: python
|
||||
build-mode: none
|
||||
# CodeQL supports the following values keywords for 'language': 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift'
|
||||
# Use `c-cpp` to analyze code written in C, C++ or both
|
||||
# Use 'java-kotlin' to analyze code written in Java, Kotlin or both
|
||||
# Use 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both
|
||||
# To learn more about changing the languages that are analyzed or customizing the build mode for your analysis,
|
||||
# see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/customizing-your-advanced-setup-for-code-scanning.
|
||||
# If you are analyzing a compiled language, you can modify the 'build-mode' for that language to customize how
|
||||
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v3
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
build-mode: ${{ matrix.build-mode }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
# By default, queries listed here will override any specified in a config file.
|
||||
# Prefix the list here with "+" to use these queries and those in the config file.
|
||||
config: |
|
||||
paths-ignore:
|
||||
- classic/frontend/build/**
|
||||
|
||||
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
|
||||
# queries: security-extended,security-and-quality
|
||||
|
||||
# If the analyze step fails for one of the languages you are analyzing with
|
||||
# "We were unable to automatically build your code", modify the matrix above
|
||||
# to set the build mode to "manual" for that language. Then modify this step
|
||||
# to build your code.
|
||||
# ℹ️ Command-line programs to run using the OS shell.
|
||||
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||
- if: matrix.build-mode == 'manual'
|
||||
shell: bash
|
||||
run: |
|
||||
echo 'If you are using a "manual" build mode for one or more of the' \
|
||||
'languages you are analyzing, replace this with the commands to build' \
|
||||
'your code, for example:'
|
||||
echo ' make bootstrap'
|
||||
echo ' make release'
|
||||
exit 1
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v3
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
|
@ -1,49 +0,0 @@
|
|||
name: AutoGPT Platform - Deploy Prod Environment
|
||||
|
||||
on:
|
||||
release:
|
||||
types: [published]
|
||||
|
||||
permissions:
|
||||
contents: 'read'
|
||||
id-token: 'write'
|
||||
|
||||
jobs:
|
||||
migrate:
|
||||
environment: production
|
||||
name: Run migrations for AutoGPT Platform
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install prisma
|
||||
|
||||
- name: Run Backend Migrations
|
||||
working-directory: ./autogpt_platform/backend
|
||||
run: |
|
||||
python -m prisma migrate deploy
|
||||
env:
|
||||
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
|
||||
|
||||
|
||||
trigger:
|
||||
needs: migrate
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Trigger deploy workflow
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
with:
|
||||
token: ${{ secrets.DEPLOY_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
event-type: build_deploy_prod
|
||||
client-payload: '{"ref": "${{ github.ref }}", "sha": "${{ github.sha }}", "repository": "${{ github.repository }}"}'
|
|
@ -1,50 +0,0 @@
|
|||
name: AutoGPT Platform - Deploy Dev Environment
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ dev ]
|
||||
paths:
|
||||
- 'autogpt_platform/**'
|
||||
|
||||
permissions:
|
||||
contents: 'read'
|
||||
id-token: 'write'
|
||||
|
||||
jobs:
|
||||
migrate:
|
||||
environment: develop
|
||||
name: Run migrations for AutoGPT Platform
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.11'
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install prisma
|
||||
|
||||
- name: Run Backend Migrations
|
||||
working-directory: ./autogpt_platform/backend
|
||||
run: |
|
||||
python -m prisma migrate deploy
|
||||
env:
|
||||
DATABASE_URL: ${{ secrets.BACKEND_DATABASE_URL }}
|
||||
|
||||
trigger:
|
||||
needs: migrate
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Trigger deploy workflow
|
||||
uses: peter-evans/repository-dispatch@v3
|
||||
with:
|
||||
token: ${{ secrets.DEPLOY_TOKEN }}
|
||||
repository: Significant-Gravitas/AutoGPT_cloud_infrastructure
|
||||
event-type: build_deploy_dev
|
||||
client-payload: '{"ref": "${{ github.ref }}", "sha": "${{ github.sha }}", "repository": "${{ github.repository }}"}'
|
|
@ -1,147 +0,0 @@
|
|||
name: AutoGPT Platform - Backend CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [master, dev, ci-test*]
|
||||
paths:
|
||||
- ".github/workflows/platform-backend-ci.yml"
|
||||
- "autogpt_platform/backend/**"
|
||||
- "autogpt_platform/autogpt_libs/**"
|
||||
pull_request:
|
||||
branches: [master, dev, release-*]
|
||||
paths:
|
||||
- ".github/workflows/platform-backend-ci.yml"
|
||||
- "autogpt_platform/backend/**"
|
||||
- "autogpt_platform/autogpt_libs/**"
|
||||
merge_group:
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('backend-ci-{0}', github.head_ref && format('{0}-{1}', github.event_name, github.event.pull_request.number) || github.sha) }}
|
||||
cancel-in-progress: ${{ startsWith(github.event_name, 'pull_request') }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
working-directory: autogpt_platform/backend
|
||||
|
||||
jobs:
|
||||
test:
|
||||
permissions:
|
||||
contents: read
|
||||
timeout-minutes: 30
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
python-version: ["3.10"]
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
services:
|
||||
redis:
|
||||
image: bitnami/redis:6.2
|
||||
env:
|
||||
REDIS_PASSWORD: testpassword
|
||||
ports:
|
||||
- 6379:6379
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
submodules: true
|
||||
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
|
||||
- name: Setup Supabase
|
||||
uses: supabase/setup-cli@v1
|
||||
with:
|
||||
version: latest
|
||||
|
||||
- id: get_date
|
||||
name: Get date
|
||||
run: echo "date=$(date +'%Y-%m-%d')" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Set up Python dependency cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ~/.cache/pypoetry
|
||||
key: poetry-${{ runner.os }}-${{ hashFiles('autogpt_platform/backend/poetry.lock') }}
|
||||
|
||||
- name: Install Poetry (Unix)
|
||||
run: |
|
||||
curl -sSL https://install.python-poetry.org | python3 -
|
||||
|
||||
if [ "${{ runner.os }}" = "macOS" ]; then
|
||||
PATH="$HOME/.local/bin:$PATH"
|
||||
echo "$HOME/.local/bin" >> $GITHUB_PATH
|
||||
fi
|
||||
|
||||
- name: Check poetry.lock
|
||||
run: |
|
||||
poetry lock
|
||||
|
||||
if ! git diff --quiet poetry.lock; then
|
||||
echo "Error: poetry.lock not up to date."
|
||||
echo
|
||||
git diff poetry.lock
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Install Python dependencies
|
||||
run: poetry install
|
||||
|
||||
- name: Generate Prisma Client
|
||||
run: poetry run prisma generate
|
||||
|
||||
- id: supabase
|
||||
name: Start Supabase
|
||||
working-directory: .
|
||||
run: |
|
||||
supabase init
|
||||
supabase start --exclude postgres-meta,realtime,storage-api,imgproxy,inbucket,studio,edge-runtime,logflare,vector,supavisor
|
||||
supabase status -o env | sed 's/="/=/; s/"$//' >> $GITHUB_OUTPUT
|
||||
# outputs:
|
||||
# DB_URL, API_URL, GRAPHQL_URL, ANON_KEY, SERVICE_ROLE_KEY, JWT_SECRET
|
||||
|
||||
- name: Run Database Migrations
|
||||
run: poetry run prisma migrate dev --name updates
|
||||
env:
|
||||
DATABASE_URL: ${{ steps.supabase.outputs.DB_URL }}
|
||||
|
||||
- id: lint
|
||||
name: Run Linter
|
||||
run: poetry run lint
|
||||
|
||||
- name: Run pytest with coverage
|
||||
run: |
|
||||
if [[ "${{ runner.debug }}" == "1" ]]; then
|
||||
poetry run pytest -s -vv -o log_cli=true -o log_cli_level=DEBUG test
|
||||
else
|
||||
poetry run pytest -s -vv test
|
||||
fi
|
||||
if: success() || (failure() && steps.lint.outcome == 'failure')
|
||||
env:
|
||||
LOG_LEVEL: ${{ runner.debug && 'DEBUG' || 'INFO' }}
|
||||
DATABASE_URL: ${{ steps.supabase.outputs.DB_URL }}
|
||||
SUPABASE_URL: ${{ steps.supabase.outputs.API_URL }}
|
||||
SUPABASE_SERVICE_ROLE_KEY: ${{ steps.supabase.outputs.SERVICE_ROLE_KEY }}
|
||||
SUPABASE_JWT_SECRET: ${{ steps.supabase.outputs.JWT_SECRET }}
|
||||
REDIS_HOST: 'localhost'
|
||||
REDIS_PORT: '6379'
|
||||
REDIS_PASSWORD: 'testpassword'
|
||||
|
||||
env:
|
||||
CI: true
|
||||
PLAIN_OUTPUT: True
|
||||
RUN_ENV: local
|
||||
PORT: 8080
|
||||
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
||||
|
||||
# - name: Upload coverage reports to Codecov
|
||||
# uses: codecov/codecov-action@v4
|
||||
# with:
|
||||
# token: ${{ secrets.CODECOV_TOKEN }}
|
||||
# flags: backend,${{ runner.os }}
|
|
@ -1,101 +0,0 @@
|
|||
name: AutoGPT Platform - Frontend CI
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [master, dev]
|
||||
paths:
|
||||
- ".github/workflows/platform-frontend-ci.yml"
|
||||
- "autogpt_platform/frontend/**"
|
||||
pull_request:
|
||||
paths:
|
||||
- ".github/workflows/platform-frontend-ci.yml"
|
||||
- "autogpt_platform/frontend/**"
|
||||
merge_group:
|
||||
|
||||
defaults:
|
||||
run:
|
||||
shell: bash
|
||||
working-directory: autogpt_platform/frontend
|
||||
|
||||
jobs:
|
||||
lint:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "21"
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
yarn install --frozen-lockfile
|
||||
|
||||
- name: Run lint
|
||||
run: |
|
||||
yarn lint
|
||||
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
browser: [chromium, webkit]
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
submodules: recursive
|
||||
|
||||
- name: Set up Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "21"
|
||||
|
||||
- name: Free Disk Space (Ubuntu)
|
||||
uses: jlumbroso/free-disk-space@main
|
||||
with:
|
||||
large-packages: false # slow
|
||||
docker-images: false # limited benefit
|
||||
|
||||
- name: Copy default supabase .env
|
||||
run: |
|
||||
cp ../supabase/docker/.env.example ../.env
|
||||
|
||||
- name: Copy backend .env
|
||||
run: |
|
||||
cp ../backend/.env.example ../backend/.env
|
||||
|
||||
- name: Run docker compose
|
||||
run: |
|
||||
docker compose -f ../docker-compose.yml up -d
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
yarn install --frozen-lockfile
|
||||
|
||||
- name: Setup Builder .env
|
||||
run: |
|
||||
cp .env.example .env
|
||||
|
||||
- name: Install Browser '${{ matrix.browser }}'
|
||||
run: yarn playwright install --with-deps ${{ matrix.browser }}
|
||||
|
||||
- name: Run tests
|
||||
run: |
|
||||
yarn test --project=${{ matrix.browser }}
|
||||
|
||||
- name: Print Docker Compose logs in debug mode
|
||||
if: runner.debug
|
||||
run: |
|
||||
docker compose -f ../docker-compose.yml logs
|
||||
|
||||
- uses: actions/upload-artifact@v4
|
||||
if: ${{ !cancelled() }}
|
||||
with:
|
||||
name: playwright-report-${{ matrix.browser }}
|
||||
path: playwright-report/
|
||||
retention-days: 30
|
|
@ -1,34 +0,0 @@
|
|||
name: Repo - Close stale issues
|
||||
on:
|
||||
schedule:
|
||||
- cron: '30 1 * * *'
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
|
||||
jobs:
|
||||
stale:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/stale@v9
|
||||
with:
|
||||
# operations-per-run: 5000
|
||||
stale-issue-message: >
|
||||
This issue has automatically been marked as _stale_ because it has not had
|
||||
any activity in the last 50 days. You can _unstale_ it by commenting or
|
||||
removing the label. Otherwise, this issue will be closed in 10 days.
|
||||
stale-pr-message: >
|
||||
This pull request has automatically been marked as _stale_ because it has
|
||||
not had any activity in the last 50 days. You can _unstale_ it by commenting
|
||||
or removing the label.
|
||||
close-issue-message: >
|
||||
This issue was closed automatically because it has been stale for 10 days
|
||||
with no activity.
|
||||
days-before-stale: 50
|
||||
days-before-close: 10
|
||||
# Do not touch meta issues:
|
||||
exempt-issue-labels: meta,fridge,project management
|
||||
# Do not affect pull requests:
|
||||
days-before-pr-stale: -1
|
||||
days-before-pr-close: -1
|
|
@ -1,21 +0,0 @@
|
|||
name: Repo - Enforce dev as base branch
|
||||
on:
|
||||
pull_request_target:
|
||||
branches: [ master ]
|
||||
types: [ opened ]
|
||||
|
||||
jobs:
|
||||
check_pr_target:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: Check if PR is from dev or hotfix
|
||||
if: ${{ !(startsWith(github.event.pull_request.head.ref, 'hotfix/') || github.event.pull_request.head.ref == 'dev') }}
|
||||
run: |
|
||||
gh pr comment ${{ github.event.number }} --repo "$REPO" \
|
||||
--body $'This PR targets the `master` branch but does not come from `dev` or a `hotfix/*` branch.\n\nAutomatically setting the base branch to `dev`.'
|
||||
gh pr edit ${{ github.event.number }} --base dev --repo "$REPO"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ github.token }}
|
||||
REPO: ${{ github.repository }}
|
|
@ -1,66 +0,0 @@
|
|||
name: Repo - Pull Request auto-label
|
||||
|
||||
on:
|
||||
# So that PRs touching the same files as the push are updated
|
||||
push:
|
||||
branches: [ master, dev, release-* ]
|
||||
paths-ignore:
|
||||
- 'classic/forge/tests/vcr_cassettes'
|
||||
- 'classic/benchmark/reports/**'
|
||||
# So that the `dirtyLabel` is removed if conflicts are resolve
|
||||
# We recommend `pull_request_target` so that github secrets are available.
|
||||
# In `pull_request` we wouldn't be able to change labels of fork PRs
|
||||
pull_request_target:
|
||||
types: [ opened, synchronize ]
|
||||
|
||||
concurrency:
|
||||
group: ${{ format('pr-label-{0}', github.event.pull_request.number || github.sha) }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
conflicts:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
steps:
|
||||
- name: Update PRs with conflict labels
|
||||
uses: eps1lon/actions-label-merge-conflict@releases/2.x
|
||||
with:
|
||||
dirtyLabel: "conflicts"
|
||||
#removeOnDirtyLabel: "PR: ready to ship"
|
||||
repoToken: "${{ secrets.GITHUB_TOKEN }}"
|
||||
commentOnDirty: "This pull request has conflicts with the base branch, please resolve those so we can evaluate the pull request."
|
||||
commentOnClean: "Conflicts have been resolved! 🎉 A maintainer will review the pull request shortly."
|
||||
|
||||
size:
|
||||
if: ${{ github.event_name == 'pull_request_target' }}
|
||||
permissions:
|
||||
issues: write
|
||||
pull-requests: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: codelytv/pr-size-labeler@v1
|
||||
with:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
xs_label: 'size/xs'
|
||||
xs_max_size: 2
|
||||
s_label: 'size/s'
|
||||
s_max_size: 10
|
||||
m_label: 'size/m'
|
||||
m_max_size: 100
|
||||
l_label: 'size/l'
|
||||
l_max_size: 500
|
||||
xl_label: 'size/xl'
|
||||
message_if_xl:
|
||||
|
||||
scope:
|
||||
if: ${{ github.event_name == 'pull_request_target' }}
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: write
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/labeler@v5
|
||||
with:
|
||||
sync-labels: true
|
|
@ -1,20 +0,0 @@
|
|||
name: Repo - Github Stats
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Run this once per day, towards the end of the day for keeping the most
|
||||
# recent data point most meaningful (hours are interpreted in UTC).
|
||||
- cron: "0 23 * * *"
|
||||
workflow_dispatch: # Allow for running this manually.
|
||||
|
||||
jobs:
|
||||
j1:
|
||||
name: github-repo-stats
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: run-ghrs
|
||||
# Use latest release.
|
||||
uses: jgehrcke/github-repo-stats@HEAD
|
||||
with:
|
||||
ghtoken: ${{ secrets.ghrs_github_api_token }}
|
||||
|
|
@ -1,32 +0,0 @@
|
|||
name: Repo - PR Status Checker
|
||||
on:
|
||||
pull_request:
|
||||
types: [opened, synchronize, reopened]
|
||||
merge_group:
|
||||
|
||||
jobs:
|
||||
status-check:
|
||||
name: Check PR Status
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
# - name: Wait some time for all actions to start
|
||||
# run: sleep 30
|
||||
- uses: actions/checkout@v4
|
||||
# with:
|
||||
# fetch-depth: 0
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
python -m pip install --upgrade pip
|
||||
pip install requests
|
||||
- name: Check PR Status
|
||||
run: |
|
||||
echo "Current directory before running Python script:"
|
||||
pwd
|
||||
echo "Attempting to run Python script:"
|
||||
python .github/workflows/scripts/check_actions_status.py
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
|
@ -1,116 +0,0 @@
|
|||
import json
|
||||
import os
|
||||
import requests
|
||||
import sys
|
||||
import time
|
||||
from typing import Dict, List, Tuple
|
||||
|
||||
CHECK_INTERVAL = 30
|
||||
|
||||
|
||||
def get_environment_variables() -> Tuple[str, str, str, str, str]:
|
||||
"""Retrieve and return necessary environment variables."""
|
||||
try:
|
||||
with open(os.environ["GITHUB_EVENT_PATH"]) as f:
|
||||
event = json.load(f)
|
||||
|
||||
# Handle both PR and merge group events
|
||||
if "pull_request" in event:
|
||||
sha = event["pull_request"]["head"]["sha"]
|
||||
else:
|
||||
sha = os.environ["GITHUB_SHA"]
|
||||
|
||||
return (
|
||||
os.environ["GITHUB_API_URL"],
|
||||
os.environ["GITHUB_REPOSITORY"],
|
||||
sha,
|
||||
os.environ["GITHUB_TOKEN"],
|
||||
os.environ["GITHUB_RUN_ID"],
|
||||
)
|
||||
except KeyError as e:
|
||||
print(f"Error: Missing required environment variable or event data: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def make_api_request(url: str, headers: Dict[str, str]) -> Dict:
|
||||
"""Make an API request and return the JSON response."""
|
||||
try:
|
||||
print("Making API request to:", url)
|
||||
response = requests.get(url, headers=headers, timeout=10)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
except requests.RequestException as e:
|
||||
print(f"Error: API request failed. {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def process_check_runs(check_runs: List[Dict]) -> Tuple[bool, bool]:
|
||||
"""Process check runs and return their status."""
|
||||
runs_in_progress = False
|
||||
all_others_passed = True
|
||||
|
||||
for run in check_runs:
|
||||
if str(run["name"]) != "Check PR Status":
|
||||
status = run["status"]
|
||||
conclusion = run["conclusion"]
|
||||
|
||||
if status == "completed":
|
||||
if conclusion not in ["success", "skipped", "neutral"]:
|
||||
all_others_passed = False
|
||||
print(
|
||||
f"Check run {run['name']} (ID: {run['id']}) has conclusion: {conclusion}"
|
||||
)
|
||||
else:
|
||||
runs_in_progress = True
|
||||
print(f"Check run {run['name']} (ID: {run['id']}) is still {status}.")
|
||||
all_others_passed = False
|
||||
else:
|
||||
print(
|
||||
f"Skipping check run {run['name']} (ID: {run['id']}) as it is the current run."
|
||||
)
|
||||
|
||||
return runs_in_progress, all_others_passed
|
||||
|
||||
|
||||
def main():
|
||||
api_url, repo, sha, github_token, current_run_id = get_environment_variables()
|
||||
|
||||
endpoint = f"{api_url}/repos/{repo}/commits/{sha}/check-runs"
|
||||
headers = {
|
||||
"Accept": "application/vnd.github.v3+json",
|
||||
}
|
||||
if github_token:
|
||||
headers["Authorization"] = f"token {github_token}"
|
||||
|
||||
print(f"Current run ID: {current_run_id}")
|
||||
|
||||
while True:
|
||||
data = make_api_request(endpoint, headers)
|
||||
|
||||
check_runs = data["check_runs"]
|
||||
|
||||
print("Processing check runs...")
|
||||
|
||||
print(check_runs)
|
||||
|
||||
runs_in_progress, all_others_passed = process_check_runs(check_runs)
|
||||
|
||||
if not runs_in_progress:
|
||||
break
|
||||
|
||||
print(
|
||||
"Some check runs are still in progress. "
|
||||
f"Waiting {CHECK_INTERVAL} seconds before checking again..."
|
||||
)
|
||||
time.sleep(CHECK_INTERVAL)
|
||||
|
||||
if all_others_passed:
|
||||
print("All other completed check runs have passed. This check passes.")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("Some check runs have failed or have not completed. This check fails.")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
|
@ -1,98 +0,0 @@
|
|||
#!/bin/bash
|
||||
meta=$(docker image inspect "$IMAGE_NAME" | jq '.[0]')
|
||||
head_compare_url=$(sed "s/{base}/$base_branch/; s/{head}/$current_ref/" <<< $compare_url_template)
|
||||
ref_compare_url=$(sed "s/{base}/$base_branch/; s/{head}/$commit_hash/" <<< $compare_url_template)
|
||||
|
||||
EOF=$(dd if=/dev/urandom bs=15 count=1 status=none | base64)
|
||||
|
||||
cat << $EOF
|
||||
# Docker Build summary 🔨
|
||||
|
||||
**Source:** branch \`$current_ref\` -> [$repository@\`${commit_hash:0:7}\`]($source_url)
|
||||
|
||||
**Build type:** \`$build_type\`
|
||||
|
||||
**Image size:** $((`jq -r .Size <<< $meta` / 10**6))MB
|
||||
|
||||
## Image details
|
||||
|
||||
**Tags:**
|
||||
$(jq -r '.RepoTags | map("* `\(.)`") | join("\n")' <<< $meta)
|
||||
|
||||
<details>
|
||||
<summary><h3>Layers</h3></summary>
|
||||
|
||||
| Age | Size | Created by instruction |
|
||||
| --------- | ------ | ---------------------- |
|
||||
$(docker history --no-trunc --format "{{.CreatedSince}}\t{{.Size}}\t\`{{.CreatedBy}}\`\t{{.Comment}}" $IMAGE_NAME \
|
||||
| grep 'buildkit.dockerfile' `# filter for layers created in this build process`\
|
||||
| cut -f-3 `# yeet Comment column`\
|
||||
| sed 's/ ago//' `# fix Layer age`\
|
||||
| sed 's/ # buildkit//' `# remove buildkit comment from instructions`\
|
||||
| sed 's/\$/\\$/g' `# escape variable and shell expansions`\
|
||||
| sed 's/|/\\|/g' `# escape pipes so they don't interfere with column separators`\
|
||||
| column -t -s$'\t' -o' | ' `# align columns and add separator`\
|
||||
| sed 's/^/| /; s/$/ |/' `# add table row start and end pipes`)
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><h3>ENV</h3></summary>
|
||||
|
||||
| Variable | Value |
|
||||
| -------- | -------- |
|
||||
$(jq -r \
|
||||
'.Config.Env
|
||||
| map(
|
||||
split("=")
|
||||
| "\(.[0]) | `\(.[1] | gsub("\\s+"; " "))`"
|
||||
)
|
||||
| map("| \(.) |")
|
||||
| .[]' <<< $meta
|
||||
)
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>Raw metadata</summary>
|
||||
|
||||
\`\`\`JSON
|
||||
$meta
|
||||
\`\`\`
|
||||
</details>
|
||||
|
||||
## Build details
|
||||
**Build trigger:** $push_forced_label $event_name \`$event_ref\`
|
||||
|
||||
<details>
|
||||
<summary><code>github</code> context</summary>
|
||||
|
||||
\`\`\`JSON
|
||||
$github_context_json
|
||||
\`\`\`
|
||||
</details>
|
||||
|
||||
### Source
|
||||
**HEAD:** [$repository@\`${commit_hash:0:7}\`]($source_url) on branch [$current_ref]($ref_compare_url)
|
||||
|
||||
**Diff with previous HEAD:** $head_compare_url
|
||||
|
||||
#### New commits
|
||||
$(jq -r 'map([
|
||||
"**Commit [`\(.id[0:7])`](\(.url)) by \(if .author.username then "@"+.author.username else .author.name end):**",
|
||||
.message,
|
||||
(if .committer.name != .author.name then "\n> <sub>**Committer:** \(.committer.name) <\(.committer.email)></sub>" else "" end),
|
||||
"<sub>**Timestamp:** \(.timestamp)</sub>"
|
||||
] | map("> \(.)\n") | join("")) | join("\n")' <<< $new_commits_json)
|
||||
|
||||
### Job environment
|
||||
|
||||
#### \`vars\` context:
|
||||
\`\`\`JSON
|
||||
$vars_json
|
||||
\`\`\`
|
||||
|
||||
#### \`env\` context:
|
||||
\`\`\`JSON
|
||||
$job_env_json
|
||||
\`\`\`
|
||||
|
||||
$EOF
|
|
@ -1,85 +0,0 @@
|
|||
#!/bin/bash
|
||||
meta=$(docker image inspect "$IMAGE_NAME" | jq '.[0]')
|
||||
|
||||
EOF=$(dd if=/dev/urandom bs=15 count=1 status=none | base64)
|
||||
|
||||
cat << $EOF
|
||||
# Docker Release Build summary 🚀🔨
|
||||
|
||||
**Source:** $ref_type \`$current_ref\` -> [$repository@\`${commit_hash:0:7}\`]($source_url)
|
||||
|
||||
**Image size:** $((`jq -r .Size <<< $meta` / 10**6))MB
|
||||
|
||||
## Image details
|
||||
|
||||
**Tags:**
|
||||
$(jq -r '.RepoTags | map("* `\(.)`") | join("\n")' <<< $meta)
|
||||
|
||||
<details>
|
||||
<summary><h3>Layers</h3></summary>
|
||||
|
||||
| Age | Size | Created by instruction |
|
||||
| --------- | ------ | ---------------------- |
|
||||
$(docker history --no-trunc --format "{{.CreatedSince}}\t{{.Size}}\t\`{{.CreatedBy}}\`\t{{.Comment}}" $IMAGE_NAME \
|
||||
| grep 'buildkit.dockerfile' `# filter for layers created in this build process`\
|
||||
| cut -f-3 `# yeet Comment column`\
|
||||
| sed 's/ ago//' `# fix Layer age`\
|
||||
| sed 's/ # buildkit//' `# remove buildkit comment from instructions`\
|
||||
| sed 's/\$/\\$/g' `# escape variable and shell expansions`\
|
||||
| sed 's/|/\\|/g' `# escape pipes so they don't interfere with column separators`\
|
||||
| column -t -s$'\t' -o' | ' `# align columns and add separator`\
|
||||
| sed 's/^/| /; s/$/ |/' `# add table row start and end pipes`)
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary><h3>ENV</h3></summary>
|
||||
|
||||
| Variable | Value |
|
||||
| -------- | -------- |
|
||||
$(jq -r \
|
||||
'.Config.Env
|
||||
| map(
|
||||
split("=")
|
||||
| "\(.[0]) | `\(.[1] | gsub("\\s+"; " "))`"
|
||||
)
|
||||
| map("| \(.) |")
|
||||
| .[]' <<< $meta
|
||||
)
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>Raw metadata</summary>
|
||||
|
||||
\`\`\`JSON
|
||||
$meta
|
||||
\`\`\`
|
||||
</details>
|
||||
|
||||
## Build details
|
||||
**Build trigger:** $event_name \`$current_ref\`
|
||||
|
||||
| Parameter | Value |
|
||||
| -------------- | ------------ |
|
||||
| \`no_cache\` | \`$inputs_no_cache\` |
|
||||
|
||||
<details>
|
||||
<summary><code>github</code> context</summary>
|
||||
|
||||
\`\`\`JSON
|
||||
$github_context_json
|
||||
\`\`\`
|
||||
</details>
|
||||
|
||||
### Job environment
|
||||
|
||||
#### \`vars\` context:
|
||||
\`\`\`JSON
|
||||
$vars_json
|
||||
\`\`\`
|
||||
|
||||
#### \`env\` context:
|
||||
\`\`\`JSON
|
||||
$job_env_json
|
||||
\`\`\`
|
||||
|
||||
$EOF
|
|
@ -1,178 +0,0 @@
|
|||
## Original ignores
|
||||
.github_access_token
|
||||
classic/original_autogpt/keys.py
|
||||
classic/original_autogpt/*.json
|
||||
auto_gpt_workspace/*
|
||||
*.mpeg
|
||||
.env
|
||||
azure.yaml
|
||||
.vscode
|
||||
.idea/*
|
||||
auto-gpt.json
|
||||
log.txt
|
||||
log-ingestion.txt
|
||||
/logs
|
||||
*.log
|
||||
*.mp3
|
||||
mem.sqlite3
|
||||
venvAutoGPT
|
||||
|
||||
# Byte-compiled / optimized / DLL files
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
|
||||
# C extensions
|
||||
*.so
|
||||
|
||||
# Distribution / packaging
|
||||
.Python
|
||||
develop-eggs/
|
||||
dist/
|
||||
downloads/
|
||||
eggs/
|
||||
.eggs/
|
||||
lib64/
|
||||
parts/
|
||||
sdist/
|
||||
var/
|
||||
wheels/
|
||||
pip-wheel-metadata/
|
||||
share/python-wheels/
|
||||
*.egg-info/
|
||||
.installed.cfg
|
||||
*.egg
|
||||
MANIFEST
|
||||
|
||||
# PyInstaller
|
||||
# Usually these files are written by a python script from a template
|
||||
# before PyInstaller builds the exe, so as to inject date/other infos into it.
|
||||
*.manifest
|
||||
*.spec
|
||||
|
||||
# Installer logs
|
||||
pip-log.txt
|
||||
pip-delete-this-directory.txt
|
||||
|
||||
# Unit test / coverage reports
|
||||
htmlcov/
|
||||
.tox/
|
||||
.nox/
|
||||
.coverage
|
||||
.coverage.*
|
||||
.cache
|
||||
nosetests.xml
|
||||
coverage.xml
|
||||
*.cover
|
||||
*.py,cover
|
||||
.hypothesis/
|
||||
.pytest_cache/
|
||||
|
||||
# Translations
|
||||
*.mo
|
||||
*.pot
|
||||
|
||||
# Django stuff:
|
||||
*.log
|
||||
local_settings.py
|
||||
db.sqlite3
|
||||
db.sqlite3-journal
|
||||
|
||||
# Flask stuff:
|
||||
instance/
|
||||
.webassets-cache
|
||||
|
||||
# Scrapy stuff:
|
||||
.scrapy
|
||||
|
||||
# Sphinx documentation
|
||||
docs/_build/
|
||||
site/
|
||||
|
||||
# PyBuilder
|
||||
target/
|
||||
|
||||
# Jupyter Notebook
|
||||
.ipynb_checkpoints
|
||||
|
||||
# IPython
|
||||
profile_default/
|
||||
ipython_config.py
|
||||
|
||||
# pyenv
|
||||
.python-version
|
||||
|
||||
# pipenv
|
||||
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
|
||||
# However, in case of collaboration, if having platform-specific dependencies or dependencies
|
||||
# having no cross-platform support, pipenv may install dependencies that don't work, or not
|
||||
# install all needed dependencies.
|
||||
#Pipfile.lock
|
||||
|
||||
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
|
||||
__pypackages__/
|
||||
|
||||
# Celery stuff
|
||||
celerybeat-schedule
|
||||
celerybeat.pid
|
||||
|
||||
# SageMath parsed files
|
||||
*.sage.py
|
||||
|
||||
# Environments
|
||||
.direnv/
|
||||
.env
|
||||
.venv
|
||||
env/
|
||||
venv*/
|
||||
ENV/
|
||||
env.bak/
|
||||
|
||||
# Spyder project settings
|
||||
.spyderproject
|
||||
.spyproject
|
||||
|
||||
# Rope project settings
|
||||
.ropeproject
|
||||
|
||||
# mkdocs documentation
|
||||
/site
|
||||
|
||||
# mypy
|
||||
.mypy_cache/
|
||||
.dmypy.json
|
||||
dmypy.json
|
||||
|
||||
# Pyre type checker
|
||||
.pyre/
|
||||
llama-*
|
||||
vicuna-*
|
||||
|
||||
# mac
|
||||
.DS_Store
|
||||
|
||||
openai/
|
||||
|
||||
# news
|
||||
CURRENT_BULLETIN.md
|
||||
|
||||
# AgBenchmark
|
||||
classic/benchmark/agbenchmark/reports/
|
||||
|
||||
# Nodejs
|
||||
package-lock.json
|
||||
|
||||
|
||||
# Allow for locally private items
|
||||
# private
|
||||
pri*
|
||||
# ignore
|
||||
ig*
|
||||
.github_access_token
|
||||
LICENSE.rtf
|
||||
autogpt_platform/backend/settings.py
|
||||
/.auth
|
||||
/autogpt_platform/frontend/.auth
|
||||
|
||||
*.ign.*
|
||||
.test-contents
|
|
@ -1,6 +0,0 @@
|
|||
[submodule "classic/forge/tests/vcr_cassettes"]
|
||||
path = classic/forge/tests/vcr_cassettes
|
||||
url = https://github.com/Significant-Gravitas/Auto-GPT-test-cassettes
|
||||
[submodule "autogpt_platform/supabase"]
|
||||
path = autogpt_platform/supabase
|
||||
url = https://github.com/supabase/supabase.git
|
|
@ -1,6 +0,0 @@
|
|||
[pr_reviewer]
|
||||
num_code_suggestions=0
|
||||
|
||||
[pr_code_suggestions]
|
||||
commitable_code_suggestions=false
|
||||
num_code_suggestions=0
|
|
@ -1,258 +0,0 @@
|
|||
repos:
|
||||
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||
rev: v4.4.0
|
||||
hooks:
|
||||
- id: check-added-large-files
|
||||
args: ["--maxkb=500"]
|
||||
- id: fix-byte-order-marker
|
||||
- id: check-case-conflict
|
||||
- id: check-merge-conflict
|
||||
- id: check-symlinks
|
||||
- id: debug-statements
|
||||
|
||||
- repo: https://github.com/Yelp/detect-secrets
|
||||
rev: v1.5.0
|
||||
hooks:
|
||||
- id: detect-secrets
|
||||
name: Detect secrets
|
||||
description: Detects high entropy strings that are likely to be passwords.
|
||||
files: ^autogpt_platform/
|
||||
stages: [push]
|
||||
|
||||
- repo: local
|
||||
# For proper type checking, all dependencies need to be up-to-date.
|
||||
# It's also a good idea to check that poetry.lock is consistent with pyproject.toml.
|
||||
hooks:
|
||||
- id: poetry-install
|
||||
name: Check & Install dependencies - AutoGPT Platform - Backend
|
||||
alias: poetry-install-platform-backend
|
||||
entry: poetry -C autogpt_platform/backend install
|
||||
# include autogpt_libs source (since it's a path dependency)
|
||||
files: ^autogpt_platform/(backend|autogpt_libs)/poetry\.lock$
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: poetry-install
|
||||
name: Check & Install dependencies - AutoGPT Platform - Libs
|
||||
alias: poetry-install-platform-libs
|
||||
entry: poetry -C autogpt_platform/autogpt_libs install
|
||||
files: ^autogpt_platform/autogpt_libs/poetry\.lock$
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: poetry-install
|
||||
name: Check & Install dependencies - Classic - AutoGPT
|
||||
alias: poetry-install-classic-autogpt
|
||||
entry: poetry -C classic/original_autogpt install
|
||||
# include forge source (since it's a path dependency)
|
||||
files: ^classic/(original_autogpt|forge)/poetry\.lock$
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: poetry-install
|
||||
name: Check & Install dependencies - Classic - Forge
|
||||
alias: poetry-install-classic-forge
|
||||
entry: poetry -C classic/forge install
|
||||
files: ^classic/forge/poetry\.lock$
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: poetry-install
|
||||
name: Check & Install dependencies - Classic - Benchmark
|
||||
alias: poetry-install-classic-benchmark
|
||||
entry: poetry -C classic/benchmark install
|
||||
files: ^classic/benchmark/poetry\.lock$
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- repo: local
|
||||
# For proper type checking, Prisma client must be up-to-date.
|
||||
hooks:
|
||||
- id: prisma-generate
|
||||
name: Prisma Generate - AutoGPT Platform - Backend
|
||||
alias: prisma-generate-platform-backend
|
||||
entry: bash -c 'cd autogpt_platform/backend && poetry run prisma generate'
|
||||
# include everything that triggers poetry install + the prisma schema
|
||||
files: ^autogpt_platform/((backend|autogpt_libs)/poetry\.lock|backend/schema.prisma)$
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- repo: https://github.com/astral-sh/ruff-pre-commit
|
||||
rev: v0.7.2
|
||||
hooks:
|
||||
- id: ruff
|
||||
name: Lint (Ruff) - AutoGPT Platform - Backend
|
||||
alias: ruff-lint-platform-backend
|
||||
files: ^autogpt_platform/backend/
|
||||
args: [--fix]
|
||||
|
||||
- id: ruff
|
||||
name: Lint (Ruff) - AutoGPT Platform - Libs
|
||||
alias: ruff-lint-platform-libs
|
||||
files: ^autogpt_platform/autogpt_libs/
|
||||
args: [--fix]
|
||||
|
||||
- id: ruff-format
|
||||
name: Format (Ruff) - AutoGPT Platform - Libs
|
||||
alias: ruff-lint-platform-libs
|
||||
files: ^autogpt_platform/autogpt_libs/
|
||||
|
||||
- repo: local
|
||||
# isort needs the context of which packages are installed to function, so we
|
||||
# can't use a vendored isort pre-commit hook (which runs in its own isolated venv).
|
||||
hooks:
|
||||
- id: isort
|
||||
name: Lint (isort) - AutoGPT Platform - Backend
|
||||
alias: isort-platform-backend
|
||||
entry: poetry -P autogpt_platform/backend run isort -p backend
|
||||
files: ^autogpt_platform/backend/
|
||||
types: [file, python]
|
||||
language: system
|
||||
|
||||
- id: isort
|
||||
name: Lint (isort) - Classic - AutoGPT
|
||||
alias: isort-classic-autogpt
|
||||
entry: poetry -P classic/original_autogpt run isort -p autogpt
|
||||
files: ^classic/original_autogpt/
|
||||
types: [file, python]
|
||||
language: system
|
||||
|
||||
- id: isort
|
||||
name: Lint (isort) - Classic - Forge
|
||||
alias: isort-classic-forge
|
||||
entry: poetry -P classic/forge run isort -p forge
|
||||
files: ^classic/forge/
|
||||
types: [file, python]
|
||||
language: system
|
||||
|
||||
- id: isort
|
||||
name: Lint (isort) - Classic - Benchmark
|
||||
alias: isort-classic-benchmark
|
||||
entry: poetry -P classic/benchmark run isort -p agbenchmark
|
||||
files: ^classic/benchmark/
|
||||
types: [file, python]
|
||||
language: system
|
||||
|
||||
- repo: https://github.com/psf/black
|
||||
rev: 23.12.1
|
||||
# Black has sensible defaults, doesn't need package context, and ignores
|
||||
# everything in .gitignore, so it works fine without any config or arguments.
|
||||
hooks:
|
||||
- id: black
|
||||
name: Format (Black)
|
||||
|
||||
- repo: https://github.com/PyCQA/flake8
|
||||
rev: 7.0.0
|
||||
# To have flake8 load the config of the individual subprojects, we have to call
|
||||
# them separately.
|
||||
hooks:
|
||||
- id: flake8
|
||||
name: Lint (Flake8) - Classic - AutoGPT
|
||||
alias: flake8-classic-autogpt
|
||||
files: ^classic/original_autogpt/(autogpt|scripts|tests)/
|
||||
args: [--config=classic/original_autogpt/.flake8]
|
||||
|
||||
- id: flake8
|
||||
name: Lint (Flake8) - Classic - Forge
|
||||
alias: flake8-classic-forge
|
||||
files: ^classic/forge/(forge|tests)/
|
||||
args: [--config=classic/forge/.flake8]
|
||||
|
||||
- id: flake8
|
||||
name: Lint (Flake8) - Classic - Benchmark
|
||||
alias: flake8-classic-benchmark
|
||||
files: ^classic/benchmark/(agbenchmark|tests)/((?!reports).)*[/.]
|
||||
args: [--config=classic/benchmark/.flake8]
|
||||
|
||||
- repo: local
|
||||
# To have watertight type checking, we check *all* the files in an affected
|
||||
# project. To trigger on poetry.lock we also reset the file `types` filter.
|
||||
hooks:
|
||||
- id: pyright
|
||||
name: Typecheck - AutoGPT Platform - Backend
|
||||
alias: pyright-platform-backend
|
||||
entry: poetry -C autogpt_platform/backend run pyright
|
||||
# include forge source (since it's a path dependency) but exclude *_test.py files:
|
||||
files: ^autogpt_platform/(backend/((backend|test)/|(\w+\.py|poetry\.lock)$)|autogpt_libs/(autogpt_libs/.*(?<!_test)\.py|poetry\.lock)$)
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pyright
|
||||
name: Typecheck - AutoGPT Platform - Libs
|
||||
alias: pyright-platform-libs
|
||||
entry: poetry -C autogpt_platform/autogpt_libs run pyright
|
||||
files: ^autogpt_platform/autogpt_libs/(autogpt_libs/|poetry\.lock$)
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pyright
|
||||
name: Typecheck - Classic - AutoGPT
|
||||
alias: pyright-classic-autogpt
|
||||
entry: poetry -C classic/original_autogpt run pyright
|
||||
# include forge source (since it's a path dependency) but exclude *_test.py files:
|
||||
files: ^(classic/original_autogpt/((autogpt|scripts|tests)/|poetry\.lock$)|classic/forge/(forge/.*(?<!_test)\.py|poetry\.lock)$)
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pyright
|
||||
name: Typecheck - Classic - Forge
|
||||
alias: pyright-classic-forge
|
||||
entry: poetry -C classic/forge run pyright
|
||||
files: ^classic/forge/(forge/|poetry\.lock$)
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pyright
|
||||
name: Typecheck - Classic - Benchmark
|
||||
alias: pyright-classic-benchmark
|
||||
entry: poetry -C classic/benchmark run pyright
|
||||
files: ^classic/benchmark/(agbenchmark/|tests/|poetry\.lock$)
|
||||
types: [file]
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: pytest
|
||||
name: Run tests - AutoGPT Platform - Backend
|
||||
alias: pytest-platform-backend
|
||||
entry: bash -c 'cd autogpt_platform/backend && poetry run pytest'
|
||||
# include autogpt_libs source (since it's a path dependency) but exclude *_test.py files:
|
||||
files: ^autogpt_platform/(backend/((backend|test)/|poetry\.lock$)|autogpt_libs/(autogpt_libs/.*(?<!_test)\.py|poetry\.lock)$)
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pytest
|
||||
name: Run tests - Classic - AutoGPT (excl. slow tests)
|
||||
alias: pytest-classic-autogpt
|
||||
entry: bash -c 'cd classic/original_autogpt && poetry run pytest --cov=autogpt -m "not slow" tests/unit tests/integration'
|
||||
# include forge source (since it's a path dependency) but exclude *_test.py files:
|
||||
files: ^(classic/original_autogpt/((autogpt|tests)/|poetry\.lock$)|classic/forge/(forge/.*(?<!_test)\.py|poetry\.lock)$)
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pytest
|
||||
name: Run tests - Classic - Forge (excl. slow tests)
|
||||
alias: pytest-classic-forge
|
||||
entry: bash -c 'cd classic/forge && poetry run pytest --cov=forge -m "not slow"'
|
||||
files: ^classic/forge/(forge/|tests/|poetry\.lock$)
|
||||
language: system
|
||||
pass_filenames: false
|
||||
|
||||
- id: pytest
|
||||
name: Run tests - Classic - Benchmark
|
||||
alias: pytest-classic-benchmark
|
||||
entry: bash -c 'cd classic/benchmark && poetry run pytest --cov=benchmark'
|
||||
files: ^classic/benchmark/(agbenchmark/|tests/|poetry\.lock$)
|
||||
language: system
|
||||
pass_filenames: false
|
|
@ -1,62 +0,0 @@
|
|||
{
|
||||
"folders": [
|
||||
{
|
||||
"name": "frontend",
|
||||
"path": "../autogpt_platform/frontend"
|
||||
},
|
||||
{
|
||||
"name": "backend",
|
||||
"path": "../autogpt_platform/backend"
|
||||
},
|
||||
{
|
||||
"name": "market",
|
||||
"path": "../autogpt_platform/market"
|
||||
},
|
||||
{
|
||||
"name": "lib",
|
||||
"path": "../autogpt_platform/autogpt_libs"
|
||||
},
|
||||
{
|
||||
"name": "infra",
|
||||
"path": "../autogpt_platform/infra"
|
||||
},
|
||||
{
|
||||
"name": "docs",
|
||||
"path": "../docs"
|
||||
},
|
||||
|
||||
{
|
||||
"name": "classic - autogpt",
|
||||
"path": "../classic/original_autogpt"
|
||||
},
|
||||
{
|
||||
"name": "classic - benchmark",
|
||||
"path": "../classic/benchmark"
|
||||
},
|
||||
{
|
||||
"name": "classic - forge",
|
||||
"path": "../classic/forge"
|
||||
},
|
||||
{
|
||||
"name": "classic - frontend",
|
||||
"path": "../classic/frontend"
|
||||
},
|
||||
{
|
||||
"name": "[root]",
|
||||
"path": ".."
|
||||
}
|
||||
],
|
||||
"settings": {
|
||||
"python.analysis.typeCheckingMode": "basic"
|
||||
},
|
||||
"extensions": {
|
||||
"recommendations": [
|
||||
"charliermarsh.ruff",
|
||||
"dart-code.flutter",
|
||||
"ms-python.black-formatter",
|
||||
"ms-python.vscode-pylance",
|
||||
"prisma.prisma",
|
||||
"qwtel.sqlite-viewer"
|
||||
]
|
||||
}
|
||||
}
|
|
@ -1,67 +0,0 @@
|
|||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"name": "Frontend: Server Side",
|
||||
"type": "node-terminal",
|
||||
"request": "launch",
|
||||
"cwd": "${workspaceFolder}/autogpt_platform/frontend",
|
||||
"command": "yarn dev"
|
||||
},
|
||||
{
|
||||
"name": "Frontend: Client Side",
|
||||
"type": "msedge",
|
||||
"request": "launch",
|
||||
"url": "http://localhost:3000"
|
||||
},
|
||||
{
|
||||
"name": "Frontend: Full Stack",
|
||||
"type": "node-terminal",
|
||||
|
||||
"request": "launch",
|
||||
"command": "yarn dev",
|
||||
"cwd": "${workspaceFolder}/autogpt_platform/frontend",
|
||||
"serverReadyAction": {
|
||||
"pattern": "- Local:.+(https?://.+)",
|
||||
"uriFormat": "%s",
|
||||
"action": "debugWithEdge"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "Backend",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"module": "backend.app",
|
||||
// "env": {
|
||||
// "ENV": "dev"
|
||||
// },
|
||||
"envFile": "${workspaceFolder}/backend/.env",
|
||||
"justMyCode": false,
|
||||
"cwd": "${workspaceFolder}/autogpt_platform/backend"
|
||||
},
|
||||
{
|
||||
"name": "Marketplace",
|
||||
"type": "debugpy",
|
||||
"request": "launch",
|
||||
"module": "autogpt_platform.market.main",
|
||||
"env": {
|
||||
"ENV": "dev"
|
||||
},
|
||||
"envFile": "${workspaceFolder}/market/.env",
|
||||
"justMyCode": false,
|
||||
"cwd": "${workspaceFolder}/market"
|
||||
}
|
||||
],
|
||||
"compounds": [
|
||||
{
|
||||
"name": "Everything",
|
||||
"configurations": ["Backend", "Frontend: Full Stack"],
|
||||
// "preLaunchTask": "${defaultBuildTask}",
|
||||
"stopAll": true,
|
||||
"presentation": {
|
||||
"hidden": false,
|
||||
"order": 0
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
21
CITATION.cff
21
CITATION.cff
|
@ -1,21 +0,0 @@
|
|||
# This CITATION.cff file was generated with cffinit.
|
||||
# Visit https://bit.ly/cffinit to generate yours today!
|
||||
|
||||
cff-version: 1.2.0
|
||||
title: AutoGPT
|
||||
message: >-
|
||||
If you use this software, please cite it using the
|
||||
metadata from this file.
|
||||
type: software
|
||||
authors:
|
||||
- name: Significant Gravitas
|
||||
website: 'https://agpt.co'
|
||||
repository-code: 'https://github.com/Significant-Gravitas/AutoGPT'
|
||||
url: 'https://agpt.co'
|
||||
abstract: >-
|
||||
A collection of tools and experimental open-source attempts to make GPT-4 fully
|
||||
autonomous.
|
||||
keywords:
|
||||
- AI
|
||||
- Agent
|
||||
license: MIT
|
|
@ -1,40 +0,0 @@
|
|||
# Code of Conduct for AutoGPT
|
||||
|
||||
## 1. Purpose
|
||||
|
||||
The purpose of this Code of Conduct is to provide guidelines for contributors to the AutoGPT projects on GitHub. We aim to create a positive and inclusive environment where all participants can contribute and collaborate effectively. By participating in this project, you agree to abide by this Code of Conduct.
|
||||
|
||||
## 2. Scope
|
||||
|
||||
This Code of Conduct applies to all contributors, maintainers, and users of the AutoGPT project. It extends to all project spaces, including but not limited to issues, pull requests, code reviews, comments, and other forms of communication within the project.
|
||||
|
||||
## 3. Our Standards
|
||||
|
||||
We encourage the following behavior:
|
||||
|
||||
* Being respectful and considerate to others
|
||||
* Actively seeking diverse perspectives
|
||||
* Providing constructive feedback and assistance
|
||||
* Demonstrating empathy and understanding
|
||||
|
||||
We discourage the following behavior:
|
||||
|
||||
* Harassment or discrimination of any kind
|
||||
* Disrespectful, offensive, or inappropriate language or content
|
||||
* Personal attacks or insults
|
||||
* Unwarranted criticism or negativity
|
||||
|
||||
## 4. Reporting and Enforcement
|
||||
|
||||
If you witness or experience any violations of this Code of Conduct, please report them to the project maintainers by email or other appropriate means. The maintainers will investigate and take appropriate action, which may include warnings, temporary or permanent bans, or other measures as necessary.
|
||||
|
||||
Maintainers are responsible for ensuring compliance with this Code of Conduct and may take action to address any violations.
|
||||
|
||||
## 5. Acknowledgements
|
||||
|
||||
This Code of Conduct is adapted from the [Contributor Covenant](https://www.contributor-covenant.org/version/2/0/code_of_conduct.html).
|
||||
|
||||
## 6. Contact
|
||||
|
||||
If you have any questions or concerns, please contact the project maintainers on Discord:
|
||||
https://discord.gg/autogpt
|
|
@ -1,41 +0,0 @@
|
|||
# AutoGPT Contribution Guide
|
||||
If you are reading this, you are probably looking for the full **[contribution guide]**,
|
||||
which is part of our [wiki].
|
||||
|
||||
Also check out our [🚀 Roadmap][roadmap] for information about our priorities and associated tasks.
|
||||
<!-- You can find our immediate priorities and their progress on our public [kanban board]. -->
|
||||
|
||||
[contribution guide]: https://github.com/Significant-Gravitas/AutoGPT/wiki/Contributing
|
||||
[wiki]: https://github.com/Significant-Gravitas/AutoGPT/wiki
|
||||
[roadmap]: https://github.com/Significant-Gravitas/AutoGPT/discussions/6971
|
||||
[kanban board]: https://github.com/orgs/Significant-Gravitas/projects/1
|
||||
|
||||
## Contributing to the AutoGPT Platform Folder
|
||||
All contributions to [the autogpt_platform folder](https://github.com/Significant-Gravitas/AutoGPT/blob/master/autogpt_platform) will be under our [Contribution License Agreement](https://github.com/Significant-Gravitas/AutoGPT/blob/master/autogpt_platform/Contributor%20License%20Agreement%20(CLA).md). By making a pull request contributing to this folder, you agree to the terms of our CLA for your contribution. All contributions to other folders will be under the MIT license.
|
||||
|
||||
## In short
|
||||
1. Avoid duplicate work, issues, PRs etc.
|
||||
2. We encourage you to collaborate with fellow community members on some of our bigger
|
||||
[todo's][roadmap]!
|
||||
* We highly recommend to post your idea and discuss it in the [dev channel].
|
||||
3. Create a draft PR when starting work on bigger changes.
|
||||
4. Adhere to the [Code Guidelines]
|
||||
5. Clearly explain your changes when submitting a PR.
|
||||
6. Don't submit broken code: test/validate your changes.
|
||||
7. Avoid making unnecessary changes, especially if they're purely based on your personal
|
||||
preferences. Doing so is the maintainers' job. ;-)
|
||||
8. Please also consider contributing something other than code; see the
|
||||
[contribution guide] for options.
|
||||
|
||||
[dev channel]: https://discord.com/channels/1092243196446249134/1095817829405704305
|
||||
[code guidelines]: https://github.com/Significant-Gravitas/AutoGPT/wiki/Contributing#code-guidelines
|
||||
|
||||
If you wish to involve with the project (beyond just contributing PRs), please read the
|
||||
wiki page about [Catalyzing](https://github.com/Significant-Gravitas/AutoGPT/wiki/Catalyzing).
|
||||
|
||||
In fact, why not just look through the whole wiki (it's only a few pages) and
|
||||
hop on our Discord. See you there! :-)
|
||||
|
||||
❤️ & 🔆
|
||||
The team @ AutoGPT
|
||||
https://discord.gg/autogpt
|
29
LICENSE
29
LICENSE
|
@ -1,29 +0,0 @@
|
|||
All portions of this repository are under one of two licenses. The majority of the AutoGPT repository is under the MIT License below. The autogpt_platform folder is under the
|
||||
Polyform Shield License.
|
||||
|
||||
|
||||
MIT License
|
||||
|
||||
|
||||
Copyright (c) 2023 Toran Bruce Richards
|
||||
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
179
README.md
179
README.md
|
@ -1,179 +0,0 @@
|
|||
# AutoGPT: Build, Deploy, and Run AI Agents
|
||||
|
||||
[](https://discord.gg/autogpt)  
|
||||
[](https://twitter.com/Auto_GPT)  
|
||||
[](https://opensource.org/licenses/MIT)
|
||||
|
||||
**AutoGPT** is a powerful platform that allows you to create, deploy, and manage continuous AI agents that automate complex workflows.
|
||||
|
||||
## Hosting Options
|
||||
- Download to self-host
|
||||
- [Join the Waitlist](https://bit.ly/3ZDijAI) for the cloud-hosted beta
|
||||
|
||||
## How to Setup for Self-Hosting
|
||||
> [!NOTE]
|
||||
> Setting up and hosting the AutoGPT Platform yourself is a technical process.
|
||||
> If you'd rather something that just works, we recommend [joining the waitlist](https://bit.ly/3ZDijAI) for the cloud-hosted beta.
|
||||
|
||||
https://github.com/user-attachments/assets/d04273a5-b36a-4a37-818e-f631ce72d603
|
||||
|
||||
This tutorial assumes you have Docker, VSCode, git and npm installed.
|
||||
|
||||
### 🧱 AutoGPT Frontend
|
||||
|
||||
The AutoGPT frontend is where users interact with our powerful AI automation platform. It offers multiple ways to engage with and leverage our AI agents. This is the interface where you'll bring your AI automation ideas to life:
|
||||
|
||||
**Agent Builder:** For those who want to customize, our intuitive, low-code interface allows you to design and configure your own AI agents.
|
||||
|
||||
**Workflow Management:** Build, modify, and optimize your automation workflows with ease. You build your agent by connecting blocks, where each block performs a single action.
|
||||
|
||||
**Deployment Controls:** Manage the lifecycle of your agents, from testing to production.
|
||||
|
||||
**Ready-to-Use Agents:** Don't want to build? Simply select from our library of pre-configured agents and put them to work immediately.
|
||||
|
||||
**Agent Interaction:** Whether you've built your own or are using pre-configured agents, easily run and interact with them through our user-friendly interface.
|
||||
|
||||
**Monitoring and Analytics:** Keep track of your agents' performance and gain insights to continually improve your automation processes.
|
||||
|
||||
[Read this guide](https://docs.agpt.co/platform/new_blocks/) to learn how to build your own custom blocks.
|
||||
|
||||
### 💽 AutoGPT Server
|
||||
|
||||
The AutoGPT Server is the powerhouse of our platform This is where your agents run. Once deployed, agents can be triggered by external sources and can operate continuously. It contains all the essential components that make AutoGPT run smoothly.
|
||||
|
||||
**Source Code:** The core logic that drives our agents and automation processes.
|
||||
|
||||
**Infrastructure:** Robust systems that ensure reliable and scalable performance.
|
||||
|
||||
**Marketplace:** A comprehensive marketplace where you can find and deploy a wide range of pre-built agents.
|
||||
|
||||
### 🐙 Example Agents
|
||||
|
||||
Here are two examples of what you can do with AutoGPT:
|
||||
|
||||
1. **Generate Viral Videos from Trending Topics**
|
||||
- This agent reads topics on Reddit.
|
||||
- It identifies trending topics.
|
||||
- It then automatically creates a short-form video based on the content.
|
||||
|
||||
2. **Identify Top Quotes from Videos for Social Media**
|
||||
- This agent subscribes to your YouTube channel.
|
||||
- When you post a new video, it transcribes it.
|
||||
- It uses AI to identify the most impactful quotes to generate a summary.
|
||||
- Then, it writes a post to automatically publish to your social media.
|
||||
|
||||
These examples show just a glimpse of what you can achieve with AutoGPT! You can create customized workflows to build agents for any use case.
|
||||
|
||||
---
|
||||
### Mission and Licencing
|
||||
Our mission is to provide the tools, so that you can focus on what matters:
|
||||
|
||||
- 🏗️ **Building** - Lay the foundation for something amazing.
|
||||
- 🧪 **Testing** - Fine-tune your agent to perfection.
|
||||
- 🤝 **Delegating** - Let AI work for you, and have your ideas come to life.
|
||||
|
||||
Be part of the revolution! **AutoGPT** is here to stay, at the forefront of AI innovation.
|
||||
|
||||
**📖 [Documentation](https://docs.agpt.co)**
|
||||
 | 
|
||||
**🚀 [Contributing](CONTRIBUTING.md)**
|
||||
|
||||
**Licensing:**
|
||||
|
||||
MIT License: The majority of the AutoGPT repository is under the MIT License.
|
||||
|
||||
Polyform Shield License: This license applies to the autogpt_platform folder.
|
||||
|
||||
For more information, see https://agpt.co/blog/introducing-the-autogpt-platform
|
||||
|
||||
---
|
||||
## 🤖 AutoGPT Classic
|
||||
> Below is information about the classic version of AutoGPT.
|
||||
|
||||
**🛠️ [Build your own Agent - Quickstart](classic/FORGE-QUICKSTART.md)**
|
||||
|
||||
### 🏗️ Forge
|
||||
|
||||
**Forge your own agent!** – Forge is a ready-to-go toolkit to build your own agent application. It handles most of the boilerplate code, letting you channel all your creativity into the things that set *your* agent apart. All tutorials are located [here](https://medium.com/@aiedge/autogpt-forge-e3de53cc58ec). Components from [`forge`](/classic/forge/) can also be used individually to speed up development and reduce boilerplate in your agent project.
|
||||
|
||||
🚀 [**Getting Started with Forge**](https://github.com/Significant-Gravitas/AutoGPT/blob/master/classic/forge/tutorials/001_getting_started.md) –
|
||||
This guide will walk you through the process of creating your own agent and using the benchmark and user interface.
|
||||
|
||||
📘 [Learn More](https://github.com/Significant-Gravitas/AutoGPT/tree/master/classic/forge) about Forge
|
||||
|
||||
### 🎯 Benchmark
|
||||
|
||||
**Measure your agent's performance!** The `agbenchmark` can be used with any agent that supports the agent protocol, and the integration with the project's [CLI] makes it even easier to use with AutoGPT and forge-based agents. The benchmark offers a stringent testing environment. Our framework allows for autonomous, objective performance evaluations, ensuring your agents are primed for real-world action.
|
||||
|
||||
<!-- TODO: insert visual demonstrating the benchmark -->
|
||||
|
||||
📦 [`agbenchmark`](https://pypi.org/project/agbenchmark/) on Pypi
|
||||
 | 
|
||||
📘 [Learn More](https://github.com/Significant-Gravitas/AutoGPT/tree/master/classic/benchmark) about the Benchmark
|
||||
|
||||
### 💻 UI
|
||||
|
||||
**Makes agents easy to use!** The `frontend` gives you a user-friendly interface to control and monitor your agents. It connects to agents through the [agent protocol](#-agent-protocol), ensuring compatibility with many agents from both inside and outside of our ecosystem.
|
||||
|
||||
<!-- TODO: insert screenshot of front end -->
|
||||
|
||||
The frontend works out-of-the-box with all agents in the repo. Just use the [CLI] to run your agent of choice!
|
||||
|
||||
📘 [Learn More](https://github.com/Significant-Gravitas/AutoGPT/tree/master/classic/frontend) about the Frontend
|
||||
|
||||
### ⌨️ CLI
|
||||
|
||||
[CLI]: #-cli
|
||||
|
||||
To make it as easy as possible to use all of the tools offered by the repository, a CLI is included at the root of the repo:
|
||||
|
||||
```shell
|
||||
$ ./run
|
||||
Usage: cli.py [OPTIONS] COMMAND [ARGS]...
|
||||
|
||||
Options:
|
||||
--help Show this message and exit.
|
||||
|
||||
Commands:
|
||||
agent Commands to create, start and stop agents
|
||||
benchmark Commands to start the benchmark and list tests and categories
|
||||
setup Installs dependencies needed for your system.
|
||||
```
|
||||
|
||||
Just clone the repo, install dependencies with `./run setup`, and you should be good to go!
|
||||
|
||||
## 🤔 Questions? Problems? Suggestions?
|
||||
|
||||
### Get help - [Discord 💬](https://discord.gg/autogpt)
|
||||
|
||||
[](https://discord.gg/autogpt)
|
||||
|
||||
To report a bug or request a feature, create a [GitHub Issue](https://github.com/Significant-Gravitas/AutoGPT/issues/new/choose). Please ensure someone else hasn’t created an issue for the same topic.
|
||||
|
||||
## 🤝 Sister projects
|
||||
|
||||
### 🔄 Agent Protocol
|
||||
|
||||
To maintain a uniform standard and ensure seamless compatibility with many current and future applications, AutoGPT employs the [agent protocol](https://agentprotocol.ai/) standard by the AI Engineer Foundation. This standardizes the communication pathways from your agent to the frontend and benchmark.
|
||||
|
||||
---
|
||||
|
||||
## Stars stats
|
||||
|
||||
<p align="center">
|
||||
<a href="https://star-history.com/#Significant-Gravitas/AutoGPT">
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=Significant-Gravitas/AutoGPT&type=Date&theme=dark" />
|
||||
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=Significant-Gravitas/AutoGPT&type=Date" />
|
||||
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=Significant-Gravitas/AutoGPT&type=Date" />
|
||||
</picture>
|
||||
</a>
|
||||
</p>
|
||||
|
||||
|
||||
## ⚡ Contributors
|
||||
|
||||
<a href="https://github.com/Significant-Gravitas/AutoGPT/graphs/contributors" alt="View Contributors">
|
||||
<img src="https://contrib.rocks/image?repo=Significant-Gravitas/AutoGPT&max=1000&columns=10" alt="Contributors" />
|
||||
</a>
|
||||
|
47
SECURITY.md
47
SECURITY.md
|
@ -1,47 +0,0 @@
|
|||
# Security Policy
|
||||
|
||||
## Reporting Security Issues
|
||||
|
||||
We take the security of our project seriously. If you believe you have found a security vulnerability, please report it to us privately. **Please do not report security vulnerabilities through public GitHub issues, discussions, or pull requests.**
|
||||
|
||||
> **Important Note**: Any code within the `classic/` folder is considered legacy, unsupported, and out of scope for security reports. We will not address security vulnerabilities in this deprecated code.
|
||||
|
||||
Instead, please report them via:
|
||||
- [GitHub Security Advisory](https://github.com/Significant-Gravitas/AutoGPT/security/advisories/new)
|
||||
<!--- [Huntr.dev](https://huntr.com/repos/significant-gravitas/autogpt) - where you may be eligible for a bounty-->
|
||||
|
||||
### Reporting Process
|
||||
1. **Submit Report**: Use one of the above channels to submit your report
|
||||
2. **Response Time**: Our team will acknowledge receipt of your report within 14 business days.
|
||||
3. **Collaboration**: We will collaborate with you to understand and validate the issue
|
||||
4. **Resolution**: We will work on a fix and coordinate the release process
|
||||
|
||||
### Disclosure Policy
|
||||
- Please provide detailed reports with reproducible steps
|
||||
- Include the version/commit hash where you discovered the vulnerability
|
||||
- Allow us a 90-day security fix window before any public disclosure
|
||||
- Share any potential mitigations or workarounds if known
|
||||
|
||||
## Supported Versions
|
||||
Only the following versions are eligible for security updates:
|
||||
|
||||
| Version | Supported |
|
||||
|---------|-----------|
|
||||
| Latest release on master branch | ✅ |
|
||||
| Development commits (pre-master) | ✅ |
|
||||
| Classic folder (deprecated) | ❌ |
|
||||
| All other versions | ❌ |
|
||||
|
||||
## Security Best Practices
|
||||
When using this project:
|
||||
1. Always use the latest stable version
|
||||
2. Review security advisories before updating
|
||||
3. Follow our security documentation and guidelines
|
||||
4. Keep your dependencies up to date
|
||||
5. Do not use code from the `classic/` folder as it is deprecated and unsupported
|
||||
|
||||
## Past Security Advisories
|
||||
For a list of past security advisories, please visit our [Security Advisory Page](https://github.com/Significant-Gravitas/AutoGPT/security/advisories) and [Huntr Disclosures Page](https://huntr.com/repos/significant-gravitas/autogpt).
|
||||
|
||||
---
|
||||
Last updated: November 2024
|
|
@ -0,0 +1,8 @@
|
|||
## github-repo-stats for Significant-Gravitas/Auto-GPT
|
||||
|
||||
- statistics for repository https://github.com/Significant-Gravitas/Auto-GPT
|
||||
- managed by GitHub Action: https://github.com/jgehrcke/github-repo-stats
|
||||
- workflow that created this README: `github-repo-stats`
|
||||
|
||||
**Latest report PDF**: [report.pdf](https://github.com/Significant-Gravitas/Auto-GPT/raw/github-repo-stats/Significant-Gravitas/Auto-GPT/latest-report/report.pdf)
|
||||
|
|
@ -0,0 +1,183 @@
|
|||
time_iso8601,forks_cumulative
|
||||
2023-03-16 00:00:00+00:00,1
|
||||
2023-03-17 00:00:00+00:00,2
|
||||
2023-03-18 00:00:00+00:00,4
|
||||
2023-03-20 00:00:00+00:00,6
|
||||
2023-03-28 00:00:00+00:00,8
|
||||
2023-03-29 00:00:00+00:00,11
|
||||
2023-03-30 00:00:00+00:00,14
|
||||
2023-03-31 00:00:00+00:00,18
|
||||
2023-04-01 00:00:00+00:00,24
|
||||
2023-04-02 00:00:00+00:00,103
|
||||
2023-04-03 00:00:00+00:00,373
|
||||
2023-04-04 00:00:00+00:00,639
|
||||
2023-04-05 00:00:00+00:00,940
|
||||
2023-04-06 00:00:00+00:00,1267
|
||||
2023-04-07 00:00:00+00:00,1610
|
||||
2023-04-08 00:00:00+00:00,1911
|
||||
2023-04-09 00:00:00+00:00,2192
|
||||
2023-04-10 00:00:00+00:00,2544
|
||||
2023-04-11 00:00:00+00:00,2997
|
||||
2023-04-12 00:00:00+00:00,4027
|
||||
2023-04-13 00:00:00+00:00,5875
|
||||
2023-04-14 00:00:00+00:00,7962
|
||||
2023-04-15 00:00:00+00:00,9402
|
||||
2023-04-16 00:00:00+00:00,10594
|
||||
2023-04-17 00:00:00+00:00,12071
|
||||
2023-04-18 00:00:00+00:00,13082
|
||||
2023-04-19 00:00:00+00:00,13834
|
||||
2023-04-20 00:00:00+00:00,14487
|
||||
2023-04-21 00:00:00+00:00,15039
|
||||
2023-04-22 00:00:00+00:00,15504
|
||||
2023-04-23 00:00:00+00:00,16203
|
||||
2023-04-24 00:00:00+00:00,17691
|
||||
2023-04-25 00:00:00+00:00,18780
|
||||
2023-04-26 00:00:00+00:00,19479
|
||||
2023-04-27 00:00:00+00:00,20403
|
||||
2023-04-28 00:00:00+00:00,20967
|
||||
2023-04-29 00:00:00+00:00,21318
|
||||
2023-04-30 00:00:00+00:00,21620
|
||||
2023-05-01 00:00:00+00:00,21925
|
||||
2023-05-02 00:00:00+00:00,22229
|
||||
2023-05-03 00:00:00+00:00,22578
|
||||
2023-05-04 00:00:00+00:00,22948
|
||||
2023-05-05 00:00:00+00:00,23266
|
||||
2023-05-06 00:00:00+00:00,23590
|
||||
2023-05-07 00:00:00+00:00,23840
|
||||
2023-05-08 00:00:00+00:00,24098
|
||||
2023-05-09 00:00:00+00:00,24314
|
||||
2023-05-10 00:00:00+00:00,24527
|
||||
2023-05-11 00:00:00+00:00,24704
|
||||
2023-05-12 00:00:00+00:00,24865
|
||||
2023-05-13 00:00:00+00:00,24994
|
||||
2023-05-14 00:00:00+00:00,25143
|
||||
2023-05-15 00:00:00+00:00,25312
|
||||
2023-05-16 00:00:00+00:00,25460
|
||||
2023-05-17 00:00:00+00:00,25609
|
||||
2023-05-18 00:00:00+00:00,25751
|
||||
2023-05-19 00:00:00+00:00,25883
|
||||
2023-05-20 00:00:00+00:00,25993
|
||||
2023-05-21 00:00:00+00:00,26084
|
||||
2023-05-22 00:00:00+00:00,26268
|
||||
2023-05-23 00:00:00+00:00,26399
|
||||
2023-05-24 00:00:00+00:00,26526
|
||||
2023-05-25 00:00:00+00:00,26664
|
||||
2023-05-26 00:00:00+00:00,26794
|
||||
2023-05-27 00:00:00+00:00,26889
|
||||
2023-05-28 00:00:00+00:00,26970
|
||||
2023-05-29 00:00:00+00:00,27072
|
||||
2023-05-30 00:00:00+00:00,27164
|
||||
2023-05-31 00:00:00+00:00,27258
|
||||
2023-06-01 00:00:00+00:00,27318
|
||||
2023-06-02 00:00:00+00:00,27402
|
||||
2023-06-03 00:00:00+00:00,27483
|
||||
2023-06-04 00:00:00+00:00,27544
|
||||
2023-06-05 00:00:00+00:00,27653
|
||||
2023-06-06 00:00:00+00:00,27743
|
||||
2023-06-07 00:00:00+00:00,27818
|
||||
2023-06-08 00:00:00+00:00,27902
|
||||
2023-06-09 00:00:00+00:00,27966
|
||||
2023-06-10 00:00:00+00:00,28017
|
||||
2023-06-11 00:00:00+00:00,28077
|
||||
2023-06-12 00:00:00+00:00,28149
|
||||
2023-06-13 00:00:00+00:00,28241
|
||||
2023-06-14 00:00:00+00:00,28306
|
||||
2023-06-15 00:00:00+00:00,28370
|
||||
2023-06-16 00:00:00+00:00,28424
|
||||
2023-06-17 00:00:00+00:00,28470
|
||||
2023-06-18 00:00:00+00:00,28519
|
||||
2023-06-19 00:00:00+00:00,28583
|
||||
2023-06-20 00:00:00+00:00,28651
|
||||
2023-06-21 00:00:00+00:00,28718
|
||||
2023-06-22 00:00:00+00:00,28761
|
||||
2023-06-23 00:00:00+00:00,28798
|
||||
2023-06-24 00:00:00+00:00,28839
|
||||
2023-06-25 00:00:00+00:00,28901
|
||||
2023-06-26 00:00:00+00:00,28966
|
||||
2023-06-27 00:00:00+00:00,29011
|
||||
2023-06-28 00:00:00+00:00,29072
|
||||
2023-06-29 00:00:00+00:00,29115
|
||||
2023-06-30 00:00:00+00:00,29156
|
||||
2023-07-01 00:00:00+00:00,29195
|
||||
2023-07-02 00:00:00+00:00,29242
|
||||
2023-07-03 00:00:00+00:00,29297
|
||||
2023-07-04 00:00:00+00:00,29340
|
||||
2023-07-05 00:00:00+00:00,29394
|
||||
2023-07-06 00:00:00+00:00,29460
|
||||
2023-07-07 00:00:00+00:00,29505
|
||||
2023-07-08 00:00:00+00:00,29536
|
||||
2023-07-09 00:00:00+00:00,29582
|
||||
2023-07-10 00:00:00+00:00,29635
|
||||
2023-07-11 00:00:00+00:00,29681
|
||||
2023-07-12 00:00:00+00:00,29721
|
||||
2023-07-13 00:00:00+00:00,29756
|
||||
2023-07-14 00:00:00+00:00,29792
|
||||
2023-07-15 00:00:00+00:00,29823
|
||||
2023-07-16 00:00:00+00:00,29857
|
||||
2023-07-17 00:00:00+00:00,29896
|
||||
2023-07-18 00:00:00+00:00,29942
|
||||
2023-07-19 00:00:00+00:00,29981
|
||||
2023-07-20 00:00:00+00:00,30015
|
||||
2023-07-21 00:00:00+00:00,30055
|
||||
2023-07-22 00:00:00+00:00,30072
|
||||
2023-07-23 00:00:00+00:00,30092
|
||||
2023-07-24 00:00:00+00:00,30124
|
||||
2023-07-25 00:00:00+00:00,30152
|
||||
2023-07-26 00:00:00+00:00,30194
|
||||
2023-07-27 00:00:00+00:00,30218
|
||||
2023-07-28 00:00:00+00:00,30245
|
||||
2023-07-29 00:00:00+00:00,30270
|
||||
2023-07-30 00:00:00+00:00,30291
|
||||
2023-07-31 00:00:00+00:00,30323
|
||||
2023-08-01 00:00:00+00:00,30360
|
||||
2023-08-02 00:00:00+00:00,30385
|
||||
2023-08-03 00:00:00+00:00,30425
|
||||
2023-08-04 00:00:00+00:00,30452
|
||||
2023-08-05 00:00:00+00:00,30473
|
||||
2023-08-06 00:00:00+00:00,30497
|
||||
2023-08-07 00:00:00+00:00,30518
|
||||
2023-08-08 00:00:00+00:00,30548
|
||||
2023-08-09 00:00:00+00:00,30571
|
||||
2023-08-10 00:00:00+00:00,30590
|
||||
2023-08-11 00:00:00+00:00,30618
|
||||
2023-08-12 00:00:00+00:00,30633
|
||||
2023-08-13 00:00:00+00:00,30651
|
||||
2023-08-14 00:00:00+00:00,30666
|
||||
2023-08-15 00:00:00+00:00,30694
|
||||
2023-08-16 00:00:00+00:00,30719
|
||||
2023-08-17 00:00:00+00:00,30756
|
||||
2023-08-18 00:00:00+00:00,30773
|
||||
2023-08-19 00:00:00+00:00,30785
|
||||
2023-08-20 00:00:00+00:00,30806
|
||||
2023-08-21 00:00:00+00:00,30826
|
||||
2023-08-22 00:00:00+00:00,30845
|
||||
2023-08-23 00:00:00+00:00,30869
|
||||
2023-08-24 00:00:00+00:00,30882
|
||||
2023-08-25 00:00:00+00:00,30899
|
||||
2023-08-26 00:00:00+00:00,30921
|
||||
2023-08-27 00:00:00+00:00,30933
|
||||
2023-08-28 00:00:00+00:00,30965
|
||||
2023-08-29 00:00:00+00:00,30988
|
||||
2023-08-30 00:00:00+00:00,31008
|
||||
2023-08-31 00:00:00+00:00,31028
|
||||
2023-09-01 00:00:00+00:00,31049
|
||||
2023-09-02 00:00:00+00:00,31068
|
||||
2023-09-03 00:00:00+00:00,31080
|
||||
2023-09-04 00:00:00+00:00,31097
|
||||
2023-09-05 00:00:00+00:00,31113
|
||||
2023-09-06 00:00:00+00:00,31133
|
||||
2023-09-07 00:00:00+00:00,31152
|
||||
2023-09-08 00:00:00+00:00,31171
|
||||
2023-09-09 00:00:00+00:00,31190
|
||||
2023-09-10 00:00:00+00:00,31208
|
||||
2023-09-11 00:00:00+00:00,31224
|
||||
2023-09-12 00:00:00+00:00,31251
|
||||
2023-09-13 00:00:00+00:00,31276
|
||||
2023-09-14 00:00:00+00:00,31298
|
||||
2023-09-15 00:00:00+00:00,31318
|
||||
2023-09-16 00:00:00+00:00,31336
|
||||
2023-09-17 00:00:00+00:00,31355
|
||||
2023-09-18 00:00:00+00:00,31393
|
||||
2023-09-19 00:00:00+00:00,31440
|
||||
2023-09-20 00:00:00+00:00,31486
|
||||
2023-09-21 00:00:00+00:00,31527
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59136,30954
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,15015,4373
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8305,2095
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,6014,3947
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5086,1541
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4330,1261
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3598,795
|
||||
/Significant-Gravitas/Auto-GPT/releases,2578,1276
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2418,829
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2296,611
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,70162,22628
|
||||
github.com,17692,3982
|
||||
docs.agpt.co,7088,1302
|
||||
Bing,5819,1652
|
||||
youtube.com,4080,1117
|
||||
news.agpt.co,3615,1151
|
||||
DuckDuckGo,1740,566
|
||||
lablab.ai,1556,520
|
||||
link.zhihu.com,955,359
|
||||
lilianweng.github.io,906,297
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59136,30954
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,15015,4373
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8305,2095
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,6014,3947
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5086,1541
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4330,1261
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3598,795
|
||||
/Significant-Gravitas/Auto-GPT/releases,2578,1276
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2418,829
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2296,611
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,70162,22628
|
||||
github.com,17692,3982
|
||||
docs.agpt.co,7088,1302
|
||||
Bing,5819,1652
|
||||
youtube.com,4080,1117
|
||||
news.agpt.co,3615,1151
|
||||
DuckDuckGo,1740,566
|
||||
lablab.ai,1556,520
|
||||
link.zhihu.com,955,359
|
||||
lilianweng.github.io,906,297
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59136,30954
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,15015,4373
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8305,2095
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,6014,3947
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5086,1541
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4330,1261
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3598,795
|
||||
/Significant-Gravitas/Auto-GPT/releases,2578,1276
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2418,829
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2296,611
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,70162,22628
|
||||
github.com,17692,3982
|
||||
docs.agpt.co,7088,1302
|
||||
Bing,5819,1652
|
||||
youtube.com,4080,1117
|
||||
news.agpt.co,3615,1151
|
||||
DuckDuckGo,1740,566
|
||||
lablab.ai,1556,520
|
||||
link.zhihu.com,955,359
|
||||
lilianweng.github.io,906,297
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59136,30954
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,15015,4373
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8305,2095
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,6014,3947
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5086,1541
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4330,1261
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3598,795
|
||||
/Significant-Gravitas/Auto-GPT/releases,2578,1276
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2418,829
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2296,611
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,70162,22628
|
||||
github.com,17692,3982
|
||||
docs.agpt.co,7088,1302
|
||||
Bing,5819,1652
|
||||
youtube.com,4080,1117
|
||||
news.agpt.co,3615,1151
|
||||
DuckDuckGo,1740,566
|
||||
lablab.ai,1556,520
|
||||
link.zhihu.com,955,359
|
||||
lilianweng.github.io,906,297
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59136,30954
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,15015,4373
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8305,2095
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,6014,3947
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5086,1541
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4330,1261
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3598,795
|
||||
/Significant-Gravitas/Auto-GPT/releases,2578,1276
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2418,829
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2296,611
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,70162,22628
|
||||
github.com,17692,3982
|
||||
docs.agpt.co,7088,1302
|
||||
Bing,5819,1652
|
||||
youtube.com,4080,1117
|
||||
news.agpt.co,3615,1151
|
||||
DuckDuckGo,1740,566
|
||||
lablab.ai,1556,520
|
||||
link.zhihu.com,955,359
|
||||
lilianweng.github.io,906,297
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59189,30795
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,17070,4847
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8254,2101
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5631,1704
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,5326,3444
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,5083,1474
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3537,777
|
||||
/Significant-Gravitas/Auto-GPT/releases,2479,1304
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2447,835
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/README.md,2400,729
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,71514,22667
|
||||
github.com,17881,3996
|
||||
docs.agpt.co,7098,1282
|
||||
Bing,5748,1636
|
||||
youtube.com,4305,1118
|
||||
news.agpt.co,3660,1173
|
||||
DuckDuckGo,1805,574
|
||||
lablab.ai,1545,514
|
||||
link.zhihu.com,1083,358
|
||||
lilianweng.github.io,864,281
|
|
|
@ -0,0 +1,28 @@
|
|||
time_iso8601,stars_cumulative
|
||||
2023-03-16 00:00:00+00:00,2
|
||||
2023-03-17 00:00:00+00:00,10
|
||||
2023-03-18 00:00:00+00:00,12
|
||||
2023-03-19 00:00:00+00:00,16
|
||||
2023-03-20 00:00:00+00:00,18
|
||||
2023-03-21 00:00:00+00:00,20
|
||||
2023-03-22 00:00:00+00:00,23
|
||||
2023-03-25 00:00:00+00:00,25
|
||||
2023-03-26 00:00:00+00:00,26
|
||||
2023-03-27 00:00:00+00:00,29
|
||||
2023-03-28 00:00:00+00:00,45
|
||||
2023-03-29 00:00:00+00:00,68
|
||||
2023-03-30 00:00:00+00:00,96
|
||||
2023-03-31 00:00:00+00:00,105
|
||||
2023-04-01 00:00:00+00:00,191
|
||||
2023-04-02 00:00:00+00:00,1651
|
||||
2023-04-03 00:00:00+00:00,4772
|
||||
2023-04-04 00:00:00+00:00,7452
|
||||
2023-04-05 00:00:00+00:00,9841
|
||||
2023-04-06 00:00:00+00:00,12575
|
||||
2023-04-07 00:00:00+00:00,15058
|
||||
2023-04-08 00:00:00+00:00,16774
|
||||
2023-04-09 00:00:00+00:00,18487
|
||||
2023-04-10 00:00:00+00:00,21031
|
||||
2023-04-11 00:00:00+00:00,25076
|
||||
2023-04-12 00:00:00+00:00,33027
|
||||
2023-04-13 00:00:00+00:00,40000
|
|
|
@ -0,0 +1,17 @@
|
|||
time_iso8601,clones_total,clones_unique,views_total,views_unique
|
||||
2023-09-06 00:00:00+00:00,1140,204,7021,2545
|
||||
2023-09-07 00:00:00+00:00,1048,349,14118,4805
|
||||
2023-09-08 00:00:00+00:00,1277,346,13009,4242
|
||||
2023-09-09 00:00:00+00:00,705,290,8951,3144
|
||||
2023-09-10 00:00:00+00:00,730,326,9224,3380
|
||||
2023-09-11 00:00:00+00:00,797,376,13188,4957
|
||||
2023-09-12 00:00:00+00:00,1367,453,18644,5326
|
||||
2023-09-13 00:00:00+00:00,966,393,21977,5489
|
||||
2023-09-14 00:00:00+00:00,1301,392,20522,5157
|
||||
2023-09-15 00:00:00+00:00,1694,377,17672,4644
|
||||
2023-09-16 00:00:00+00:00,1372,367,13370,3112
|
||||
2023-09-17 00:00:00+00:00,1185,358,17007,3549
|
||||
2023-09-18 00:00:00+00:00,1435,477,22457,5186
|
||||
2023-09-19 00:00:00+00:00,1099,456,20116,5239
|
||||
2023-09-20 00:00:00+00:00,1487,463,20147,4816
|
||||
2023-09-21 00:00:00+00:00,1450,397,17793,4513
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
Binary file not shown.
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,985 @@
|
|||
.markdown-body .octicon {
|
||||
display: inline-block;
|
||||
fill: currentColor;
|
||||
vertical-align: text-bottom;
|
||||
}
|
||||
|
||||
.markdown-body .anchor {
|
||||
float: left;
|
||||
line-height: 1;
|
||||
margin-left: -20px;
|
||||
padding-right: 4px;
|
||||
}
|
||||
|
||||
.markdown-body .anchor:focus {
|
||||
outline: none;
|
||||
}
|
||||
|
||||
.markdown-body h1 .octicon-link,
|
||||
.markdown-body h2 .octicon-link,
|
||||
.markdown-body h3 .octicon-link,
|
||||
.markdown-body h4 .octicon-link,
|
||||
.markdown-body h5 .octicon-link,
|
||||
.markdown-body h6 .octicon-link {
|
||||
color: #1b1f23;
|
||||
vertical-align: middle;
|
||||
visibility: hidden;
|
||||
}
|
||||
|
||||
.markdown-body h1:hover .anchor,
|
||||
.markdown-body h2:hover .anchor,
|
||||
.markdown-body h3:hover .anchor,
|
||||
.markdown-body h4:hover .anchor,
|
||||
.markdown-body h5:hover .anchor,
|
||||
.markdown-body h6:hover .anchor {
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.markdown-body h1:hover .anchor .octicon-link,
|
||||
.markdown-body h2:hover .anchor .octicon-link,
|
||||
.markdown-body h3:hover .anchor .octicon-link,
|
||||
.markdown-body h4:hover .anchor .octicon-link,
|
||||
.markdown-body h5:hover .anchor .octicon-link,
|
||||
.markdown-body h6:hover .anchor .octicon-link {
|
||||
visibility: visible;
|
||||
}
|
||||
|
||||
.markdown-body h1:hover .anchor .octicon-link:before,
|
||||
.markdown-body h2:hover .anchor .octicon-link:before,
|
||||
.markdown-body h3:hover .anchor .octicon-link:before,
|
||||
.markdown-body h4:hover .anchor .octicon-link:before,
|
||||
.markdown-body h5:hover .anchor .octicon-link:before,
|
||||
.markdown-body h6:hover .anchor .octicon-link:before {
|
||||
width: 16px;
|
||||
height: 16px;
|
||||
content: ' ';
|
||||
display: inline-block;
|
||||
background-image: url("data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' version='1.1' width='16' height='16' aria-hidden='true'%3E%3Cpath fill-rule='evenodd' d='M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z'%3E%3C/path%3E%3C/svg%3E");
|
||||
}.markdown-body {
|
||||
-ms-text-size-adjust: 100%;
|
||||
-webkit-text-size-adjust: 100%;
|
||||
line-height: 1.5;
|
||||
color: #24292e;
|
||||
font-family: -apple-system,BlinkMacSystemFont,Segoe UI,Helvetica,Arial,sans-serif,Apple Color Emoji,Segoe UI Emoji;
|
||||
font-size: 16px;
|
||||
line-height: 1.5;
|
||||
word-wrap: break-word;
|
||||
}
|
||||
|
||||
.markdown-body details {
|
||||
display: block;
|
||||
}
|
||||
|
||||
.markdown-body summary {
|
||||
display: list-item;
|
||||
}
|
||||
|
||||
.markdown-body a {
|
||||
background-color: initial;
|
||||
}
|
||||
|
||||
.markdown-body a:active,
|
||||
.markdown-body a:hover {
|
||||
outline-width: 0;
|
||||
}
|
||||
|
||||
.markdown-body strong {
|
||||
font-weight: inherit;
|
||||
font-weight: bolder;
|
||||
}
|
||||
|
||||
.markdown-body h1 {
|
||||
font-size: 2em;
|
||||
margin: .67em 0;
|
||||
}
|
||||
|
||||
.markdown-body img {
|
||||
border-style: none;
|
||||
}
|
||||
|
||||
.markdown-body code,
|
||||
.markdown-body kbd,
|
||||
.markdown-body pre {
|
||||
font-family: monospace,monospace;
|
||||
font-size: 1em;
|
||||
}
|
||||
|
||||
.markdown-body hr {
|
||||
box-sizing: initial;
|
||||
height: 0;
|
||||
overflow: visible;
|
||||
}
|
||||
|
||||
.markdown-body input {
|
||||
font: inherit;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.markdown-body input {
|
||||
overflow: visible;
|
||||
}
|
||||
|
||||
.markdown-body [type=checkbox] {
|
||||
box-sizing: border-box;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.markdown-body * {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
.markdown-body input {
|
||||
font-family: inherit;
|
||||
font-size: inherit;
|
||||
line-height: inherit;
|
||||
}
|
||||
|
||||
.markdown-body a {
|
||||
color: #0366d6;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.markdown-body a:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
.markdown-body strong {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.markdown-body hr {
|
||||
height: 0;
|
||||
margin: 15px 0;
|
||||
overflow: hidden;
|
||||
background: transparent;
|
||||
border: 0;
|
||||
border-bottom: 1px solid #dfe2e5;
|
||||
}
|
||||
|
||||
.markdown-body hr:after,
|
||||
.markdown-body hr:before {
|
||||
display: table;
|
||||
content: "";
|
||||
}
|
||||
|
||||
.markdown-body hr:after {
|
||||
clear: both;
|
||||
}
|
||||
|
||||
.markdown-body table {
|
||||
border-spacing: 0;
|
||||
border-collapse: collapse;
|
||||
}
|
||||
|
||||
.markdown-body td,
|
||||
.markdown-body th {
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.markdown-body details summary {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.markdown-body kbd {
|
||||
display: inline-block;
|
||||
padding: 3px 5px;
|
||||
font: 11px SFMono-Regular,Consolas,Liberation Mono,Menlo,monospace;
|
||||
line-height: 10px;
|
||||
color: #444d56;
|
||||
vertical-align: middle;
|
||||
background-color: #fafbfc;
|
||||
border: 1px solid #d1d5da;
|
||||
border-radius: 3px;
|
||||
box-shadow: inset 0 -1px 0 #d1d5da;
|
||||
}
|
||||
|
||||
.markdown-body h1,
|
||||
.markdown-body h2,
|
||||
.markdown-body h3,
|
||||
.markdown-body h4,
|
||||
.markdown-body h5,
|
||||
.markdown-body h6 {
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.markdown-body h1 {
|
||||
font-size: 32px;
|
||||
}
|
||||
|
||||
.markdown-body h1,
|
||||
.markdown-body h2 {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.markdown-body h2 {
|
||||
font-size: 24px;
|
||||
}
|
||||
|
||||
.markdown-body h3 {
|
||||
font-size: 20px;
|
||||
}
|
||||
|
||||
.markdown-body h3,
|
||||
.markdown-body h4 {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.markdown-body h4 {
|
||||
font-size: 16px;
|
||||
}
|
||||
|
||||
.markdown-body h5 {
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.markdown-body h5,
|
||||
.markdown-body h6 {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.markdown-body h6 {
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.markdown-body p {
|
||||
margin-top: 0;
|
||||
margin-bottom: 10px;
|
||||
}
|
||||
|
||||
.markdown-body blockquote {
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.markdown-body ol,
|
||||
.markdown-body ul {
|
||||
padding-left: 0;
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.markdown-body ol ol,
|
||||
.markdown-body ul ol {
|
||||
list-style-type: lower-roman;
|
||||
}
|
||||
|
||||
.markdown-body ol ol ol,
|
||||
.markdown-body ol ul ol,
|
||||
.markdown-body ul ol ol,
|
||||
.markdown-body ul ul ol {
|
||||
list-style-type: lower-alpha;
|
||||
}
|
||||
|
||||
.markdown-body dd {
|
||||
margin-left: 0;
|
||||
}
|
||||
|
||||
.markdown-body code,
|
||||
.markdown-body pre {
|
||||
font-family: SFMono-Regular,Consolas,Liberation Mono,Menlo,monospace;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.markdown-body pre {
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.markdown-body input::-webkit-inner-spin-button,
|
||||
.markdown-body input::-webkit-outer-spin-button {
|
||||
margin: 0;
|
||||
-webkit-appearance: none;
|
||||
appearance: none;
|
||||
}
|
||||
|
||||
.markdown-body :checked+.radio-label {
|
||||
position: relative;
|
||||
z-index: 1;
|
||||
border-color: #0366d6;
|
||||
}
|
||||
|
||||
.markdown-body .border {
|
||||
border: 1px solid #e1e4e8!important;
|
||||
}
|
||||
|
||||
.markdown-body .border-0 {
|
||||
border: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .border-bottom {
|
||||
border-bottom: 1px solid #e1e4e8!important;
|
||||
}
|
||||
|
||||
.markdown-body .rounded-1 {
|
||||
border-radius: 3px!important;
|
||||
}
|
||||
|
||||
.markdown-body .bg-white {
|
||||
background-color: #fff!important;
|
||||
}
|
||||
|
||||
.markdown-body .bg-gray-light {
|
||||
background-color: #fafbfc!important;
|
||||
}
|
||||
|
||||
.markdown-body .text-gray-light {
|
||||
color: #6a737d!important;
|
||||
}
|
||||
|
||||
.markdown-body .mb-0 {
|
||||
margin-bottom: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .my-2 {
|
||||
margin-top: 8px!important;
|
||||
margin-bottom: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-0 {
|
||||
padding-left: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .py-0 {
|
||||
padding-top: 0!important;
|
||||
padding-bottom: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-1 {
|
||||
padding-left: 4px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-2 {
|
||||
padding-left: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .py-2 {
|
||||
padding-top: 8px!important;
|
||||
padding-bottom: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-3,
|
||||
.markdown-body .px-3 {
|
||||
padding-left: 16px!important;
|
||||
}
|
||||
|
||||
.markdown-body .px-3 {
|
||||
padding-right: 16px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-4 {
|
||||
padding-left: 24px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-5 {
|
||||
padding-left: 32px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-6 {
|
||||
padding-left: 40px!important;
|
||||
}
|
||||
|
||||
.markdown-body .f6 {
|
||||
font-size: 12px!important;
|
||||
}
|
||||
|
||||
.markdown-body .lh-condensed {
|
||||
line-height: 1.25!important;
|
||||
}
|
||||
|
||||
.markdown-body .text-bold {
|
||||
font-weight: 600!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-c {
|
||||
color: #6a737d;
|
||||
}
|
||||
|
||||
.markdown-body .pl-c1,
|
||||
.markdown-body .pl-s .pl-v {
|
||||
color: #005cc5;
|
||||
}
|
||||
|
||||
.markdown-body .pl-e,
|
||||
.markdown-body .pl-en {
|
||||
color: #6f42c1;
|
||||
}
|
||||
|
||||
.markdown-body .pl-s .pl-s1,
|
||||
.markdown-body .pl-smi {
|
||||
color: #24292e;
|
||||
}
|
||||
|
||||
.markdown-body .pl-ent {
|
||||
color: #22863a;
|
||||
}
|
||||
|
||||
.markdown-body .pl-k {
|
||||
color: #d73a49;
|
||||
}
|
||||
|
||||
.markdown-body .pl-pds,
|
||||
.markdown-body .pl-s,
|
||||
.markdown-body .pl-s .pl-pse .pl-s1,
|
||||
.markdown-body .pl-sr,
|
||||
.markdown-body .pl-sr .pl-cce,
|
||||
.markdown-body .pl-sr .pl-sra,
|
||||
.markdown-body .pl-sr .pl-sre {
|
||||
color: #032f62;
|
||||
}
|
||||
|
||||
.markdown-body .pl-smw,
|
||||
.markdown-body .pl-v {
|
||||
color: #e36209;
|
||||
}
|
||||
|
||||
.markdown-body .pl-bu {
|
||||
color: #b31d28;
|
||||
}
|
||||
|
||||
.markdown-body .pl-ii {
|
||||
color: #fafbfc;
|
||||
background-color: #b31d28;
|
||||
}
|
||||
|
||||
.markdown-body .pl-c2 {
|
||||
color: #fafbfc;
|
||||
background-color: #d73a49;
|
||||
}
|
||||
|
||||
.markdown-body .pl-c2:before {
|
||||
content: "^M";
|
||||
}
|
||||
|
||||
.markdown-body .pl-sr .pl-cce {
|
||||
font-weight: 700;
|
||||
color: #22863a;
|
||||
}
|
||||
|
||||
.markdown-body .pl-ml {
|
||||
color: #735c0f;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mh,
|
||||
.markdown-body .pl-mh .pl-en,
|
||||
.markdown-body .pl-ms {
|
||||
font-weight: 700;
|
||||
color: #005cc5;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mi {
|
||||
font-style: italic;
|
||||
color: #24292e;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mb {
|
||||
font-weight: 700;
|
||||
color: #24292e;
|
||||
}
|
||||
|
||||
.markdown-body .pl-md {
|
||||
color: #b31d28;
|
||||
background-color: #ffeef0;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mi1 {
|
||||
color: #22863a;
|
||||
background-color: #f0fff4;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mc {
|
||||
color: #e36209;
|
||||
background-color: #ffebda;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mi2 {
|
||||
color: #f6f8fa;
|
||||
background-color: #005cc5;
|
||||
}
|
||||
|
||||
.markdown-body .pl-mdr {
|
||||
font-weight: 700;
|
||||
color: #6f42c1;
|
||||
}
|
||||
|
||||
.markdown-body .pl-ba {
|
||||
color: #586069;
|
||||
}
|
||||
|
||||
.markdown-body .pl-sg {
|
||||
color: #959da5;
|
||||
}
|
||||
|
||||
.markdown-body .pl-corl {
|
||||
text-decoration: underline;
|
||||
color: #032f62;
|
||||
}
|
||||
|
||||
.markdown-body .mb-0 {
|
||||
margin-bottom: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .my-2 {
|
||||
margin-bottom: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .my-2 {
|
||||
margin-top: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-0 {
|
||||
padding-left: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .py-0 {
|
||||
padding-top: 0!important;
|
||||
padding-bottom: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-1 {
|
||||
padding-left: 4px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-2 {
|
||||
padding-left: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .py-2 {
|
||||
padding-top: 8px!important;
|
||||
padding-bottom: 8px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-3 {
|
||||
padding-left: 16px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-4 {
|
||||
padding-left: 24px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-5 {
|
||||
padding-left: 32px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-6 {
|
||||
padding-left: 40px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-7 {
|
||||
padding-left: 48px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-8 {
|
||||
padding-left: 64px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-9 {
|
||||
padding-left: 80px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-10 {
|
||||
padding-left: 96px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-11 {
|
||||
padding-left: 112px!important;
|
||||
}
|
||||
|
||||
.markdown-body .pl-12 {
|
||||
padding-left: 128px!important;
|
||||
}
|
||||
|
||||
.markdown-body hr {
|
||||
border-bottom-color: #eee;
|
||||
}
|
||||
|
||||
.markdown-body kbd {
|
||||
display: inline-block;
|
||||
padding: 3px 5px;
|
||||
font: 11px SFMono-Regular,Consolas,Liberation Mono,Menlo,monospace;
|
||||
line-height: 10px;
|
||||
color: #444d56;
|
||||
vertical-align: middle;
|
||||
background-color: #fafbfc;
|
||||
border: 1px solid #d1d5da;
|
||||
border-radius: 3px;
|
||||
box-shadow: inset 0 -1px 0 #d1d5da;
|
||||
}
|
||||
|
||||
.markdown-body:after,
|
||||
.markdown-body:before {
|
||||
display: table;
|
||||
content: "";
|
||||
}
|
||||
|
||||
.markdown-body:after {
|
||||
clear: both;
|
||||
}
|
||||
|
||||
.markdown-body>:first-child {
|
||||
margin-top: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body>:last-child {
|
||||
margin-bottom: 0!important;
|
||||
}
|
||||
|
||||
.markdown-body a:not([href]) {
|
||||
color: inherit;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
.markdown-body blockquote,
|
||||
.markdown-body details,
|
||||
.markdown-body dl,
|
||||
.markdown-body ol,
|
||||
.markdown-body p,
|
||||
.markdown-body pre,
|
||||
.markdown-body table,
|
||||
.markdown-body ul {
|
||||
margin-top: 0;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.markdown-body hr {
|
||||
height: .25em;
|
||||
padding: 0;
|
||||
margin: 24px 0;
|
||||
background-color: #e1e4e8;
|
||||
border: 0;
|
||||
}
|
||||
|
||||
.markdown-body blockquote {
|
||||
padding: 0 1em;
|
||||
color: #6a737d;
|
||||
border-left: .25em solid #dfe2e5;
|
||||
}
|
||||
|
||||
.markdown-body blockquote>:first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.markdown-body blockquote>:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.markdown-body h1,
|
||||
.markdown-body h2,
|
||||
.markdown-body h3,
|
||||
.markdown-body h4,
|
||||
.markdown-body h5,
|
||||
.markdown-body h6 {
|
||||
margin-top: 24px;
|
||||
margin-bottom: 16px;
|
||||
font-weight: 600;
|
||||
line-height: 1.25;
|
||||
}
|
||||
|
||||
.markdown-body h1 {
|
||||
font-size: 2em;
|
||||
}
|
||||
|
||||
.markdown-body h1,
|
||||
.markdown-body h2 {
|
||||
padding-bottom: .3em;
|
||||
border-bottom: 1px solid #eaecef;
|
||||
}
|
||||
|
||||
.markdown-body h2 {
|
||||
font-size: 1.5em;
|
||||
}
|
||||
|
||||
.markdown-body h3 {
|
||||
font-size: 1.25em;
|
||||
}
|
||||
|
||||
.markdown-body h4 {
|
||||
font-size: 1em;
|
||||
}
|
||||
|
||||
.markdown-body h5 {
|
||||
font-size: .875em;
|
||||
}
|
||||
|
||||
.markdown-body h6 {
|
||||
font-size: .85em;
|
||||
color: #6a737d;
|
||||
}
|
||||
|
||||
.markdown-body ol,
|
||||
.markdown-body ul {
|
||||
padding-left: 2em;
|
||||
}
|
||||
|
||||
.markdown-body ol ol,
|
||||
.markdown-body ol ul,
|
||||
.markdown-body ul ol,
|
||||
.markdown-body ul ul {
|
||||
margin-top: 0;
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.markdown-body li {
|
||||
word-wrap: break-all;
|
||||
}
|
||||
|
||||
.markdown-body li>p {
|
||||
margin-top: 16px;
|
||||
}
|
||||
|
||||
.markdown-body li+li {
|
||||
margin-top: .25em;
|
||||
}
|
||||
|
||||
.markdown-body dl {
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.markdown-body dl dt {
|
||||
padding: 0;
|
||||
margin-top: 16px;
|
||||
font-size: 1em;
|
||||
font-style: italic;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.markdown-body dl dd {
|
||||
padding: 0 16px;
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.markdown-body table {
|
||||
display: block;
|
||||
width: 100%;
|
||||
overflow: auto;
|
||||
}
|
||||
|
||||
.markdown-body table th {
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.markdown-body table td,
|
||||
.markdown-body table th {
|
||||
padding: 6px 13px;
|
||||
border: 1px solid #dfe2e5;
|
||||
}
|
||||
|
||||
.markdown-body table tr {
|
||||
background-color: #fff;
|
||||
border-top: 1px solid #c6cbd1;
|
||||
}
|
||||
|
||||
.markdown-body table tr:nth-child(2n) {
|
||||
background-color: #f6f8fa;
|
||||
}
|
||||
|
||||
.markdown-body img {
|
||||
max-width: 100%;
|
||||
box-sizing: initial;
|
||||
background-color: #fff;
|
||||
}
|
||||
|
||||
.markdown-body img[align=right] {
|
||||
padding-left: 20px;
|
||||
}
|
||||
|
||||
.markdown-body img[align=left] {
|
||||
padding-right: 20px;
|
||||
}
|
||||
|
||||
.markdown-body code {
|
||||
padding: .2em .4em;
|
||||
margin: 0;
|
||||
font-size: 85%;
|
||||
background-color: rgba(27,31,35,.05);
|
||||
border-radius: 3px;
|
||||
}
|
||||
|
||||
.markdown-body pre {
|
||||
word-wrap: normal;
|
||||
}
|
||||
|
||||
.markdown-body pre>code {
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
font-size: 100%;
|
||||
word-break: normal;
|
||||
white-space: pre;
|
||||
background: transparent;
|
||||
border: 0;
|
||||
}
|
||||
|
||||
.markdown-body .highlight {
|
||||
margin-bottom: 16px;
|
||||
}
|
||||
|
||||
.markdown-body .highlight pre {
|
||||
margin-bottom: 0;
|
||||
word-break: normal;
|
||||
}
|
||||
|
||||
.markdown-body .highlight pre,
|
||||
.markdown-body pre {
|
||||
padding: 16px;
|
||||
overflow: auto;
|
||||
font-size: 85%;
|
||||
line-height: 1.45;
|
||||
background-color: #f6f8fa;
|
||||
border-radius: 3px;
|
||||
}
|
||||
|
||||
.markdown-body pre code {
|
||||
display: inline;
|
||||
max-width: auto;
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
overflow: visible;
|
||||
line-height: inherit;
|
||||
word-wrap: normal;
|
||||
background-color: initial;
|
||||
border: 0;
|
||||
}
|
||||
|
||||
.markdown-body .commit-tease-sha {
|
||||
display: inline-block;
|
||||
font-family: SFMono-Regular,Consolas,Liberation Mono,Menlo,monospace;
|
||||
font-size: 90%;
|
||||
color: #444d56;
|
||||
}
|
||||
|
||||
.markdown-body .full-commit .btn-outline:not(:disabled):hover {
|
||||
color: #005cc5;
|
||||
border-color: #005cc5;
|
||||
}
|
||||
|
||||
.markdown-body .blob-wrapper {
|
||||
overflow-x: auto;
|
||||
overflow-y: hidden;
|
||||
}
|
||||
|
||||
.markdown-body .blob-wrapper-embedded {
|
||||
max-height: 240px;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.markdown-body .blob-num {
|
||||
width: 1%;
|
||||
min-width: 50px;
|
||||
padding-right: 10px;
|
||||
padding-left: 10px;
|
||||
font-family: SFMono-Regular,Consolas,Liberation Mono,Menlo,monospace;
|
||||
font-size: 12px;
|
||||
line-height: 20px;
|
||||
color: rgba(27,31,35,.3);
|
||||
text-align: right;
|
||||
white-space: nowrap;
|
||||
vertical-align: top;
|
||||
cursor: pointer;
|
||||
-webkit-user-select: none;
|
||||
-moz-user-select: none;
|
||||
-ms-user-select: none;
|
||||
user-select: none;
|
||||
}
|
||||
|
||||
.markdown-body .blob-num:hover {
|
||||
color: rgba(27,31,35,.6);
|
||||
}
|
||||
|
||||
.markdown-body .blob-num:before {
|
||||
content: attr(data-line-number);
|
||||
}
|
||||
|
||||
.markdown-body .blob-code {
|
||||
position: relative;
|
||||
padding-right: 10px;
|
||||
padding-left: 10px;
|
||||
line-height: 20px;
|
||||
vertical-align: top;
|
||||
}
|
||||
|
||||
.markdown-body .blob-code-inner {
|
||||
overflow: visible;
|
||||
font-family: SFMono-Regular,Consolas,Liberation Mono,Menlo,monospace;
|
||||
font-size: 12px;
|
||||
color: #24292e;
|
||||
word-wrap: normal;
|
||||
white-space: pre;
|
||||
}
|
||||
|
||||
.markdown-body .pl-token.active,
|
||||
.markdown-body .pl-token:hover {
|
||||
cursor: pointer;
|
||||
background: #ffea7f;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="1"] {
|
||||
-moz-tab-size: 1;
|
||||
tab-size: 1;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="2"] {
|
||||
-moz-tab-size: 2;
|
||||
tab-size: 2;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="3"] {
|
||||
-moz-tab-size: 3;
|
||||
tab-size: 3;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="4"] {
|
||||
-moz-tab-size: 4;
|
||||
tab-size: 4;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="5"] {
|
||||
-moz-tab-size: 5;
|
||||
tab-size: 5;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="6"] {
|
||||
-moz-tab-size: 6;
|
||||
tab-size: 6;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="7"] {
|
||||
-moz-tab-size: 7;
|
||||
tab-size: 7;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="8"] {
|
||||
-moz-tab-size: 8;
|
||||
tab-size: 8;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="9"] {
|
||||
-moz-tab-size: 9;
|
||||
tab-size: 9;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="10"] {
|
||||
-moz-tab-size: 10;
|
||||
tab-size: 10;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="11"] {
|
||||
-moz-tab-size: 11;
|
||||
tab-size: 11;
|
||||
}
|
||||
|
||||
.markdown-body .tab-size[data-tab-size="12"] {
|
||||
-moz-tab-size: 12;
|
||||
tab-size: 12;
|
||||
}
|
||||
|
||||
.markdown-body .task-list-item {
|
||||
list-style-type: none;
|
||||
}
|
||||
|
||||
.markdown-body .task-list-item+.task-list-item {
|
||||
margin-top: 3px;
|
||||
}
|
||||
|
||||
.markdown-body .task-list-item input {
|
||||
margin: 0 .2em .25em -1.6em;
|
||||
vertical-align: middle;
|
||||
}
|
|
@ -0,0 +1,8 @@
|
|||
## github-repo-stats for Significant-Gravitas/AutoGPT
|
||||
|
||||
- statistics for repository https://github.com/Significant-Gravitas/AutoGPT
|
||||
- managed by GitHub Action: https://github.com/jgehrcke/github-repo-stats
|
||||
- workflow that created this README: `Repo - Github Stats`
|
||||
|
||||
**Latest report PDF**: [GitHub-rendered](https://github.com/Significant-Gravitas/AutoGPT/blob/github-repo-stats/Significant-Gravitas/AutoGPT/latest-report/report.pdf), [raw](https://github.com/Significant-Gravitas/AutoGPT/raw/github-repo-stats/Significant-Gravitas/AutoGPT/latest-report/report.pdf)
|
||||
|
|
@ -0,0 +1,661 @@
|
|||
time_iso8601,forks_cumulative
|
||||
2023-03-16 00:00:00+00:00,1
|
||||
2023-03-17 00:00:00+00:00,2
|
||||
2023-03-18 00:00:00+00:00,4
|
||||
2023-03-20 00:00:00+00:00,6
|
||||
2023-03-28 00:00:00+00:00,8
|
||||
2023-03-29 00:00:00+00:00,11
|
||||
2023-03-30 00:00:00+00:00,14
|
||||
2023-03-31 00:00:00+00:00,18
|
||||
2023-04-01 00:00:00+00:00,24
|
||||
2023-04-02 00:00:00+00:00,102
|
||||
2023-04-03 00:00:00+00:00,361
|
||||
2023-04-04 00:00:00+00:00,606
|
||||
2023-04-05 00:00:00+00:00,881
|
||||
2023-04-06 00:00:00+00:00,1181
|
||||
2023-04-07 00:00:00+00:00,1506
|
||||
2023-04-08 00:00:00+00:00,1795
|
||||
2023-04-09 00:00:00+00:00,2058
|
||||
2023-04-10 00:00:00+00:00,2391
|
||||
2023-04-11 00:00:00+00:00,2822
|
||||
2023-04-12 00:00:00+00:00,3812
|
||||
2023-04-13 00:00:00+00:00,5569
|
||||
2023-04-14 00:00:00+00:00,7564
|
||||
2023-04-15 00:00:00+00:00,8925
|
||||
2023-04-16 00:00:00+00:00,10042
|
||||
2023-04-17 00:00:00+00:00,11446
|
||||
2023-04-18 00:00:00+00:00,12408
|
||||
2023-04-19 00:00:00+00:00,13119
|
||||
2023-04-20 00:00:00+00:00,13742
|
||||
2023-04-21 00:00:00+00:00,14261
|
||||
2023-04-22 00:00:00+00:00,14698
|
||||
2023-04-23 00:00:00+00:00,15344
|
||||
2023-04-24 00:00:00+00:00,16733
|
||||
2023-04-25 00:00:00+00:00,17759
|
||||
2023-04-26 00:00:00+00:00,18411
|
||||
2023-04-27 00:00:00+00:00,19285
|
||||
2023-04-28 00:00:00+00:00,19820
|
||||
2023-04-29 00:00:00+00:00,20143
|
||||
2023-04-30 00:00:00+00:00,20427
|
||||
2023-05-01 00:00:00+00:00,20712
|
||||
2023-05-02 00:00:00+00:00,21001
|
||||
2023-05-03 00:00:00+00:00,21328
|
||||
2023-05-04 00:00:00+00:00,21678
|
||||
2023-05-05 00:00:00+00:00,21982
|
||||
2023-05-06 00:00:00+00:00,22292
|
||||
2023-05-07 00:00:00+00:00,22534
|
||||
2023-05-08 00:00:00+00:00,22776
|
||||
2023-05-09 00:00:00+00:00,22978
|
||||
2023-05-10 00:00:00+00:00,23182
|
||||
2023-05-11 00:00:00+00:00,23347
|
||||
2023-05-12 00:00:00+00:00,23497
|
||||
2023-05-13 00:00:00+00:00,23620
|
||||
2023-05-14 00:00:00+00:00,23758
|
||||
2023-05-15 00:00:00+00:00,23916
|
||||
2023-05-16 00:00:00+00:00,24056
|
||||
2023-05-17 00:00:00+00:00,24191
|
||||
2023-05-18 00:00:00+00:00,24328
|
||||
2023-05-19 00:00:00+00:00,24450
|
||||
2023-05-20 00:00:00+00:00,24553
|
||||
2023-05-21 00:00:00+00:00,24635
|
||||
2023-05-22 00:00:00+00:00,24810
|
||||
2023-05-23 00:00:00+00:00,24931
|
||||
2023-05-24 00:00:00+00:00,25048
|
||||
2023-05-25 00:00:00+00:00,25176
|
||||
2023-05-26 00:00:00+00:00,25296
|
||||
2023-05-27 00:00:00+00:00,25383
|
||||
2023-05-28 00:00:00+00:00,25456
|
||||
2023-05-29 00:00:00+00:00,25544
|
||||
2023-05-30 00:00:00+00:00,25634
|
||||
2023-05-31 00:00:00+00:00,25721
|
||||
2023-06-01 00:00:00+00:00,25778
|
||||
2023-06-02 00:00:00+00:00,25855
|
||||
2023-06-03 00:00:00+00:00,25932
|
||||
2023-06-04 00:00:00+00:00,25989
|
||||
2023-06-05 00:00:00+00:00,26090
|
||||
2023-06-06 00:00:00+00:00,26176
|
||||
2023-06-07 00:00:00+00:00,26247
|
||||
2023-06-08 00:00:00+00:00,26328
|
||||
2023-06-09 00:00:00+00:00,26390
|
||||
2023-06-10 00:00:00+00:00,26438
|
||||
2023-06-11 00:00:00+00:00,26496
|
||||
2023-06-12 00:00:00+00:00,26559
|
||||
2023-06-13 00:00:00+00:00,26647
|
||||
2023-06-14 00:00:00+00:00,26708
|
||||
2023-06-15 00:00:00+00:00,26769
|
||||
2023-06-16 00:00:00+00:00,26816
|
||||
2023-06-17 00:00:00+00:00,26859
|
||||
2023-06-18 00:00:00+00:00,26904
|
||||
2023-06-19 00:00:00+00:00,26963
|
||||
2023-06-20 00:00:00+00:00,27022
|
||||
2023-06-21 00:00:00+00:00,27088
|
||||
2023-06-22 00:00:00+00:00,27128
|
||||
2023-06-23 00:00:00+00:00,27163
|
||||
2023-06-24 00:00:00+00:00,27202
|
||||
2023-06-25 00:00:00+00:00,27261
|
||||
2023-06-26 00:00:00+00:00,27323
|
||||
2023-06-27 00:00:00+00:00,27365
|
||||
2023-06-28 00:00:00+00:00,27425
|
||||
2023-06-29 00:00:00+00:00,27464
|
||||
2023-06-30 00:00:00+00:00,27505
|
||||
2023-07-01 00:00:00+00:00,27542
|
||||
2023-07-02 00:00:00+00:00,27584
|
||||
2023-07-03 00:00:00+00:00,27638
|
||||
2023-07-04 00:00:00+00:00,27677
|
||||
2023-07-05 00:00:00+00:00,27729
|
||||
2023-07-06 00:00:00+00:00,27790
|
||||
2023-07-07 00:00:00+00:00,27834
|
||||
2023-07-08 00:00:00+00:00,27862
|
||||
2023-07-09 00:00:00+00:00,27906
|
||||
2023-07-10 00:00:00+00:00,27955
|
||||
2023-07-11 00:00:00+00:00,27994
|
||||
2023-07-12 00:00:00+00:00,28032
|
||||
2023-07-13 00:00:00+00:00,28065
|
||||
2023-07-14 00:00:00+00:00,28101
|
||||
2023-07-15 00:00:00+00:00,28130
|
||||
2023-07-16 00:00:00+00:00,28164
|
||||
2023-07-17 00:00:00+00:00,28201
|
||||
2023-07-18 00:00:00+00:00,28244
|
||||
2023-07-19 00:00:00+00:00,28281
|
||||
2023-07-20 00:00:00+00:00,28309
|
||||
2023-07-21 00:00:00+00:00,28346
|
||||
2023-07-22 00:00:00+00:00,28362
|
||||
2023-07-23 00:00:00+00:00,28382
|
||||
2023-07-24 00:00:00+00:00,28412
|
||||
2023-07-25 00:00:00+00:00,28438
|
||||
2023-07-26 00:00:00+00:00,28476
|
||||
2023-07-27 00:00:00+00:00,28500
|
||||
2023-07-28 00:00:00+00:00,28525
|
||||
2023-07-29 00:00:00+00:00,28549
|
||||
2023-07-30 00:00:00+00:00,28569
|
||||
2023-07-31 00:00:00+00:00,28600
|
||||
2023-08-01 00:00:00+00:00,28634
|
||||
2023-08-02 00:00:00+00:00,28658
|
||||
2023-08-03 00:00:00+00:00,28696
|
||||
2023-08-04 00:00:00+00:00,28720
|
||||
2023-08-05 00:00:00+00:00,28740
|
||||
2023-08-06 00:00:00+00:00,28759
|
||||
2023-08-07 00:00:00+00:00,28780
|
||||
2023-08-08 00:00:00+00:00,28806
|
||||
2023-08-09 00:00:00+00:00,28826
|
||||
2023-08-10 00:00:00+00:00,28843
|
||||
2023-08-11 00:00:00+00:00,28870
|
||||
2023-08-12 00:00:00+00:00,28885
|
||||
2023-08-13 00:00:00+00:00,28900
|
||||
2023-08-14 00:00:00+00:00,28913
|
||||
2023-08-15 00:00:00+00:00,28936
|
||||
2023-08-16 00:00:00+00:00,28960
|
||||
2023-08-17 00:00:00+00:00,28993
|
||||
2023-08-18 00:00:00+00:00,29008
|
||||
2023-08-19 00:00:00+00:00,29020
|
||||
2023-08-20 00:00:00+00:00,29037
|
||||
2023-08-21 00:00:00+00:00,29054
|
||||
2023-08-22 00:00:00+00:00,29070
|
||||
2023-08-23 00:00:00+00:00,29090
|
||||
2023-08-24 00:00:00+00:00,29103
|
||||
2023-08-25 00:00:00+00:00,29121
|
||||
2023-08-26 00:00:00+00:00,29139
|
||||
2023-08-27 00:00:00+00:00,29150
|
||||
2023-08-28 00:00:00+00:00,29178
|
||||
2023-08-29 00:00:00+00:00,29200
|
||||
2023-08-30 00:00:00+00:00,29221
|
||||
2023-08-31 00:00:00+00:00,29241
|
||||
2023-09-01 00:00:00+00:00,29260
|
||||
2023-09-02 00:00:00+00:00,29278
|
||||
2023-09-03 00:00:00+00:00,29287
|
||||
2023-09-04 00:00:00+00:00,29303
|
||||
2023-09-05 00:00:00+00:00,29318
|
||||
2023-09-06 00:00:00+00:00,29337
|
||||
2023-09-07 00:00:00+00:00,29355
|
||||
2023-09-08 00:00:00+00:00,29373
|
||||
2023-09-09 00:00:00+00:00,29391
|
||||
2023-09-10 00:00:00+00:00,29408
|
||||
2023-09-11 00:00:00+00:00,29422
|
||||
2023-09-12 00:00:00+00:00,29446
|
||||
2023-09-13 00:00:00+00:00,29469
|
||||
2023-09-14 00:00:00+00:00,29489
|
||||
2023-09-15 00:00:00+00:00,29506
|
||||
2023-09-16 00:00:00+00:00,29520
|
||||
2023-09-17 00:00:00+00:00,29539
|
||||
2023-09-18 00:00:00+00:00,29570
|
||||
2023-09-19 00:00:00+00:00,29608
|
||||
2023-09-20 00:00:00+00:00,29649
|
||||
2023-09-21 00:00:00+00:00,29685
|
||||
2023-09-22 00:00:00+00:00,29715
|
||||
2023-09-23 00:00:00+00:00,29752
|
||||
2023-09-24 00:00:00+00:00,29787
|
||||
2023-09-25 00:00:00+00:00,29846
|
||||
2023-09-26 00:00:00+00:00,29923
|
||||
2023-09-27 00:00:00+00:00,30013
|
||||
2023-09-28 00:00:00+00:00,30104
|
||||
2023-09-29 00:00:00+00:00,30201
|
||||
2023-09-30 00:00:00+00:00,30270
|
||||
2023-10-01 00:00:00+00:00,30342
|
||||
2023-10-02 00:00:00+00:00,30419
|
||||
2023-10-03 00:00:00+00:00,30504
|
||||
2023-10-04 00:00:00+00:00,30585
|
||||
2023-10-05 00:00:00+00:00,30660
|
||||
2023-10-06 00:00:00+00:00,30734
|
||||
2023-10-07 00:00:00+00:00,30810
|
||||
2023-10-08 00:00:00+00:00,30885
|
||||
2023-10-09 00:00:00+00:00,30985
|
||||
2023-10-10 00:00:00+00:00,31074
|
||||
2023-10-11 00:00:00+00:00,31172
|
||||
2023-10-12 00:00:00+00:00,31248
|
||||
2023-10-13 00:00:00+00:00,31322
|
||||
2023-10-14 00:00:00+00:00,31386
|
||||
2023-10-15 00:00:00+00:00,31439
|
||||
2023-10-16 00:00:00+00:00,31525
|
||||
2023-10-17 00:00:00+00:00,31580
|
||||
2023-10-18 00:00:00+00:00,31664
|
||||
2023-10-19 00:00:00+00:00,31757
|
||||
2023-10-20 00:00:00+00:00,31840
|
||||
2023-10-21 00:00:00+00:00,31898
|
||||
2023-10-22 00:00:00+00:00,31959
|
||||
2023-10-23 00:00:00+00:00,32070
|
||||
2023-10-24 00:00:00+00:00,32167
|
||||
2023-10-25 00:00:00+00:00,32261
|
||||
2023-10-26 00:00:00+00:00,32367
|
||||
2023-10-27 00:00:00+00:00,32447
|
||||
2023-10-28 00:00:00+00:00,32508
|
||||
2023-10-29 00:00:00+00:00,32556
|
||||
2023-10-30 00:00:00+00:00,32636
|
||||
2023-10-31 00:00:00+00:00,32710
|
||||
2023-11-01 00:00:00+00:00,32790
|
||||
2023-11-02 00:00:00+00:00,32886
|
||||
2023-11-03 00:00:00+00:00,32964
|
||||
2023-11-04 00:00:00+00:00,33023
|
||||
2023-11-05 00:00:00+00:00,33087
|
||||
2023-11-06 00:00:00+00:00,33155
|
||||
2023-11-07 00:00:00+00:00,33242
|
||||
2023-11-08 00:00:00+00:00,33315
|
||||
2023-11-09 00:00:00+00:00,33373
|
||||
2023-11-10 00:00:00+00:00,33425
|
||||
2023-11-11 00:00:00+00:00,33480
|
||||
2023-11-12 00:00:00+00:00,33531
|
||||
2023-11-13 00:00:00+00:00,33592
|
||||
2023-11-14 00:00:00+00:00,33667
|
||||
2023-11-15 00:00:00+00:00,33726
|
||||
2023-11-16 00:00:00+00:00,33787
|
||||
2023-11-17 00:00:00+00:00,33840
|
||||
2023-11-18 00:00:00+00:00,33889
|
||||
2023-11-19 00:00:00+00:00,33944
|
||||
2023-11-20 00:00:00+00:00,34027
|
||||
2023-11-21 00:00:00+00:00,34102
|
||||
2023-11-22 00:00:00+00:00,34168
|
||||
2023-11-23 00:00:00+00:00,34236
|
||||
2023-11-24 00:00:00+00:00,34285
|
||||
2023-11-25 00:00:00+00:00,34321
|
||||
2023-11-26 00:00:00+00:00,34364
|
||||
2023-11-27 00:00:00+00:00,34424
|
||||
2023-11-28 00:00:00+00:00,34477
|
||||
2023-11-29 00:00:00+00:00,34535
|
||||
2023-11-30 00:00:00+00:00,34585
|
||||
2023-12-01 00:00:00+00:00,34631
|
||||
2023-12-02 00:00:00+00:00,34685
|
||||
2023-12-03 00:00:00+00:00,34735
|
||||
2023-12-04 00:00:00+00:00,34784
|
||||
2023-12-05 00:00:00+00:00,34842
|
||||
2023-12-06 00:00:00+00:00,34897
|
||||
2023-12-07 00:00:00+00:00,34960
|
||||
2023-12-08 00:00:00+00:00,35001
|
||||
2023-12-09 00:00:00+00:00,35043
|
||||
2023-12-10 00:00:00+00:00,35092
|
||||
2023-12-11 00:00:00+00:00,35142
|
||||
2023-12-12 00:00:00+00:00,35183
|
||||
2023-12-13 00:00:00+00:00,35245
|
||||
2023-12-14 00:00:00+00:00,35291
|
||||
2023-12-15 00:00:00+00:00,35349
|
||||
2023-12-16 00:00:00+00:00,35385
|
||||
2023-12-17 00:00:00+00:00,35418
|
||||
2023-12-18 00:00:00+00:00,35472
|
||||
2023-12-19 00:00:00+00:00,35517
|
||||
2023-12-20 00:00:00+00:00,35558
|
||||
2023-12-21 00:00:00+00:00,35605
|
||||
2023-12-22 00:00:00+00:00,35640
|
||||
2023-12-23 00:00:00+00:00,35673
|
||||
2023-12-24 00:00:00+00:00,35703
|
||||
2023-12-25 00:00:00+00:00,35739
|
||||
2023-12-26 00:00:00+00:00,35790
|
||||
2023-12-27 00:00:00+00:00,35846
|
||||
2023-12-28 00:00:00+00:00,35883
|
||||
2023-12-29 00:00:00+00:00,35913
|
||||
2023-12-30 00:00:00+00:00,35938
|
||||
2023-12-31 00:00:00+00:00,35957
|
||||
2024-01-01 00:00:00+00:00,35983
|
||||
2024-01-02 00:00:00+00:00,36035
|
||||
2024-01-03 00:00:00+00:00,36087
|
||||
2024-01-04 00:00:00+00:00,36133
|
||||
2024-01-05 00:00:00+00:00,36169
|
||||
2024-01-06 00:00:00+00:00,36201
|
||||
2024-01-07 00:00:00+00:00,36231
|
||||
2024-01-08 00:00:00+00:00,36261
|
||||
2024-01-09 00:00:00+00:00,36300
|
||||
2024-01-10 00:00:00+00:00,36338
|
||||
2024-01-11 00:00:00+00:00,36375
|
||||
2024-01-12 00:00:00+00:00,36410
|
||||
2024-01-13 00:00:00+00:00,36442
|
||||
2024-01-14 00:00:00+00:00,36481
|
||||
2024-01-15 00:00:00+00:00,36527
|
||||
2024-01-16 00:00:00+00:00,36565
|
||||
2024-01-17 00:00:00+00:00,36600
|
||||
2024-01-18 00:00:00+00:00,36643
|
||||
2024-01-19 00:00:00+00:00,36683
|
||||
2024-01-20 00:00:00+00:00,36709
|
||||
2024-01-21 00:00:00+00:00,36742
|
||||
2024-01-22 00:00:00+00:00,36772
|
||||
2024-01-23 00:00:00+00:00,36823
|
||||
2024-01-24 00:00:00+00:00,36863
|
||||
2024-01-25 00:00:00+00:00,36897
|
||||
2024-01-26 00:00:00+00:00,36926
|
||||
2024-01-27 00:00:00+00:00,36954
|
||||
2024-01-28 00:00:00+00:00,36988
|
||||
2024-01-29 00:00:00+00:00,37022
|
||||
2024-01-30 00:00:00+00:00,37066
|
||||
2024-01-31 00:00:00+00:00,37098
|
||||
2024-02-01 00:00:00+00:00,37131
|
||||
2024-02-02 00:00:00+00:00,37156
|
||||
2024-02-03 00:00:00+00:00,37185
|
||||
2024-02-04 00:00:00+00:00,37223
|
||||
2024-02-05 00:00:00+00:00,37265
|
||||
2024-02-06 00:00:00+00:00,37305
|
||||
2024-02-07 00:00:00+00:00,37341
|
||||
2024-02-08 00:00:00+00:00,37369
|
||||
2024-02-09 00:00:00+00:00,37401
|
||||
2024-02-10 00:00:00+00:00,37423
|
||||
2024-02-11 00:00:00+00:00,37448
|
||||
2024-02-12 00:00:00+00:00,37472
|
||||
2024-02-13 00:00:00+00:00,37497
|
||||
2024-02-14 00:00:00+00:00,37526
|
||||
2024-02-15 00:00:00+00:00,37558
|
||||
2024-02-16 00:00:00+00:00,37591
|
||||
2024-02-17 00:00:00+00:00,37620
|
||||
2024-02-18 00:00:00+00:00,37649
|
||||
2024-02-19 00:00:00+00:00,37692
|
||||
2024-02-20 00:00:00+00:00,37730
|
||||
2024-02-21 00:00:00+00:00,37783
|
||||
2024-02-22 00:00:00+00:00,37826
|
||||
2024-02-23 00:00:00+00:00,37873
|
||||
2024-02-24 00:00:00+00:00,37918
|
||||
2024-02-25 00:00:00+00:00,37966
|
||||
2024-02-26 00:00:00+00:00,38016
|
||||
2024-02-27 00:00:00+00:00,38071
|
||||
2024-02-28 00:00:00+00:00,38126
|
||||
2024-02-29 00:00:00+00:00,38178
|
||||
2024-03-01 00:00:00+00:00,38231
|
||||
2024-03-02 00:00:00+00:00,38263
|
||||
2024-03-03 00:00:00+00:00,38291
|
||||
2024-03-04 00:00:00+00:00,38341
|
||||
2024-03-05 00:00:00+00:00,38383
|
||||
2024-03-06 00:00:00+00:00,38433
|
||||
2024-03-07 00:00:00+00:00,38470
|
||||
2024-03-08 00:00:00+00:00,38498
|
||||
2024-03-09 00:00:00+00:00,38529
|
||||
2024-03-10 00:00:00+00:00,38563
|
||||
2024-03-11 00:00:00+00:00,38604
|
||||
2024-03-12 00:00:00+00:00,38641
|
||||
2024-03-13 00:00:00+00:00,38689
|
||||
2024-03-14 00:00:00+00:00,38760
|
||||
2024-03-15 00:00:00+00:00,38814
|
||||
2024-03-16 00:00:00+00:00,38848
|
||||
2024-03-17 00:00:00+00:00,38884
|
||||
2024-03-18 00:00:00+00:00,38928
|
||||
2024-03-19 00:00:00+00:00,38984
|
||||
2024-03-20 00:00:00+00:00,39033
|
||||
2024-03-21 00:00:00+00:00,39077
|
||||
2024-03-22 00:00:00+00:00,39110
|
||||
2024-03-23 00:00:00+00:00,39147
|
||||
2024-03-24 00:00:00+00:00,39176
|
||||
2024-03-25 00:00:00+00:00,39224
|
||||
2024-03-26 00:00:00+00:00,39269
|
||||
2024-03-27 00:00:00+00:00,39315
|
||||
2024-03-28 00:00:00+00:00,39362
|
||||
2024-03-29 00:00:00+00:00,39407
|
||||
2024-03-30 00:00:00+00:00,39446
|
||||
2024-03-31 00:00:00+00:00,39478
|
||||
2024-04-01 00:00:00+00:00,39525
|
||||
2024-04-02 00:00:00+00:00,39568
|
||||
2024-04-03 00:00:00+00:00,39616
|
||||
2024-04-04 00:00:00+00:00,39660
|
||||
2024-04-05 00:00:00+00:00,39686
|
||||
2024-04-06 00:00:00+00:00,39714
|
||||
2024-04-07 00:00:00+00:00,39750
|
||||
2024-04-08 00:00:00+00:00,39788
|
||||
2024-04-09 00:00:00+00:00,39813
|
||||
2024-04-10 00:00:00+00:00,39851
|
||||
2024-04-11 00:00:00+00:00,39879
|
||||
2024-04-12 00:00:00+00:00,39909
|
||||
2024-04-13 00:00:00+00:00,39935
|
||||
2024-04-14 00:00:00+00:00,39959
|
||||
2024-04-15 00:00:00+00:00,40000
|
||||
2024-04-16 00:00:00+00:00,40057
|
||||
2024-04-17 00:00:00+00:00,40107
|
||||
2024-04-18 00:00:00+00:00,40164
|
||||
2024-04-19 00:00:00+00:00,40196
|
||||
2024-04-20 00:00:00+00:00,40221
|
||||
2024-04-21 00:00:00+00:00,40255
|
||||
2024-04-22 00:00:00+00:00,40297
|
||||
2024-04-23 00:00:00+00:00,40355
|
||||
2024-04-24 00:00:00+00:00,40396
|
||||
2024-04-25 00:00:00+00:00,40423
|
||||
2024-04-26 00:00:00+00:00,40444
|
||||
2024-04-27 00:00:00+00:00,40474
|
||||
2024-04-28 00:00:00+00:00,40502
|
||||
2024-04-29 00:00:00+00:00,40538
|
||||
2024-04-30 00:00:00+00:00,40573
|
||||
2024-05-01 00:00:00+00:00,40603
|
||||
2024-05-02 00:00:00+00:00,40631
|
||||
2024-05-03 00:00:00+00:00,40653
|
||||
2024-05-04 00:00:00+00:00,40680
|
||||
2024-05-05 00:00:00+00:00,40702
|
||||
2024-05-06 00:00:00+00:00,40727
|
||||
2024-05-07 00:00:00+00:00,40766
|
||||
2024-05-08 00:00:00+00:00,40797
|
||||
2024-05-09 00:00:00+00:00,40822
|
||||
2024-05-10 00:00:00+00:00,40846
|
||||
2024-05-11 00:00:00+00:00,40875
|
||||
2024-05-12 00:00:00+00:00,40898
|
||||
2024-05-13 00:00:00+00:00,40926
|
||||
2024-05-14 00:00:00+00:00,40959
|
||||
2024-05-15 00:00:00+00:00,41006
|
||||
2024-05-16 00:00:00+00:00,41056
|
||||
2024-05-17 00:00:00+00:00,41085
|
||||
2024-05-18 00:00:00+00:00,41111
|
||||
2024-05-19 00:00:00+00:00,41151
|
||||
2024-05-20 00:00:00+00:00,41184
|
||||
2024-05-21 00:00:00+00:00,41225
|
||||
2024-05-22 00:00:00+00:00,41267
|
||||
2024-05-23 00:00:00+00:00,41296
|
||||
2024-05-24 00:00:00+00:00,41329
|
||||
2024-05-25 00:00:00+00:00,41354
|
||||
2024-05-26 00:00:00+00:00,41385
|
||||
2024-05-27 00:00:00+00:00,41412
|
||||
2024-05-28 00:00:00+00:00,41445
|
||||
2024-05-29 00:00:00+00:00,41480
|
||||
2024-05-30 00:00:00+00:00,41511
|
||||
2024-05-31 00:00:00+00:00,41532
|
||||
2024-06-01 00:00:00+00:00,41562
|
||||
2024-06-02 00:00:00+00:00,41583
|
||||
2024-06-03 00:00:00+00:00,41612
|
||||
2024-06-04 00:00:00+00:00,41644
|
||||
2024-06-05 00:00:00+00:00,41688
|
||||
2024-06-06 00:00:00+00:00,41726
|
||||
2024-06-07 00:00:00+00:00,41749
|
||||
2024-06-08 00:00:00+00:00,41773
|
||||
2024-06-09 00:00:00+00:00,41809
|
||||
2024-06-10 00:00:00+00:00,41835
|
||||
2024-06-11 00:00:00+00:00,41869
|
||||
2024-06-12 00:00:00+00:00,41902
|
||||
2024-06-13 00:00:00+00:00,41920
|
||||
2024-06-14 00:00:00+00:00,41940
|
||||
2024-06-15 00:00:00+00:00,41948
|
||||
2024-06-16 00:00:00+00:00,41957
|
||||
2024-06-17 00:00:00+00:00,41983
|
||||
2024-06-18 00:00:00+00:00,42010
|
||||
2024-06-19 00:00:00+00:00,42033
|
||||
2024-06-20 00:00:00+00:00,42059
|
||||
2024-06-21 00:00:00+00:00,42086
|
||||
2024-06-22 00:00:00+00:00,42098
|
||||
2024-06-23 00:00:00+00:00,42113
|
||||
2024-06-24 00:00:00+00:00,42129
|
||||
2024-06-25 00:00:00+00:00,42160
|
||||
2024-06-26 00:00:00+00:00,42172
|
||||
2024-06-27 00:00:00+00:00,42188
|
||||
2024-06-28 00:00:00+00:00,42203
|
||||
2024-06-29 00:00:00+00:00,42222
|
||||
2024-06-30 00:00:00+00:00,42231
|
||||
2024-07-01 00:00:00+00:00,42253
|
||||
2024-07-02 00:00:00+00:00,42288
|
||||
2024-07-03 00:00:00+00:00,42311
|
||||
2024-07-04 00:00:00+00:00,42331
|
||||
2024-07-05 00:00:00+00:00,42349
|
||||
2024-07-06 00:00:00+00:00,42370
|
||||
2024-07-07 00:00:00+00:00,42384
|
||||
2024-07-08 00:00:00+00:00,42407
|
||||
2024-07-09 00:00:00+00:00,42444
|
||||
2024-07-10 00:00:00+00:00,42481
|
||||
2024-07-11 00:00:00+00:00,42514
|
||||
2024-07-12 00:00:00+00:00,42547
|
||||
2024-07-13 00:00:00+00:00,42573
|
||||
2024-07-14 00:00:00+00:00,42613
|
||||
2024-07-15 00:00:00+00:00,42643
|
||||
2024-07-16 00:00:00+00:00,42676
|
||||
2024-07-17 00:00:00+00:00,42702
|
||||
2024-07-18 00:00:00+00:00,42749
|
||||
2024-07-19 00:00:00+00:00,42779
|
||||
2024-07-20 00:00:00+00:00,42791
|
||||
2024-07-21 00:00:00+00:00,42815
|
||||
2024-07-22 00:00:00+00:00,42828
|
||||
2024-07-23 00:00:00+00:00,42845
|
||||
2024-07-24 00:00:00+00:00,42871
|
||||
2024-07-25 00:00:00+00:00,42887
|
||||
2024-07-26 00:00:00+00:00,42905
|
||||
2024-07-27 00:00:00+00:00,42921
|
||||
2024-07-28 00:00:00+00:00,42932
|
||||
2024-07-29 00:00:00+00:00,42953
|
||||
2024-07-30 00:00:00+00:00,42976
|
||||
2024-07-31 00:00:00+00:00,42990
|
||||
2024-08-01 00:00:00+00:00,43013
|
||||
2024-08-02 00:00:00+00:00,43035
|
||||
2024-08-03 00:00:00+00:00,43043
|
||||
2024-08-04 00:00:00+00:00,43057
|
||||
2024-08-05 00:00:00+00:00,43071
|
||||
2024-08-06 00:00:00+00:00,43086
|
||||
2024-08-07 00:00:00+00:00,43106
|
||||
2024-08-08 00:00:00+00:00,43125
|
||||
2024-08-09 00:00:00+00:00,43139
|
||||
2024-08-10 00:00:00+00:00,43150
|
||||
2024-08-11 00:00:00+00:00,43160
|
||||
2024-08-12 00:00:00+00:00,43171
|
||||
2024-08-13 00:00:00+00:00,43188
|
||||
2024-08-14 00:00:00+00:00,43217
|
||||
2024-08-15 00:00:00+00:00,43239
|
||||
2024-08-16 00:00:00+00:00,43258
|
||||
2024-08-17 00:00:00+00:00,43265
|
||||
2024-08-18 00:00:00+00:00,43280
|
||||
2024-08-19 00:00:00+00:00,43289
|
||||
2024-08-20 00:00:00+00:00,43302
|
||||
2024-08-21 00:00:00+00:00,43314
|
||||
2024-08-22 00:00:00+00:00,43329
|
||||
2024-08-23 00:00:00+00:00,43338
|
||||
2024-08-24 00:00:00+00:00,43345
|
||||
2024-08-25 00:00:00+00:00,43357
|
||||
2024-08-26 00:00:00+00:00,43372
|
||||
2024-08-27 00:00:00+00:00,43383
|
||||
2024-08-28 00:00:00+00:00,43399
|
||||
2024-08-29 00:00:00+00:00,43414
|
||||
2024-08-30 00:00:00+00:00,43422
|
||||
2024-08-31 00:00:00+00:00,43431
|
||||
2024-09-01 00:00:00+00:00,43439
|
||||
2024-09-02 00:00:00+00:00,43456
|
||||
2024-09-03 00:00:00+00:00,43471
|
||||
2024-09-04 00:00:00+00:00,43487
|
||||
2024-09-05 00:00:00+00:00,43492
|
||||
2024-09-06 00:00:00+00:00,43506
|
||||
2024-09-07 00:00:00+00:00,43520
|
||||
2024-09-08 00:00:00+00:00,43531
|
||||
2024-09-09 00:00:00+00:00,43542
|
||||
2024-09-10 00:00:00+00:00,43548
|
||||
2024-09-11 00:00:00+00:00,43556
|
||||
2024-09-12 00:00:00+00:00,43572
|
||||
2024-09-13 00:00:00+00:00,43587
|
||||
2024-09-14 00:00:00+00:00,43598
|
||||
2024-09-15 00:00:00+00:00,43621
|
||||
2024-09-16 00:00:00+00:00,43636
|
||||
2024-09-17 00:00:00+00:00,43649
|
||||
2024-09-18 00:00:00+00:00,43674
|
||||
2024-09-19 00:00:00+00:00,43705
|
||||
2024-09-20 00:00:00+00:00,43727
|
||||
2024-09-21 00:00:00+00:00,43738
|
||||
2024-09-22 00:00:00+00:00,43760
|
||||
2024-09-23 00:00:00+00:00,43784
|
||||
2024-09-24 00:00:00+00:00,43803
|
||||
2024-09-25 00:00:00+00:00,43823
|
||||
2024-09-26 00:00:00+00:00,43833
|
||||
2024-09-27 00:00:00+00:00,43843
|
||||
2024-09-28 00:00:00+00:00,43852
|
||||
2024-09-29 00:00:00+00:00,43864
|
||||
2024-09-30 00:00:00+00:00,43876
|
||||
2024-10-01 00:00:00+00:00,43885
|
||||
2024-10-02 00:00:00+00:00,43901
|
||||
2024-10-03 00:00:00+00:00,43922
|
||||
2024-10-04 00:00:00+00:00,43930
|
||||
2024-10-05 00:00:00+00:00,43939
|
||||
2024-10-06 00:00:00+00:00,43953
|
||||
2024-10-07 00:00:00+00:00,43966
|
||||
2024-10-08 00:00:00+00:00,43979
|
||||
2024-10-09 00:00:00+00:00,43991
|
||||
2024-10-10 00:00:00+00:00,44003
|
||||
2024-10-11 00:00:00+00:00,44012
|
||||
2024-10-12 00:00:00+00:00,44028
|
||||
2024-10-13 00:00:00+00:00,44031
|
||||
2024-10-14 00:00:00+00:00,44047
|
||||
2024-10-15 00:00:00+00:00,44061
|
||||
2024-10-16 00:00:00+00:00,44079
|
||||
2024-10-17 00:00:00+00:00,44085
|
||||
2024-10-18 00:00:00+00:00,44095
|
||||
2024-10-19 00:00:00+00:00,44104
|
||||
2024-10-20 00:00:00+00:00,44114
|
||||
2024-10-21 00:00:00+00:00,44126
|
||||
2024-10-22 00:00:00+00:00,44136
|
||||
2024-10-23 00:00:00+00:00,44144
|
||||
2024-10-24 00:00:00+00:00,44163
|
||||
2024-10-25 00:00:00+00:00,44179
|
||||
2024-10-26 00:00:00+00:00,44191
|
||||
2024-10-27 00:00:00+00:00,44205
|
||||
2024-10-28 00:00:00+00:00,44218
|
||||
2024-10-29 00:00:00+00:00,44228
|
||||
2024-10-30 00:00:00+00:00,44239
|
||||
2024-10-31 00:00:00+00:00,44250
|
||||
2024-11-01 00:00:00+00:00,44261
|
||||
2024-11-02 00:00:00+00:00,44274
|
||||
2024-11-03 00:00:00+00:00,44279
|
||||
2024-11-04 00:00:00+00:00,44285
|
||||
2024-11-05 00:00:00+00:00,44297
|
||||
2024-11-06 00:00:00+00:00,44305
|
||||
2024-11-07 00:00:00+00:00,44322
|
||||
2024-11-08 00:00:00+00:00,44329
|
||||
2024-11-09 00:00:00+00:00,44344
|
||||
2024-11-10 00:00:00+00:00,44353
|
||||
2024-11-11 00:00:00+00:00,44366
|
||||
2024-11-12 00:00:00+00:00,44385
|
||||
2024-11-13 00:00:00+00:00,44399
|
||||
2024-11-14 00:00:00+00:00,44411
|
||||
2024-11-15 00:00:00+00:00,44420
|
||||
2024-11-16 00:00:00+00:00,44427
|
||||
2024-11-17 00:00:00+00:00,44440
|
||||
2024-11-18 00:00:00+00:00,44451
|
||||
2024-11-19 00:00:00+00:00,44465
|
||||
2024-11-20 00:00:00+00:00,44485
|
||||
2024-11-21 00:00:00+00:00,44502
|
||||
2024-11-22 00:00:00+00:00,44513
|
||||
2024-11-23 00:00:00+00:00,44528
|
||||
2024-11-24 00:00:00+00:00,44539
|
||||
2024-11-25 00:00:00+00:00,44554
|
||||
2024-11-26 00:00:00+00:00,44564
|
||||
2024-11-27 00:00:00+00:00,44575
|
||||
2024-11-28 00:00:00+00:00,44588
|
||||
2024-11-29 00:00:00+00:00,44594
|
||||
2024-11-30 00:00:00+00:00,44599
|
||||
2024-12-01 00:00:00+00:00,44608
|
||||
2024-12-02 00:00:00+00:00,44618
|
||||
2024-12-03 00:00:00+00:00,44637
|
||||
2024-12-04 00:00:00+00:00,44654
|
||||
2024-12-05 00:00:00+00:00,44670
|
||||
2024-12-06 00:00:00+00:00,44685
|
||||
2024-12-07 00:00:00+00:00,44693
|
||||
2024-12-08 00:00:00+00:00,44706
|
||||
2024-12-09 00:00:00+00:00,44721
|
||||
2024-12-10 00:00:00+00:00,44731
|
||||
2024-12-11 00:00:00+00:00,44741
|
||||
2024-12-12 00:00:00+00:00,44751
|
||||
2024-12-13 00:00:00+00:00,44761
|
||||
2024-12-14 00:00:00+00:00,44776
|
||||
2024-12-15 00:00:00+00:00,44788
|
||||
2024-12-16 00:00:00+00:00,44798
|
||||
2024-12-17 00:00:00+00:00,44817
|
||||
2024-12-18 00:00:00+00:00,44830
|
||||
2024-12-19 00:00:00+00:00,44843
|
||||
2024-12-20 00:00:00+00:00,44855
|
||||
2024-12-21 00:00:00+00:00,44860
|
||||
2024-12-22 00:00:00+00:00,44873
|
||||
2024-12-23 00:00:00+00:00,44892
|
||||
2024-12-24 00:00:00+00:00,44918
|
||||
2024-12-25 00:00:00+00:00,44942
|
||||
2024-12-26 00:00:00+00:00,44958
|
||||
2024-12-27 00:00:00+00:00,44978
|
||||
2024-12-28 00:00:00+00:00,44995
|
||||
2024-12-29 00:00:00+00:00,45008
|
||||
2024-12-30 00:00:00+00:00,45017
|
||||
2024-12-31 00:00:00+00:00,45030
|
||||
2025-01-01 00:00:00+00:00,45034
|
||||
2025-01-02 00:00:00+00:00,45052
|
||||
2025-01-03 00:00:00+00:00,45071
|
||||
2025-01-04 00:00:00+00:00,45082
|
||||
2025-01-05 00:00:00+00:00,45097
|
||||
2025-01-06 00:00:00+00:00,45119
|
||||
2025-01-07 00:00:00+00:00,45135
|
||||
2025-01-08 00:00:00+00:00,45146
|
||||
2025-01-09 00:00:00+00:00,45165
|
||||
2025-01-10 00:00:00+00:00,45181
|
||||
2025-01-11 00:00:00+00:00,45194
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,59316,31101
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,19205,5306
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8307,2126
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,6004,1798
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,5763,1630
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,4663,2979
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3603,792
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/agbenchmark-v0.0.10,2727,1607
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2508,856
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/README.md,2496,789
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,73683,23326
|
||||
github.com,18487,4024
|
||||
docs.agpt.co,7266,1259
|
||||
Bing,5666,1641
|
||||
youtube.com,4404,1118
|
||||
news.agpt.co,3849,1223
|
||||
DuckDuckGo,1969,603
|
||||
lablab.ai,1560,513
|
||||
link.zhihu.com,1160,368
|
||||
lilianweng.github.io,880,286
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,60364,31483
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,21011,5764
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,8402,2186
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,6429,1899
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,6384,1798
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,4129,2655
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3694,820
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/agbenchmark-v0.0.10,3124,1852
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2592,715
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/docs,2571,874
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,75816,23877
|
||||
github.com,19135,4097
|
||||
docs.agpt.co,7633,1265
|
||||
Bing,5818,1690
|
||||
youtube.com,4489,1125
|
||||
news.agpt.co,4222,1308
|
||||
DuckDuckGo,2071,618
|
||||
lablab.ai,1620,518
|
||||
link.zhihu.com,1252,386
|
||||
lilianweng.github.io,914,298
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,57165,29805
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,21037,5764
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,7960,2073
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,6352,1781
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,6333,1887
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,3537,2334
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,3507,779
|
||||
/Significant-Gravitas/AutoGPT,3285,2129
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/agbenchmark-v0.0.10,3124,1852
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/benchmark,2558,704
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,75933,23417
|
||||
github.com,19875,4203
|
||||
docs.agpt.co,7724,1286
|
||||
Bing,5666,1665
|
||||
youtube.com,4658,1157
|
||||
news.agpt.co,4426,1362
|
||||
DuckDuckGo,2126,639
|
||||
lablab.ai,1637,511
|
||||
link.zhihu.com,1246,382
|
||||
lilianweng.github.io,867,294
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,47387,24804
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,19458,5304
|
||||
/Significant-Gravitas/AutoGPT,11368,6959
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,6650,1772
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,6271,1748
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5710,1716
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,5031,1389
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/agbenchmark-v0.0.10,3124,1852
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,2961,668
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/v0.4.7,2277,1503
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,74290,22351
|
||||
github.com,20867,4357
|
||||
docs.agpt.co,7911,1230
|
||||
Bing,5505,1513
|
||||
news.agpt.co,4806,1397
|
||||
youtube.com,4693,1136
|
||||
DuckDuckGo,2303,637
|
||||
lablab.ai,1681,500
|
||||
link.zhihu.com,1214,363
|
||||
lilianweng.github.io,934,301
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,42297,22351
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,17545,4784
|
||||
/Significant-Gravitas/AutoGPT,16428,9745
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,6158,1702
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,6157,1656
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,5614,1560
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,5037,1536
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/agbenchmark-v0.0.10,3124,1852
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt,2558,578
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,2526,747
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,73222,22067
|
||||
github.com,21264,4390
|
||||
docs.agpt.co,7666,1205
|
||||
Bing,5172,1472
|
||||
news.agpt.co,4756,1453
|
||||
youtube.com,4740,1143
|
||||
DuckDuckGo,2331,624
|
||||
lablab.ai,1467,475
|
||||
link.zhihu.com,1215,356
|
||||
lilianweng.github.io,955,302
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,37413,20093
|
||||
/Significant-Gravitas/AutoGPT,21088,12488
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,15510,4234
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,7162,1927
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,5722,1588
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,4724,1343
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,4374,1344
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,3491,1076
|
||||
/Significant-Gravitas/Auto-GPT/releases/tag/agbenchmark-v0.0.10,3124,1852
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,3117,919
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,72580,21797
|
||||
github.com,21256,4382
|
||||
docs.agpt.co,7482,1162
|
||||
Bing,4927,1423
|
||||
youtube.com,4864,1122
|
||||
news.agpt.co,4859,1459
|
||||
DuckDuckGo,2312,625
|
||||
lablab.ai,1460,456
|
||||
link.zhihu.com,1213,352
|
||||
lilianweng.github.io,883,291
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/Auto-GPT,32939,17832
|
||||
/Significant-Gravitas/AutoGPT,25637,14847
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,13812,3853
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,8293,2257
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,4987,1482
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4946,1395
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,4108,1181
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,3803,1106
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,3787,1165
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,3190,937
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,72856,21590
|
||||
github.com,21517,4504
|
||||
docs.agpt.co,7436,1167
|
||||
news.agpt.co,5027,1486
|
||||
youtube.com,5015,1130
|
||||
Bing,4985,1427
|
||||
DuckDuckGo,2285,632
|
||||
lablab.ai,1358,421
|
||||
link.zhihu.com,1239,359
|
||||
lilianweng.github.io,886,284
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,29561,16895
|
||||
/Significant-Gravitas/Auto-GPT,29442,15704
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,12369,3433
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,9179,2521
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,6416,1820
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,4405,1269
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,4304,1252
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,3687,1067
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,3610,1065
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/frontend,3386,1030
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,74644,21958
|
||||
github.com,21520,4442
|
||||
docs.agpt.co,7475,1160
|
||||
news.agpt.co,5104,1503
|
||||
youtube.com,5084,1127
|
||||
Bing,4885,1421
|
||||
DuckDuckGo,2354,635
|
||||
lablab.ai,1255,403
|
||||
link.zhihu.com,1254,352
|
||||
search.brave.com,892,262
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,32723,18519
|
||||
/Significant-Gravitas/Auto-GPT,25424,13784
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,10493,2908
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,9928,2730
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,7614,2158
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,4852,1356
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,4100,1174
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,3764,1094
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,3474,1192
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt,3047,913
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,73302,21934
|
||||
github.com,21483,4365
|
||||
docs.agpt.co,7422,1141
|
||||
news.agpt.co,5154,1515
|
||||
youtube.com,4898,1141
|
||||
Bing,4687,1379
|
||||
DuckDuckGo,2326,635
|
||||
lablab.ai,1251,385
|
||||
link.zhihu.com,1248,350
|
||||
search.brave.com,866,270
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,35824,19996
|
||||
/Significant-Gravitas/Auto-GPT,20044,11106
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,10794,2937
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,8630,2378
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,8181,2248
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,5374,1476
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,4525,1278
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,3839,1282
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,2984,1827
|
||||
/Significant-Gravitas/Auto-GPT/tree/master/autogpts,2930,871
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,69420,20846
|
||||
github.com,21255,4172
|
||||
docs.agpt.co,7192,1081
|
||||
news.agpt.co,5108,1503
|
||||
youtube.com,4891,1137
|
||||
Bing,4732,1341
|
||||
DuckDuckGo,2297,622
|
||||
link.zhihu.com,1158,324
|
||||
lablab.ai,1126,346
|
||||
search.brave.com,885,266
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,39530,22086
|
||||
/Significant-Gravitas/Auto-GPT,15042,8354
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,11665,3149
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,9766,2722
|
||||
/Significant-Gravitas/Auto-GPT/blob/master/autogpts/autogpt/README.md,6022,1697
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,5886,1610
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,4986,1403
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,4158,1370
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,3234,1998
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,2369,688
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,67619,20379
|
||||
github.com,21191,3975
|
||||
docs.agpt.co,7075,1065
|
||||
news.agpt.co,5022,1511
|
||||
Bing,4714,1301
|
||||
youtube.com,4628,1120
|
||||
DuckDuckGo,2311,595
|
||||
link.zhihu.com,1088,309
|
||||
lablab.ai,1002,327
|
||||
search.brave.com,894,260
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,47790,26350
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,13479,3600
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,12369,3455
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,7035,1914
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,5928,1656
|
||||
/Significant-Gravitas/Auto-GPT,5752,3309
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,4908,1654
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,3837,2347
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,2933,865
|
||||
/Significant-Gravitas/AutoGPT/blob/master/QUICKSTART.md,2647,744
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,65290,20562
|
||||
github.com,20699,3782
|
||||
docs.agpt.co,6958,1006
|
||||
news.agpt.co,5013,1511
|
||||
youtube.com,4733,1105
|
||||
Bing,4679,1236
|
||||
DuckDuckGo,2125,561
|
||||
lablab.ai,969,310
|
||||
link.zhihu.com,914,270
|
||||
search.brave.com,902,274
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,51881,28043
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,14214,3801
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,13478,3738
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,7618,2065
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,6438,1779
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,5251,1755
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,4103,2497
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,3137,922
|
||||
/Significant-Gravitas/AutoGPT/fork,2818,977
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt/autogpt,2811,683
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,64925,20431
|
||||
github.com,20639,3795
|
||||
docs.agpt.co,6737,996
|
||||
news.agpt.co,5005,1492
|
||||
youtube.com,4688,1084
|
||||
Bing,4645,1199
|
||||
DuckDuckGo,2073,553
|
||||
lablab.ai,924,305
|
||||
search.brave.com,877,272
|
||||
lilianweng.github.io,834,250
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,52629,28292
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,14534,4031
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,13615,3618
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,7615,2084
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,6495,1777
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,5225,1760
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,4049,2440
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,3311,988
|
||||
/Significant-Gravitas/AutoGPT/fork,2900,1034
|
||||
/Significant-Gravitas/AutoGPT/blob/master/QUICKSTART.md,2843,801
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,66207,20857
|
||||
github.com,20247,3860
|
||||
docs.agpt.co,6610,991
|
||||
news.agpt.co,4977,1494
|
||||
Bing,4877,1223
|
||||
youtube.com,4729,1081
|
||||
DuckDuckGo,2068,546
|
||||
lablab.ai,934,302
|
||||
lilianweng.github.io,888,256
|
||||
search.brave.com,878,274
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,52827,28832
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,15827,4323
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,12852,3471
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,7522,2046
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,6383,1711
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,5264,1748
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,3901,2316
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,3542,1044
|
||||
/Significant-Gravitas/AutoGPT/fork,3028,1069
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt/autogpt,2791,656
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,66193,20756
|
||||
github.com,20499,3913
|
||||
docs.agpt.co,6589,1011
|
||||
Bing,4896,1205
|
||||
news.agpt.co,4831,1474
|
||||
youtube.com,4763,1067
|
||||
DuckDuckGo,1958,532
|
||||
lilianweng.github.io,933,259
|
||||
lablab.ai,900,289
|
||||
search.brave.com,846,268
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,51752,28722
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,16767,4541
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,11759,3163
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,7301,1974
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,6258,1672
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,5003,1638
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,3746,2207
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,3645,1098
|
||||
/Significant-Gravitas/AutoGPT/fork,3095,1077
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/002_blueprint_of_an_agent.md,2798,873
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,63773,19767
|
||||
github.com,20173,3769
|
||||
docs.agpt.co,6255,978
|
||||
youtube.com,4999,1077
|
||||
news.agpt.co,4803,1431
|
||||
Bing,4736,1158
|
||||
DuckDuckGo,1790,496
|
||||
link.zhihu.com,921,290
|
||||
search.brave.com,878,272
|
||||
lablab.ai,853,279
|
|
|
@ -0,0 +1,11 @@
|
|||
url_path,views_total,views_unique
|
||||
/Significant-Gravitas/AutoGPT,51134,28014
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/001_getting_started.md,16638,4523
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/autogpt/README.md,11567,3130
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts,7200,1982
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/autogpt,6224,1693
|
||||
/Significant-Gravitas/AutoGPT/tree/master/frontend,4975,1627
|
||||
/Significant-Gravitas/AutoGPT/releases/tag/agbenchmark-v0.0.10,3688,2201
|
||||
/Significant-Gravitas/AutoGPT/tree/master/autogpts/forge/tutorials,3373,1012
|
||||
/Significant-Gravitas/AutoGPT/fork,3121,1110
|
||||
/Significant-Gravitas/AutoGPT/blob/master/autogpts/forge/tutorials/002_blueprint_of_an_agent.md,2710,841
|
|
|
@ -0,0 +1,11 @@
|
|||
referrer,views_total,views_unique
|
||||
Google,63064,19999
|
||||
github.com,19705,3726
|
||||
docs.agpt.co,6057,929
|
||||
youtube.com,4907,1076
|
||||
news.agpt.co,4855,1409
|
||||
Bing,4647,1108
|
||||
DuckDuckGo,1599,472
|
||||
link.zhihu.com,949,305
|
||||
lablab.ai,859,279
|
||||
search.brave.com,842,262
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue