Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

rename dev_requirements.txt -> dev-requirements.txt to match dbt-core #344

Merged
merged 2 commits into from
May 4, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 17 additions & 17 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,28 +3,28 @@ name: Build and Release

on:
workflow_dispatch:

# Release version number that must be updated for each release
env:
version_number: '0.20.0rc2'

jobs:
jobs:
Test:
runs-on: ubuntu-latest
steps:
- name: Setup Python
uses: actions/[email protected]
with:
with:
python-version: '3.8'

- uses: actions/checkout@v2

- name: Test release
- name: Test release
run: |
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
pip install dist/dbt-spark-*.tar.gz
Expand All @@ -38,17 +38,17 @@ jobs:
steps:
- name: Setup Python
uses: actions/[email protected]
with:
with:
python-version: '3.8'

- uses: actions/checkout@v2

- name: Bumping version
run: |
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
bumpversion --config-file .bumpversion-dbt.cfg patch --new-version ${{env.version_number}}
bumpversion --config-file .bumpversion.cfg patch --new-version ${{env.version_number}} --allow-dirty
git status
Expand All @@ -60,7 +60,7 @@ jobs:
author_email: '[email protected]'
message: 'Bumping version to ${{env.version_number}}'
tag: v${{env.version_number}}

# Need to set an output variable because env variables can't be taken as input
# This is needed for the next step with releasing to GitHub
- name: Find release type
Expand All @@ -69,7 +69,7 @@ jobs:
IS_PRERELEASE: ${{ contains(env.version_number, 'rc') || contains(env.version_number, 'b') }}
run: |
echo ::set-output name=isPrerelease::$IS_PRERELEASE

- name: Create GitHub release
uses: actions/create-release@v1
env:
Expand All @@ -88,7 +88,7 @@ jobs:
# or
$ pip install "dbt-spark[PyHive]==${{env.version_number}}"
```

PypiRelease:
name: Pypi release
runs-on: ubuntu-latest
Expand All @@ -97,13 +97,13 @@ jobs:
steps:
- name: Setup Python
uses: actions/[email protected]
with:
with:
python-version: '3.8'

- uses: actions/checkout@v2
with:
ref: v${{env.version_number}}

- name: Release to pypi
env:
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
Expand All @@ -112,8 +112,8 @@ jobs:
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
twine upload --non-interactive dist/dbt_spark-${{env.version_number}}-py3-none-any.whl dist/dbt-spark-${{env.version_number}}.tar.gz

20 changes: 10 additions & 10 deletions .github/workflows/version-bump.yml
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
# **what?**
# This workflow will take a version number and a dry run flag. With that
# it will run versionbump to update the version number everywhere in the
# it will run versionbump to update the version number everywhere in the
# code base and then generate an update Docker requirements file. If this
# is a dry run, a draft PR will open with the changes. If this isn't a dry
# run, the changes will be committed to the branch this is run on.

# **why?**
# This is to aid in releasing dbt and making sure we have updated
# This is to aid in releasing dbt and making sure we have updated
# the versions and Docker requirements in all places.

# **when?**
# This is triggered either manually OR
# This is triggered either manually OR
# from the repository_dispatch event "version-bump" which is sent from
# the dbt-release repo Action

Expand All @@ -25,11 +25,11 @@ on:
is_dry_run:
description: 'Creates a draft PR to allow testing instead of committing to a branch'
required: true
default: 'true'
default: 'true'
repository_dispatch:
types: [version-bump]

jobs:
jobs:
bump:
runs-on: ubuntu-latest
steps:
Expand Down Expand Up @@ -58,19 +58,19 @@ jobs:
sudo apt-get install libsasl2-dev
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install --upgrade pip

- name: Create PR branch
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
run: |
git checkout -b bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git push origin bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git branch --set-upstream-to=origin/bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID

- name: Bumping version
run: |
source env/bin/activate
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
env/bin/bumpversion --allow-dirty --new-version ${{steps.variables.outputs.VERSION_NUMBER}} major
git status

Expand Down Expand Up @@ -100,4 +100,4 @@ jobs:
draft: true
base: ${{github.ref}}
title: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@

### Features
- Add session connection method ([#272](https:/dbt-labs/dbt-spark/issues/272), [#279](https:/dbt-labs/dbt-spark/pull/279))
- rename file to match reference to dbt-core ([#344](https:/dbt-labs/dbt-spark/pull/344))

### Under the hood
- Use dbt.tests.adapter.basic in test suite ([#298](https:/dbt-labs/dbt-spark/issues/298), [#299](https:/dbt-labs/dbt-spark/pull/299))
Expand Down
File renamed without changes.
14 changes: 7 additions & 7 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -8,23 +8,23 @@ basepython = python3.8
commands = /bin/bash -c '$(which flake8) --max-line-length 99 --select=E,W,F --ignore=W504 dbt/'
passenv = DBT_* PYTEST_ADDOPTS
deps =
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt

[testenv:unit]
basepython = python3.8
commands = /bin/bash -c '{envpython} -m pytest -v {posargs} tests/unit'
passenv = DBT_* PYTEST_ADDOPTS
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt

[testenv:integration-spark-databricks-http]
basepython = python3.8
commands = /bin/bash -c '{envpython} -m pytest -v --profile databricks_http_cluster {posargs} -n4 tests/functional/adapter/*'
passenv = DBT_* PYTEST_ADDOPTS
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.

[testenv:integration-spark-databricks-odbc-cluster]
Expand All @@ -34,7 +34,7 @@ commands = /bin/bash -c '{envpython} -m pytest -v --profile databricks_cluster {
passenv = DBT_* PYTEST_ADDOPTS ODBC_DRIVER
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.

[testenv:integration-spark-databricks-odbc-sql-endpoint]
Expand All @@ -44,7 +44,7 @@ commands = /bin/bash -c '{envpython} -m pytest -v --profile databricks_sql_endpo
passenv = DBT_* PYTEST_ADDOPTS ODBC_DRIVER
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.


Expand All @@ -55,7 +55,7 @@ commands = /bin/bash -c '{envpython} -m pytest -v --profile apache_spark {posarg
passenv = DBT_* PYTEST_ADDOPTS
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.

[testenv:integration-spark-session]
Expand All @@ -67,5 +67,5 @@ passenv =
PIP_CACHE_DIR
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.[session]