Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
tianlh
LightGBM-DCU
Commits
226e7f7d
Unverified
Commit
226e7f7d
authored
Jan 20, 2025
by
Nikita Titov
Committed by
GitHub
Jan 20, 2025
Browse files
[ci] fix errors about indentation in yaml files (Part 2) (#6789)
parent
7679c735
Changes
8
Hide whitespace changes
Inline
Side-by-side
Showing
8 changed files
with
529 additions
and
530 deletions
+529
-530
.github/workflows/cuda.yml
.github/workflows/cuda.yml
+6
-6
.github/workflows/python_package.yml
.github/workflows/python_package.yml
+6
-6
.github/workflows/r_package.yml
.github/workflows/r_package.yml
+6
-6
.github/workflows/static_analysis.yml
.github/workflows/static_analysis.yml
+6
-6
.github/workflows/triggering_comments.yml
.github/workflows/triggering_comments.yml
+19
-19
.vsts-ci.yml
.vsts-ci.yml
+440
-440
.yamllint.yml
.yamllint.yml
+0
-1
R-package/pkgdown/_pkgdown.yml
R-package/pkgdown/_pkgdown.yml
+46
-46
No files found.
.github/workflows/cuda.yml
View file @
226e7f7d
...
...
@@ -3,10 +3,10 @@ name: CUDA Version
on
:
push
:
branches
:
-
master
-
master
pull_request
:
branches
:
-
master
-
master
# Run manually by clicking a button in the UI
workflow_dispatch
:
inputs
:
...
...
@@ -130,7 +130,7 @@ jobs:
runs-on
:
ubuntu-latest
needs
:
[
test
]
steps
:
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
.github/workflows/python_package.yml
View file @
226e7f7d
...
...
@@ -3,10 +3,10 @@ name: Python-package
on
:
push
:
branches
:
-
master
-
master
pull_request
:
branches
:
-
master
-
master
# automatically cancel in-progress builds if another commit is pushed
concurrency
:
...
...
@@ -147,7 +147,7 @@ jobs:
runs-on
:
ubuntu-latest
needs
:
[
test
,
test-latest-versions
,
test-oldest-versions
]
steps
:
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
.github/workflows/r_package.yml
View file @
226e7f7d
...
...
@@ -3,10 +3,10 @@ name: R-package
on
:
push
:
branches
:
-
master
-
master
pull_request
:
branches
:
-
master
-
master
# automatically cancel in-progress builds if another commit is pushed
concurrency
:
...
...
@@ -358,7 +358,7 @@ jobs:
runs-on
:
ubuntu-latest
needs
:
[
test
,
test-r-sanitizers
,
test-r-extra-checks
]
steps
:
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
.github/workflows/static_analysis.yml
View file @
226e7f7d
...
...
@@ -5,10 +5,10 @@ name: Static Analysis
on
:
push
:
branches
:
-
master
-
master
pull_request
:
branches
:
-
master
-
master
# automatically cancel in-progress builds if another commit is pushed
concurrency
:
...
...
@@ -88,7 +88,7 @@ jobs:
runs-on
:
ubuntu-latest
needs
:
[
test
,
r-check-docs
]
steps
:
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
-
name
:
Note that all tests succeeded
uses
:
re-actors/alls-green@v1.2.2
with
:
jobs
:
${{ toJSON(needs) }}
.github/workflows/triggering_comments.yml
View file @
226e7f7d
...
...
@@ -11,24 +11,24 @@ jobs:
env
:
SECRETS_WORKFLOW
:
${{ secrets.WORKFLOW }}
steps
:
-
name
:
Checkout repository
uses
:
actions/checkout@v4
with
:
fetch-depth
:
5
submodules
:
false
-
name
:
Checkout repository
uses
:
actions/checkout@v4
with
:
fetch-depth
:
5
submodules
:
false
-
name
:
Trigger R valgrind tests
if
:
github.event.comment.body == '/gha run r-valgrind'
run
:
|
$GITHUB_WORKSPACE/.ci/trigger-dispatch-run.sh \
"${{ github.event.issue.pull_request.url }}" \
"${{ github.event.comment.id }}" \
"gha_run_r_valgrind"
-
name
:
Trigger R valgrind tests
if
:
github.event.comment.body == '/gha run r-valgrind'
run
:
|
$GITHUB_WORKSPACE/.ci/trigger-dispatch-run.sh \
"${{ github.event.issue.pull_request.url }}" \
"${{ github.event.comment.id }}" \
"gha_run_r_valgrind"
-
name
:
Trigger update R configure
if
:
github.event.comment.body == '/gha run r-configure'
run
:
|
$GITHUB_WORKSPACE/.ci/trigger-dispatch-run.sh \
"${{ github.event.issue.pull_request.url }}" \
"${{ github.event.comment.id }}" \
"gha_run_r_configure"
-
name
:
Trigger update R configure
if
:
github.event.comment.body == '/gha run r-configure'
run
:
|
$GITHUB_WORKSPACE/.ci/trigger-dispatch-run.sh \
"${{ github.event.issue.pull_request.url }}" \
"${{ github.event.comment.id }}" \
"gha_run_r_configure"
.vsts-ci.yml
View file @
226e7f7d
trigger
:
branches
:
include
:
-
master
-
master
tags
:
include
:
-
v*
-
v*
pr
:
-
master
-
master
variables
:
AZURE
:
'
true'
CMAKE_BUILD_PARALLEL_LEVEL
:
4
...
...
@@ -26,458 +26,458 @@ resources:
# to minimize the risk of side effects from one run affecting future runs.
# ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/resources-containers-container
containers
:
-
container
:
linux-artifact-builder
image
:
lightgbm/vsts-agent:manylinux_2_28_x86_64
mountReadOnly
:
work
:
false
externals
:
true
tools
:
true
tasks
:
true
-
container
:
ubuntu-latest
image
:
'
ubuntu:22.04'
options
:
"
--name
ci-container
-v
/usr/bin/docker:/tmp/docker:ro"
mountReadOnly
:
work
:
false
externals
:
true
tools
:
true
tasks
:
true
-
container
:
rbase
image
:
wch1/r-debug
mountReadOnly
:
work
:
false
externals
:
true
tools
:
true
tasks
:
true
-
container
:
linux-artifact-builder
image
:
lightgbm/vsts-agent:manylinux_2_28_x86_64
mountReadOnly
:
work
:
false
externals
:
true
tools
:
true
tasks
:
true
-
container
:
ubuntu-latest
image
:
'
ubuntu:22.04'
options
:
"
--name
ci-container
-v
/usr/bin/docker:/tmp/docker:ro"
mountReadOnly
:
work
:
false
externals
:
true
tools
:
true
tasks
:
true
-
container
:
rbase
image
:
wch1/r-debug
mountReadOnly
:
work
:
false
externals
:
true
tools
:
true
tasks
:
true
jobs
:
###########################################
-
job
:
Maintenance
-
job
:
Maintenance
###########################################
pool
:
mariner-20240410-0
container
:
ubuntu-latest
# routine maintenance (like periodically deleting old files),
# to be run on 1 random CI runner in the self-hosted pool each runner
steps
:
-
script
:
|
print-diagnostics(){
echo "---- df -h -m ----"
df -h -m
echo "---- docker system df ----"
/tmp/docker system df
echo "---- docker images ----"
/tmp/docker images
}
# check disk usage
print-diagnostics
# remove old containers, container images, volumes
# ref: https://stackoverflow.com/a/32723127/3986677
# ref: https://depot.dev/blog/docker-clear-cache#removing-everything-with-docker-system-prune
echo "---- running 'docker system prune' ----"
/tmp/docker system prune \
--all \
--force \
--volumes \
--filter until=720h
# check disk usage again
print-diagnostics
displayName
:
Clean
pool
:
mariner-20240410-0
container
:
ubuntu-latest
# routine maintenance (like periodically deleting old files),
# to be run on 1 random CI runner in the self-hosted pool each runner
steps
:
-
script
:
|
print-diagnostics(){
echo "---- df -h -m ----"
df -h -m
echo "---- docker system df ----"
/tmp/docker system df
echo "---- docker images ----"
/tmp/docker images
}
# check disk usage
print-diagnostics
# remove old containers, container images, volumes
# ref: https://stackoverflow.com/a/32723127/3986677
# ref: https://depot.dev/blog/docker-clear-cache#removing-everything-with-docker-system-prune
echo "---- running 'docker system prune' ----"
/tmp/docker system prune \
--all \
--force \
--volumes \
--filter until=720h
# check disk usage again
print-diagnostics
displayName
:
Clean
###########################################
-
job
:
Linux
-
job
:
Linux
###########################################
variables
:
COMPILER
:
gcc
SETUP_CONDA
:
'
false'
OS_NAME
:
'
linux'
PRODUCES_ARTIFACTS
:
'
true'
pool
:
mariner-20240410-0
container
:
linux-artifact-builder
strategy
:
matrix
:
regular
:
TASK
:
regular
PYTHON_VERSION
:
'
3.10'
sdist
:
TASK
:
sdist
PYTHON_VERSION
:
'
3.8'
bdist
:
TASK
:
bdist
PYTHON_VERSION
:
'
3.9'
inference
:
TASK
:
if-else
mpi_source
:
TASK
:
mpi
METHOD
:
source
PYTHON_VERSION
:
'
3.9'
gpu_source
:
TASK
:
gpu
METHOD
:
source
swig
:
TASK
:
swig
steps
:
-
script
:
|
echo "##vso[task.setvariable variable=BUILD_DIRECTORY]$BUILD_SOURCESDIRECTORY"
echo "##vso[task.prependpath]/usr/lib64/openmpi/bin"
echo "##vso[task.prependpath]$CONDA/bin"
displayName
:
'
Set
variables'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
echo '$(Build.SourceVersion)' > '$(Build.ArtifactStagingDirectory)/commit.txt'
displayName
:
'
Add
commit
hash
to
artifacts
archive'
-
task
:
Bash@3
displayName
:
Setup
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/setup.sh
targetType
:
filePath
-
task
:
Bash@3
displayName
:
Test
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/test.sh
targetType
:
filePath
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'regular', 'sdist', 'bdist', 'swig'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
variables
:
COMPILER
:
gcc
SETUP_CONDA
:
'
false'
OS_NAME
:
'
linux'
PRODUCES_ARTIFACTS
:
'
true'
pool
:
mariner-20240410-0
container
:
linux-artifact-builder
strategy
:
matrix
:
regular
:
TASK
:
regular
PYTHON_VERSION
:
'
3.10'
sdist
:
TASK
:
sdist
PYTHON_VERSION
:
'
3.8'
bdist
:
TASK
:
bdist
PYTHON_VERSION
:
'
3.9'
inference
:
TASK
:
if-else
mpi_source
:
TASK
:
mpi
METHOD
:
source
PYTHON_VERSION
:
'
3.9'
gpu_source
:
TASK
:
gpu
METHOD
:
source
swig
:
TASK
:
swig
steps
:
-
script
:
|
echo "##vso[task.setvariable variable=BUILD_DIRECTORY]$BUILD_SOURCESDIRECTORY"
echo "##vso[task.prependpath]/usr/lib64/openmpi/bin"
echo "##vso[task.prependpath]$CONDA/bin"
displayName
:
'
Set
variables'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
echo '$(Build.SourceVersion)' > '$(Build.ArtifactStagingDirectory)/commit.txt'
displayName
:
'
Add
commit
hash
to
artifacts
archive'
-
task
:
Bash@3
displayName
:
Setup
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/setup.sh
targetType
:
filePath
-
task
:
Bash@3
displayName
:
Test
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/test.sh
targetType
:
filePath
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'regular', 'sdist', 'bdist', 'swig'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
###########################################
-
job
:
Linux_latest
-
job
:
Linux_latest
###########################################
variables
:
COMPILER
:
clang-17
DEBIAN_FRONTEND
:
'
noninteractive'
IN_UBUNTU_BASE_CONTAINER
:
'
true'
OS_NAME
:
'
linux'
SETUP_CONDA
:
'
true'
pool
:
mariner-20240410-0
container
:
ubuntu-latest
strategy
:
matrix
:
regular
:
TASK
:
regular
sdist
:
TASK
:
sdist
bdist
:
TASK
:
bdist
PYTHON_VERSION
:
'
3.10'
inference
:
TASK
:
if-else
mpi_source
:
TASK
:
mpi
METHOD
:
source
mpi_pip
:
TASK
:
mpi
METHOD
:
pip
PYTHON_VERSION
:
'
3.11'
mpi_wheel
:
TASK
:
mpi
METHOD
:
wheel
PYTHON_VERSION
:
'
3.9'
gpu_source
:
TASK
:
gpu
METHOD
:
source
PYTHON_VERSION
:
'
3.11'
gpu_pip
:
TASK
:
gpu
METHOD
:
pip
PYTHON_VERSION
:
'
3.10'
gpu_wheel
:
TASK
:
gpu
METHOD
:
wheel
PYTHON_VERSION
:
'
3.9'
cpp_tests
:
TASK
:
cpp-tests
METHOD
:
with-sanitizers
steps
:
-
script
:
|
echo "##vso[task.setvariable variable=BUILD_DIRECTORY]$BUILD_SOURCESDIRECTORY"
CONDA=$HOME/miniforge
echo "##vso[task.setvariable variable=CONDA]$CONDA"
echo "##vso[task.prependpath]$CONDA/bin"
displayName
:
'
Set
variables'
# https://github.com/microsoft/azure-pipelines-agent/issues/2043#issuecomment-687983301
-
script
:
|
/tmp/docker exec -t -u 0 ci-container \
sh -c "apt-get update && apt-get -o Dpkg::Options::="--force-confold" -y install sudo"
displayName
:
'
Install
sudo'
-
script
:
|
sudo apt-get update
sudo apt-get install -y --no-install-recommends git
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
task
:
Bash@3
displayName
:
Setup
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/setup.sh
targetType
:
'
filePath'
-
task
:
Bash@3
displayName
:
Test
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/test.sh
targetType
:
'
filePath'
variables
:
COMPILER
:
clang-17
DEBIAN_FRONTEND
:
'
noninteractive'
IN_UBUNTU_BASE_CONTAINER
:
'
true'
OS_NAME
:
'
linux'
SETUP_CONDA
:
'
true'
pool
:
mariner-20240410-0
container
:
ubuntu-latest
strategy
:
matrix
:
regular
:
TASK
:
regular
sdist
:
TASK
:
sdist
bdist
:
TASK
:
bdist
PYTHON_VERSION
:
'
3.10'
inference
:
TASK
:
if-else
mpi_source
:
TASK
:
mpi
METHOD
:
source
mpi_pip
:
TASK
:
mpi
METHOD
:
pip
PYTHON_VERSION
:
'
3.11'
mpi_wheel
:
TASK
:
mpi
METHOD
:
wheel
PYTHON_VERSION
:
'
3.9'
gpu_source
:
TASK
:
gpu
METHOD
:
source
PYTHON_VERSION
:
'
3.11'
gpu_pip
:
TASK
:
gpu
METHOD
:
pip
PYTHON_VERSION
:
'
3.10'
gpu_wheel
:
TASK
:
gpu
METHOD
:
wheel
PYTHON_VERSION
:
'
3.9'
cpp_tests
:
TASK
:
cpp-tests
METHOD
:
with-sanitizers
steps
:
-
script
:
|
echo "##vso[task.setvariable variable=BUILD_DIRECTORY]$BUILD_SOURCESDIRECTORY"
CONDA=$HOME/miniforge
echo "##vso[task.setvariable variable=CONDA]$CONDA"
echo "##vso[task.prependpath]$CONDA/bin"
displayName
:
'
Set
variables'
# https://github.com/microsoft/azure-pipelines-agent/issues/2043#issuecomment-687983301
-
script
:
|
/tmp/docker exec -t -u 0 ci-container \
sh -c "apt-get update && apt-get -o Dpkg::Options::="--force-confold" -y install sudo"
displayName
:
'
Install
sudo'
-
script
:
|
sudo apt-get update
sudo apt-get install -y --no-install-recommends git
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
task
:
Bash@3
displayName
:
Setup
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/setup.sh
targetType
:
'
filePath'
-
task
:
Bash@3
displayName
:
Test
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/test.sh
targetType
:
'
filePath'
###########################################
-
job
:
QEMU_multiarch
-
job
:
QEMU_multiarch
###########################################
variables
:
BUILD_DIRECTORY
:
/LightGBM
COMPILER
:
gcc
PRODUCES_ARTIFACTS
:
'
true'
pool
:
vmImage
:
ubuntu-22.04
timeoutInMinutes
:
180
strategy
:
matrix
:
bdist
:
TASK
:
bdist
ARCH
:
aarch64
steps
:
-
script
:
|
sudo apt-get update
sudo apt-get install --no-install-recommends -y \
binfmt-support \
qemu \
qemu-user \
qemu-user-static
displayName
:
'
Install
QEMU'
-
script
:
|
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
displayName
:
'
Enable
Docker
multi-architecture
support'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
cat > docker-script.sh <<EOF
export CONDA=\$HOME/miniforge
export PATH=\$CONDA/bin:/opt/rh/llvm-toolset-7.0/root/usr/bin:\$PATH
export LD_LIBRARY_PATH=/opt/rh/llvm-toolset-7.0/root/usr/lib64:\$LD_LIBRARY_PATH
\$BUILD_DIRECTORY/.ci/setup.sh || exit 1
\$BUILD_DIRECTORY/.ci/test.sh || exit 1
EOF
IMAGE_URI="lightgbm/vsts-agent:manylinux2014_aarch64"
docker pull "${IMAGE_URI}" || exit 1
PLATFORM=$(docker inspect --format='{{.Os}}/{{.Architecture}}' "${IMAGE_URI}") || exit 1
echo "detected image platform: ${PLATFORM}"
docker run \
--platform "${PLATFORM}" \
--rm \
--env AZURE=true \
--env BUILD_ARTIFACTSTAGINGDIRECTORY=$BUILD_ARTIFACTSTAGINGDIRECTORY \
--env BUILD_DIRECTORY=$BUILD_DIRECTORY \
--env COMPILER=$COMPILER \
--env METHOD=$METHOD \
--env OS_NAME=linux \
--env PRODUCES_ARTIFACTS=$PRODUCES_ARTIFACTS \
--env PYTHON_VERSION=$PYTHON_VERSION \
--env TASK=$TASK \
-v "$(Build.SourcesDirectory)":"$BUILD_DIRECTORY" \
-v "$(Build.ArtifactStagingDirectory)":"$(Build.ArtifactStagingDirectory)" \
"${IMAGE_URI}" \
/bin/bash $BUILD_DIRECTORY/docker-script.sh
displayName
:
'
Setup
and
run
tests'
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'bdist'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
variables
:
BUILD_DIRECTORY
:
/LightGBM
COMPILER
:
gcc
PRODUCES_ARTIFACTS
:
'
true'
pool
:
vmImage
:
ubuntu-22.04
timeoutInMinutes
:
180
strategy
:
matrix
:
bdist
:
TASK
:
bdist
ARCH
:
aarch64
steps
:
-
script
:
|
sudo apt-get update
sudo apt-get install --no-install-recommends -y \
binfmt-support \
qemu \
qemu-user \
qemu-user-static
displayName
:
'
Install
QEMU'
-
script
:
|
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
displayName
:
'
Enable
Docker
multi-architecture
support'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
cat > docker-script.sh <<EOF
export CONDA=\$HOME/miniforge
export PATH=\$CONDA/bin:/opt/rh/llvm-toolset-7.0/root/usr/bin:\$PATH
export LD_LIBRARY_PATH=/opt/rh/llvm-toolset-7.0/root/usr/lib64:\$LD_LIBRARY_PATH
\$BUILD_DIRECTORY/.ci/setup.sh || exit 1
\$BUILD_DIRECTORY/.ci/test.sh || exit 1
EOF
IMAGE_URI="lightgbm/vsts-agent:manylinux2014_aarch64"
docker pull "${IMAGE_URI}" || exit 1
PLATFORM=$(docker inspect --format='{{.Os}}/{{.Architecture}}' "${IMAGE_URI}") || exit 1
echo "detected image platform: ${PLATFORM}"
docker run \
--platform "${PLATFORM}" \
--rm \
--env AZURE=true \
--env BUILD_ARTIFACTSTAGINGDIRECTORY=$BUILD_ARTIFACTSTAGINGDIRECTORY \
--env BUILD_DIRECTORY=$BUILD_DIRECTORY \
--env COMPILER=$COMPILER \
--env METHOD=$METHOD \
--env OS_NAME=linux \
--env PRODUCES_ARTIFACTS=$PRODUCES_ARTIFACTS \
--env PYTHON_VERSION=$PYTHON_VERSION \
--env TASK=$TASK \
-v "$(Build.SourcesDirectory)":"$BUILD_DIRECTORY" \
-v "$(Build.ArtifactStagingDirectory)":"$(Build.ArtifactStagingDirectory)" \
"${IMAGE_URI}" \
/bin/bash $BUILD_DIRECTORY/docker-script.sh
displayName
:
'
Setup
and
run
tests'
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'bdist'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
###########################################
-
job
:
macOS
-
job
:
macOS
###########################################
variables
:
COMPILER
:
clang
OS_NAME
:
'
macos'
PRODUCES_ARTIFACTS
:
'
true'
pool
:
vmImage
:
'
macOS-13'
strategy
:
matrix
:
regular
:
TASK
:
regular
PYTHON_VERSION
:
'
3.10'
sdist
:
TASK
:
sdist
PYTHON_VERSION
:
'
3.9'
bdist
:
TASK
:
bdist
swig
:
TASK
:
swig
cpp_tests
:
TASK
:
cpp-tests
METHOD
:
with-sanitizers
SANITIZERS
:
"
address;undefined"
steps
:
-
script
:
|
echo "##vso[task.setvariable variable=BUILD_DIRECTORY]$BUILD_SOURCESDIRECTORY"
CONDA=$AGENT_HOMEDIRECTORY/miniforge
echo "##vso[task.setvariable variable=CONDA]$CONDA"
echo "##vso[task.prependpath]$CONDA/bin"
echo "##vso[task.setvariable variable=JAVA_HOME]$JAVA_HOME_8_X64"
displayName
:
'
Set
variables'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
task
:
Bash@3
displayName
:
Setup
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/setup.sh
targetType
:
filePath
-
task
:
Bash@3
displayName
:
Test
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/test.sh
targetType
:
filePath
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'regular', 'bdist', 'swig'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
variables
:
COMPILER
:
clang
OS_NAME
:
'
macos'
PRODUCES_ARTIFACTS
:
'
true'
pool
:
vmImage
:
'
macOS-13'
strategy
:
matrix
:
regular
:
TASK
:
regular
PYTHON_VERSION
:
'
3.10'
sdist
:
TASK
:
sdist
PYTHON_VERSION
:
'
3.9'
bdist
:
TASK
:
bdist
swig
:
TASK
:
swig
cpp_tests
:
TASK
:
cpp-tests
METHOD
:
with-sanitizers
SANITIZERS
:
"
address;undefined"
steps
:
-
script
:
|
echo "##vso[task.setvariable variable=BUILD_DIRECTORY]$BUILD_SOURCESDIRECTORY"
CONDA=$AGENT_HOMEDIRECTORY/miniforge
echo "##vso[task.setvariable variable=CONDA]$CONDA"
echo "##vso[task.prependpath]$CONDA/bin"
echo "##vso[task.setvariable variable=JAVA_HOME]$JAVA_HOME_8_X64"
displayName
:
'
Set
variables'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
task
:
Bash@3
displayName
:
Setup
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/setup.sh
targetType
:
filePath
-
task
:
Bash@3
displayName
:
Test
inputs
:
filePath
:
$(Build.SourcesDirectory)/.ci/test.sh
targetType
:
filePath
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'regular', 'bdist', 'swig'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
###########################################
-
job
:
Windows
-
job
:
Windows
###########################################
pool
:
vmImage
:
'
windows-2019'
strategy
:
matrix
:
regular
:
TASK
:
regular
PYTHON_VERSION
:
'
3.10'
sdist
:
TASK
:
sdist
PYTHON_VERSION
:
'
3.9'
bdist
:
TASK
:
bdist
swig
:
TASK
:
swig
cpp_tests
:
TASK
:
cpp-tests
steps
:
-
powershell
:
|
Write-Host "##vso[task.prependpath]$env:CONDA\Scripts"
displayName
:
'
Set
Variables'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
cmd /c "powershell -ExecutionPolicy Bypass -File %BUILD_SOURCESDIRECTORY%/.ci/install-opencl.ps1"
condition
:
eq(variables['TASK'], 'bdist')
displayName
:
'
Install
OpenCL'
-
script
:
|
cmd /c "conda config --remove channels defaults"
cmd /c "conda config --add channels nodefaults"
cmd /c "conda config --add channels conda-forge"
cmd /c "conda config --set channel_priority strict"
cmd /c "conda init powershell"
cmd /c "powershell -ExecutionPolicy Bypass -File %BUILD_SOURCESDIRECTORY%/.ci/test-windows.ps1"
displayName
:
Test
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'regular', 'bdist', 'swig'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
pool
:
vmImage
:
'
windows-2019'
strategy
:
matrix
:
regular
:
TASK
:
regular
PYTHON_VERSION
:
'
3.10'
sdist
:
TASK
:
sdist
PYTHON_VERSION
:
'
3.9'
bdist
:
TASK
:
bdist
swig
:
TASK
:
swig
cpp_tests
:
TASK
:
cpp-tests
steps
:
-
powershell
:
|
Write-Host "##vso[task.prependpath]$env:CONDA\Scripts"
displayName
:
'
Set
Variables'
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
cmd /c "powershell -ExecutionPolicy Bypass -File %BUILD_SOURCESDIRECTORY%/.ci/install-opencl.ps1"
condition
:
eq(variables['TASK'], 'bdist')
displayName
:
'
Install
OpenCL'
-
script
:
|
cmd /c "conda config --remove channels defaults"
cmd /c "conda config --add channels nodefaults"
cmd /c "conda config --add channels conda-forge"
cmd /c "conda config --set channel_priority strict"
cmd /c "conda init powershell"
cmd /c "powershell -ExecutionPolicy Bypass -File %BUILD_SOURCESDIRECTORY%/.ci/test-windows.ps1"
displayName
:
Test
-
task
:
PublishBuildArtifacts@1
condition
:
and(succeeded(), in(variables['TASK'], 'regular', 'bdist', 'swig'), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)'
artifactName
:
PackageAssets
artifactType
:
container
###########################################
-
job
:
R_artifact
-
job
:
R_artifact
###########################################
condition
:
not(startsWith(variables['Build.SourceBranch'], 'refs/pull/'))
pool
:
vmImage
:
'
ubuntu-22.04'
container
:
rbase
steps
:
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
LGB_VER=$(head -n 1 VERSION.txt | sed "s/rc/-/g")
R_LIB_PATH=~/Rlib
export R_LIBS=${R_LIB_PATH}
mkdir -p ${R_LIB_PATH}
RDscript -e "install.packages(c('R6', 'data.table', 'jsonlite', 'knitr', 'markdown', 'Matrix', 'RhpcBLASctl'), lib = '${R_LIB_PATH}', dependencies = c('Depends', 'Imports', 'LinkingTo'), repos = 'https://cran.rstudio.com', Ncpus = parallel::detectCores())" || exit 1
sh build-cran-package.sh --r-executable=RD || exit 1
mv lightgbm_${LGB_VER}.tar.gz $(Build.ArtifactStagingDirectory)/lightgbm-${LGB_VER}-r-cran.tar.gz
displayName
:
'
Build
CRAN
R-package'
-
task
:
PublishBuildArtifacts@1
condition
:
succeeded()
inputs
:
pathtoPublish
:
$(Build.ArtifactStagingDirectory)
artifactName
:
R-package
artifactType
:
container
condition
:
not(startsWith(variables['Build.SourceBranch'], 'refs/pull/'))
pool
:
vmImage
:
'
ubuntu-22.04'
container
:
rbase
steps
:
-
script
:
|
git clean -d -f -x
displayName
:
'
Clean
source
directory'
-
script
:
|
LGB_VER=$(head -n 1 VERSION.txt | sed "s/rc/-/g")
R_LIB_PATH=~/Rlib
export R_LIBS=${R_LIB_PATH}
mkdir -p ${R_LIB_PATH}
RDscript -e "install.packages(c('R6', 'data.table', 'jsonlite', 'knitr', 'markdown', 'Matrix', 'RhpcBLASctl'), lib = '${R_LIB_PATH}', dependencies = c('Depends', 'Imports', 'LinkingTo'), repos = 'https://cran.rstudio.com', Ncpus = parallel::detectCores())" || exit 1
sh build-cran-package.sh --r-executable=RD || exit 1
mv lightgbm_${LGB_VER}.tar.gz $(Build.ArtifactStagingDirectory)/lightgbm-${LGB_VER}-r-cran.tar.gz
displayName
:
'
Build
CRAN
R-package'
-
task
:
PublishBuildArtifacts@1
condition
:
succeeded()
inputs
:
pathtoPublish
:
$(Build.ArtifactStagingDirectory)
artifactName
:
R-package
artifactType
:
container
###########################################
-
job
:
Package
-
job
:
Package
###########################################
dependsOn
:
-
Linux
-
Linux_latest
-
QEMU_multiarch
-
macOS
-
Windows
-
R_artifact
condition
:
and(succeeded(), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
pool
:
vmImage
:
'
ubuntu-22.04'
steps
:
# Create archives with complete source code included (with git submodules)
-
task
:
ArchiveFiles@2
displayName
:
Create zip archive
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
rootFolderOrFile
:
$(Build.SourcesDirectory)
includeRootFolder
:
false
archiveType
:
zip
archiveFile
:
'
$(Build.ArtifactStagingDirectory)/archives/LightGBM-complete_source_code_zip.zip'
replaceExistingArchive
:
true
-
task
:
ArchiveFiles@2
displayName
:
Create tar.gz archive
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
rootFolderOrFile
:
$(Build.SourcesDirectory)
includeRootFolder
:
false
archiveType
:
tar
tarCompression
:
gz
archiveFile
:
'
$(Build.ArtifactStagingDirectory)/archives/LightGBM-complete_source_code_tar_gz.tar.gz'
replaceExistingArchive
:
true
# Download all agent packages from all previous phases
-
task
:
DownloadBuildArtifacts@0
displayName
:
Download package assets
inputs
:
artifactName
:
PackageAssets
downloadPath
:
$(Build.SourcesDirectory)/binaries
-
task
:
DownloadBuildArtifacts@0
displayName
:
Download R-package
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
artifactName
:
R-package
downloadPath
:
$(Build.SourcesDirectory)/R
-
script
:
|
python "$(Build.SourcesDirectory)/.ci/create-nuget.py" "$(Build.SourcesDirectory)/binaries/PackageAssets"
displayName
:
'
Create
NuGet
configuration
files'
-
task
:
NuGetCommand@2
inputs
:
command
:
pack
packagesToPack
:
'
$(Build.SourcesDirectory)/.ci/nuget/*.nuspec'
packDestination
:
'
$(Build.ArtifactStagingDirectory)/nuget'
-
task
:
PublishBuildArtifacts@1
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)/nuget'
artifactName
:
NuGet
artifactType
:
container
-
task
:
GitHubRelease@0
displayName
:
'
Create
GitHub
Release'
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
gitHubConnection
:
guolinke
repositoryName
:
'
$(Build.Repository.Name)'
action
:
'
create'
target
:
'
$(Build.SourceVersion)'
tagSource
:
'
auto'
title
:
'
$(Build.SourceBranchName)'
assets
:
|
$(Build.SourcesDirectory)/binaries/PackageAssets/*
$(Build.SourcesDirectory)/R/R-package/*
$(Build.ArtifactStagingDirectory)/nuget/*.nupkg
$(Build.ArtifactStagingDirectory)/archives/*
assetUploadMode
:
'
delete'
isDraft
:
true
isPreRelease
:
false
addChangeLog
:
false
dependsOn
:
-
Linux
-
Linux_latest
-
QEMU_multiarch
-
macOS
-
Windows
-
R_artifact
condition
:
and(succeeded(), not(startsWith(variables['Build.SourceBranch'], 'refs/pull/')))
pool
:
vmImage
:
'
ubuntu-22.04'
steps
:
# Create archives with complete source code included (with git submodules)
-
task
:
ArchiveFiles@2
displayName
:
Create zip archive
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
rootFolderOrFile
:
$(Build.SourcesDirectory)
includeRootFolder
:
false
archiveType
:
zip
archiveFile
:
'
$(Build.ArtifactStagingDirectory)/archives/LightGBM-complete_source_code_zip.zip'
replaceExistingArchive
:
true
-
task
:
ArchiveFiles@2
displayName
:
Create tar.gz archive
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
rootFolderOrFile
:
$(Build.SourcesDirectory)
includeRootFolder
:
false
archiveType
:
tar
tarCompression
:
gz
archiveFile
:
'
$(Build.ArtifactStagingDirectory)/archives/LightGBM-complete_source_code_tar_gz.tar.gz'
replaceExistingArchive
:
true
# Download all agent packages from all previous phases
-
task
:
DownloadBuildArtifacts@0
displayName
:
Download package assets
inputs
:
artifactName
:
PackageAssets
downloadPath
:
$(Build.SourcesDirectory)/binaries
-
task
:
DownloadBuildArtifacts@0
displayName
:
Download R-package
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
artifactName
:
R-package
downloadPath
:
$(Build.SourcesDirectory)/R
-
script
:
|
python "$(Build.SourcesDirectory)/.ci/create-nuget.py" "$(Build.SourcesDirectory)/binaries/PackageAssets"
displayName
:
'
Create
NuGet
configuration
files'
-
task
:
NuGetCommand@2
inputs
:
command
:
pack
packagesToPack
:
'
$(Build.SourcesDirectory)/.ci/nuget/*.nuspec'
packDestination
:
'
$(Build.ArtifactStagingDirectory)/nuget'
-
task
:
PublishBuildArtifacts@1
inputs
:
pathtoPublish
:
'
$(Build.ArtifactStagingDirectory)/nuget'
artifactName
:
NuGet
artifactType
:
container
-
task
:
GitHubRelease@0
displayName
:
'
Create
GitHub
Release'
condition
:
and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/tags/v'))
inputs
:
gitHubConnection
:
guolinke
repositoryName
:
'
$(Build.Repository.Name)'
action
:
'
create'
target
:
'
$(Build.SourceVersion)'
tagSource
:
'
auto'
title
:
'
$(Build.SourceBranchName)'
assets
:
|
$(Build.SourcesDirectory)/binaries/PackageAssets/*
$(Build.SourcesDirectory)/R/R-package/*
$(Build.ArtifactStagingDirectory)/nuget/*.nupkg
$(Build.ArtifactStagingDirectory)/archives/*
assetUploadMode
:
'
delete'
isDraft
:
true
isPreRelease
:
false
addChangeLog
:
false
.yamllint.yml
View file @
226e7f7d
...
...
@@ -10,5 +10,4 @@ rules:
check-keys
:
false
# temporarily disabled rules
indentation
:
disable
comments-indentation
:
disable
R-package/pkgdown/_pkgdown.yml
View file @
226e7f7d
...
...
@@ -37,70 +37,70 @@ navbar:
title
:
LightGBM
type
:
default
left
:
-
icon
:
fa-reply fa-lg
href
:
../
-
icon
:
fa-home fa-lg
href
:
index.html
-
text
:
Articles
href
:
articles/index.html
-
text
:
Reference
href
:
reference/index.html
-
icon
:
fa-reply fa-lg
href
:
../
-
icon
:
fa-home fa-lg
href
:
index.html
-
text
:
Articles
href
:
articles/index.html
-
text
:
Reference
href
:
reference/index.html
right
:
-
icon
:
fa-github fa-lg
href
:
https://github.com/microsoft/LightGBM/tree/master/R-package
-
icon
:
fa-github fa-lg
href
:
https://github.com/microsoft/LightGBM/tree/master/R-package
reference
:
-
title
:
Datasets
desc
:
Datasets included with the R-package
contents
:
-
'
`agaricus.train`'
-
'
`agaricus.test`'
-
'
`bank`'
-
'
`agaricus.train`'
-
'
`agaricus.test`'
-
'
`bank`'
-
title
:
Data Input / Output
desc
:
Data I/O required for LightGBM
contents
:
-
'
`dim.lgb.Dataset`'
-
'
`dimnames.lgb.Dataset`'
-
'
`get_field`'
-
'
`set_field`'
-
'
`lgb.Dataset`'
-
'
`lgb.Dataset.construct`'
-
'
`lgb.Dataset.create.valid`'
-
'
`lgb.Dataset.save`'
-
'
`lgb.Dataset.set.categorical`'
-
'
`lgb.Dataset.set.reference`'
-
'
`lgb.convert_with_rules`'
-
'
`lgb.slice.Dataset`'
-
'
`dim.lgb.Dataset`'
-
'
`dimnames.lgb.Dataset`'
-
'
`get_field`'
-
'
`set_field`'
-
'
`lgb.Dataset`'
-
'
`lgb.Dataset.construct`'
-
'
`lgb.Dataset.create.valid`'
-
'
`lgb.Dataset.save`'
-
'
`lgb.Dataset.set.categorical`'
-
'
`lgb.Dataset.set.reference`'
-
'
`lgb.convert_with_rules`'
-
'
`lgb.slice.Dataset`'
-
title
:
Machine Learning
desc
:
Train models with LightGBM and then use them to make predictions on new data
contents
:
-
'
`lightgbm`'
-
'
`lgb.train`'
-
'
`predict.lgb.Booster`'
-
'
`lgb.cv`'
-
'
`lgb.configure_fast_predict`'
-
'
`lightgbm`'
-
'
`lgb.train`'
-
'
`predict.lgb.Booster`'
-
'
`lgb.cv`'
-
'
`lgb.configure_fast_predict`'
-
title
:
Saving / Loading Models
desc
:
Save and load LightGBM models
contents
:
-
'
`lgb.dump`'
-
'
`lgb.save`'
-
'
`lgb.load`'
-
'
`lgb.model.dt.tree`'
-
'
`lgb.drop_serialized`'
-
'
`lgb.make_serializable`'
-
'
`lgb.restore_handle`'
-
'
`lgb.dump`'
-
'
`lgb.save`'
-
'
`lgb.load`'
-
'
`lgb.model.dt.tree`'
-
'
`lgb.drop_serialized`'
-
'
`lgb.make_serializable`'
-
'
`lgb.restore_handle`'
-
title
:
Model Interpretation
desc
:
Analyze your models
contents
:
-
'
`lgb.get.eval.result`'
-
'
`lgb.importance`'
-
'
`lgb.interprete`'
-
'
`lgb.plot.importance`'
-
'
`lgb.plot.interpretation`'
-
'
`print.lgb.Booster`'
-
'
`summary.lgb.Booster`'
-
'
`lgb.get.eval.result`'
-
'
`lgb.importance`'
-
'
`lgb.interprete`'
-
'
`lgb.plot.importance`'
-
'
`lgb.plot.interpretation`'
-
'
`print.lgb.Booster`'
-
'
`summary.lgb.Booster`'
-
title
:
Multithreading Control
desc
:
Manage degree of parallelism used by LightGBM
contents
:
-
'
`getLGBMThreads`'
-
'
`setLGBMThreads`'
-
'
`getLGBMThreads`'
-
'
`setLGBMThreads`'
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment