Skip to content

Commit b6ea1da

Browse files
author
Github Actions
committed
Matthias Feurer: Fix documentation links (#1048)
1 parent 8eb2a24 commit b6ea1da

File tree

211 files changed

+2181
-1471
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

211 files changed

+2181
-1471
lines changed

develop/_downloads/006bd0fc770c8918e54252890f1e023e/study_tutorial.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@
2525
# connects to the test server at test.openml.org before doing so.
2626
# This prevents the crowding of the main server with example datasets,
2727
# tasks, runs, and so on.
28+
#
2829
############################################################################
2930

3031

develop/_downloads/0a5da6cf0947c30e6ebb0b171dfc1b5a/configure_logging.py

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,6 @@
66
Explains openml-python logging, and shows how to configure it.
77
"""
88
##################################################################################
9-
# Logging
10-
# ^^^^^^^
119
# Openml-python uses the `Python logging module <https://docs.python.org/3/library/logging.html>`_
1210
# to provide users with log messages. Each log message is assigned a level of importance, see
1311
# the table in Python's logging tutorial
@@ -16,7 +14,7 @@
1614
# By default, openml-python will print log messages of level `WARNING` and above to console.
1715
# All log messages (including `DEBUG` and `INFO`) are also saved in a file, which can be
1816
# found in your cache directory (see also the
19-
# `introduction tutorial <../20_basic/introduction_tutorial.html>`_).
17+
# :ref:`sphx_glr_examples_20_basic_introduction_tutorial.py`).
2018
# These file logs are automatically deleted if needed, and use at most 2MB of space.
2119
#
2220
# It is possible to configure what log levels to send to console and file.

develop/_downloads/0d1d1c06933bd6d32bda534cb0aa0e53/create_upload_tutorial.py

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@
6767
"Robert Tibshirani (2004) (Least Angle Regression) "
6868
"Annals of Statistics (with discussion), 407-499"
6969
)
70-
paper_url = "http://web.stanford.edu/~hastie/Papers/LARS/LeastAngle_2002.pdf"
70+
paper_url = "https://web.stanford.edu/~hastie/Papers/LARS/LeastAngle_2002.pdf"
7171

7272
############################################################################
7373
# Create the dataset object
@@ -110,7 +110,7 @@
110110
data=data,
111111
# A version label which is provided by the user.
112112
version_label="test",
113-
original_data_url="http://www4.stat.ncsu.edu/~boos/var.select/diabetes.html",
113+
original_data_url="https://www4.stat.ncsu.edu/~boos/var.select/diabetes.html",
114114
paper_url=paper_url,
115115
)
116116

@@ -126,7 +126,7 @@
126126
# OrderedDicts in the case of sparse data.
127127
#
128128
# Weather dataset:
129-
# http://storm.cis.fordham.edu/~gweiss/data-mining/datasets.html
129+
# https://storm.cis.fordham.edu/~gweiss/data-mining/datasets.html
130130

131131
data = [
132132
["sunny", 85, 85, "FALSE", "no"],
@@ -200,8 +200,8 @@
200200
# storing the type of data for each column as well as the attribute names.
201201
# Therefore, when providing a Pandas DataFrame, OpenML can infer this
202202
# information without needing to explicitly provide it when calling the
203-
# function :func:`create_dataset`. In this regard, you only need to pass
204-
# ``'auto'`` to the ``attributes`` parameter.
203+
# function :func:`openml.datasets.create_dataset`. In this regard, you only
204+
# need to pass ``'auto'`` to the ``attributes`` parameter.
205205

206206
df = pd.DataFrame(data, columns=[col_name for col_name, _ in attribute_names])
207207
# enforce the categorical column to have a categorical dtype
@@ -214,8 +214,8 @@
214214
# We enforce the column 'outlook' and 'play' to be a categorical
215215
# dtype while the column 'windy' is kept as a boolean column. 'temperature'
216216
# and 'humidity' are kept as numeric columns. Then, we can
217-
# call :func:`create_dataset` by passing the dataframe and fixing the parameter
218-
# ``attributes`` to ``'auto'``.
217+
# call :func:`openml.datasets.create_dataset` by passing the dataframe and
218+
# fixing the parameter ``attributes`` to ``'auto'``.
219219

220220
weather_dataset = create_dataset(
221221
name="Weather",

develop/_downloads/1ad3e73844b6bba64db63f52045fc1ae/suites_tutorial.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
How to list, download and upload benchmark suites.
77
88
If you want to learn more about benchmark suites, check out our
9-
`brief introductory tutorial <../20_basic/simple_suites_tutorial.html>`_ or the
9+
brief introductory tutorial :ref:`sphx_glr_examples_20_basic_simple_suites_tutorial.py` or the
1010
`OpenML benchmark docs <https://docs.openml.org/benchmark/#benchmarking-suites>`_.
1111
"""
1212
############################################################################
@@ -24,6 +24,7 @@
2424
# connects to the test server at test.openml.org before doing so.
2525
# This prevents the main server from crowding with example datasets,
2626
# tasks, runs, and so on.
27+
#
2728
############################################################################
2829

2930

develop/_downloads/25a00d3d6385de3b0fbf8dd033ff9db0/simple_suites_tutorial.py

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,6 @@
6262
# Further examples
6363
# ================
6464
#
65-
# * `Advanced benchmarking suites tutorial <../30_extended/suites_tutorial.html>`_
66-
# * `Benchmarking studies tutorial <../30_extended/study_tutorial.html>`_
67-
# * `Using studies to compare linear and non-linear classifiers
68-
# <../40_paper/2018_ida_strang_example.html>`_
65+
# * :ref:`sphx_glr_examples_30_extended_suites_tutorial.py`
66+
# * :ref:`sphx_glr_examples_30_extended_study_tutorial.py`
67+
# * :ref:`sphx_glr_examples_40_paper_2018_ida_strang_example.py`

develop/_downloads/27f49b0e36fba2fe65360adcf060e098/2015_neurips_feurer_example.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Feurer et al. (2015)\n\nA tutorial on how to get the datasets used in the paper introducing *Auto-sklearn* by Feurer et al..\n\nAuto-sklearn website: https://automl.github.io/auto-sklearn/master/\n\n## Publication\n\n| Efficient and Robust Automated Machine Learning\n| Matthias Feurer, Aaron Klein, Katharina Eggensperger, Jost Springenberg, Manuel Blum and Frank Hutter\n| In *Advances in Neural Information Processing Systems 28*, 2015\n| Available at http://papers.nips.cc/paper/5872-efficient-and-robust-automated-machine-learning.pdf\n"
18+
"\n# Feurer et al. (2015)\n\nA tutorial on how to get the datasets used in the paper introducing *Auto-sklearn* by Feurer et al..\n\nAuto-sklearn website: https://automl.github.io/auto-sklearn/master/\n\n## Publication\n\n| Efficient and Robust Automated Machine Learning\n| Matthias Feurer, Aaron Klein, Katharina Eggensperger, Jost Springenberg, Manuel Blum and Frank Hutter\n| In *Advances in Neural Information Processing Systems 28*, 2015\n| Available at https://papers.nips.cc/paper/5872-efficient-and-robust-automated-machine-learning.pdf\n"
1919
]
2020
},
2121
{

develop/_downloads/296bc5731c400ca6e06e54ecb9b84b5c/configure_logging.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
"cell_type": "markdown",
2323
"metadata": {},
2424
"source": [
25-
"## Logging\nOpenml-python uses the `Python logging module <https://docs.python.org/3/library/logging.html>`_\nto provide users with log messages. Each log message is assigned a level of importance, see\nthe table in Python's logging tutorial\n`here <https://docs.python.org/3/howto/logging.html#when-to-use-logging>`_.\n\nBy default, openml-python will print log messages of level `WARNING` and above to console.\nAll log messages (including `DEBUG` and `INFO`) are also saved in a file, which can be\nfound in your cache directory (see also the\n`introduction tutorial <../20_basic/introduction_tutorial.html>`_).\nThese file logs are automatically deleted if needed, and use at most 2MB of space.\n\nIt is possible to configure what log levels to send to console and file.\nWhen downloading a dataset from OpenML, a `DEBUG`-level message is written:\n\n"
25+
"Openml-python uses the `Python logging module <https://docs.python.org/3/library/logging.html>`_\nto provide users with log messages. Each log message is assigned a level of importance, see\nthe table in Python's logging tutorial\n`here <https://docs.python.org/3/howto/logging.html#when-to-use-logging>`_.\n\nBy default, openml-python will print log messages of level `WARNING` and above to console.\nAll log messages (including `DEBUG` and `INFO`) are also saved in a file, which can be\nfound in your cache directory (see also the\n`sphx_glr_examples_20_basic_introduction_tutorial.py`).\nThese file logs are automatically deleted if needed, and use at most 2MB of space.\n\nIt is possible to configure what log levels to send to console and file.\nWhen downloading a dataset from OpenML, a `DEBUG`-level message is written:\n\n"
2626
]
2727
},
2828
{

develop/_downloads/2fc23bfc18345b110ab68bc5f3939dc8/2018_neurips_perrone_example.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
| Scalable Hyperparameter Transfer Learning
1212
| Valerio Perrone and Rodolphe Jenatton and Matthias Seeger and Cedric Archambeau
1313
| In *Advances in Neural Information Processing Systems 31*, 2018
14-
| Available at http://papers.nips.cc/paper/7917-scalable-hyperparameter-transfer-learning.pdf
14+
| Available at https://papers.nips.cc/paper/7917-scalable-hyperparameter-transfer-learning.pdf
1515
1616
This example demonstrates how OpenML runs can be used to construct a surrogate model.
1717

develop/_downloads/4076733b22158deda2a79e57d217b001/2018_kdd_rijn_example.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# van Rijn and Hutter (2018)\n\nA tutorial on how to reproduce the paper *Hyperparameter Importance Across Datasets*.\n\nThis is a Unix-only tutorial, as the requirements can not be satisfied on a Windows machine (Untested on other\nsystems).\n\n## Publication\n\n| Hyperparameter importance across datasets\n| Jan N. van Rijn and Frank Hutter\n| In *Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining*, 2018\n| Available at https://dl.acm.org/citation.cfm?id=3220058\n"
18+
"\n# van Rijn and Hutter (2018)\n\nA tutorial on how to reproduce the paper *Hyperparameter Importance Across Datasets*.\n\nThis is a Unix-only tutorial, as the requirements can not be satisfied on a Windows machine (Untested on other\nsystems).\n\n## Publication\n\n| Hyperparameter importance across datasets\n| Jan N. van Rijn and Frank Hutter\n| In *Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining*, 2018\n| Available at https://dl.acm.org/doi/10.1145/3219819.3220058\n"
1919
]
2020
},
2121
{

develop/_downloads/42ecf9b9ca30a385452934aeb1a420d5/2018_neurips_perrone_example.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Perrone et al. (2018)\n\nA tutorial on how to build a surrogate model based on OpenML data as done for *Scalable\nHyperparameter Transfer Learning* by Perrone et al..\n\n## Publication\n\n| Scalable Hyperparameter Transfer Learning\n| Valerio Perrone and Rodolphe Jenatton and Matthias Seeger and Cedric Archambeau\n| In *Advances in Neural Information Processing Systems 31*, 2018\n| Available at http://papers.nips.cc/paper/7917-scalable-hyperparameter-transfer-learning.pdf\n\nThis example demonstrates how OpenML runs can be used to construct a surrogate model.\n\nIn the following section, we shall do the following:\n\n* Retrieve tasks and flows as used in the experiments by Perrone et al. (2018).\n* Build a tabular data by fetching the evaluations uploaded to OpenML.\n* Impute missing values and handle categorical data before building a Random Forest model that\n maps hyperparameter values to the area under curve score.\n"
18+
"\n# Perrone et al. (2018)\n\nA tutorial on how to build a surrogate model based on OpenML data as done for *Scalable\nHyperparameter Transfer Learning* by Perrone et al..\n\n## Publication\n\n| Scalable Hyperparameter Transfer Learning\n| Valerio Perrone and Rodolphe Jenatton and Matthias Seeger and Cedric Archambeau\n| In *Advances in Neural Information Processing Systems 31*, 2018\n| Available at https://papers.nips.cc/paper/7917-scalable-hyperparameter-transfer-learning.pdf\n\nThis example demonstrates how OpenML runs can be used to construct a surrogate model.\n\nIn the following section, we shall do the following:\n\n* Retrieve tasks and flows as used in the experiments by Perrone et al. (2018).\n* Build a tabular data by fetching the evaluations uploaded to OpenML.\n* Impute missing values and handle categorical data before building a Random Forest model that\n maps hyperparameter values to the area under curve score.\n"
1919
]
2020
},
2121
{

0 commit comments

Comments
 (0)