Skip to content

Commit 5a23d1b

Browse files
small edits (#18748)
small edits
1 parent 189d96c commit 5a23d1b

4 files changed

Lines changed: 45 additions & 10 deletions

File tree

tutorials/ai-core-data/ai-core-data.md

Lines changed: 38 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ By the end of the tutorial you will have two models trained on two different dat
2828

2929
[ACCORDION-BEGIN [Step 1: ](Modify AI code)]
3030

31-
Create a new directory named `hello-aicore-data`.
31+
Create a new directory named `hello-aicore-data`. The code is different from [previous tutorial](https://developers.sap.com/tutorials/ai-core-code.html/#) as it reads the data from folder (volumes, virtual storage spaces). The content of these volumes is dynamically loaded during execution of workflows.
3232

3333
Create a file named `main.py`, and paste the following snippet there:
3434

@@ -72,6 +72,8 @@ pickle.dump(clf, open(MODEL_PATH, 'wb'))
7272

7373
Your code reads the data file `train.csv` from the location `/app/data`, which will be prepared in a later step. It also reads the variable (hyper-parameter) `DT_MAX_DEPTH` from the environment variables later. When generated, your model will be stored in the location `/app/model/` . You will also learn how to transport this code from SAP AI Core to your own cloud storage.
7474

75+
> **Recommendation**: Although the dataset file `train.csv` is not present, it will be dynamically copied during execution to the volume mentioned in `(/app/data)`. Its recommended to pass the filename `(train.csv)` through the environment variable to your code so that if your dataset filename changes, you can dynamically set the dataset file.
76+
7577
!![image](img/code-main.png)
7678

7779
Create file `requirements.txt` as shown below. Here, if you don't specify a particular version, as shown for `pandas`, then the latest version of the package will be fetched automatically.
@@ -164,14 +166,19 @@ spec:
164166
- "python /app/src/main.py"
165167
```
166168
167-
### Understanding your workflow.
169+
### Understanding changes in your workflow
170+
171+
This change to your workflow creates a placeholder through which you can specify a data path (volume) to the container (Docker image in execution).
168172
169173
!![image](img/pipeline.png)
170174
171175
1. A placeholder named `housedataset` is created.
172176
2. You specify the **kind of artifact** that the placeholder can accept. **Artifact** is covered in details later in this tutorial.
173177
3. You use a placeholder to specify the **path that you created in your Dockerfile**, which is where you will copy files to your Docker image.
174178

179+
#### Why do we need to create placeholders?
180+
SAP AI Core only uses your workflows as an interface, so is unaware of the volume/ attachments specified in your Docker image. Your data path is specified in your Dockerfile and has a placeholder in your workflow and data is then expected by the Docker image.
181+
175182
[DONE]
176183
[ACCORDION-END]
177184

@@ -210,6 +217,7 @@ spec:
210217
- - name: mypredictor
211218
template: mycodeblock1
212219
- name: mycodeblock1
220+
# Add your resource plan here. The annotation should follow metadata > labels > ai.sap.com/resourcePlan: <plan>
213221
inputs:
214222
artifacts: # placeholder for cloud storage attachements
215223
- name: housedataset # a name for the placeholder
@@ -228,7 +236,7 @@ The following shows the new important lines in the workflows.
228236

229237
!![image](img/pipeline2.png)
230238

231-
##*#Understanding these changes
239+
####Understanding these changes
232240

233241
1. A placeholder named `DT_MAX_DEPTH` is created locally in the workflow. It accepts number content of input type: `string`. The input is then type cast to an integer elsewhere in your code, so it must be a string containing integers. For example: ```"4"``` is acceptable because it is a string containing content that can can be type cast to and integer.
234242
2. You create an input `env` (Environment) variable to your Docker image, named `DT_MAX_DEPTH`. The value of this variable is fed in from `workflow.parameters.DT_MAX_DEPTH`,the local name from previous point.
@@ -239,6 +247,29 @@ Commit the changes in the GitHub.
239247
[ACCORDION-END]
240248

241249

250+
[ACCORDION-BEGIN [Step 1: ](Set resource plan)]
251+
252+
Add the following snippet in your workflow to specify resource plan. The resource plan helps specify computing resource required to run your Docker image. The computing resources includes GPU, RAM and Processor. If not mentioned the resource plan defaults to `starter` which is the entry level [resource plan](https://help.sap.com/docs/AI_CORE/2d6c5984063c40a59eda62f4a9135bee/57f4f19d9b3b46208ee1d72017d0eab6.html?locale=en-US).
253+
254+
255+
```BASH[6-8]
256+
spec:
257+
...
258+
templates:
259+
...
260+
- name: mycodeblock1
261+
metadata:
262+
labels:
263+
ai.sap.com/resourcePlan: starter
264+
...
265+
```
266+
267+
**INFORMATION**: You can always verify computing resource allocated using the following command `echo $(lscpu)` within your Docker image. The command is the the shell script command of Linux to print system configuration.
268+
269+
[DONE]
270+
[ACCORDION-END]
271+
272+
242273
[ACCORDION-BEGIN [Step 4: ](Observe your scenario and placeholder)]
243274

244275
[OPTION BEGIN [SAP AI Launchpad]]
@@ -358,7 +389,7 @@ You need to create AWS S3 object store, using one of the following links:
358389

359390
Download and Install the [AWS Command Line Interface (CLI)](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html).
360391

361-
Open your terminal and run:
392+
To configure settings for your CWS CLI, open your terminal and run:
362393

363394
```BASH
364395
aws configure
@@ -368,6 +399,8 @@ aws configure
368399

369400
Enter your AWS credentials. Note that the appearance of the screen will not change as you type. You can leave the `Default output format` entry as blank. Press **enter** to submit your credentials.
370401

402+
Your credentials are stored in your system and used by the AWS CLI to interact with AWS. Fore more informaiton, see [Configuring the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html)
403+
371404
[DONE]
372405
[ACCORDION-END]
373406

@@ -506,7 +539,7 @@ With your object store secret created, you can now reference any sub-folders to
506539

507540
[OPTION BEGIN [Postman]]
508541

509-
Create an artifact for train.csv that we uploaded to jan folder, by clicking **POST Register artifact** and using the **body** underneath.
542+
Create an artifact for the `train.csv` file that we uploaded to the `jan` folder, by clicking **POST Register artifact** and using the body underneath.
510543

511544
!![image](img/postman/artifact.png)
512545

tutorials/ai-core-deploy/ai-core-deploy.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -102,11 +102,13 @@ if __name__ == "__main__":
102102
#### Understanding your code
103103

104104
Where should you load your model from?
105+
105106
- Your code reads files from folder `/mnt/models`. This folder path is hard-coded in SAP AI Core, and cannot be modified.
106107
- Later, you will dynamically place your model file in the path `/mnt/models`.
107108
- You may place multiple files inside `/mnt/models` as part of your model. These files may have multiple formats, such as `.py` or `.pickle`, however you should not-create sub-directories within it.
108109

109110
Which serving engine to use?
111+
110112
- Your code uses Flask to create a server, however you may use another python library if you would like to.
111113
- Your format for prediction REST calls will depend on the implementation of this deployment server.
112114
- You implement the endpoint `/v2/predict` to make predictions. You may modify the endpoint name and format, but each endpoint must have the prefix `/v<NUMBER>`. For example if you want to create endpoint to greet your server, then the endpoint implementation should be `/v2/greet` or `/v1/greet`
@@ -442,7 +444,7 @@ Switching between deployed models means that you can update the model used in yo
442444

443445
[OPTION BEGIN [SAP AI Launchpad]]
444446

445-
To create a new configuration, click **ML Operations > Configuration > Create**. Enter the following details amd click **Next**.
447+
To create a new configuration, click **ML Operations > Configuration > Create**. Enter the following details and click **Next**.
446448

447449
!![configuration create](img/ail/config21.jpg)
448450

tutorials/ai-core-helloworld/ai-core-helloworld.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -217,7 +217,7 @@ The code first takes takes a public [docker image of python](https://hub.docker.
217217
!![image](img/explain_3.png)
218218

219219
> ### What is a Docker Image?
220-
> A Docker Image is a portable Linux environment, similar to a virtual machine. Docker images are layered environments, which means you may just have Linux OS (for example Distrom) as one Docker image or another Docker image which has python layered on top of that Linux.
220+
> A Docker Image is a portable Linux environment, similar to a virtual machine. Docker images are layered environments, which means you may just have Linux OS (for example `Distrom`) as one Docker image or another Docker image which has python layered on top of that Linux.
221221
>
222222
> While the code in this tutorial is written directly in the workflow, in actual production you will store the code scripts within your Docker Image. The number of code files and programming language are your preferences.
223223

tutorials/ai-core-metrics/ai-core-metrics.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -240,7 +240,7 @@ After execution, you can see this in SAP AI Launchpad.
240240
[DONE]
241241
[ACCORDION-END]
242242

243-
[ACCORDION-BEGIN [Step 6: ](Add tags for execution meta after training)]
243+
[ACCORDION-BEGIN [Step 7: ](Add tags for execution meta after training)]
244244

245245
Add the following snippet to tag your execution. The `tags` are customizable key-values.
246246

@@ -260,7 +260,7 @@ After execution, you can see this in SAP AI Launchpad.
260260
[DONE]
261261
[ACCORDION-END]
262262

263-
[ACCORDION-BEGIN [Step 7: ](Complete files)]
263+
[ACCORDION-BEGIN [Step 8: ](Complete files)]
264264

265265
Check your modified `main.py` by comparing it with the following expected `main.py`.
266266

@@ -371,7 +371,7 @@ Check your modified `requirements.txt` by comparing it with the following expect
371371
```TEXT
372372
sklearn==0.0
373373
pandas
374-
ai-core-sdk>=1.12.0
374+
ai-core-sdk>=1.15.1
375375
```
376376

377377
Create a file called `Dockerfile` with the following snippet. This file must not have a file extension or alternative name.

0 commit comments

Comments
 (0)