Skip to content

Commit 96add91

Browse files
committed
mlb - tf tutorial updates
1 parent 47c145d commit 96add91

7 files changed

Lines changed: 314 additions & 237 deletions

File tree

tutorials/mlb-hxe-setup-tensorflow/mlb-hxe-setup-tensorflow.md

Lines changed: 88 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ Finally, you will learn how to configure your SAP HANA, express edition instance
2222
## Details
2323

2424
### Time to Complete
25-
**120 Min**.
25+
**20 to 120 Min**.
2626

2727
[ACCORDION-BEGIN [Info: ](SAP HANA External Machine Learning Library)]
2828

@@ -151,7 +151,7 @@ sudo su -l tmsadm
151151
[DONE]
152152
[ACCORDION-END]
153153

154-
[ACCORDION-BEGIN [Step 3: ](Install Required System Packages)]
154+
[ACCORDION-BEGIN [Step 3: ](Install TensorFlow Serving dependencies)]
155155

156156
In order to successfully compile TensorFlow Serving `ModelServer`, you will need to install a C/C++ compiler.
157157

@@ -163,6 +163,22 @@ By default, SAP HANA, express edition setup a Python 2.7 version, but this one d
163163

164164
Therefore, you will add these missing packages too.
165165

166+
#### For Debian or Ubuntu system:
167+
168+
If you are planning on running TensorFlow Serving on `Debian` or `Ubuntu` system, you can simply follow the **Packages** section details from the [TensorFlow Serving setup instructions](https://www.tensorflow.org/serving/setup#packages).
169+
170+
Then, install the `virtualenv` package using the following command:
171+
172+
```shell
173+
sudo apt-get install virtualenv
174+
```
175+
176+
Make sure that you have a Python 2.7 installation:
177+
178+
```shell
179+
sudo apt-get install python-numpy python-dev python-pip python-wheel
180+
```
181+
166182
#### For SUSE Linux Enterprise Server (including the SAP HANA, express edition VM):
167183

168184
First, you need to add the required software repositories and then install the package group.
@@ -311,15 +327,15 @@ sudo yum install \
311327
[DONE]
312328
[ACCORDION-END]
313329

314-
[ACCORDION-BEGIN [Step 4: ](Install TensorFlow ModelServer Pre-Requisites)]
330+
[ACCORDION-BEGIN [Step 4: ](Install TensorFlow Serving Python Pre-Requisites)]
315331

316332
Before install the TensorFlow Serving `ModelServer`, you will need to install a set of pre-requisites.
317333

318334
First, you have to create and activate a Python Virtual Environment (named `tms`) using the following commands:
319335

320336
```shell
321337
cd ~/
322-
virtualenv --system-site-packages ~/tms
338+
virtualenv --python=python2.7 --system-site-packages ~/tms
323339
source ~/tms/bin/activate
324340
```
325341

@@ -329,40 +345,6 @@ Your terminal prompt should now look like the following:
329345
(tms) tmsadm@hxehost:~>
330346
```
331347

332-
#### Bazel:
333-
334-
`Bazel` is an open-source build and test tool similar to `Make`, `Maven`, and `Gradle`. It uses a human-readable, high-level build language.
335-
`Bazel` supports projects in multiple languages and builds outputs for multiple platforms.
336-
337-
TensorFlow uses `Bazel` for its compilation. You can find the `Bazel` installation instructions [online](https://docs.bazel.build/versions/master/install.html).
338-
339-
You can install `Bazel` 0.11.1 in a *user* mode using the following commands:
340-
341-
```shell
342-
cd ~/
343-
curl -L https://github.com/bazelbuild/bazel/releases/download/0.11.1/bazel-0.11.1-installer-linux-x86_64.sh -o ~/bazel-0.11.1-installer-linux-x86_64.sh
344-
chmod +x ~/bazel-0.11.1-installer-linux-x86_64.sh
345-
./bazel-0.11.1-installer-linux-x86_64.sh --user
346-
export PATH="$PATH:$HOME/bin"
347-
rm ~/bazel-0.11.1-installer-linux-x86_64.sh
348-
```
349-
350-
You can now check that `Bazel` 0.11.1 was properly installed using the following command:
351-
352-
```shell
353-
bazel version
354-
```
355-
356-
The output should look like the following:
357-
358-
```
359-
Build label: 0.11.1
360-
Build target: bazel-out/k8-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
361-
Build time: ...
362-
Build timestamp: ...
363-
Build timestamp as int: ...
364-
```
365-
366348
#### Pip:
367349

368350
Now, you need to check the Pip version using the following command:
@@ -413,6 +395,8 @@ exit()
413395

414396
A **Hello, TensorFlow!** message should be printed out.
415397

398+
#### TensorFlow Serving API:
399+
416400
And finally, the TensorFlow Serving API:
417401

418402
```shell
@@ -422,11 +406,50 @@ pip install tensorflow-serving-api
422406
[DONE]
423407
[ACCORDION-END]
424408

425-
[ACCORDION-BEGIN [Step 5: ](Compile TensorFlow ModelServer)]
409+
[ACCORDION-BEGIN [Step 5: ](Install TensorFlow Serving)]
410+
411+
As stated previously, TensorFlow Serving `ModelServer` installable binaries are only available for `Debian` & `Ubuntu` Linux distribution.
412+
413+
Therefore, on SUSE Linux Enterprise Server or Red Hat Enterprise Linux, you will need to compile the binary locally as detailed in this step.
414+
426415

427-
As stated previously, TensorFlow Serving `ModelServer` binaries are only available for `Debian` Linux distribution.
416+
If you are planning on running TensorFlow Serving on `Debian` or `Ubuntu`system, you can simply following the [TensorFlow Serving setup instructions](https://www.tensorflow.org/serving/setup#installing_using_apt-get) and move to the next step.
428417

429-
Therefore, we will need to compile the binary locally for SUSE Linux Enterprise Server and Red Hat Enterprise Linux.
418+
#### Bazel:
419+
420+
`Bazel` is an open-source build and test tool similar to `Make`, `Maven`, and `Gradle`. It uses a human-readable, high-level build language.
421+
`Bazel` supports projects in multiple languages and builds outputs for multiple platforms.
422+
423+
TensorFlow uses `Bazel` for its compilation. You can find the `Bazel` installation instructions [online](https://docs.bazel.build/versions/master/install.html).
424+
425+
You can install `Bazel` 0.11.1 in a *user* mode using the following commands:
426+
427+
```shell
428+
cd ~/
429+
curl -L https://github.com/bazelbuild/bazel/releases/download/0.11.1/bazel-0.11.1-installer-linux-x86_64.sh -o ~/bazel-0.11.1-installer-linux-x86_64.sh
430+
chmod +x ~/bazel-0.11.1-installer-linux-x86_64.sh
431+
./bazel-0.11.1-installer-linux-x86_64.sh --user
432+
export PATH="$PATH:$HOME/bin"
433+
rm ~/bazel-0.11.1-installer-linux-x86_64.sh
434+
```
435+
436+
You can now check that `Bazel` 0.11.1 was properly installed using the following command:
437+
438+
```shell
439+
bazel version
440+
```
441+
442+
The output should look like the following:
443+
444+
```
445+
Build label: 0.11.1
446+
Build target: bazel-out/k8-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
447+
Build time: ...
448+
Build timestamp: ...
449+
Build timestamp as int: ...
450+
```
451+
452+
#### Clone TensorFlow Serving Git repository:
430453

431454
The first step is to clone the TensorFlow Serving `ModelServer` locally using the following commands:
432455

@@ -436,6 +459,8 @@ git clone --recurse-submodules https://github.com/tensorflow/serving
436459
cd ~/serving
437460
```
438461

462+
#### Compile TensorFlow Serving:
463+
439464
Then you will compile the source code with `Bazel` using the following command:
440465

441466
```shell
@@ -471,40 +496,53 @@ cd ~/serving
471496
bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server
472497
```
473498

499+
#### Add TensorFlow Serving to the path:
500+
501+
In order to permanently add the TensorFlow Serving `ModelServer` executable to your user path, you will add the compiled binary directory path in your profile file:
502+
503+
```shell
504+
cd ~/
505+
echo "export PATH=$PATH:/home/tmsadm/serving/bazel-bin/tensorflow_serving/model_servers/" >> ~/.profile
506+
source .profile
507+
```
508+
509+
Now you can call directly the TensorFlow Serving `ModelServer` executable without prefixing with the path.
510+
474511
[DONE]
475512
[ACCORDION-END]
476513

477-
[ACCORDION-BEGIN [Step 6: ](Start TensorFlow ModelServer)]
514+
[ACCORDION-BEGIN [Step 6: ](Start TensorFlow Serving)]
478515

479-
Now that your compilation is completed, we can start the TensorFlow Serving `ModelServer`.
516+
Create a model export directory where you will store your TensorFlow Serving configuration and exported models:
517+
518+
```shell
519+
mkdir -p ~/export
520+
```
480521

481522
First, create the following empty model configuration file `/home/tmsadm/export/config.cnf` and add the following content:
482523

483524
```js
484525
model_config_list: {
485526
}
486527
```
487-
488-
You can now start the TensorFlow Serving `ModelServer` using the following command:
528+
Now that your installation and configuration is completed, you can start the TensorFlow Serving using the following command:
489529

490530
```shell
491-
cd ~/serving
492-
~/serving/bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=8500 --model_config_file=/home/tmsadm/export/config.cnf
531+
tensorflow_model_server --port=8500 --model_config_file=/home/tmsadm/export/config.cnf
493532
```
494533

495534
You can use the following command if you prefer to run it as a background process with all outputs redirected:
496535

497536
```shell
498-
cd ~/serving
499-
nohup ~/serving/bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server -port=8500 --model_config_file=/home/tmsadm/export/config.cnf > /home/tmsadm/export/tensorflow_model_server.log 2>&1 </dev/null &
537+
nohup tensorflow_model_server -port=8500 --model_config_file=/home/tmsadm/export/config.cnf > /home/tmsadm/export/tensorflow_model_server.log 2>&1 </dev/null &
500538
```
501539

502540
[DONE]
503541
[ACCORDION-END]
504542

505543
[ACCORDION-BEGIN [Step 7: ](Configure SAP HANA External Machine Learning)]
506544

507-
Now that the TensorFlow Serving `ModelServer` is up and running, you will need to add its configuration to your SAP HANA, express edition instance.
545+
Now, that the TensorFlow Serving `ModelServer` is up and running, you will need to add its configuration to your SAP HANA, express edition instance.
508546

509547
Before moving forward with the EML configuration, you need to grant the proper role to the `ML_USER` created during the [Prepare your SAP HANA, express edition instance for Machine Learning](https://www.sap.com/developer/tutorials/mlb-hxe-setup-basic.html).
510548

tutorials/mlb-hxe-tensorflow-image-retraining/mlb-hxe-tensorflow-image-retraining.md

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ As part of the [TensorFlow Hub](https://www.tensorflow.org/hub/), a library to f
1717

1818
You will reuse a pre-trained image recognition model from the [How to Retrain an Image Classifier for New Categories](https://www.tensorflow.org/tutorials/image_retraining) tutorial from the TensorFlow website.
1919

20-
However, this script currently includes a step to export the model using a 3 dimensions shape as the input which is not supported by the SAP HANA External Machine Learning integration .
20+
However, this script currently includes a step to export the model using a 3 dimensions shape as the input which is not supported by the SAP HANA External Machine Learning integration.
2121

2222
Therefore, you will use a script that will save the retrained model a the raw image blob as input.
2323

@@ -87,7 +87,6 @@ The scripts that you will leverage requires the TensorFlow Hub Python package, t
8787

8888
```shell
8989
pip install tensorflow_hub
90-
pip install tensorflow-serving-api
9190
```
9291

9392
Next, you can download the retrain script:
@@ -255,7 +254,6 @@ def main(unused_argv):
255254
legacy_init_op = legacy_init_op,
256255
)
257256
builder.save(as_text=False)
258-
session.close()
259257

260258
if __name__ == '__main__':
261259
tf.app.run()
@@ -444,15 +442,13 @@ You can now start the TensorFlow Serving `ModelServer` using the following comma
444442
> **Note:** As of the publication of this tutorial, there is no ***graceful*** shutdown command for the TensorFlow Serving `ModelServer`. Therefore you will need to kill the process manually.
445443
446444
```shell
447-
cd ~
448-
~/serving/bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --model_config_file=./export/config.cnf
445+
tensorflow_model_server --model_config_file=./export/config.cnf
449446
```
450447

451448
You can use the following command if you prefer to run it as a background process with all outputs redirected to a log file:
452449

453450
```shell
454-
cd ~/
455-
nohup ~/serving/bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --model_config_file=./export/config.cnf > ./tensorflow_model_server.log 2>&1 </dev/null &
451+
nohup tensorflow_model_server --model_config_file=./export/config.cnf > ./tensorflow_model_server.log 2>&1 </dev/null &
456452
```
457453

458454
[DONE]

0 commit comments

Comments
 (0)