You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/hana-cloud-dl-clients-dot-net/hana-cloud-dl-clients-dot-net.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -81,6 +81,8 @@ In order for the shell to recognize that the .NET SDK is installed and for any `
81
81
</Reference>
82
82
</ItemGroup>
83
83
```
84
+
85
+
Note that if the developer licensed version of the data lake Client was installed for Linux, the path might be similar to /home/dan/sap/hdlclient/sdk/dotnet/Sap.Data.SQLAnywhere.Core.v2.1.dll
Copy file name to clipboardExpand all lines: tutorials/hana-cloud-dl-clients-golang/hana-cloud-dl-clients-golang.md
+11-6Lines changed: 11 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,7 +30,7 @@ go version
30
30
31
31

32
32
33
-
If Go is installed, then it will return the currently installed version, such as 1.21.5
33
+
If Go is installed, then it will return the currently installed version, such as 1.23.4
34
34
35
35
If it is not installed, download it from [Download Go](https://golang.org/dl/), run the installer, follow the provided instructions, and ensure that Go is in your path.
36
36
@@ -86,20 +86,23 @@ The Go driver loads the SQLDBC library named `libdbcapiHDB` using [cgo](https:/
86
86
>It is also possible on Microsoft Windows to set this using the SETX command from a shell.
87
87
88
88
89
-
On Linux, open the '.bash_profile' and add the following lines.
90
-
89
+
On Linux, open the '.bashrc' or '.bash_profile' and first check and if needed the following lines. Note that the path may be different depending on the data lake client install used.
Copy file name to clipboardExpand all lines: tutorials/hana-cloud-dl-clients-odbc/hana-cloud-dl-clients-odbc.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -88,11 +88,12 @@ For additional details see [Connection Properties](https://help.sap.com/viewer/a
88
88
pico .odbc.ini
89
89
```
90
90
91
-
4. Configure the values of `driver` and `host` so that they conform with your setup.
91
+
4. Configure the values of `driver` and `host` so that they conform with your setup. Note that with the developer licensed version of the data lake client, the driver path below is slightly different.
Copy file name to clipboardExpand all lines: tutorials/hana-cloud-dl-clients-overview/hana-cloud-dl-clients-overview.md
+99-4Lines changed: 99 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,16 +32,16 @@ SAP HANA Cloud is composed of multiple components.
32
32
33
33
* SAP HANA is an in-memory, multi-model, column-based, relational database. For further details see [Introduction to SAP HANA Cloud](https://help.sap.com/docs/hana-cloud/sap-hana-cloud-getting-started-guide/introduction-to-sap-hana-cloud) and the tutorial mission [Use Clients to Query an SAP HANA Database](mission.hana-cloud-clients).
34
34
35
-
* SAP HANA Cloud, data lake is composed of two components: data lake Relational Engine and data lake Files.
35
+
* SAP HANA Cloud, data lake is composed of two components; a data lake Relational Engine and data lake Files.
36
36
37
-
[Data Lake Relational Engine](https://help.sap.com/docs/hana-cloud-data-lake/welcome-guide/data-lake-relational-engine) is a disk-based, column-oriented relational database for storing and analyzing high volumes of infrequently updated data. It descends from [SAP IQ](https://help.sap.com/docs/SAP_IQ), which was previously named Sybase IQ. Because of its heritage, there are commonalities with other Sybase products. Some of the client interface drivers are shared with [SAP SQL Anywhere](https://help.sap.com/docs/SAP_SQL_Anywhere) and SAP Adaptive Server Enterprise.
37
+
[Data Lake Relational Engine](https://help.sap.com/docs/hana-cloud-data-lake/welcome-guide/data-lake-relational-engine) is a disk-based, column-oriented relational database for storing and analyzing large amounts of infrequently updated data. It descends from [SAP IQ](https://help.sap.com/docs/SAP_IQ), which was previously named Sybase IQ. Because of its heritage, there are commonalities with other Sybase products. Some of the client interface drivers are shared with [SAP SQL Anywhere](https://help.sap.com/docs/SAP_SQL_Anywhere) and SAP Adaptive Server Enterprise.
38
38
39
-
[Data Lake Files](https://help.sap.com/docs/hana-cloud-data-lake/welcome-guide/data-lake-files) can be used to store and access unstructured data such as trace files and structured files like CSV, Parquet, or ORC. Structured files can use [SQL on Files](https://help.sap.com/docs/hana-cloud-data-lake/welcome-guide/sql-on-files), which enables SQL queries to be performed on them.
39
+
[Data Lake Files](https://help.sap.com/docs/hana-cloud-data-lake/welcome-guide/data-lake-files) can be used to store and access unstructured data such as trace files and structured files like CSV, Parquet, or delta tables. Structured files can use [SQL on Files](https://help.sap.com/docs/hana-cloud-database/sap-hana-cloud-sap-hana-database-sql-on-files-guide/sap-hana-cloud-sap-hana-database-sql-on-files-guide), which enables SQL queries to be performed on them.
40
40
41
41
>Note, that the data lake Files component, is currently not available in free tier or trial accounts.
42
42
43
43
### Choose where to deploy the database instances
44
-
The SAP BTP platform provides multiple runtime environments such as Cloud Foundry and Kyma. When a HANA Cloud or data lake instance is created, it can be created in a BTP subaccount or in a Cloud Foundry space. SAP HANA Cloud Central can be used to provision and manage instances in the BTP subaccount or in a Cloud Foundry space. In the screenshot below, there is an instance of a data lake that was provisioned in the BTP subaccount (Other Environments) and one that was provisioned into Cloud Foundry.
44
+
The SAP BTP platform provides multiple runtime environments such as Cloud Foundry and Kyma. When a SAP HANA Cloud or data lake instance is created, it can be created in a BTP subaccount or in a Cloud Foundry space. SAP HANA Cloud Central can be used to provision and manage instances in the BTP subaccount or in a Cloud Foundry space. In the screenshot below, there is an instance of a data lake that was provisioned in the BTP subaccount (Other Environments) and one that was provisioned into Cloud Foundry.
45
45
46
46

47
47
@@ -286,7 +286,70 @@ In this step, a sample HOTEL dataset will be created comprising tables, a view,
286
286
For additional details on the SAP HANA database explorer, see the tutorial [Get Started with the SAP HANA Database Explorer](group.hana-cloud-get-started), which showcases many of its features.
287
287
288
288
289
+
### Install the developer licensed version of the data lake client
290
+
This version of the data lake client does not include cryptographic libraries as it makes use of the libraries that are available on the operating systems such as OpenSSL. It is available for download after accepting SAP Developer License agreement. Currently, this version is only available for Linux. Either client can be used to complete the steps shown in this tutorial group.
291
+
292
+
1. Open the HANA tab of [SAP Development Tools](https://tools.hana.ondemand.com/#hanatools).
293
+
294
+
2. Download the SAP HANA Data Lake Client 1.0.
295
+
296
+

297
+
298
+
3. Extract the archive.
299
+
300
+
```Shell (Linux)
301
+
tar -zxvf hdlclient-latest-linux-x64.tar.gz
302
+
```
303
+
304
+
4. Start the installer.
305
+
306
+
```Shell (Linux)
307
+
cd ebf*
308
+
./hdbinst
309
+
```
310
+
311
+
[run the installer](hdbinst.png)
312
+
313
+
5. Configure the environment variables. This can be done by calling hdlclienv.sh manually or it can be added to the Bash shell by referencing it in`.bashrc`.
314
+
315
+
Open the `.bashrc`.
316
+
317
+
```Shell (Linux)
318
+
pico ~/.bashrc
319
+
```
320
+
321
+
>This tutorial uses notepad and`pico`as default text editors, but any text editor will do.
322
+
>`Pico` can be installed on SUSE Linux with
323
+
324
+
>```Shell (Linux SUSE)
325
+
sudo zypper install pico
326
+
>```
327
+
328
+
Add the following line to point to the location where the SAP data lake client is installed.
329
+
330
+
```Shell (Linux) .bash_profile
331
+
source ~/sap/hdlclient/bin64/hdlclienv.sh
332
+
```
333
+
334
+
Test the change by running:
335
+
336
+
```Shell (Linux)
337
+
source ~/.bashrc
338
+
```
339
+
340
+
The following command should display the install location of the data lake client.
341
+
342
+
```Shell (Linux)
343
+
echo $IQDIR17
344
+
```
345
+
346
+
>In the case that the Data Lake Client needs to be uninstalled, run the `hdbuninst` file located in the directory `~/sap/hdlclient/install`.
347
+
348
+
---
349
+
350
+
289
351
### Install the data lake client
352
+
This version of the data lake client is available from the SAP Software Download Center and requires an S-user ID and only shows software that you have purchased. Additional details can be found at [Software Downloads FAQ](https://support.sap.com/content/dam/support/en_us/library/ssp/my-support/help-for-sap-support-applications/online_help-software_downloads.html#faq). Either client can be used to complete the steps shown in this tutorial group.
290
353
291
354
1. Open [SAP for me](https://me.sap.com/softwarecenter) and navigate to **Support Packages & Patches** | **By Alphabetical Index (A-Z)**.
292
355
@@ -391,6 +454,38 @@ The data lake client install includes [dbisql Interactive SQL Utility](https://h
391
454
392
455
1. Start the GUI version of DBISQL by searching from the Microsoft Windows Start menu. It can also be accessed by entering `dbisql`in the command prompt.
393
456
457
+
If the error "This file could not be found: java" occurs, follow the instructions below.
458
+
459
+
* Download and install [SAP JVM 8.0](https://tools.hana.ondemand.com/#cloud)
460
+
461
+
```Shell (Linux)
462
+
unzip sapjvm-8.1.102-linux-x64.zip
463
+
mv ./sapjvm_8 ~/
464
+
```
465
+
466
+
* Add the following to your bashrc.
467
+
468
+
```Shell (Linux)
469
+
export JAVA_HOME=~/sapjvm_8
470
+
export PATH=$PATH:$JAVA_HOME/bin
471
+
```
472
+
473
+
* Apply and test the changes.
474
+
```Shell (Linux)
475
+
source ~/.bashrc
476
+
java -version
477
+
dbisql
478
+
```
479
+
480
+
If a further error occurs such aslibXext.so.6: cannot open shared object file, install the missing components.
Copy file name to clipboardExpand all lines: tutorials/hana-cloud-dl-clients-python/hana-cloud-dl-clients-python.md
+15-11Lines changed: 15 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
---
1
+
---
2
2
parser: v2
3
3
auto_validation: true
4
4
time: 15
@@ -37,13 +37,18 @@ The first step is to check if Python and pip are installed.
37
37
38
38
>In some Linux distributions, 'python' refers to Python 2, while'python3' refers to Python 3. However, as Python 2 is now obsolete, 'python' may refer to Python 3 instead.
39
39
40
-
41
40
If Python is not installed, it can be downloaded from [Python downloads](https://www.python.org/downloads/).
42
41
43
42
On Microsoft Windows, check the box that says **Add Python 3.x to PATH** as shown below to ensure that the interpreter will be placed in your path. The Microsoft Windows command prompt or shell will need to be reopened after Python is installed to pick up the path to python.
44
43
45
44

46
45
46
+
On OpenSUSE Tumbleweed, yast can be used to install python313. Once it has been installed its version can be seen with the below command.
47
+
48
+
```Shell (Linux)
49
+
python3.13 --version
50
+
```
51
+
47
52
48
53
2. Enter the commands below.
49
54
@@ -86,9 +91,9 @@ The `sqlanydb` package is the python driver for the data lake Relational Engine
86
91
```
87
92
88
93
89
-
On Linux the rest of the steps will be executed in a virtual enviroment.
94
+
On Linux the rest of the steps will be executed in a virtual environment.
90
95
91
-
First make a project folder, and create a virtual environment inside it. To do so, open the terminal app, write the following command, and hit return, here pyvenv is the name of the folder that you wish to create the virtual enviroment in.
96
+
First make a project folder, and create a virtual environment inside it. To do so, open the terminal app, write the following command, and hit return, here pyvenv is the name of the folder that you wish to create the virtual environment in.
92
97
93
98
```Shell(Linux)
94
99
mkdir $HOME/pyvenv
@@ -109,21 +114,20 @@ The `sqlanydb` package is the python driver for the data lake Relational Engine
109
114
110
115

111
116
112
-
113
-
114
-
Now execute :-
117
+
Depending on which install of the data lake client was used, execute
115
118
116
119
117
120
```Shell (Linux)
118
121
cd$IQDIR17/sdk/python
119
122
python3 setup.py install
120
123
```
121
124
125
+
or
122
126
123
-
124
-
125
-
126
-
127
+
```Shell (Linux)
128
+
cd$IQDIR17/sdk/python
129
+
pip install sqlanydb-1.0.14.tar.gz
130
+
```
127
131
128
132
2. On Microsoft Windows, create a user environment variable named `SQLANY_API_DLL` and set it to `%IQDIR17%\Bin64\dbcapi.dll`.
0 commit comments