You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/main/paradox/reference.md
+11-3Lines changed: 11 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -192,7 +192,7 @@ Parameters `tile_columns` and `tile_rows` are literals, not column expressions.
192
192
193
193
Tile rf_array_to_tile(Array arrayCol, Int numCols, Int numRows)
194
194
195
-
Python only. Create a `tile` from a Spark SQL [Array](http://spark.apache.org/docs/2.3.2/api/python/pyspark.sql.html#pyspark.sql.types.ArrayType), filling values in row-major order.
195
+
Python only. Create a `tile` from a Spark SQL [Array][Array], filling values in row-major order.
196
196
197
197
### rf_assemble_tile
198
198
@@ -383,6 +383,13 @@ Returns a `tile` column containing the element-wise equality of `tile1` and `rhs
383
383
384
384
Returns a `tile` column containing the element-wise inequality of `tile1` and `rhs`.
385
385
386
+
### rf_local_is_in
387
+
388
+
Tile rf_local_is_in(Tile tile, Array array)
389
+
Tile rf_local_is_in(Tile tile, list l)
390
+
391
+
Returns a `tile` column with cell values of 1 where the `tile` cell value is in the provided array or list. The `array` is a Spark SQL [Array][Array]. A python `list` of numeric values can also be passed.
392
+
386
393
### rf_round
387
394
388
395
Tile rf_round(Tile tile)
@@ -630,13 +637,13 @@ Python only. As with @ref:[`rf_explode_tiles`](reference.md#rf-explode-tiles), b
630
637
631
638
Array rf_tile_to_array_int(Tile tile)
632
639
633
-
Convert Tile column to Spark SQL [Array](http://spark.apache.org/docs/2.3.2/api/python/pyspark.sql.html#pyspark.sql.types.ArrayType), in row-major order. Float cell types will be coerced to integral type by flooring.
640
+
Convert Tile column to Spark SQL [Array][Array], in row-major order. Float cell types will be coerced to integral type by flooring.
634
641
635
642
### rf_tile_to_array_double
636
643
637
644
Array rf_tile_to_arry_double(Tile tile)
638
645
639
-
Convert tile column to Spark [Array](http://spark.apache.org/docs/2.3.2/api/python/pyspark.sql.html#pyspark.sql.types.ArrayType), in row-major order. Integral cell types will be coerced to floats.
646
+
Convert tile column to Spark [Array][Array], in row-major order. Integral cell types will be coerced to floats.
640
647
641
648
### rf_render_ascii
642
649
@@ -666,3 +673,4 @@ Runs [`rf_rgb_composite`](reference.md#rf-rgb-composite) on the given tile colum
Because there is not a NoData already defined, we will choose one. In this particular example, the minimum value is greater than zero, so we can use 0 as the NoData value.
We can verify that the number of NoData cells in the resulting `blue_masked` column matches the total of the boolean `mask` _tile_ to ensure our logic is correct.
0 commit comments