pyspark.sql.functions.sec#
- pyspark.sql.functions.sec(col)[source]#
Computes secant of the input column.
New in version 3.3.0.
Changed in version 3.4.0: Supports Spark Connect.
Examples
Example 1: Compute the secant
>>> from pyspark.sql import functions as sf >>> spark.sql( ... "SELECT * FROM VALUES (PI() / 4), (PI() / 16) AS TAB(value)" ... ).select("*", sf.sec("value")).show() +-------------------+------------------+ | value| SEC(value)| +-------------------+------------------+ | 0.7853981633974...| 1.414213562373...| |0.19634954084936...|1.0195911582083...| +-------------------+------------------+
Example 2: Compute the secant of invalid values
>>> from pyspark.sql import functions as sf >>> spark.sql( ... "SELECT * FROM VALUES (FLOAT('NAN')), (NULL) AS TAB(value)" ... ).select("*", sf.sec("value")).show() +-----+----------+ |value|SEC(value)| +-----+----------+ | NaN| NaN| | NULL| NULL| +-----+----------+