pyspark.sql.Column.alias¶
- 
Column.alias(*alias, **kwargs)[source]¶
- Returns this column aliased with a new name or names (in the case of expressions that return more than one column, such as explode). - New in version 1.3.0. - Parameters
- aliasstr
- desired column names (collects all positional arguments passed) 
 
- Other Parameters
- metadata: dict
- a dict of information to be stored in - metadataattribute of the corresponding- StructField(optional, keyword only argument)- Changed in version 2.2.0: Added optional - metadataargument.
 
 - Examples - >>> df.select(df.age.alias("age2")).collect() [Row(age2=2), Row(age2=5)] >>> df.select(df.age.alias("age3", metadata={'max': 99})).schema['age3'].metadata['max'] 99