Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark
    Definition Classes
    apache
  • package sql
    Definition Classes
    spark
  • package catalyst

    Catalyst is a library for manipulating relational query plans.

    Catalyst is a library for manipulating relational query plans. All classes in catalyst are considered an internal API to Spark SQL and are subject to change between minor releases.

    Definition Classes
    sql
  • package analysis

    Provides a logical query plan Analyzer and supporting classes for performing analysis.

    Provides a logical query plan Analyzer and supporting classes for performing analysis. Analysis consists of translating UnresolvedAttributes and UnresolvedRelations into fully typed objects using information in a schema Catalog.

    Definition Classes
    catalyst
  • package catalog
    Definition Classes
    catalyst
  • package csv
    Definition Classes
    catalyst
  • package dsl

    A collection of implicit conversions that create a DSL for constructing catalyst data structures.

    A collection of implicit conversions that create a DSL for constructing catalyst data structures.

    scala> import org.apache.spark.sql.catalyst.dsl.expressions._
    
    // Standard operators are added to expressions.
    scala> import org.apache.spark.sql.catalyst.expressions.Literal
    scala> Literal(1) + Literal(1)
    res0: org.apache.spark.sql.catalyst.expressions.Add = (1 + 1)
    
    // There is a conversion from 'symbols to unresolved attributes.
    scala> 'a.attr
    res1: org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute = 'a
    
    // These unresolved attributes can be used to create more complicated expressions.
    scala> 'a === 'b
    res2: org.apache.spark.sql.catalyst.expressions.EqualTo = ('a = 'b)
    
    // SQL verbs can be used to construct logical query plans.
    scala> import org.apache.spark.sql.catalyst.plans.logical._
    scala> import org.apache.spark.sql.catalyst.dsl.plans._
    scala> LocalRelation('key.int, 'value.string).where('key === 1).select('value).analyze
    res3: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan =
    Project [value#3]
     Filter (key#2 = 1)
      LocalRelation [key#2,value#3], []
    Definition Classes
    catalyst
  • package encoders
    Definition Classes
    catalyst
  • package errors

    Functions for attaching and retrieving trees that are associated with errors.

    Functions for attaching and retrieving trees that are associated with errors.

    Definition Classes
    catalyst
  • package expressions

    A set of classes that can be used to represent trees of relational expressions.

    A set of classes that can be used to represent trees of relational expressions. A key goal of the expression library is to hide the details of naming and scoping from developers who want to manipulate trees of relational operators. As such, the library defines a special type of expression, a NamedExpression in addition to the standard collection of expressions.

    Standard Expressions

    A library of standard expressions (e.g., Add, EqualTo), aggregates (e.g., SUM, COUNT), and other computations (e.g. UDFs). Each expression type is capable of determining its output schema as a function of its children's output schema.

    Named Expressions

    Some expression are named and thus can be referenced by later operators in the dataflow graph. The two types of named expressions are AttributeReferences and Aliases. AttributeReferences refer to attributes of the input tuple for a given operator and form the leaves of some expression trees. Aliases assign a name to intermediate computations. For example, in the SQL statement SELECT a+b AS c FROM ..., the expressions a and b would be represented by AttributeReferences and c would be represented by an Alias.

    During analysis, all named expressions are assigned a globally unique expression id, which can be used for equality comparisons. While the original names are kept around for debugging purposes, they should never be used to check if two attributes refer to the same value, as plan transformations can result in the introduction of naming ambiguity. For example, consider a plan that contains subqueries, both of which are reading from the same table. If an optimization removes the subqueries, scoping information would be destroyed, eliminating the ability to reason about which subquery produced a given attribute.

    Evaluation

    The result of expressions can be evaluated using the Expression.apply(Row) method.

    Definition Classes
    catalyst
  • package json
    Definition Classes
    catalyst
  • package optimizer
    Definition Classes
    catalyst
  • package parser
    Definition Classes
    catalyst
  • package planning

    Contains classes for enumerating possible physical plans for a given logical query plan.

    Contains classes for enumerating possible physical plans for a given logical query plan.

    Definition Classes
    catalyst
  • package plans

    A collection of common abstractions for query plans as well as a base logical plan representation.

    A collection of common abstractions for query plans as well as a base logical plan representation.

    Definition Classes
    catalyst
  • package rules

    A framework for applying batches rewrite rules to trees, possibly to fixed point.

    A framework for applying batches rewrite rules to trees, possibly to fixed point.

    Definition Classes
    catalyst
  • package trees

    A library for easily manipulating trees of operators.

    A library for easily manipulating trees of operators. Operators that extend TreeNode are granted the following interface:

    • Scala collection like methods (foreach, map, flatMap, collect, etc)

    - transform - accepts a partial function that is used to generate a new tree. When the partial function can be applied to a given tree segment, that segment is replaced with the result. After attempting to apply the partial function to a given node, the transform function recursively attempts to apply the function to that node's children.

    • debugging support - pretty printing, easy splicing of trees, etc.
    Definition Classes
    catalyst
  • package util
    Definition Classes
    catalyst
  • AliasIdentifier
  • CatalystTypeConverters
  • DefinedByConstructorParams
  • DeserializerBuildHelper
  • FunctionIdentifier
  • IdentifierWithDatabase
  • InternalRow
  • JavaTypeInference
  • QualifiedTableName
  • QueryPlanningTracker
  • ScalaReflection
  • SerializerBuildHelper
  • TableIdentifier
  • WalkedTypePath

object ScalaReflection extends ScalaReflection

A default version of ScalaReflection that uses the runtime universe.

Linear Supertypes
ScalaReflection, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. ScalaReflection
  2. ScalaReflection
  3. Logging
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class Schema(dataType: DataType, nullable: Boolean) extends Product with Serializable

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def attributesFor[T](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Seq[Attribute]

    Returns a Sequence of attributes for the given case class type.

  6. def cleanUpReflectionObjects[T](func: ⇒ T): T

    Any codes calling scala.reflect.api.Types.TypeApi.<:< should be wrapped by this method to clean up the Scala reflection garbage automatically.

    Any codes calling scala.reflect.api.Types.TypeApi.<:< should be wrapped by this method to clean up the Scala reflection garbage automatically. Otherwise, it will leak some objects to scala.reflect.runtime.JavaUniverse.undoLog.

    Definition Classes
    ScalaReflection
    See also

    https://github.com/scala/bug/issues/8302

  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  8. def constructParams(tpe: scala.reflect.api.JavaUniverse.Type): Seq[scala.reflect.api.JavaUniverse.Symbol]
    Attributes
    protected
    Definition Classes
    ScalaReflection
  9. def dataTypeFor[T](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): DataType

    Returns the Spark SQL DataType for a given scala type.

    Returns the Spark SQL DataType for a given scala type. Where this is not an exact mapping to a native type, an ObjectType is returned. Special handling is also used for Arrays including those that hold primitive types.

    Unlike schemaFor, this function doesn't do any massaging of types into the Spark SQL type system. As a result, ObjectType will be returned for things like boxed Integers

  10. def dataTypeJavaClass(dt: DataType): Class[_]
  11. def definedByConstructorParams(tpe: scala.reflect.api.JavaUniverse.Type): Boolean

    Whether the fields of the given type is defined entirely by its constructor parameters.

  12. def deserializerForType(tpe: scala.reflect.api.JavaUniverse.Type): Expression

    Returns an expression that can be used to deserialize a Spark SQL representation to an object of type T with a compatible schema.

    Returns an expression that can be used to deserialize a Spark SQL representation to an object of type T with a compatible schema. The Spark SQL representation is located at ordinal 0 of a row, i.e., GetColumnByOrdinal(0, _). Nested classes will have their fields accessed using UnresolvedExtractValue.

    The returned expression is used by ExpressionEncoder. The encoder will resolve and bind this deserializer expression when using it.

  13. def encodeFieldNameToIdentifier(fieldName: String): String
  14. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  16. def expressionJavaClasses(arguments: Seq[Expression]): Seq[Class[_]]
  17. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  18. def findConstructor[T](cls: Class[T], paramTypes: Seq[Class[_]]): Option[(Seq[AnyRef]) ⇒ T]

    Finds an accessible constructor with compatible parameters.

    Finds an accessible constructor with compatible parameters. This is a more flexible search than the exact matching algorithm in Class.getConstructor. The first assignment-compatible matching constructor is returned if it exists. Otherwise, we check for additional compatible constructors defined in the companion object as apply methods. Otherwise, it returns None.

  19. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  20. def getClassFromType(tpe: scala.reflect.api.JavaUniverse.Type): Class[_]
  21. def getClassNameFromType(tpe: scala.reflect.api.JavaUniverse.Type): String

    Returns the full class name for a type.

    Returns the full class name for a type. The returned name is the canonical Scala name, where each component is separated by a period. It is NOT the Java-equivalent runtime name (no dollar signs).

    In simple cases, both the Scala and Java names are the same, however when Scala generates constructs that do not map to a Java equivalent, such as singleton objects or nested classes in package objects, it uses the dollar sign ($) to create synthetic classes, emulating behaviour in Java bytecode.

  22. def getConstructorParameterNames(cls: Class[_]): Seq[String]

    Returns the parameter names for the primary constructor of this class.

    Returns the parameter names for the primary constructor of this class.

    Logically we should call getConstructorParameters and throw away the parameter types to get parameter names, however there are some weird scala reflection problems and this method is a workaround to avoid getting parameter types.

  23. def getConstructorParameterValues(obj: DefinedByConstructorParams): Seq[AnyRef]

    Returns the parameter values for the primary constructor of this class.

  24. def getConstructorParameters(cls: Class[_]): Seq[(String, scala.reflect.api.JavaUniverse.Type)]

    Returns the parameter names and types for the primary constructor of this class.

    Returns the parameter names and types for the primary constructor of this class.

    Note that it only works for scala classes with primary constructor, and currently doesn't support inner class.

  25. def getConstructorParameters(tpe: scala.reflect.api.JavaUniverse.Type): Seq[(String, scala.reflect.api.JavaUniverse.Type)]

    Returns the parameter names and types for the primary constructor of this type.

    Returns the parameter names and types for the primary constructor of this type.

    Note that it only works for scala classes with primary constructor, and currently doesn't support inner class.

    Definition Classes
    ScalaReflection
  26. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  27. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  28. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  29. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  30. def isNativeType(dt: DataType): Boolean

    Returns true if the value of this data type is same between internal and external.

  31. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  32. def javaBoxedType(dt: DataType): Class[_]
  33. def localTypeOf[T](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): scala.reflect.api.JavaUniverse.Type

    Return the Scala Type for T in the current classloader mirror.

    Return the Scala Type for T in the current classloader mirror.

    Use this method instead of the convenience method universe.typeOf, which assumes that all types can be found in the classloader that loaded scala-reflect classes. That's not necessarily the case when running using Eclipse launchers or even Sbt console or test (without fork := true).

    Definition Classes
    ScalaReflection
    See also

    SPARK-5281

  34. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  35. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  36. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  37. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  38. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  39. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  40. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  41. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  42. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  43. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  44. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  45. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  46. def mirror: Mirror

    The mirror used to access types in the universe

    The mirror used to access types in the universe

    Definition Classes
    ScalaReflectionScalaReflection
  47. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  48. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  49. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  50. def optionOfProductType(tpe: scala.reflect.api.JavaUniverse.Type): Boolean

    Returns true if the given type is option of product type, e.g.

    Returns true if the given type is option of product type, e.g. Option[Tuple2]. Note that, we also treat DefinedByConstructorParams as product type.

  51. def schemaFor(tpe: scala.reflect.api.JavaUniverse.Type): Schema

    Returns a catalyst DataType and its nullability for the given Scala Type using reflection.

  52. def schemaFor[T](implicit arg0: scala.reflect.api.JavaUniverse.TypeTag[T]): Schema

    Returns a catalyst DataType and its nullability for the given Scala Type using reflection.

  53. def serializerForType(tpe: scala.reflect.api.JavaUniverse.Type): Expression

    Returns an expression for serializing an object of type T to Spark SQL representation.

    Returns an expression for serializing an object of type T to Spark SQL representation. The input object is located at ordinal 0 of a row, i.e., BoundReference(0, _).

    If the given type is not supported, i.e. there is no encoder can be built for this type, an UnsupportedOperationException will be thrown with detailed error message to explain the type path walked so far and which class we are not supporting. There are 4 kinds of type path: * the root type: root class: "abc.xyz.MyClass" * the value type of Option: option value class: "abc.xyz.MyClass" * the element type of Array or Seq: array element class: "abc.xyz.MyClass" * the field of Product: field (class: "abc.xyz.MyClass", name: "myField")

  54. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  55. def toString(): String
    Definition Classes
    AnyRef → Any
  56. val typeBoxedJavaMapping: Map[DataType, Class[_]]
  57. val typeJavaMapping: Map[DataType, Class[_]]
  58. val universe: scala.reflect.runtime.universe.type

    The universe we work in (runtime or macro)

    The universe we work in (runtime or macro)

    Definition Classes
    ScalaReflectionScalaReflection
  59. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  60. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  61. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from ScalaReflection

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped