package statsEstimation
- Alphabetic
- Public
- All
Type Members
-
case class
ColumnStatsMap(originalMap: AttributeMap[ColumnStat]) extends Product with Serializable
This class contains the original column stats from child, and maintains the updated column stats.
This class contains the original column stats from child, and maintains the updated column stats. We will update the corresponding ColumnStats for a column after we apply a predicate condition. For example, column c has [min, max] value as [0, 100]. In a range condition such as (c > 40 AND c <= 50), we need to set the column's [min, max] value to [40, 100] after we evaluate the first condition c > 40. We also need to set the column's [min, max] value to [40, 50] after we evaluate the second condition c <= 50.
- originalMap
Original column stats from child.
-
class
DefaultValueInterval extends ValueInterval
This version of Spark does not have min/max for binary/string types, we define their default behaviors by this class.
- case class FilterEstimation(plan: Filter) extends Logging with Product with Serializable
- case class JoinEstimation(join: Join) extends Logging with Product with Serializable
-
trait
LogicalPlanStats extends AnyRef
A trait to add statistics propagation to LogicalPlan.
-
class
NullValueInterval extends ValueInterval
This is for columns with only null values.
-
case class
NumericValueInterval(min: Double, max: Double) extends ValueInterval with Product with Serializable
For simplicity we use double to unify operations of numeric intervals.
-
trait
ValueInterval extends AnyRef
Value range of a column.
Value Members
- object AggregateEstimation
-
object
BasicStatsPlanVisitor extends LogicalPlanVisitor[Statistics]
A LogicalPlanVisitor that computes the statistics for the cost-based optimizer.
- object EstimationUtils
- object ProjectEstimation
-
object
SizeInBytesOnlyStatsPlanVisitor extends LogicalPlanVisitor[Statistics]
An LogicalPlanVisitor that computes a single dimension for plan stats: size in bytes.
- object ValueInterval