Packages

class SQLConf extends Serializable with Logging

A class that enables the setting and getting of mutable config parameters/hints.

In the presence of a SQLContext, these can be set and queried by passing SET commands into Spark SQL's query functions (i.e. sql()). Otherwise, users of this class can modify the hints by programmatically calling the setters and getters of this class.

SQLConf is thread-safe (internally synchronized, so safe to be used in multiple threads).

Linear Supertypes
Logging, Serializable, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. SQLConf
  2. Logging
  3. Serializable
  4. Serializable
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new SQLConf()

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def adaptiveExecutionEnabled: Boolean
  5. def adaptiveExecutionLogLevel: String
  6. def addSingleFileInAddFile: Boolean
  7. def advancedPartitionPredicatePushdownEnabled: Boolean
  8. def allowNegativeScaleOfDecimalEnabled: Boolean
  9. def analyzerMaxIterations: Int

    ************************ Spark SQL Params/Hints *******************

  10. def ansiEnabled: Boolean
  11. def arrowMaxRecordsPerBatch: Int
  12. def arrowPySparkEnabled: Boolean
  13. def arrowPySparkFallbackEnabled: Boolean
  14. def arrowSafeTypeConversion: Boolean
  15. def arrowSparkREnabled: Boolean
  16. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  17. def autoBroadcastJoinThreshold: Long
  18. def autoSizeUpdateEnabled: Boolean
  19. def avroCompressionCodec: String
  20. def avroDeflateLevel: Int
  21. def broadcastTimeout: Long
  22. def bucketingEnabled: Boolean
  23. def bucketingMaxBuckets: Int
  24. def cacheVectorizedReaderEnabled: Boolean
  25. def cartesianProductExecBufferInMemoryThreshold: Int
  26. def cartesianProductExecBufferSpillThreshold: Int
  27. def caseSensitiveAnalysis: Boolean
  28. def caseSensitiveInferenceMode: SQLConf.HiveCaseSensitiveInferenceMode.Value
  29. def castDatetimeToString: Boolean
  30. def cboEnabled: Boolean
  31. def checkpointLocation: Option[String]
  32. def clear(): Unit
  33. def clone(): SQLConf
    Definition Classes
    SQLConf → AnyRef
  34. def coalesceShufflePartitionsEnabled: Boolean
  35. def codegenCacheMaxEntries: Int
  36. def codegenComments: Boolean
  37. def codegenFallback: Boolean
  38. def codegenSplitAggregateFunc: Boolean
  39. def columnBatchSize: Int
  40. def columnNameOfCorruptRecord: String
  41. def concatBinaryAsString: Boolean
  42. def constraintPropagationEnabled: Boolean
  43. def contains(key: String): Boolean

    Return whether a given key is set in this SQLConf.

  44. def continuousStreamingEpochBacklogQueueSize: Int
  45. def continuousStreamingExecutorPollIntervalMs: Long
  46. def continuousStreamingExecutorQueueSize: Int
  47. def convertCTAS: Boolean
  48. def copy(entries: (ConfigEntry[_], Any)*): SQLConf
  49. def crossJoinEnabled: Boolean
  50. def csvColumnPruning: Boolean
  51. def csvFilterPushDown: Boolean
  52. def dataFramePivotMaxValues: Int
  53. def dataFrameRetainGroupColumns: Boolean
  54. def dataFrameSelfJoinAutoResolveAmbiguity: Boolean
  55. def datetimeJava8ApiEnabled: Boolean
  56. def decimalOperationsAllowPrecisionLoss: Boolean
  57. def defaultDataSourceName: String
  58. def defaultNumShufflePartitions: Int
  59. def defaultSizeInBytes: Long
  60. def disabledV2StreamingMicroBatchReaders: String
  61. def disabledV2StreamingWriters: String
  62. def dynamicPartitionPruningEnabled: Boolean
  63. def dynamicPartitionPruningFallbackFilterRatio: Double
  64. def dynamicPartitionPruningReuseBroadcastOnly: Boolean
  65. def dynamicPartitionPruningUseStats: Boolean
  66. def eltOutputAsString: Boolean
  67. def enableRadixSort: Boolean
  68. def enableTwoLevelAggMap: Boolean
  69. def enableVectorizedHashMap: Boolean
  70. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  71. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  72. def escapedStringLiterals: Boolean
  73. def exchangeReuseEnabled: Boolean
  74. def exponentLiteralAsDecimalEnabled: Boolean
  75. def fallBackToHdfsForStatsEnabled: Boolean
  76. def fastHashAggregateRowMaxCapacityBit: Int
  77. def fetchShuffleBlocksInBatch: Boolean
  78. def fileCommitProtocolClass: String
  79. def fileCompressionFactor: Double
  80. def fileSinkLogCleanupDelay: Long
  81. def fileSinkLogCompactInterval: Int
  82. def fileSinkLogDeletion: Boolean
  83. def fileSourceLogCleanupDelay: Long
  84. def fileSourceLogCompactInterval: Int
  85. def fileSourceLogDeletion: Boolean
  86. def filesMaxPartitionBytes: Long
  87. def filesOpenCostInBytes: Long
  88. def filesourcePartitionFileCacheSize: Long
  89. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  90. def gatherFastStats: Boolean
  91. def getAllConfs: Map[String, String]

    Return all the configuration properties that have been set (i.e.

    Return all the configuration properties that have been set (i.e. not the default). This creates a new copy of the config properties in the form of a Map.

  92. def getAllDefinedConfs: Seq[(String, String, String, String)]

    Return all the configuration definitions that have been defined in SQLConf.

    Return all the configuration definitions that have been defined in SQLConf. Each definition contains key, defaultValue and doc.

  93. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  94. def getConf[T](entry: OptionalConfigEntry[T]): Option[T]

    Return the value of an optional Spark SQL configuration property for the given key.

    Return the value of an optional Spark SQL configuration property for the given key. If the key is not set yet, returns None.

  95. def getConf[T](entry: ConfigEntry[T]): T

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue in ConfigEntry.

  96. def getConf[T](entry: ConfigEntry[T], defaultValue: T): T

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue. This is useful when defaultValue in ConfigEntry is not the desired one.

  97. def getConfString(key: String, defaultValue: String): String

    Return the string value of Spark SQL configuration property for the given key.

    Return the string value of Spark SQL configuration property for the given key. If the key is not set yet, return defaultValue.

  98. def getConfString(key: String): String

    Return the value of Spark SQL configuration property for the given key.

    Return the value of Spark SQL configuration property for the given key.

    Annotations
    @throws( "if key is not set" )
  99. def groupByAliases: Boolean
  100. def groupByOrdinal: Boolean
  101. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  102. def hintErrorHandler: HintErrorHandler

    Returns the error handler for handling hint errors.

  103. def histogramEnabled: Boolean
  104. def histogramNumBins: Int
  105. def hiveThriftServerSingleSession: Boolean
  106. def hugeMethodLimit: Int
  107. def ignoreCorruptFiles: Boolean
  108. def ignoreDataLocality: Boolean
  109. def ignoreMissingFiles: Boolean
  110. def inMemoryPartitionPruning: Boolean
  111. def inMemoryTableScanStatisticsEnabled: Boolean
  112. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  113. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  114. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  115. def isModifiable(key: String): Boolean
  116. def isOrcSchemaMergingEnabled: Boolean
  117. def isParquetBinaryAsString: Boolean
  118. def isParquetINT96AsTimestamp: Boolean
  119. def isParquetINT96TimestampConversion: Boolean
  120. def isParquetSchemaMergingEnabled: Boolean
  121. def isParquetSchemaRespectSummaries: Boolean
  122. def isReplEagerEvalEnabled: Boolean
  123. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  124. def isUnsupportedOperationCheckEnabled: Boolean
  125. def joinReorderCardWeight: Double
  126. def joinReorderDPStarFilter: Boolean
  127. def joinReorderDPThreshold: Int
  128. def joinReorderEnabled: Boolean
  129. def jsonGeneratorIgnoreNullFields: Boolean
  130. def legacyMsSqlServerNumericMappingEnabled: Boolean
  131. def legacySizeOfNull: Boolean
  132. def legacyTimeParserPolicy: SQLConf.LegacyBehaviorPolicy.Value
  133. def limitScaleUpFactor: Int
  134. def literalPickMinimumPrecision: Boolean
  135. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  136. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  137. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  138. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  139. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  140. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  141. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  142. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  143. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  144. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  145. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  146. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  147. def loggingMaxLinesForCodegen: Int
  148. def manageFilesourcePartitions: Boolean
  149. def maxBatchesToRetainInMemory: Int
  150. def maxNestedViewDepth: Int
  151. def maxPlanStringLength: Int
  152. def maxRecordsPerFile: Long
  153. def maxToStringFields: Int
  154. def metastorePartitionPruning: Boolean
  155. def methodSplitThreshold: Int
  156. def minBatchesToRetain: Int
  157. def nameNonStructGroupingKeyAsValue: Boolean
  158. def ndvMaxError: Double
  159. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  160. def nestedPruningOnExpressions: Boolean
  161. def nestedSchemaPruningEnabled: Boolean
  162. def nonEmptyPartitionRatioForBroadcastJoin: Double
  163. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  164. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  165. def numShufflePartitions: Int
  166. def objectAggSortBasedFallbackThreshold: Int
  167. def offHeapColumnVectorEnabled: Boolean
  168. def optimizerExcludedRules: Option[String]
  169. def optimizerInSetConversionThreshold: Int
  170. def optimizerInSetSwitchThreshold: Int
  171. def optimizerMaxIterations: Int
  172. def optimizerMetadataOnly: Boolean
  173. def optimizerPlanChangeBatches: Option[String]
  174. def optimizerPlanChangeLogLevel: String
  175. def optimizerPlanChangeRules: Option[String]
  176. def orcCompressionCodec: String
  177. def orcFilterPushDown: Boolean
  178. def orcVectorizedReaderBatchSize: Int
  179. def orcVectorizedReaderEnabled: Boolean
  180. def orderByOrdinal: Boolean
  181. def pandasGroupedMapAssignColumnsByName: Boolean
  182. def pandasUDFBufferSize: Int
  183. def parallelFileListingInStatsComputation: Boolean
  184. def parallelPartitionDiscoveryParallelism: Int
  185. def parallelPartitionDiscoveryThreshold: Int
  186. def parquetCompressionCodec: String
  187. def parquetFilterPushDown: Boolean
  188. def parquetFilterPushDownDate: Boolean
  189. def parquetFilterPushDownDecimal: Boolean
  190. def parquetFilterPushDownInFilterThreshold: Int
  191. def parquetFilterPushDownStringStartWith: Boolean
  192. def parquetFilterPushDownTimestamp: Boolean
  193. def parquetOutputCommitterClass: String
  194. def parquetOutputTimestampType: SQLConf.ParquetOutputTimestampType.Value
  195. def parquetRecordFilterEnabled: Boolean
  196. def parquetVectorizedReaderBatchSize: Int
  197. def parquetVectorizedReaderEnabled: Boolean
  198. def partitionColumnTypeInferenceEnabled: Boolean
  199. def partitionOverwriteMode: SQLConf.PartitionOverwriteMode.Value
  200. def percentileAccuracy: Int
  201. def planStatsEnabled: Boolean
  202. def preferSortMergeJoin: Boolean
  203. def pysparkJVMStacktraceEnabled: Boolean
  204. def rangeExchangeSampleSizePerPartition: Int
  205. val reader: ConfigReader
    Attributes
    protected
  206. def redactOptions[K, V](options: Map[K, V]): Map[K, V]

    Redacts the given option map according to the description of SQL_OPTIONS_REDACTION_PATTERN.

  207. def replEagerEvalMaxNumRows: Int
  208. def replEagerEvalTruncate: Int
  209. def replaceDatabricksSparkAvroEnabled: Boolean
  210. def replaceExceptWithFilter: Boolean
  211. def resolver: Resolver

    Returns the Resolver for the current configuration, which can be used to determine if two identifiers are equal.

  212. def runSQLonFile: Boolean
  213. def serializerNestedSchemaPruningEnabled: Boolean
  214. def sessionLocalTimeZone: String
  215. def setCommandRejectsSparkCoreConfs: Boolean
  216. def setConf[T](entry: ConfigEntry[T], value: T): Unit

    Set the given Spark SQL configuration property.

  217. def setConf(props: Properties): Unit

    Set Spark SQL configuration properties.

  218. def setConfString(key: String, value: String): Unit

    Set the given Spark SQL configuration property using a string value.

  219. def setConfWithCheck(key: String, value: String): Unit
    Attributes
    protected
  220. def setOpsPrecedenceEnforced: Boolean
  221. val settings: Map[String, String]

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Only low degree of contention is expected for conf, thus NOT using ConcurrentHashMap.

    Attributes
    protected[spark]
  222. def sortBeforeRepartition: Boolean
  223. def sortMergeJoinExecBufferInMemoryThreshold: Int
  224. def sortMergeJoinExecBufferSpillThreshold: Int
  225. def starSchemaDetection: Boolean
  226. def starSchemaFTRatio: Double
  227. def stateStoreMinDeltasForSnapshot: Int
  228. def stateStoreProviderClass: String
  229. def storeAssignmentPolicy: SQLConf.StoreAssignmentPolicy.Value
  230. def streamingFileCommitProtocolClass: String
  231. def streamingMetricsEnabled: Boolean
  232. def streamingNoDataMicroBatchesEnabled: Boolean
  233. def streamingNoDataProgressEventInterval: Long
  234. def streamingPollingDelay: Long
  235. def streamingProgressRetention: Int
  236. def streamingSchemaInference: Boolean
  237. def stringRedactionPattern: Option[Regex]
  238. def subexpressionEliminationEnabled: Boolean
  239. def subqueryReuseEnabled: Boolean
  240. def supportQuotedRegexColumnName: Boolean
  241. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  242. def tableRelationCacheSize: Int
  243. def toString(): String
    Definition Classes
    AnyRef → Any
  244. def topKSortFallbackThreshold: Int
  245. def truncateTableIgnorePermissionAcl: Boolean
  246. def unsetConf(entry: ConfigEntry[_]): Unit
  247. def unsetConf(key: String): Unit
  248. def useCompression: Boolean
  249. def useObjectHashAggregation: Boolean
  250. def validatePartitionColumns: Boolean
  251. def variableSubstituteEnabled: Boolean
  252. def verifyPartitionPath: Boolean
  253. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  254. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  255. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  256. def warehousePath: String
  257. def wholeStageEnabled: Boolean
  258. def wholeStageMaxNumFields: Int
  259. def wholeStageSplitConsumeFuncByOperator: Boolean
  260. def wholeStageUseIdInClassName: Boolean
  261. def windowExecBufferInMemoryThreshold: Int
  262. def windowExecBufferSpillThreshold: Int
  263. def writeLegacyParquetFormat: Boolean

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped