org.locationtech.geomesa.accumulo.jobs.mapreduce
GeoMesaAccumuloInputFormat
Companion class GeoMesaAccumuloInputFormat
object GeoMesaAccumuloInputFormat extends LazyLogging
- Alphabetic
- By Inheritance
- GeoMesaAccumuloInputFormat
- LazyLogging
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Type Members
- class GeoMesaRecordReader extends RecordReader[Text, SimpleFeature]
Record reader that delegates to accumulo record readers and transforms the key/values coming back into simple features.
- class GroupedSplit extends InputSplit with Writable
Input split that groups a series of RangeInputSplits.
Input split that groups a series of RangeInputSplits. Has to implement Hadoop Writable, thus the vars and mutable state.
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- val SYS_PROP_SPARK_LOAD_CP: String
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @HotSpotIntrinsicCandidate() @native()
- def configure(conf: Configuration, params: Map[String, _], plan: AccumuloQueryPlan, auths: Option[Authorizations]): Unit
Configure the input format based on a query plan
Configure the input format based on a query plan
- conf
configuration to update
- params
data store parameters
- plan
query plan
- def configure(conf: Configuration, params: Map[String, _], plan: AccumuloQueryPlan): Unit
Configure the input format based on a query plan
Configure the input format based on a query plan
- conf
configuration to update
- params
data store parameters
- plan
query plan
- def configure(conf: Configuration, params: Map[String, _], query: Query): Unit
Configure the input format based on a query
Configure the input format based on a query
- conf
configuration to update
- params
data store parameters
- query
query
- def ensureSparkClasspath(): Unit
This takes any jars that have been loaded by spark in the context classloader and makes them available to the general classloader.
This takes any jars that have been loaded by spark in the context classloader and makes them available to the general classloader. This is required as not all classes (even spark ones) check the context classloader.
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @HotSpotIntrinsicCandidate() @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @HotSpotIntrinsicCandidate() @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- lazy val logger: Logger
- Attributes
- protected
- Definition Classes
- LazyLogging
- Annotations
- @transient()
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @HotSpotIntrinsicCandidate() @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @HotSpotIntrinsicCandidate() @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
Deprecated Value Members
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable]) @Deprecated
- Deprecated
(Since version 9)