Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] Problem with spark example

I've been working the error and discovered it was due to conflicting versions of DateTime.  My spark job was requiring 2.4 and Geomesa requires 2.3.  When I packaged my job into an uber jar, it was only packaging 2.4.  This is an issues because apparently 2.4 of DateTime has issues with using Long.MinValue and Long.MaxValue (reference https://github.com/JodaOrg/joda-time/issues/190) when initializing a DateTime object.  This same approach is used to initialize a few DateTime objects in org.locationtech.geomesa.core.index.scala (line 36/37).

Thanks
-- Adam
 

On Fri, Nov 21, 2014 at 9:17 PM, Adam Fraser <tigclem00@xxxxxxxxx> wrote:

I'm working with Spark and attempting to emulate the Spark example (http://www.geomesa.org/2014/08/05/spark/), but am having problems. 

The example creates a dataStore using:

val ds = DataStoreFinder.getDataStore(params).asInstanceOf[AccumuloDataStore]

 It then sends ds into an init function found in compute.spark.GeoMesaSpark. This init function calls ds.getSchema.  This is where the problem is occurring.  I'm getting the following error:

 

Exception in thread "main" java.lang.ExceptionInInitializerError at org.locationtech.geomesa.core.data.AccumuloDataStore$$anonfun$getSchema$2.apply(AccumuloDataStore.scala:712)

at org.locationtech.geomesa.core.data.AccumuloDataStore$$anonfun$getSchema$2.apply(AccumuloDataStore.scala:703)

at scala.Option.map(Option.scala:145)

at org.locationtech.geomesa.core.data.AccumuloDataStore.getSchema(AccumuloDataStore.scala:703)

at org.locationtech.geomesa.core.data.AccumuloDataStore.getSchema(AccumuloDataStore.scala:701)

After some debugging efforts, I've pretty much determined that this error is occurring because of the use of the core.index package object.  If you look inside the getSchema function inside AccumuloDataStore you'll see it is used several times to retrieve a few strings and call a function. 

It is like the core.index package object hasn't been initialized. 

Any ideas for how I can get getSchema to work?  Also, is it possible for you to post a complete solution to the Spark Example on GitHub?

 

Thanks

-- Adam

 


Sent from my iPad


Back to the top