Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[geomesa-users] Problem with spark example

I'm working with Spark and attempting to emulate the Spark example (http://www.geomesa.org/2014/08/05/spark/), but am having problems. 

The example creates a dataStore using:

val ds = DataStoreFinder.getDataStore(params).asInstanceOf[AccumuloDataStore]

 It then sends ds into an init function found in compute.spark.GeoMesaSpark. This init function calls ds.getSchema.  This is where the problem is occurring.  I'm getting the following error:

 

Exception in thread "main" java.lang.ExceptionInInitializerError at org.locationtech.geomesa.core.data.AccumuloDataStore$$anonfun$getSchema$2.apply(AccumuloDataStore.scala:712)

at org.locationtech.geomesa.core.data.AccumuloDataStore$$anonfun$getSchema$2.apply(AccumuloDataStore.scala:703)

at scala.Option.map(Option.scala:145)

at org.locationtech.geomesa.core.data.AccumuloDataStore.getSchema(AccumuloDataStore.scala:703)

at org.locationtech.geomesa.core.data.AccumuloDataStore.getSchema(AccumuloDataStore.scala:701)

After some debugging efforts, I've pretty much determined that this error is occurring because of the use of the core.index package object.  If you look inside the getSchema function inside AccumuloDataStore you'll see it is used several times to retrieve a few strings and call a function. 

It is like the core.index package object hasn't been initialized. 

Any ideas for how I can get getSchema to work?  Also, is it possible for you to post a complete solution to the Spark Example on GitHub?

 

Thanks

-- Adam

 


Sent from my iPad

Back to the top