Hi Jim,
thanks a lot for the long answers:
To output the table created by Spark, are you using the 'save' function in GeoMesaSpark (1)? (I'm wondering if there might be a bug with this function.) Before calling 'save' or using a similar function on an RDD, I'd suggest making sure to create the DataStore and call createSchema in a single-threaded context. (Mainly, I want to make sure that there aren't issuing writing the SimpleFeatureType/FeatureSource metadata to Accumulo.)
I actually don’t use the save function. I oriented on the tutorial and do:
val store = DataStoreFinder.getDataStore(paramsInsert).asInstanceOf[AccumuloDataStore]
val attributes = Lists.newArrayList( "analysis_id:String:index=full", "*position:Point:srid=4326", "grade:Integer" );
// Get the featureType or create it from the above attributes var featureType = store.getSchema(storageTypeName); if (featureType == null){ val featureTypeAttr = String.join(", ", attributes); val simpleFeatureType = SimpleFeatureTypes.createType(storageTypeName, featureTypeAttr); store.createSchema(simpleFeatureType); featureType = store.getSchema(storageTypeName); }
val sfBuilder = new SimpleFeatureBuilder(featureType) val sfList = Lists.newArrayList[SimpleFeature]()
// Ingest just 5 entries, for testing (the save function makes an array from the attributes of the element) blocks.take(5).map(t => sfList.add(sfBuilder.buildFeature(t.analysis_id,t.save())))
var collection = new ListFeatureCollection(featureType, sfList); var featureStore :SimpleFeatureStore = store.getFeatureSource(storageTypeName).asInstanceOf[SimpleFeatureStore]; featureStore.addFeatures( collection );
Regarding the other questions:
3) The error still happens with just that one line of data and the three attributes - so actually there can not be anything bad in the data itself.
However I liked your suggestion of exporting and re-importing the table. When I tried exporting, I recognized that after the first time, it does not export all, but only one or two. Then I checked the geomesa.log and saw for each try it did not work: Error on server master1.gt:9997 org.apache.accumulo.core.client.impl.AccumuloServerException: Error on server master1.gt:9997 at org.apache.accumulo.core.client.impl.TabletServerBatchReaderIterator.doLookup(TabletServerBatchReaderIterator.java:695) at org.apache.accumulo.core.client.impl.TabletServerBatchReaderIterator$QueryTask.run(TabletServerBatchReaderIterator.java:349) at org.apache.htrace.wrappers.TraceRunnable.run(TraceRunnable.java:57) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at org.apache.accumulo.fate.util.LoggingRunnable.run(LoggingRunnable.java:35) at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.thrift.TApplicationException: Internal error processing startMultiScan at org.apache.thrift.TApplicationException.read(TApplicationException.java:111) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:71) at org.apache.accumulo.core.tabletserver.thrift.TabletClientService$Client.recv_startMultiScan(TabletClientService.java:317) at org.apache.accumulo.core.tabletserver.thrift.TabletClientService$Client.startMultiScan(TabletClientService.java:297) at org.apache.accumulo.core.client.impl.TabletServerBatchReaderIterator.doLookup(TabletServerBatchReaderIterator.java:634) ... 6 more
I also tried to create an empty table using the geomesa-tools command provided by the github page. If I create a layer on this empty table, I still see the same error (Or might it be a problem of accessing an empty table?). So actually right know I really think it has to be something with regard to GeoServer or the adapter.
Regards, Nico
Hi Nico,
First, sorry for the troubles. I do have a few suggestions, and
I'll break them out in sections.
Versions: Which version of GeoMesa and GeoServer are you using?
For Spark:
To output the table created by Spark, are you using the 'save'
function in GeoMesaSpark (1)? (I'm wondering if there might be a
bug with this function.) Before calling 'save' or using a similar
function on an RDD, I'd suggest making sure to create the DataStore
and call createSchema in a single-threaded context. (Mainly, I want
to make sure that there aren't issuing writing the
SimpleFeatureType/FeatureSource metadata to Accumulo.)
1.
https://github.com/locationtech/geomesa/blob/master/geomesa-compute/src/main/scala/org/locationtech/geomesa/compute/spark/GeoMesaSpark.scala#L136
For GeoServer:
Since you are seeing that exception in the web layer, I'd expect to
see it in the GeoServer logs. Which J2EE container are you using?
JBoss, Tomcat? By chance, is there more of the stack trace in those
logs?
Generally:
Are any of the fields in the SimpleFeatures null or empty? Is there
any namespacing or anything interesting happening with the
SimpleFeatureType? (E.g., are there special characters in the
attribute names? A ':' would definitely throw things for a loop.)
As a suggestion which is part work-around and part debugging, could
you try exporting the data using the tools and then immediately
re-importing the data into a new table? If something breaks during
re-ingest, a particular record may show the issue. If the entire
layer is still having issues in GeoServer, it may be the
SimpleFeatureType or other general metadata.
Overall, I'm guessing this is a serialization issue. Hopefully we
can track it down!
Jim
On 3/26/2016 8:39 AM, Nico Kreiling
wrote:
Hello,
I have a strange error using the GeoMesa Geoserver
Plugin. My setup does work for some tables, however for one
table I created using some spark commands, I always get the
following error, if I base a Geoserver Layer on it and try to
access its data using WFS:
<ServiceException>java.lang.ArrayIndexOutOfBoundsException:
7 7</ServiceException>
Neither GeoServer Logs nor the
Accumulo Monitor give me any related errors.
Also the content of the table
looks quite fine, using geomesa-tools I get for describe
String (Indexed)
position: Point
(ST-Geo-index) (Indexed)
grade: Integer
And for export:
analysis_id,position,grade
134715,POINT
(8.385753609107171 48.99513421696649),1
(Of course the data I really want to save has more
entries and more attributes, but even after breaking it down to
this the error still occurs.)
In the Geoserver Layer Settings I only set the
neccessary Bounding-Box and left all other options as default
Any ideas what I might test? Any Ideas where to find
more information on the error and what the 7 7 in the error
description might mean?
Thanks for help, Nico
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://www.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________ geomesa-users mailing list geomesa-users@xxxxxxxxxxxxxxxxTo change your delivery options, retrieve your password, or unsubscribe from this list, visit https://www.locationtech.org/mailman/listinfo/geomesa-users
|