Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [geomesa-users] ingesting and exposing polygon data, is this supported?

Hi Diane,

I meant that the line numbers further down the stacktrace don't match up to the method calls in 1.2.3. If you ingested using 1.2.0, then that would explain the mismatch.

The bug you're hitting with the "can't handle binding" was recently fixed, but it exists in 1.2.3 and prevents ingest when running against an older schema. There's a work-around, which I will add at the bottom of this email.

As far as updating geomesa, you need to ensure that *every* jar that is interacting with your cluster is the same version. We haven't been explicit about updating ingest code, but that should also match versions with geoserver and with the accumulo cluster. (In some cases, such as if you are only ever appending data and not updating existing data, the old ingest code will continue to work, but we don't recommended running this way). If you create a schema using a more recent version of geomesa, and then try to write (ingest) with an older version, that will not work, as the old code doesn't contain the new logic.

You don't need to run any kind of migration when updating versions, but you will continue to use the old code paths. For example, the bug you're hitting with the polygon inserts is in our geohash index, which has recently been replaced with a z-index. Since your schema was created with an older version, it's still using the old (buggy) code, even after updating. By creating a new schema, your data will be written in the new format and you won't hit the bug, which is why using a new table works. If you want to migrate old data to the new index, you can run a migration job to do so.

Work around for "can't handle binding" bug:

Go into the accumulo shell and navigate to the catalog table for geomesa. You should see something like this:

    user@cloud geomesa_catalog > scan
~METADATA_AccumuloQuickStart attributes: [] Who:String:index=full,What:Long,When:Date,*Where:Point:srid=4326:index=full,Why:String

The key point is to remove the 'index' flags on your geometry. You can do so something like this:

user@cloud geomesa_catalog > insert '~METADATA_AccumuloQuickStart' 'attributes' '' 'Who:String:index=full,What:Long,When:Date,*Where:Point:srid=4326,Why:String'

Your command will vary of course. Note that you need an empty quoted string to account for the empty column qualifier. For the attributes, just copy the existing string but remove any 'index' key pairs associated with the geometry.

Thanks,

Emilio


On 07/06/2016 10:59 AM, Diane Griffith wrote:
Emilio,

The line of that stacktrace does exist in 1.2.3 source code and in the named method of that stacktrace in that source.  Were you asking how did I even get to that section of the code?  It suggested I was ingesting against old code/geomesa jars.  Yes the test ingest application was reverted to geomesa version 1.2.0 code base (b/c of that one cloudera manager instance that I have not gotten to upgrade happily yet).  Anyway changing the test ingest application to use 1.2.3 then triggered a new error:

scala.NotImplementedError: Can't handle binding of type class com.vividsolutions.jts.geom.Polygon

at org.locationtech.geomesa.accumulo.data.stats.GeoMesaStats$.defaultPrecision(GeoMesaStats.scala:170)
at org.locationtech.geomesa.accumulo.data.stats.GeoMesaMetadataStats$$anonfun$26.apply(GeoMesaMetadataStats.scala:350)
...

To an existing tableName/Schema of a simple polyon insert test I had been trying to get to happily insert which it never did (per the original stacktrace).

I also had set up a simple point ingest test off of the same test data which had inserted the test point record fine yesterday.  (Yes I have been happily ingesting points all along but it was a complimentary test set to the same test to insert a polygon).  So I tried to ingest that test point data to that same test point schema now that I got the ingest code upgraded to 1.2.3.  I then ran into the same error and the ingest failed:

scala.NotImplementedError: Can't handle binding of type class com.vividsolutions.jts.geom.Point

at org.locationtech.geomesa.accumulo.data.stats.GeoMesaStats$.defaultPrecision(GeoMesaStats.scala:170)
at org.locationtech.geomesa.accumulo.data.stats.GeoMesaMetadataStats$$anonfun$26.apply(GeoMesaMetadataStats.scala:350)
...

Anyway with the test ingest application using geomesa 1.2.3 code I had to ingest into a new table name to get the test point data to ingest as well as the test polygon data to ingest. Once I did that they both inserted fine with no errors.

I thought I heard as we upgrade geomesa we would not need to ingest into a new table/schema and would not need to run a job to convert stats.   That we would only need to do something to proactively leverage new indexing enhancements if we wanted to gain the benefit of the improvement.  I may have misunderstood the implications of upgrading.  But for sure after I upgraded the accumulo instance to 1.2.3 but left the ingest application at geomesa 1.2.0, I could only ingest Point data.  Once I upgraded the test ingest application to 1.2.3 from 1.2.0 I could not ingest either Point or Polygon to an existing table/schema.

This brings up the following questions:

1. Why is it a problem to ingest into an upgraded 1.2.3 instance with 1.2.0 jars?
2. What is the upgrade process.  What am I missing?
      a. Have an accumulo 1.6.x instance at geomesa 1.2.x version and ingest point and polygon data using goemesa 1.2.x jars to testPoint and testPolygon (tables/schemas/etc) fine.
      b. geomesa 1.2.y releases and one wants to upgrade to 1.2.y.  Install the upgraded jars on accumulo and geoserver respectively.
      c. One needs to keep ingesting new data daily into testPoint and testPolygon, so what does one do?  Upgrade their ingest code to use geomesa 1.2.y jars?  If so do you need to convert the existing tables/schemas by running a job?

We currently do not use namespaces but I am not sure that would have solved whatever this issue is.

What should I have done differently to ingest data into that existing testPoint table/schema after upgrade?  I'm pretty sure I am going to see for my existing "real" data that I can no longer ingest point data and will get the same error.  So I need to figure out if I can fix the existing tables or if I need to drop and re-ingest existing data.

Before I try the catalog delete I want to know what steps I should try to continue ingesting against the old schemas I guess I'm saying.  I'm unclear what are valid upgrade steps or if I skipped one.

Thanks,
Diane
-----Original Message-----
From: geomesa-users-bounces@xxxxxxxxxxxxxxxx [mailto:geomesa-users-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Emilio Lahr-Vivaz
Sent: Wednesday, July 06, 2016 8:51 AM
To: geomesa-users@xxxxxxxxxxxxxxxx
Subject: Re: [geomesa-users] ingesting and exposing polygon data, is this supported?

Hi Diane,

Those line numbers in the stack trace don't seem to match up with the
1.2.3 tag - are you sure that all your jars are updated and matching?
Additionally, the code in question (SpatioTemporalTable) shouldn't be used in 1.2.3 unless you originally called createSchema with an older version of geomesa.

Could you try deleting the catalog table (or manually deleting the rows with that simple feature type), double check your jars, and re-run your test?

We're still looking into the issue you're hitting, though, as it will likely affect users on older versions.

Thanks,

Emilio

On 07/06/2016 08:08 AM, Diane Griffith wrote:
I did forget to mention one more thing I tried, I had also set the schema to generic Geometry but got the same error.

Diane

-----Original Message-----
From: geomesa-users-bounces@xxxxxxxxxxxxxxxx
[mailto:geomesa-users-bounces@xxxxxxxxxxxxxxxx] On Behalf Of Chris
Eichelberger
Sent: Tuesday, July 05, 2016 4:55 PM
To: Geomesa User discussions
Subject: Re: [geomesa-users] ingesting and exposing polygon data, is this supported?

Diane,

Thank you for the very detailed report!  We will look into this on GeoMesa 1.2.3, and expect to write you back tomorrow morning.

Sincerely,
    -- Chris


On Tue, 2016-07-05 at 20:29 +0000, Diane Griffith wrote:
So I added in logging and if I set the schema as following:
fooStr:String,fooDate:Date,barStr:String:index=full,fooDouble:Double,
barDouble:Double,uuid:String,*fooPoly:Polygon:srid=4326
And then in the code I take a string of point pairs in the example
format of:
POLYGON((-120 45, -120 50, -125 50, -125 45, -120 45)) (so I used the polygon from the test class you referenced) And then I printed out the attribute value of fooPoly and it is set
as the string above.
And I printed out the value of fooDate which was of format similar
to:
Mon Jul 04 06:30:27 GMT 2016 when I printed it out.
But when I try to insert the feature I get stack trace on 1.2.3 that
looks similar to the following:
Java.lang.NullPointerException
    at
org.locationtech.geomesa.utils.geohas.GeohashUtils$.getInternationalD
ateLineSafeGeometry(GeohashUtils.scala:766)
   at
org.locationtech.geomesa.utils.geohash.GeohashUtils$.decomposeGeometr
y(GeohashUtils.scala:790)
   at
org.locationtech.geomesa.accumulo.index.STIndexEncoder.encode(STIndex
Entry.scala:50)
    at
org.locationtech.geomesa.accumulo.data.tables.SpatioTemporalTabl$$ano
nfun$writer$1.applly(SpatioTEmporalTable.scala:40)
…
Also I had set the following things previously for my feature: SimpleFeatureType.getUserData().put(Constants.SF_PROPERTY_START_TIME,
“fooDate”);
feature.getUserData().put(Hints.USE_PROVIDED_FID, true); feature.getUserData().put(Hints.PROVIDED_FID, “uuidFieldValue”); If I changed it back to the schema inserting a Point field, i.e.: fooStr:String,fooDate:Date,barStr:String:index=full,fooDouble:Double,
barDouble:Double,uuid:String,*fooPoint:Point:srid=4326
And then in the code I take a lon and lat and generate a point POINT(-120 45) Then it inserts fine. Any idea what may be causing that exception? Thanks,
Diane
From: geomesa-users-bounces@xxxxxxxxxxxxxxxx [mailto:geomesa-users-bo
unces@xxxxxxxxxxxxxxxx] On Behalf Of Emilio Lahr-Vivaz
Sent: Friday, July 01, 2016 3:17 PM
To: geomesa-users@xxxxxxxxxxxxxxxx
Subject: Re: [geomesa-users] ingesting and exposing polygon data, is
this supported?
Hi Diane,

What behavior are you observing?

What you're doing should work fine, and we have lots of unit tests
that use polygon shapes. They're written in scala, but for example:

https://github.com/locationtech/geomesa/blob/master/geomesa-accumulo/
geomesa-accumulo-
datastore/src/test/scala/org/locationtech/geomesa/accumulo/data/Accum
uloDataStoreFilterTest.scala

(in this test the binding is a generic geometry - you should stick
with Polygon if that is your shape)

One way to check that the geometry is being set correctly is to print
out the attribute from the simple feature after setting it. Usually
geotools will convert it to the correct type, and if it can't do so
the attribute will be null, e.g.

SimpleFeature feature = featureBuilder.build();
System.out.println(feature.getAttribute(polyFieldName));

I also find this website useful for checking WKT syntax:

http://arthur-e.github.io/Wicket/sandbox-gmaps3.html

Thanks,

Emilio

On 07/01/2016 02:58 PM, Diane Griffith wrote:
I am trying to debug if I ingested polygon data correctly.  I’d like
to ingest a polygon and expose it via geomesa/geoserver.
Is there a trick to this as I am not sure it actually created the
indexes expected.
I was going to have the regular point data and then have a separate
data set that had the associated polygons to turn them on and off.
I defined a field of *polygonField:Polygon:srid=4326 in the schema. I
then created  a polygon via geotools out of arrays of points to
insert with the feature data.
(i.e. one way I tried was using the following classes: WKTReader reader = new WKTReader(geometryFactory); String
coordinates=”///arrays of  point values///”
StrinBuffer polyStringBuffer=new
StringBuffer(“POLYGON((“).append(coordinates).append(“))”);
Polygon polygon = (Polygon) reader.read(polyStringBuffer.toString()); featureBuilder.set(polyFieldName, polygon); …) I had tried a coorindates array as well and using LinearRing and then
using that to create the Polygon.
Did I need to mess with any underlying indexes or am I out of luck of
ingesting them in via geomesa and exposing them through geoserver.
So in this case I don’t put in the associated point with the polygon,
is that a potential problem area?  If there is a good example to
better understand how to do something similar I’d love to review it.
Thanks,
Diane



_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or
unsubscribe from this list, visit
https://www.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or
unsubscribe from this list, visit
https://www.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or
unsubscribe from this list, visit
https://www.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or
unsubscribe from this list, visit
https://www.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit https://www.locationtech.org/mailman/listinfo/geomesa-users
_______________________________________________
geomesa-users mailing list
geomesa-users@xxxxxxxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://www.locationtech.org/mailman/listinfo/geomesa-users



Back to the top