Hi all,
I am working with a connector to fetch data from an external db and to save it using a JsonBuilder.
The excerpt of the groovy script looks like this:

try {

    List myList= new ArrayList();
    def builder = new JsonBuilder();
    String res = new String();
    while (resultset.next()) { 

        def lin1 = builder {
            id resultset.getObject(1).toString()
            name resultset.getObject(2).toString()
            surname resultset.getObject(3).toString()
            numberumber resultset.getObject(4).toString()

        res = res + builder.toString() + ", ";

    return res;


and yields the following error:

2017-10-11 16:10:35.090 +0200 WARNING: org.hibernate.engine.jdbc.spi.SqlExceptionHelper SQL Error: 22001, SQLState: 22001
2017-10-11 16:10:35.119 +0200 SEVERE: org.hibernate.engine.jdbc.spi.SqlExceptionHelper Wert zu gross / lang für Feld "SHORTTEXTVALUE VARCHAR_IGNORECASE(255)": "CAST('[{""id"":""19"",""name"":""John"",""surname"":""Smith"",""number"":""123456789""}, {""id"":""20"",""name"":""Mark"",""surname"":""Suffer"",""numb... (294)"
Value too long for column "SHORTTEXTVALUE VARCHAR_IGNORECASE(255)": "CAST('[{""id"":""19"",""name"":""John"",""surname"":""Smith"",""number"":""123456789""}, {""id"":""20"",""name"":""Mark"",""surname"":""Suffer"",""numb... (294)"; SQL statement:
insert into arch_data_instance (name, description, transientData, className, containerId, containerType, archiveDate, sourceObjectId, shortTextValue, DISCRIMINANT, tenantId, id) values (?, ?, ?, ?, ?, ?, ?, ?, ?, 'SAShortTextDataInstanceImpl', ?, ?) [22001-175]
2017-10-11 16:10:35.241 +0200 SEVERE: org.hibernate.engine.transaction.synchronization.internal.SynchronizationCallbackCoordinatorNonTrackingImpl HHH000346: Error during managed flush [could not execute statement]

It seems to me that bonita saves the data I fetch in an own "arch_data_instance" table which is set as a varchar(255). Therefore, when the external data is bigger than a varchar(255) the exception is raised.
Is my interpretation correct? Any workaround?

Thanks in advance.

No answers yet.