Parallel Multi-Instantiation & Boundary Events

1
0
-1

We have a use case for parallel multi-instantiation with a variable number of tasks & possible actors (budgetary approval for a request)

One of the requirements is to allow users to determine they approve all their possible line items from one line item --- but they don't want all the possible line items batched together bonita1.png

So, to handle this for non multi-instantiated tasks we can do this easily with a non-interrupting boundary timer on the human task that executes a call to another process designed for this (see this diagram for a feel as to what this does): bonita3.png. The problem here is if a user says "auto approve" to the FIRST task but there are others in this list it will never re-fire the boundary event.

Any suggestions?
bonita2.png

Comments

Submitted by mrobinson_1362703 on Thu, 09/19/2019 - 22:31

Phew! I got it to work AND have it be a LOT less complicated than before.

1) Keep as a regular parallel multi-instantiation (no need for a sub process) -- this allows the documents to flow as expected.
2) Move the auto task completion to a Connector Out Groovy Script (this never occurred to me -- thanks for the hint about that earlier) -- so no need to invoke the REST API at all to accomplish this!

/* Generate an arrayList of ALL notifications */

import java.time.LocalDateTime
import java.time.ZoneOffset

import org.bonitasoft.engine.api.ProcessAPI;
import org.bonitasoft.engine.api.IdentityAPI;
import org.bonitasoft.engine.bpm.flownode.HumanTaskInstance;
import org.bonitasoft.engine.bpm.flownode.HumanTaskInstanceSearchDescriptor
import org.bonitasoft.engine.bpm.parameter.ParameterCriterion
import org.bonitasoft.engine.bpm.parameter.ParameterInstance
import org.bonitasoft.engine.bpm.process.ProcessDefinition
import org.bonitasoft.engine.bpm.process.ProcessDeploymentInfo
import org.bonitasoft.engine.bpm.process.ProcessDeploymentInfoSearchDescriptor
import org.bonitasoft.engine.identity.User
import org.bonitasoft.engine.search.Order;
import org.bonitasoft.engine.search.SearchOptions;
import org.bonitasoft.engine.search.SearchOptionsBuilder;
import org.bonitasoft.engine.search.SearchResult;

import java.util.logging.Logger;

ProcessAPI processAPI = apiAccessor.getProcessAPI();
IdentityAPI identityAPI = apiAccessor.getIdentityAPI();
Logger logger=Logger.getLogger("org.bonitasoft");

/* Initialize the search builder for the humanTasks */
final SearchOptionsBuilder humanTaskSearchOptionsBuilder = new SearchOptionsBuilder(0, 5000);
humanTaskSearchOptionsBuilder.filter(HumanTaskInstanceSearchDescriptor.PROCESS_INSTANCE_ID,processInstanceId );
humanTaskSearchOptionsBuilder.or();
humanTaskSearchOptionsBuilder.filter(HumanTaskInstanceSearchDescriptor.PARENT_CONTAINER_ID, processInstanceId)
humanTaskSearchOptionsBuilder.or();
humanTaskSearchOptionsBuilder.filter(HumanTaskInstanceSearchDescriptor.ROOT_PROCESS_INSTANCE_ID, processInstanceId);
humanTaskSearchOptionsBuilder.sort(HumanTaskInstanceSearchDescriptor.DUE_DATE, Order.DESC);
SearchResult<HumanTaskInstance> allTasksResults = processAPI.searchAssignedAndPendingHumanTasks(humanTaskSearchOptionsBuilder.done());
List<HumanTaskInstance> allTasks = allTasksResults.getResult();
List<User> possible;
List<Long> possibleUsers;
User u = null;

for(HumanTaskInstance task : allTasks) {
boolean canDo=false;
possibleUsers = new ArrayList<Long>();
if(task.assigneeId==0) {
possible = processAPI.getPossibleUsersOfPendingHumanTask(task.getId(), 0, 500);
for(User user : possible) {
possibleUsers.add(user.getId());
}
} else {
possibleUsers.add(task.assigneeId);
}

for(long userId : possibleUsers) {
if(autoApprove.contains(userId)) {
HashMap inputs = new HashMap<String,Serializable>();
HashMap packet = new HashMap<String,String>();
List<String> emptyList = new ArrayList<?>();
packet.put("approvalStatus","Approved");
packet.put("comment", "Auto-Approved "+currentStep);
inputs.put("myAdComp", packet);
inputs.put("supportingDocumentation", emptyList);
logger.warn("Attempting to execute a task for "+userId);
logger.warn(inputs.toString());
processAPI.assignAndExecuteUserTask(userId, task.getId(), inputs);
}
}
}
return true;

Submitted by Pierre-yves Monnet on Sat, 09/21/2019 - 03:18

Awesome!

That's why we have a Professional Service Team at Bonitasoft, to speed up your development!
Have a good week end,
Regards

2 answers

1
0
-1

Hello Mrobinson.
It is an interesting use case.
First, a question: when a student checks the "Auto approve my Subsequent task ", you want to approve all the multi-iteration task. But do you have after this multi iteration tasks any another human task that you wish to then to approve automatically? If yes, you have to save this auto-approve information in your process.

Back to the auto-approve iteration.

Doing that via a Boundary event does not solve your issue. The boundary event will fire only ONE task, and in your situation, you have MULTIPLE tasks.

The best is to use the "Early completion condition" in the multi iteration task. This condition is visible on the bottom of the panel.
Using this condition, all multiple tasks will be stopped, and the iteration is finished. Saying that I implied that all tasks of the multi-iteration are for the student.

If you have some tasks of the multi iteration for the student, and some another for a teacher. You want to approve only the Student tasks, and then you can manage that by a Groovy Connector OUT in the MultiInstanciation. In this Groovy Connector out, select all the different Student task of the Multi Iteration, and execute them via the API.

Note: in the first process, we can see a task "Login to Rest Service" and another task "ExecuteRestApi". Is the Login is related to the Execute?
If yes, you have to take into account different aspect.

  • on a Cluster, the "Login To Rest Service" may be executed on node 1, then "Execute Rest API" on Node 2. Then, you understand the Execute can't run.
    ==> For the moment, Bonita Engine execute on the same node all the automatic task, but it may change to share the load between multiple nodes.

  • Execution is split into multiple "work unit". A work unit is executed by "workers" (a thread). That's mean the login can be done at 10:01 in the morning by a worker. If you have a lot of load on your server, the second work unit (Execute Rest API) can be executed at 10:15... and then the connection faces a timeout.

  • if an Execute Rest Api failed (external service is done), task move to "failed task". Then, the administrator can re-execute it. But a lot of time may pass during the two-step, and the Rest Connection must be created again

So, I advise creating some "self-content" call. If your Rest Service needs to have the first connection, I recommend:
- to create your connector, then execute the two rest call in the same connector
- if you want to save the connection and reuse it (like a Pool of Rest Connection), then you can do that in Java, via a Bonita Command for example.

Hope this help,

Comments

Submitted by mrobinson_1362703 on Thu, 09/19/2019 - 19:50

Thank you Pierre-Yves, very insightful comments.

The early completion scenario won't work here because we could have x different tasks created, but the initial approver may only be able to only work one or two of these. I got this to MOSTLY work using a separate process call for the multiinstantiated attributes. My problem NOW is that I cannot fetch the user-uploaded documents from the subprocess and move them into the parent process. I'm thinking this is a bug that I cannot work around.

When I have the following code on as an Operation to set the process's documents, it outputs 0 results for both Open & Archived documents -- but, if I execute this as another step after the multi-instantiated Call activity it will return the Archived documents I want --- but I lose the submitter which is less than ideal. It's too bad the Call activity doesn't support sending/receiving Documents (I'm about to try setting the documents to a List pool variable in the subprocess and retrieve the data that way...)

import java.util.logging.Logger
import org.bonitasoft.engine.api.DocumentAPI
import org.bonitasoft.engine.bpm.document.ArchivedDocument
import org.bonitasoft.engine.bpm.document.ArchivedDocumentsSearchDescriptor
import org.bonitasoft.engine.bpm.document.Document
import org.bonitasoft.engine.bpm.document.DocumentValue
import org.bonitasoft.engine.bpm.document.DocumentsSearchDescriptor
import org.bonitasoft.engine.search.SearchOptionsBuilder
import org.bonitasoft.engine.search.SearchResult
Logger logger = Logger.getLogger("org.bonitasoft");
def processAPI = apiAccessor.getProcessAPI()
List newDocValues = new ArrayList();
List newDocs = new ArrayList();

logger.info("Look for OPEN Documents in Case #"+subCaseId.toString());
SearchOptionsBuilder sob = new SearchOptionsBuilder(0, 100);
sob.filter(DocumentsSearchDescriptor.PROCESSINSTANCE_ID,subCaseId);
List documents = processAPI.searchDocuments(sob.done()).getResult();
logger.info("Got "+documents.size().toString()+" documents to process!");

for(Document doc : documents) {
newDocs.add(doc);
logger.info(doc.toString());
DocumentAPI docAPI = processAPI;
def docContent = processAPI.getDocumentContent(doc.contentStorageId)
DocumentValue dv = new DocumentValue(docContent,doc.getContentMimeType(), doc.getContentFileName())
newDocValues.add(dv);
}

logger.info("Look for ARCHIVED Documents in Case #"+subCaseId.toString());
sob = new SearchOptionsBuilder(0, 100);
sob.filter(ArchivedDocumentsSearchDescriptor.PROCESSINSTANCE_ID,subCaseId);
List archDocuments = processAPI.searchArchivedDocuments(sob.done()).getResult();
logger.info("Got "+archDocuments.size().toString()+" documents to process!");

for(ArchivedDocument doc : archDocuments) {
newDocs.add(doc);
logger.info(doc.toString());
DocumentAPI docAPI = processAPI;
def docContent = processAPI.getDocumentContent(doc.contentStorageId)
DocumentValue dv = new DocumentValue(docContent,doc.getContentMimeType(), doc.getContentFileName())
newDocValues.add(dv);
}
return newDocs;

As for the subprocess that handles the rest api call, the "login to api" step simply fetches the security token and parses it for the "Execute" step to use -- i'm guessing this would be safe even in a clustered environment (which it isn't)

Thanks!

Submitted by Pierre-yves Monnet on Thu, 09/19/2019 - 22:19

For the sub process document, the point is a document is a process variable.
That's why when you search the document using the PARENT PROCESS ID, you don't find them. A sub process has a "internal process ID" that you never see in the portal. Child process Id can be use to retrieve documents attached to the child process id.
Have a look on Longboard to understand the internal structure.

So, to collect the sub process document, you have to proceed like this:
- from child, you have a document.To simplify you can create a process variable List and collect inside the documentId

  • form the parent, access the List OR via the API, access the child process and then collect the list of document Id

  • then in the parent, you have to read / copy the document in the parent process. You'll duplicate the document then.

You can decide to keep the document in the child process. Then, you need to develop a RestApi Extension to get access to any Child sub process document (in standard, you only can access to the same process document)

1
0
-1

Maybe, you just need to create loop over each task and verify the 'auto approval', if it's og, the task will be approve otherwise, task need to be approve. And then, go to next task.

Comments

Submitted by mrobinson_1362703 on Thu, 09/19/2019 - 19:57

Hi Bastien,

I think if I do it that way then it will now allow the tasks to be executed in parallel unless I draw out each task individually. Please correct me if wrong.

Thank you!

Notifications