Keeping track of Emails with EWS and K2

I haven’t posted anything in a while as I have been working on another project, which I am hoping to unveil sometime very soon. In the meantime though  I wanted to talk about K2 and Exchange. Now we all know that K2 can talk to exchange and send emails and receive replies back in the form of SmartActions out of the box.

But what if we wanted to keep track of the emails sent from a K2 app then this gets a bit tricky. We could save the message in a database using  a SmartObject Event and then use the email Event to send the email. Which is an ok approach, but I think something could be done better, where we don’t need to have this two step/event approach.

So lets have a think about about what i want the assembly to do?

  1. Send an email
  2. View the mailbox
  3. View an email

We could modify the existing email event to do what I am suggesting below, but that would be a pain as we would need to do it every time we use the email event and would also require the person building the workflow to be able to write code.  With the approach  I am going to go through, it  will allow anyone to be able to build a workflow where it would track what emails are being sent without having to write code and more importantly every app will be able to see it’s own emails it has sent out.

We are going to create a Email Endpoint Assembly that will allow a workflow to send an email and reference a primary key , SN, Process Instance Id or  application type (see framework) and view it’s mailbox by same type of information.

Getting Started

We will need the following

  1. Visual Studio 2015+
  2. Microsoft exchange web service (EWS URL)
  3. Exchange version
  4. UserAccount specifically setup just to be used for K2 mailbox (I normally create a service account, that just has a mailbox)
  5. User Account Email Address
  6. Microsoft.Exchange.Webservices.dll

To do this i need use the assembly Microsoft.Exchange.Webservices.dll which you can get from here .

Once we have the above we can start building the new email endpoint assembly.

EWS Code

To setup the connection to exchange server,  it is important to identify which version of exchange we are talking too.

ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2010_SP2);

When we have created an instance of the exchange service, we then give the instance the exchange web service url.

service.Url = new Uri(“Web service”);

ExchangeService service = new ExchangeService(ExchangeVersion.Exchange2010_SP2);
service.Credentials = new WebCredentials("Username", "Password");

service.Url = new Uri("Web service");
service.ImpersonatedUserId = new ImpersonatedUserId(ConnectingIdType.SmtpAddress, "Email Address");

We have now got a connection to exchange server via it’s web service and we can do a number of different things such as

  1. Send Email
  2. View mailbox contents , such as the inbox or sent items
  3. View an email
  4. We can also do things such as create meeting requests

We will look at the basic code for sending an email

Sending an Email

To send an email we need to create an instance of the EmailMessage object and pass in the Exchange Service Object.

EmailMessage email = new EmailMessage(service);

Once we have done that we can access the properties and methods of EmailMessage object.

So we can give are email a subject email.Subject = “Subject”;

We can also give the email body and decide whether we want to send a plain text or a HTML message.

email.Body = new MessageBody(BodyType.HTML, Body);

EmailMessage email = new EmailMessage(service);
email.Subject = "Subject";
email.Body = new MessageBody(BodyType.HTML, Body);

To add recipients (To, Cc, Bcc) we just need to add the following code

  • email.ToRecipients.Add(“address”);
  • email.CcRecipients.Add(“address”);
  • email.BccRecipients.Add(“address”);

If you have more than one email address for ‘To’ or ‘Cc’ or the ‘Bcc’ then we can simply loop through the correct address method parameter. Like in the example below.

 if (To.Contains(";"))
 {
 String[] to = To.Split(';');
 foreach (var address in to)
 {
 email.ToRecipients.Add(address);
 }
 }
 else
 {
 email.ToRecipients.Add(To);
 }

To send the email we simply use .Send(); method

 email.SendAndSaveCopy();

Now we can send a basic email. So let us have a look how we can now extend this email so it can contain some additional properties that relate to the workflow it is being sent from.

The EmailMessage object allows us to add properties called extend properties and they are really simple to create. The only thing you need to remember is that the GUID used to identify the property must be the same every time we an email is sent and needs to be the same for when when we retrieve the mailbox.

So in this example i am going to bind the process instance id to the email message. We will then be able to search the sent items mailbox and retrieve all the messages that relates to that process instance id.

Creating extend properties.

This is the most important part , extend properties is what allows the ability to be able to group emails by the process Instance I’d, business key etc.. 

Create a Guid called ‘ProcessInstanceId’ and assign it a GUID.

Guid ProcessInstanceId_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc")

We then have to define the extend property by giving the property a name in this case the property is called ‘ProcessInstanceId’ and we define the data type of the property as a ‘String’.

 ExtendedPropertyDefinition ProcessInstanceId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(ProcessInstanceId_PropertySetId, "ProcessInstanceId", MapiPropertyType.String);

 

Now that we have defined the property , we can now populate the email with the process instance id. In code example below I am checking to see if the ‘ProcessInstanceId’ is greater than 0 or is not null and if true it will assign the property the value of the ‘ProcessInstanceId’ and if it is false it will assign the property a 0.

email.SetExtendedProperty(ProcessInstanceId_ExtendedPropertyDefinition, (ProcessInstanceId > 0 | ProcessInstanceId != null ? ProcessInstanceId : 0));

 

Now every time we send an email, it will now contain the process instance id.  In the complete code example of the ‘Send Emall’ method below I have also added some additional properties to contain the following

  1. Primary Key of the main business data
  2. ProcessTypeId (framework see here)
  3. Foilo of the process instance
  4. MessageId, so we can identify each email
public static string SendEmail(string Subject,string Body, string To, string Cc,string Bcc,int importance, string sn,string Folio, int? ProcessInstanceId, string ProcessTypeId, string BusinessKey)
 {
 string result = string.Empty;
 ExchangeService service = ConnectToExchange();
 try
 {
 if (To != null || To.Length != 0)
 {
 EmailMessage email = new EmailMessage(service);
 email.Subject = Subject;
 email.Body = new MessageBody(BodyType.HTML, Body);

Guid SN_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition SN_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(SN_PropertySetId, "SN", MapiPropertyType.String);
 email.SetExtendedProperty(SN_ExtendedPropertyDefinition, (!String.IsNullOrEmpty(sn) ? sn : "0_0"));

Guid Folio_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
ExtendedPropertyDefinition Folio_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(Folio_PropertySetId, "Folio", MapiPropertyType.String);
 email.SetExtendedProperty(Folio_ExtendedPropertyDefinition, (!String.IsNullOrEmpty(Folio) ? Folio : "Email Message"));

Guid ProcessInstanceId_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition ProcessInstanceId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(ProcessInstanceId_PropertySetId, "ProcessInstanceId", MapiPropertyType.String);
 email.SetExtendedProperty(ProcessInstanceId_ExtendedPropertyDefinition, (ProcessInstanceId > 0 | ProcessInstanceId != null ? ProcessInstanceId : 0));

Guid BusinessKey_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition BusinessKey_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(BusinessKey_PropertySetId, "BusinessKey", MapiPropertyType.String);
 email.SetExtendedProperty(BusinessKey_ExtendedPropertyDefinition, (!String.IsNullOrEmpty(BusinessKey) ? BusinessKey : "0"));

Guid ProcessTypeId_PropertySetId = Guid.Parse("d6520129-3c59-4191-b9d7-4f5160329e4f");ExtendedPropertyDefinition ProcessTypeId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(ProcessTypeId_PropertySetId, "ProcessTypeId", MapiPropertyType.String);
 email.SetExtendedProperty(ProcessTypeId_ExtendedPropertyDefinition, (!String.IsNullOrEmpty(ProcessTypeId) ? ProcessTypeId : "00000000-0000-0000-0000-000000000000"));

Guid MessageId_PropertySetId = Guid.Parse("6e997d14-d9b3-4516-8d14-0a10b0aa74aa");
 string MessageId = Guid.NewGuid().ToString();
 ExtendedPropertyDefinition MessageId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(MessageId_PropertySetId, "ProcessTypeId", MapiPropertyType.String);
 email.SetExtendedProperty(MessageId_ExtendedPropertyDefinition, MessageId);



if (To.Contains(";"))
 {
 String[] to = To.Split(';');
 foreach (var address in to)
 {
 email.ToRecipients.Add(address);
 }
 }
 else
 {
 email.ToRecipients.Add(To);
 }



if (!string.IsNullOrEmpty(Cc))
 {
 if (Cc.Contains(";"))
 {
 String[] to = Cc.Split(';');
 foreach( var address in to)
 {
 email.CcRecipients.Add(address);
 }
 }
 else
 {
 email.CcRecipients.Add(Cc);

}
 }

if (!string.IsNullOrEmpty(Bcc))
 {
 if (Bcc.Contains(";"))
 {
 String[] to = Bcc.Split(';');
 foreach (var address in to)
 {
 email.BccRecipients.Add(address);
 }
 }
 else
 {
 email.BccRecipients.Add(Cc);

}
 }

if (importance > 0)
 {
 email.Importance = (importance == 1 ? Microsoft.Exchange.WebServices.Data.Importance.Normal : Importance.High);
 }

email.SendAndSaveCopy();

result = email.Id.ToString();
 }
 }
 catch(Exception ex)
 {
 result = "Error: " + ex.Message.ToString(); 
 }
 finally
 {

}
 return result;
 }

Retrieving an Exchange Mailbox

Now that we can send emails with K2 related data we now need to be able to retrieve those emails. So we can then view them in a SmartForm.

The first thing we need

public static List<EmailBox> GetMailBox(string MailBoxType,int PageSize)
 {
 ItemView view = new ItemView(PageSize);
 List<EmailBox> list = new List<EmailBox>();

Guid SN_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition SN_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(SN_PropertySetId, "SN", MapiPropertyType.String);

Guid Folio_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition Folio_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(Folio_PropertySetId, "Folio", MapiPropertyType.String);

Guid ProcessInstanceId_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition ProcessInstanceId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(ProcessInstanceId_PropertySetId, "ProcessInstanceId", MapiPropertyType.String);

Guid BusinessKey_PropertySetId = Guid.Parse("fc0a27be-f463-472e-bea8-648e62d1d7dc");
 ExtendedPropertyDefinition BusinessKey_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(BusinessKey_PropertySetId, "BusinessKey", MapiPropertyType.String);

Guid ProcessTypeId_PropertySetId = Guid.Parse("d6520129-3c59-4191-b9d7-4f5160329e4f");
 ExtendedPropertyDefinition ProcessTypeId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(ProcessTypeId_PropertySetId, "ProcessTypeId", MapiPropertyType.String);

Guid MessageId_PropertySetId = Guid.Parse("6e997d14-d9b3-4516-8d14-0a10b0aa74aa");
 ExtendedPropertyDefinition MessageId_ExtendedPropertyDefinition = new ExtendedPropertyDefinition(MessageId_PropertySetId, "ProcessTypeId", MapiPropertyType.String);

ExchangeService service = ConnectToExchange();
 view.PropertySet = new PropertySet(BasePropertySet.IdOnly, ItemSchema.Subject, SN_ExtendedPropertyDefinition, Folio_ExtendedPropertyDefinition, ProcessInstanceId_ExtendedPropertyDefinition, BusinessKey_ExtendedPropertyDefinition, ProcessTypeId_ExtendedPropertyDefinition, MessageId_ExtendedPropertyDefinition);

FindItemsResults<Item> findResults = service.FindItems((MailBoxType == "Sent" ? WellKnownFolderName.SentItems : WellKnownFolderName.Inbox), view);
 foreach(Item email in findResults.Items)
 {
 Item mail = Item.Bind(service, email.Id);
 list.Add(new EmailBox
 {
 MailBoxType = MailBoxType,
 Subject = mail.Subject,
 Body = mail.Body,
 Importance = mail.Importance.ToString(),
 Id = mail.Id.ToString(),
 Categories = mail.Categories.ToString(),
 DateTimeCreated = mail.DateTimeCreated,
 DateTimeReceived = mail.DateTimeReceived,
 DateTimeSent = mail.DateTimeSent,
 Cc = mail.DisplayCc,
 To = mail.DisplayTo,
 SN = (email.ExtendedProperties.Count > 0 ? email.ExtendedProperties[0].Value.ToString():string.Empty),
 Folio = (email.ExtendedProperties.Count > 0 ? email.ExtendedProperties[1].Value.ToString(): string.Empty),
 ProcessInstanceId = (email.ExtendedProperties.Count > 0 ? email.ExtendedProperties[2].Value.ToString(): string.Empty),
 BusinessKey = (email.ExtendedProperties.Count > 0 ? email.ExtendedProperties[3].Value.ToString(): string.Empty),
 ProcessTypeId = (email.ExtendedProperties.Count > 0 ? email.ExtendedProperties[4].Value.ToString(): string.Empty),
 MessageId = (email.ExtendedProperties.Count > 0 ? email.ExtendedProperties[5].Value.ToString(): string.Empty)

});

}
 return list;

}

d

de

Retrieve an Email

Now that we can retrieve a list of emails from a mailbox we now need to be able to retrieve a single email.

We can do this.ww

public static EmailBox GetEmail(string Id)
 {
 EmailBox email = new EmailBox();
 ExchangeService service = ConnectToExchange();

try
 {
 Item mail = Item.Bind(service, (ItemId)Id);
 {
 email.Subject = mail.Subject;
 email.Body = mail.Body;
 email.Importance = mail.Importance.ToString();
 email.Id = mail.Id.ToString();
 email.Categories = mail.Categories.ToString() ;
 email.DateTimeCreated = mail.DateTimeCreated;
 email.DateTimeReceived = mail.DateTimeReceived;
 email.DateTimeSent = mail.DateTimeSent;
 email.Cc = mail.DisplayCc;
 email.To = mail.DisplayTo;
 email.SN = (mail.ExtendedProperties.Count > 0 ? mail.ExtendedProperties[0].Value.ToString(): string.Empty);
 email.Folio = (mail.ExtendedProperties.Count > 0 ? mail.ExtendedProperties[1].Value.ToString(): string.Empty);
 email.ProcessInstanceId = (mail.ExtendedProperties.Count > 0 ? mail.ExtendedProperties[2].Value.ToString(): string.Empty);
 email.BusinessKey = (mail.ExtendedProperties.Count > 0 ? mail.ExtendedProperties[3].Value.ToString(): string.Empty);
 email.ProcessTypeId = (mail.ExtendedProperties.Count > 0 ? mail.ExtendedProperties[4].Value.ToString(): string.Empty);
 email.MessageId = (mail.ExtendedProperties.Count > 0 ? mail.ExtendedProperties[5].Value.ToString(): string.Empty);
}
}
 catch(Exception ex)
 { }
 finally
 {

}
 return email;
 }

 

Now that we have these methods to send an email, retrieve a mailbox and to retrieve an email. We can now register the library as an endpoint assembly. 

We could extend this to be able to add attachments and we could also look at the calendar meeting requests and doing the same with those and extend their properties 

We can then build a SmartObject around it and then we can use it within are workflows and Smartforms. To make it even easier for people to use the new email SmartObject, we could wrap a SmartWizard around the methods.

The full solution can be downloaded from here 

 

Testing a workflow inside a Smartform

So I have posted a few articles on testing in K2, we have looked at Unit Testing,  Testing Smartforms and then just recently more examples of testing a workflow.  After writing that article I realised that there was something missing. What about building a workflow testing tool inside of a Smartform. As it all well and good that we can write unit tests for testing a workflow, but that relies on the workflow developer being able to write code and being that the whole idea of K2 is to build low code , no code solutions. Then there should be away of doing it without the developer having to write code.

There are a number of tools out in the market and they do absolutely fantastic job of testing workflows without having to write code. But I wanted to see if it could be done inside a Smartform.

So my challenge is to build a Smartform App that would test a workflow, with out the tester having to write any code.

The app had to do the following

  1. Start any workflow and check that it has started
  2. Check that tasks have been created and the correct number
  3. Action a task
  4. Checking the task has been completed
  5. Check that the workflow has finished
  6. Check the workflow has gone down the direct path.

The app should all be in Smartforms and should be easy to use.

So lets see how I got on.  So i am going to start with the finished tool and then i will take us through it.

Testing Dashboard

The dashboard shows all the current tests that have been performed and shows the overall percentage of pass and failures.

dashboard

From here you can also click on ‘Create Test’ button to create a new test. Selecting an existing test and clicking on ‘Run Test’ will  run an existing test. Finally double clicking on a test will open up the details about the test.

Test Details

So the test details shows the following information

test details

  1. Workflow being tested
  2. If has already been tested
  3. When it was tested
  4. Overall Pass or Fail
  5. What percentage passed and failed
  6. What was tested and whether each test passed or failed.
  7. It also shows any other tests relating to the workflow being tested.

From the test details you also have the ability to retest the test as well.

Creating a new test

From the dashboard click on ‘Create Test’ button.

test builder.PNG

  1. In the new test form, give the test a name
  2. Select a workflow that you are going to test. Once selected you will then be able to see all the activities of that workflow beneath.test-builder1
  3. Select the activities that the workflow will go throughtest-builder2
  4. The last section is where we build up the actual the tests and what results we expect from those tests.test-builder3

Building a simple test script

The workflow we have selected is a very simple one. It has the following activities

  1. Setup
  2. Task
  3. Approve
  4. Reject
  5. End

test-workflow

Task is where there is a client event which has one slot and goes to the originator.

Now that we have got the basic test details and what route we expect the test to go. We can now build are test script.

  1. Select ‘Start Process’ from the test type drop down list
  2. You will notice that the other text boxes in the row, will be pre populated with information and some will be disabled depending on what information the particular test type requires.
  3. Now we choose comparison sign we are going to use it can be either
    1.  =
    2. >
    3. >=
    4. <
    5. <=
  4. We are going to choose >
  5. In the expected result / comparison box lets enter 0, this is because if workflow starts successfully we expect a value greater than 0.
  6. So ‘Start Process’, needs to know the ‘Folio’ to give the workflow, you can either enter in something. Or if you leave it blank it will take the name we have given earlier to the test.
  7. The next parameter we need to give the test type is the process name. You can either copy the full workflow name and paste into this box or leave it and it will automatically get the workflow name.
  8. The next box is called milliseconds, you can either leave it at 0. Or you can enter in a value. This is useful as it may be your workflow is taking a while to perform an event. So you can enter in a delay to the tests to compensate for this.
  9.  Click on ‘Add’ again , we now what to check the status of the workflow, to make sure that once we started it. It is now active.
  10. So select ‘Get Workflow Status’ from the drop down list
  11. Choose ‘=’ from Sign
  12. Enter ‘Active’ in expected result
  13. We don’t need to enter in anything else
  14. Click on ‘Add’
  15. We now want to check to see if has generated some tasks
  16. Select ‘Get Task Count’
  17. Select ‘=’ in Sign
  18. Expected Result should be 1, as we know it’s only one slot and going to the originator.
  19. ProcessInstanceId, leave that how it is, as the testing framework will replace that with the actual process instance id as run time of the test
  20. Process name either enter in the full name or leave it and it will get automatically populated
  21. Next we can action the task
  22. Select ‘=’ for the sign
  23. In the expected result box enter ‘True’
  24. Where it’s says ‘Action’ we need to enter in the action we expect the test to use. So lets enter ‘Approve’
  25. Where it says ‘Serial Number’ we leave that and like the process instance id, it will be populated by the testing framework at run time.
  26. For Destination user, enter in the destination user that the task is supposed to be for.
  27. When this test type runs it will try and action the task based on the information we have given it (Task Action and Destination User) if the test is successful the actual result will be true.
  28. The last test we are going to do is to check that workflow has gone down the correct path and the correct activities were used. The test will compare the path we expect it to be. which was defined in the ‘Workflow Route’ section to the actual path taken.
  29. Select ‘Compare Activities’ from ‘Test Type’
  30. Select ‘=’ from Sign
  31. Enter in ‘True’ in expected result
  32. Leave ProcessInstanceId and Activities how it is
  33. In the Millisecond box enter ‘4000’ this will make the ‘Compare Activities’ test to wait 4000 milliseconds before starting the test. This will give the workflow time to perform it’s other activities before completing. If this test fails then increase the time. It will then pass.
  34. The test script should now look like thistest-builder4
  35. Now click on ‘Save’
  36. You will now see a read only version of the test
  37. With two buttons ‘Edit Test’ and ‘Run Test’

Running a Test

To run a test, either select a test from the test list and click on ‘Run Test’ or from the test details click on the button ‘Run Test’

Once the test has been completed, you will be displayed the results for each individual test and also the overall test.

resuts

Each individual test will show whether it has passed or failed. If it has passed the status will be true and in green and if had failed it will be false and in red.

For each test that we create we also get the overall test result.

A failed test would be when any of the individual tests has failed

fail

It will show a pie chart showing the % of pass and fail with big black x cross above it.

pass

For an overall Pass we get a tick and of course a 100%

These test results will remain until, you click on ‘Run Test’ again and this will overwrite the previous test results with the new ones.

Continue reading “Testing a workflow inside a Smartform”

Using The New Rest Broker

Hooray, it’s finally here, after months and months of beta testing. In 4.7  we finally get to use REST Services, without having to write a load of code to wrap around the REST service and then access it via the Endpoint assembly broker.

It’s really easy to setup and start using REST Services.

K2 Endpoint REST Broker, uses Swagger to define the Service Objects and methods. Swagger is becoming the open standard for describing REST APIS

Getting access to a Swagger Editor

To get a swagger definition of a REST Service you need access to a Swagger editor that can see the REST Service

  1. http://editor.swagger.io/#/ – Online Swagger Editor , if the Rest Service is accessible to the outside world
  2. https://github.com/swagger-api/swagger.io/blob/wordpress/tools/swagger-editor.md  – You can download a copy of the editor to install on a server where the REST service is accessible

swagger

Creating a Swagger definition file

To create a Swagger definition file using the swagger editor (same for both online and local version) follow the following steps

  • Click on ‘File’
  • Click on ‘Import URL’

Swagger1.PNG

  • Enter in the URL
  • If it can’t find the URL, uncheck ‘Use CORS proxy’
  • Once it has found the REST Service , click on ‘Import’
  • If there are no errors, click on ‘File’
  • Click on ‘Download JSON’

swagger2

  •  Take the .JSON file and copy it to the K2 server
  • Create a folder on the C:\  called ‘Swagger’
  • Copy file into the folder
  • Open up the SmartObject Tester Tool
  • Expand ‘ServiceObject Explorer’
  • Right click on ‘Endpoints Rest’ and click on ‘Register Service Instance’
  • In the section ‘Descriptor Location’ enter in the full path of the Swagger file we copied into the Swagger folder

rest

  • Click on ‘Next’ and then click on ‘Add’
  • K2 will now go through the Swagger file and create service objects based on the definitions in the Swagger file
  • You can now create SmartObjects as normal

 

Using K2 Framework (K2F) with Swagger and the REST Broker

K2F provides support for managing the swagger definitions of the REST Broker. It will do this in two ways.

  1. There will be a Web API Handler that accepts Posts, the idea being that when the global registry changes it will post to the service with the new host address and application name. The service then goes into the library and finds the correct swagger definition and updates it with the new host address. It then exports the swagger definition file to the  swagger folder on the server and then updates K2 with the new definition. ToDo
  2. In the portal you can view all the swagger definitions, where new ones can be added and old ones can be edited. Once created in the library, this will then create Swagger JSON file on the K2 Server.

swaggerdefintionlib

Add  a new Swagger file

demoswagger

  1. Click on ‘Add Swagger Definition’
  2. Form loads up where you can enter in the definition name, host address and the definition it self
  3. Click on ‘Submit’
  4. This will save the swagger definition in the library and also create its corresponding JSON file on to the server.
  5. You will still need to create the instance manually using the service object tester tool

savedswagger

Unchaining Workflows with Framework Part 1

Introduction

Building workflows in K2 is great, we can drag and drop events onto the canvas , click through wizards that tell us what to do. We can setup line rules, tasks etc.. With just using a mouse. Which is fantastic as its quick and easy.

We can also deploy a workflow following a simple wizard too.

But when we want to change something in a email, or change the form that the task is pointing too etc.. It involves opening up the solution and making those changes and redeploying and then only new instances of that workflow picking up those changes.

So what do we do? Well we do what i tell all my clients, we make everything in the workflow that could dynamic. We store the url of a form in a data store somewhere and the same for email content or the business rules like escalation values. We basically unchain the workflow from the data. We free it so we can make these type of changes from outside the workflow. So we don’t have to go back into the workflow and make the change and then redeploy. We just go to a simple form  and make the change and then any workflow which is pointing to this value will automatically receives the change.   Sounds brilliant doesn’t it?

So over the years myself  and my colleagues have been slowly putting  a framework/ pattern together.

It doesn’t have a fancy name so throughout this article i am just going to call it framework.

Framework

The framework designed to make the workflows  adaptable to change without having to redeploy the workflow each time. Its also designed to make building a workflow quicker and giving the people the involved from developers, administrators and users a complete overview of what is happening.

Some of the features of framework are

  1. Headless forms, with framework, we can make the task forms dynamic so at run time we can choose what form the task will be get.
  2. Externalize process data,
    1. External  data fields are created outside of the workflow, allowing workflows to be more efficient and for the data data to be deleted after the instance has finished instead of persisting after the instance has finished.
    2. Process data, key data for the process type is externalized allowing the data to be changed without redeployment of the workflow
  3. Audit trail, every workflow has a audit trail allowing users to keep track of important changes  within the flow of the instance of a workflow
  4. Workflow registration, every workflow is registered allowing users, developers and admin to keep track of what workflows are running , if they are related to another workflow and how complete an instance of a workflow type is.

Components of framework

  1. Process Management
    1. Externalized Data fields
    2. Configuration data
    3. Time slots
  2. Swagger Definition Library
  3. Script Library
  4. Form Library

 

To use the framework within your workflow, we need to setup a few things inside the workflow and also outside the workflow.

 

Setting up a workflow to use the framework

Create Key Data Fields

First thing we need to do is create the three main data fields that every workflow using the framework should have.

Data field name Type
DataFieldId String
K2FID String
ProcessTypeId String

datafields

  1. DataFieldId, will old the external data field Key
  2. K2FID, will hold the key value for the registered process istance
  3. ProcessTypeId, holds the key for type of process

There will probably be fourth data field that is not mentioned above, which will be the primary key for the main business data.

Create References

To make life easier when it comes to designing \ developing workflows the framework recommends we use ‘References’ to create short cut to a SmartObject method and its properties of data.

To create a reference, go to the data tab (3rd) and right click on ‘References’ and follow the instructions.

The references that we need to create are as follows

Reference Name SmartObject Method Parameter
Configuration Data K2F.SMO.ProCon GetProcessConfig ProcessTypeId (Data field)
External Datafields K2F.SMO.Datafields Get Datafields DataFieldId (Data field)

Configuration data, contains values for the process type

External Data fields will contain data for the instance of the process

references

Like the Data fields, you may also have a third one for  your main business data.

 

Setup Activity

Every workflow that uses the framework must have an activity called ‘Setup’ This activity has a number of different things to do.

  1. It needs to register the instance of its self, to help keep a track of the instance
  2. Create the data field key, so we can use external data fields
  3. Start auditing the key stages of the workflow

1.Within your workflow create an activity called ‘Setup’

setupact

Getting the Process Type Id

In the ‘Setup’ Activity, drag a ‘Data Event’ into the activity, give it the name ‘Get Process TypeId’

Reference Name SmartObject Method Parameter Return Property
Source K2F.SMO.ProcType GetTypeByProcessName Process Name ProcessTypeId
Destination DataField: ProcessTypeId

Now we have the process type id, it will allow the workflow to get all related framework information relating to this workflow type.

Creating a Data Field Key

In the ‘Setup’ Activity drag the ‘Data Event’ into the activity and give it the name ‘Create DataField Key’

Reference Name SmartObject Method Return Property
Source K2F.SMO.DataFields Create Key DataFieldKey
Destination DataField: DataFieldId

DataFieldId is what we will use to group the process instance data fields together.

Creating a Folio

In the ‘Setup’ Activity drag the ‘Data Event’ into the activity and give it the name ‘Set Folio’

Reference Name  
Source Free Text
Destination [Context Browser].[Workflow Context Browser].[Process Instance].Folio

Registering the instance of a workflow

By registering the instance against the workflow type allows the framework to keep track of how many instances of a particular type is running and to make sure the correct details and other key values are being used.

Create a SmartObject event inside the ‘Setup’ activity and name it ‘Record Workflow’

SmartObject Method Parameters Return
K2C.SMO.ProcInst Register Process Instance
Name Type Value Notes
pParentProcessId Number 0 or process instance Id of a parent process Default is 0 unless this workflow is being called via IPC or case Action then the parent process instance Id is required
pProcessInstanceId Number ProcessId [Context Browser].[Workflow Context Browser].[Process Instance]
pStatus Text Active
pViewFlow Memo View Flow [Context Browser].[Workflow Context Browser].[Process Instance]
pStartDate Date ProcessStartDate [Context Browser].[Workflow Context Browser].[Process Instance]
pBusinessKey Text 0 unless primary key of the main business data is known
pProcessTypeId GUID ProcessTypeId DataField
pProcessId Number 0
PFolio Text ProcessFolio [Context Browser].[Workflow Context Browser].[Process Instance]
pOriginator Text ProcessOriginatorFQN [Context Browser].[Workflow Context Browser].[Process Instance]
pDataFieldKey GUID DataFieldId Data Field
ID, assigned to K2CID datafield

Creating a Audit entry

Now that the instance is registered, against the process type, we need to make an entry in the audit log so we can show what record that the instance has started. Create a SmartObject event inside the ‘Setup’ Activity and call it ‘Add to Audit’

SmartObject Method Parameters
K2F.SMO.Audit Add Audit
Name Type Value Notes
pDetails Text Starting [Name of workflow] Free text. what ever is appropriate
pProConfigId Number 0
pProcessInstanceId Number ProcessId [Context Browser].[Workflow Context Browser].[Process Instance]
pCreatedDate Date ProcessStartDate [Context Browser].[Workflow Context Browser].[Process Instance]
pCreatedBy Text ProcessOriginatorFQN [Context Browser].[Workflow Context Browser].[Process Instance]

The ‘Add to Audit’ event can be copied and pasted into other activities in the workflow and you just need to change pDetails, and pCreatedDate

  1.  The ‘Setup’ Activity should now look like this

setup1

The workflow is now setup  to use the framework

 

 Client Events with the framework

With framework the form’s URL is dynamic and is stored externally, allowing the form to be changed at any point. With out the need to  go back into the workflow

To do set this up in the workflow carry out the following steps

  1. Drag a ‘Default Client Event’ on to the canvas or into an Activity
  2. Give your event a name
  3. Check ‘Task Item URL’
  4. Add the following SmartObject Method

 

 

SmartObject Method Parameters Return
K2F.SMO.Client ClientEventURL
Name Type Value Notes
pClientEventName Text ActivityInstanceName [Context Browser].[Workflow Context Browser].[Activity Instance]
pProcessTypeId GUID ProcessTypeId
FormUrl

 

defaultclientevent

 

  1. Make a note of the name the Activity is called where you have put the client event.
  2. Goto  orm url here’

Updating the  framework of  a workflow instance status

The framework likes to keep track of the status of the workflow that is registered. This allows us to do reporting and to offer additional functionality based on it’s status.

To update the framework of a status change follow these steps

  1. Create a SmartObject Event
  2. Select ‘K2F.SMO.ProcInst’ SmartObject
  3. Select the ‘Update Status’ method
  4. For the input mapping ‘pStatus’ enter in free text to what the status is
  5. For ‘pId’ input parameter use the data field ‘K2FID’

updatestatus

This event like the audit can be copied into other activities where the status needs to be updated.

 

Dynamic Escalation and Start Rules

When there needs to be an escalation on an ‘Activity’ or ‘Process’ we can use K2CF, to make the time frames dynamic.

Setting up the escalation

  1. Click onclock  on the Acitivity
  2. Click ‘Next’, give the escalation a name
  3. Select a ‘Rule Template’, in this case i am going to select ‘Escalate After’
  4. Click ‘Next’
  5. Click on eclipse  in ‘Days’ to bring up the context browser.
  6. Add the following SmartObject Method

emptyescalation

 

SmartObject Method Parameters Return
K2C.CMN.SMO.Time Read
Name Type Value
pTimeId GUID
References Value Filter
Configuration Data ProcessConfigValue ProcessConfigName = TaskTime

It doesn’t have to be ‘TaskTime’ it could be what ever you have called reference

 

Days

 

 

finishedescalation

  1. Repeat steps 5 and 6 for Hours, Minutes and Seconds, changing the return value for to the correct return property based on section. For example if you are on hours , then the return would be hours.
  2. Make a note of the name you passed into the ProcessConfigName. for example ‘TaskTime’
  3. You can also do the same for a ‘Start Rule’

 

End Activity

Every workflow must have an ‘End’ Activity.  In this activity we delete all external data fields relating to the instance of this workflow. We update the status of the process instance information and  update the last entry in the audit table for the instance.

Deleting External Data Fields

  1. Create a SmartObject’ event
  2. Select ‘K2F.SMO.DataField’ SmartObject
  3. Select the method ‘Delete Data Fields’
  4. It will ask for the ‘pDataFieldKey’ so assign it ‘DatafieldId’ Data field
  5. Complete the event.  Now when this event is executed it will delete all the datafields that were created for this instance.  Making the workflow efficient in cleaning up data it no longer needs.

externaldatafield4

Audit Trail

The end activity will also need an entry into the audit to say that it has finished

Updating the framework on process status

The framework process instance status will also need to be updated to ‘Complete’

Creating an external Data Field

To create an external data field

  1. Drag  a SmartObject event onto the canvas
  2. Call the event ‘Create DataField’ or something similar
  3. Find  K2F.SMO.DataField SmartObject from the context browser
  4. Select the ‘Create Data Field’ method
  5. Now we need to supply the method with some additional information
    1. pDataFieldDescription, description of the data field we are creating
    2. pDataFieldKey, this is data field Key that we created earlier
    3. pDataFieldTitle, the name we are giving the data field
    4. pDataFieldValue, the value of the data field
    5. pProcessInstanceId, the id of the process instance
  6. Click ‘Next’ and ‘Finish’
  7. We have now created a external data field
  8. Repeat steps 1 to 6 for additional external data fields if needed

externaldatafield

Using an external data field

To use an external data field, in the event or the line rule you are in.

externaldatafield1

  1. Open the context browser
  2. Go to the data tab (3rd tab)
  3. Expand references
  4. Expand ‘External Datafields’
  5. Drag ‘DataFieldValue’ into the section you want to get the data field value
  6. The filter window opens up, click on ‘Add’
  7. In ‘Left’ select ‘DataFieldTitle’ from the drop down list
  8. Operation should be ‘=’
  9. In ‘Right’ enter the name of the data field you want to get the value of.

externaldatafield2

10. Click ‘Next’

11. Make Sure ‘DataFieldValue’ is selected and click ‘Next’

12.  Make sure ‘Return a single item’ is checked and click ‘Finish’

13. We have now called the external data fields to get the value for a particular data field

externaldatafield3

Line Rules

We can also use framework for line rules and other workflow logic

  1. Right click on a line
  2. Give the line appropriate name and prefix the name with ‘ln’. For example ‘lnVotingAge’
  3. In the label name section prefix it with ‘LR’ and then put what the rule is doing.
  4. Click on linerule  and click on ‘Add’
  5. Click on eclipse
  6. Expand ‘References’
  7. Expand ‘Configuration Data’
  8. Drag ‘ProcessConfigValue’ into the  ‘Variable’ section
  9. In Filter click on ‘Add’
  10. Select ‘ProcessConfigName’
  11. Select ‘=’ operation
  12. In Right, enter in the keyword for the value. for example ‘AgeLimit’
  13. Click ‘Next’ and ‘Next’ again
  14. Click ‘Finish’
  15. In the second variable put in your business data as normal.

linerules2

We have now set up a dynamic line rule where both the business data the variable that its being compared against is coming from outside of the workflow

 

 

 

Managing the framework

Adding a form to the form library

All forms created either Smartforms or HTML 5 forms must be registered, to allow the framework to use the forms in the workflows

formlib

  1. Goto form K2F.ADM.SMF.ProcessManager
  2. Click on ‘Add’ on the Form Library
  3. Enter in the name of the form
  4. Complete URL of the form
  5. Tick the checkbox
  6. Click ‘Save’
  7. The form is now registered and can be used.

Adding A Form to A Task

For Default Client Events to get a form in a workflow, we need to link up the form and client event together.

  1. Click on ‘Client Event’
  2. Search for the desired form in the form library
  3. Give the ‘Client event name’ the same as  the activity the client event is in the workflow.
  4. Click on Save
  5. We can use this form now in the workflow as described above

task

 

Swagger Definition Library

 

K2F provides support for managing the swagger definitions of the REST Broker. It will do this in two ways.

  1. There will be a Web API Handler that accepts Posts, the idea being that when the global registry changes it will post to the service with the new host address and application name. The service then goes into the library and finds the correct swagger definition and updates it with the new host address. It then exports the swagger definition file to the  swagger folder on the server and then updates K2 with the new definition. ToDo
  2. In the portal you can view all the swagger definitions, where new ones can be added and old ones can be edited. Once created in the library, this will then create Swagger JSON file on the K2 Server.

swaggerdefintionlib

Add  a new Swagger file

  1. Click on ‘Add Swagger Definition’
  2. Form loads up where you can enter in the definition name, host address and the definition it self
  3. demoswagger
  4. Click on ‘Submit’
  5. This will save the swagger definition in the library and also create its corresponding JSON file on to the server.

5.You will still need to create the instance manually using the service object tester tool

savedswagger

Script Library

Some smartforms may require additional scripts to be used such as the K2C.CMN.VWI.Footer view. Like with the workflows instead of hard coding the script into the view and having to check out and check in the view every time there is change. The script library allows us to make the change from the management tool, without the  risk of breaking the view or form.

scriptlibrary

From the script library the following actions are possible

  1. Add
  2. Edit

script2

when you have finished adding and editing scripts click on ‘Save’

 

 

Registering a process

dash-1

  1. Deploy Process (This can be done at anytime during the workflow setup, but must be done before we carry on with the following steps)
  2. Goto form K2F.ADM.SMF.ProcessManager
  3. Click on ‘Register Process Type’
  4. A pop up window will appear enter in the workflow name, a description and a display name
  5. Click on submit, we have now told K2F about this workflow.
  6. We can now start to add in configutation data about the workflow.

addproctypr

Process Configuration Data

processconfiguration-1

In the process configuration form, we can add,edit configuration data of a workflow.

Navigating to this form can be done either by clicking on the process type or by adding a new process type.

Adding  Time Configuration

To add time configuration that will be used for Escalations and Start Rules as mentioned above.

timeconfiglist

  1. Click on ‘Add Time’
  2. Time configuration window opens
  3. Enter in a name, description
  4. Select a value from the drop down list for each of the time sections
  5. Click on Submit
  6. Time is now added

So that we use the time frame in  a workflow , we now need to link the time frame to the ‘Configuration Data’

Configuration Data

Configuration data is used to hold information that will be needed for every instance of a particular workflow.

configdata

  1. Click on ‘Add’
  2. Enter in a description of config data being added
  3. Enter in the actual value
  4. Enter in a name for the config data
  5. In this example we are going to add the link the to the time frame we created earlier.

sample

 

 

As you can see the framework allows workflows to be flexible and allows for apps to be build apps fast and easily.

Now that the framework has been introduced , next time we will look at what else it can do  and build a workflow using it.