Creating a SharePoint Timer Job using SharePoint Designer 2013 31


What do i really mean?

Now that I have got your attention with that eye catching title, let me expand and qualify it.  So, what I will be doing exactly is

  1. Inspecting a List (a Task List actually) for Tasks that are Due and Overdue.
  2. I will be checking them nightly (every 24 hours to be exact)
  3. If task are Over Due, then I will take ‘some’ action on them, in this instance, I will be changing a KPI field but this can be quite exhaustive based on your own particular use case. see below for one i have in mind.
  4. Pause the Workflow for 24 hours and Repeat.

By the way, this CAN also be used in SharePoint Online

Use Case

In this use case I have been charged to monitor a Task list (multiple task list from varying SharePoint Sites/ Site Collections) and aggregating them. Once complete, then we need to serve up the tasks that are overdue into a KPI dashboard and send the email link to managers of those offending taskees.

For my POC demo here I will show how to get 1 Site Collection but to get multiple all you have to do is make a separate REST call to that particular Lists API, and the rest is easy.

Pre-Requisites and Technologies Used

In this example I am using an On Premises SharePoint 2013 with Workflows enabled, I am also using SharePoint Designer 2013 as well, that is if for tooling. i.e. NO CODE.  I will be using the SharePoint 2013 REST API to read the SharePoint Task Lists as my Entry Point URI. The generic url will be /_api/web/lists/getListByTitle(‘’)/items">/_api/web/lists/getListByTitle(‘’)/items">http://<sitename>/_api/web/lists/getListByTitle(‘<listNameHere>’)/items which fetches List Items from a Named list using oData. Now by default this will be returned to your in XML, so in order to use this in SharePoint Designer which requires JSON inside the Dictionary Object (see my post here if you need a refresher on that) you will need to modify the headers to accept JSON using the Accept Headers. The rest is a matter of using Loops to iterate through the List Items returned, and pausing for the time allotted.

Approach

As usual I put my URI inside a Variable to obscure the name & any API Keys if any, this time there was no need but it just makes for better programming especially if there are a lot and I want to swap them out from time to time without affecting the core program

image

the full list of properties from the Call HTTP web service is located below since it is truncated in the image

image

The Request Headers are “Accept: application/json; odata=verbose” and you can place that into a string type as well and call it into the HTTP Web Service Properties as well inside a Workflow Dictionary Variable

image

Once you have done that, then as seen in the first image and in my previous blog post on the same topic, you parse the JSON results to get the part of the dataset that you want, in this case i want what is under the “d/results” node and to do that I use the Get Item from Dictionary and parse out what i need then i count the items for good measure, save the “Count” as a variable to be used as my upper bound in my Loop Counter and log it out.

Do Business Logic

After setting up the stage to get the URI, return the JSON data, Parse It to the Node you want, and get a Count, the next thing is to do the business logic.

What you will see below in the image is us using the Variable Count as the upperbound in our Loop and then getting the JSON data into Local Variables to be manipulated in the Workflow Logic.

//MY SOAPBOX

So, i have been talking about this for a while as most people who have known me for a long time, know be because of my work and efforts in SharePoint BCS.  But i have actually moved my thoughts in some aspects to what BCS allows (even in 2013) and what Workflows gives you interms of interacting with External Data. You see my point is, if you do not need to look at that data, i.e. External List, or need to Search on that Data i.e. External Content Type as a Content Source, I THINK that it is to your best interest from a simplicity standpoint, a performance standpoint, best of breed standpoint… to do this as a Workflow calling External Web Services and manipulate the data how you want and then get rid of it (i.e. the Variables) when done. That way the data is only used for its intended purposes then goes away.

//END SOAPBOX

Back to our story, what we do next is inspect the Due Date and use Conditional Statements to affect changes to the Fields for what you need. in this instance I am updating a KPI field based on how far along or how overdue a Due date is on a Task List. Now the Use case here is that I can do this on Any Amount of List both On PREM and in Office 365 in the same logic, I have Scale and Scope down to a Tee 🙂

 

image

 

The logic loops through each item does a Check and then updates the loop counter until it hits the “Count” variable number then exits. Since this is a Timer Job then the next thing we need to do is Pause it and wait, see below

image

Now, this workflow was designed as a Site Workflow so it can be ran independently of any Library or Lists and this guy can run forever.

Summary

So the next time someone ask you to Do a Timer Job for them, especially if they say they want it on Office 365 (SharePoint Online) and it involves a use case similar to mine or at the very least is looking at affecting changes to a List/Library, consider doing a Site Workflow with Conditional Logic and Pause Duration. Its really is that simple.

If this was being done in SharePoint Online, you would also need to capture and pass along the oAuth Tokens FedAuth and rfTa in your dictionary object.

Cheers and Happy SharePointing


Leave a comment

Your email address will not be published. Required fields are marked *

31 thoughts on “Creating a SharePoint Timer Job using SharePoint Designer 2013

  • André

    Fabian,

    It is nice, but SharePoint Workflows aren’t the best ones. They consume cpu and memory and when in a loop it will consume resources. What are your thoughts on that?

    André

    • fabiangwilliams

      Hey Andre, great hearing from you again mate, hope all is well, its been too long.
      So, to answer your questions, address your comments, let me make a few points then ill summarize.
      1. Everything consumes some level of CPU and Memory, its how its is used and employed that is important
      2. Loops are memory intensive by nature, it has to keep track of items, counters, iterations, agreed

      Notwithstanding that, I think your argument is valid if we had our Workflow on our SharePoint Farm as in the case of SP 2010, but if you structure your Azure Workflow Service in a proper farm then it is only consuming the resources you deem fit, and you can also direct the node in the Workflow Farm to have these items run on. NOW if you put this on BCS, you don’t have that isolation. That and for the reasons I outline, is why I think WF’s are better for this job. The real value is that this can be used in O365 and since MS isn’t allowing you to work in their Farm to do Timer Job its a good route to go.

      • André

        Hi Fabian,

        Thanks for you reply! Yes, my life is great have a good time… But yet less time to program only doing IT Pro stuff at the moment. You’re doing great with all these new features you use.

        I agree with you that WF is a way to bypass O365 limitations. It is a cool feature which you can use. Of course I always watch the performance of memory and cpu consumption with WF. And Azure Workflow is a lot better then SP2010 WF. I will keep it in mind that there are multiple possibilities regarding Timer Jobs.

        What about reboots, patches, etc? Will it resume after it has an one time fail? Or does it stop?

        Cheerz mate!

  • marcelmethmarcel meth

    My experience has been that I prefer to have the workflow run once to completion every night. The I schedule the workflow to run every evening at the same time. The reasons I prefer this are:

    1. I do not like to have a workflow waiting for a long period. I have found issues with Workflows not waking up properly.
    2. If I have the workflow scheduled to run every night, then if I need to make a change to the workflow, the newer version of the workflow will run in the future, not the one I started a month ago.

    • fabiangwilliams

      Marcel, thanks for the comments and your point of view for the most part I guess 50% if I had to quantify 🙂 Id agree. I think your second point if very poignant and makes perfect sense in that scenario, however to your first, I think that was an issue when Workflow was coupled with SharePoint i.e. Pre-SP2013 but with the decoupling and the ability to make a Workflow Farm or even dedicate a WF box you can segment work that could be hmmm risky or need more horse power aside from other business processes. Regardless, I always welcome additional points of view, thanks for sharing. Cheers.

  • Joshua

    Hi, nice walkthrough.
    I’m getting an error at the Get ([%Variable:indexer%])/… stages. I’ve triple-checked myself and can’t figure out what the problem could be.
    The error is “Looking up a value using a key is not supported on an instance of ‘Microsoft.Activities.Dynamic.DynamicJsonArray'”, so it clearly doesn’t like trying to address the dictionary array as an array.
    I’m running on Office 365, nothing on premises. Can you offer any insight into what I may be missing?
    I get the item count just fine.

    Cheers!

    • fwadmin Post author

      I think the problem may be that you are either (a) not getting back the value in your JSON payload that you expect because it is errored out or (b) not in the format you expect. have you tried looking at fiddler to see what our output is and validate against that. Ive gotten those before, I go the fiddler route or I log out all my HTTP responses to see whats going on. you could be getting an unauthorized & because of that your indexer will fail because there is nothing to index.

    • Nate

      I had the same problem until I had realized I had accidentally left off the parens ( ) around the [%Variable:indexer%] piece. You access the JSON array results like (0)/GUID, (1)/GUID, etc so the parens are required.

  • Chris Jones

    I use this same process in O365 for a site workflow with a pause, then a loop back to the beginning to repeat. After a few days (1 day=1iteration of the workflow) I get the following message:

    “RequestorId: da719592-d728-7f2a-0000-000000000000. Details: The workflow instance exceeded the CPU usage limit of throttle of 00:00:01.2000000 and could not be unloaded because it was not persistable.”

    When I resume, all is well for a few days, but then the same thing happens again. Cant find much info on this error. can you help?

    • fwadmin Post author

      if this is an On Premises environment, dig into the ULS logs for more detail, set up logging to be Verbose, if SPO via Office 365 take a look at any long running task you may be trying to do and refactor

  • Chen

    hi Fabian, I am a beginner to sharepoint and trying to come up with something in sharepoint that can make my workflow run every 24 hours. I tried to understand your steps but they are still way above my head. Do you have more detailed and step by step instructions more toward beginners that are not very familiar with sharepoint or coding please?

    appreciate very much!

    • fwadmin Post author

      Absolutely nothing that is not covered in the post. I wanted to point out that the solution ‘as is’ can be run both on premises and on sharepoint online.

  • Leon Bryant

    Hi Fabian! This looks very interesting. Forgive my layperson brain, but does it still require a server-side timer job external to the workflow to kick off the workflow at a certain time?

  • Amit Loh

    Hi Fabian! this is what I am looking for!!
    Does this technique work on any sharepoint list in same site?(site workflow) and also does it work with https using rest? Any issues related to values changed and then workarounds(I see posts using designer workflows but have gotchas)

  • Mike

    Hi Fabian,

    I’m just getting started with SharePoint solutions.

    In your blog post in part “Approach” – What exactly variables (in your blog post case) are in:

    1. then Log “Items returned from REST call is: ” to the workflow history list

    2. then Log “REST Call made to SharePoint List – m…” to the workflow history list

    Thanks for your help!

    Regards,
    Mike

  • Euan

    After your “Job End and Pause for Next 24 hour” stage you loop back to “Run Timer Job” rather than “Make REST Call”. Presumably this means that the workflow never makes the REST call again? Wouldn’t you want to make sure you’re working with the latest data?

    I ask because I’m having an issue with a similar workflow that does loop back to the start and makes the REST call again. First time round everything works as expected but on the second call I get the following.

    Details: An unhandled exception occurred during the execution of the workflow instance. Exception details: System.InvalidOperationException: Looking up a value using a key is not supported on an instance of ‘Microsoft.Activities.Dynamic.DynamicPrimitive’. at Microsoft.Activities.Dynamic.DynamicItem.TryGetValue(String key, DynamicItem& value) at Microsoft.Activities.Dynamic.DynamicValueBuilder.PathSegmentFactory.ObjectPathSegment.Get(DynamicItem obj) at Microsoft.Activities.GetDynamicValueProperty`1.Execute(CodeActivityContext context) at System.Activities.CodeActivity`1.InternalExecute(ActivityInstance instance, ActivityExecutor executor, BookmarkManager bookmarkManager) at System.Activities.Runtime.ActivityExecutor.ExecuteActivityWorkItem.ExecuteBody(ActivityExecutor executor, BookmarkManager bookmarkManager, Location resultLocation)

    As far as I can the call gets made but the problem occurs when trying to assign the result to a variable. I even had the workflow email me the raw JSON and it looks correct both times.

    Any thoughts on this?