We are looking at to build "A Workflow" to replace company's manual work like
1) working from an email (system generated, coming from external parties, etc..)
2) Spread sheets (mostly errors from systems/applications, custom queries, etc..)
3) Images (Appeals, Claims, Faxes, etc..).
There are several business units and teams. Case workers can get work from multiple streams.
Each team has there own rules to segregate work and manage it. Some times people are shared within department based on the work load, how do we give business capability to move people around across different work?
What is the better way to implement this kind of requirement? Looking for ideas to manage people and work. How do we use of workgroups vs Workbasket's? one application vs multiple applications? and switching applications.
This will help company to replace small workflow tools.
There isn't really a one size fits all answer to your question. You can definitely do it as one monolithic application with tasks broken up by workbaskets and roles. You can also make multiple applications built on a common rulebase and give permission to the various applications to the appropriate operators. I think a lot of it comes down to how much data needs to move from one task to the next. If you want the same object to go to different departments for some bit of work and then to be passed somewhere else, you're better off with a shared codebase. If the work is atomic and theoretically could be run on separate servers, then isolating the application allows you to keep it small and lightweight if you decide you want to move it later.
Another factor might be who the developers are. If you have multiple unrelated teams, then sharing a common codebase will require coordination between them to ensure that what is being built will work across shared layers. Keeps aplictions separate avoids some of that, but at the cost of not being able to leverage one team's work as easily by another team.