Hi Rebeccah,
1. What do you mean when you say you "deploy/promote" certain Globals, rather than creating them in multiple spots? Is deploying/promoting them different from importing them?
- Correct, sorry if the terminology confused. So if a new Global Variable (or Condition, Connection, Notification etc) is required with a new/existing job then it is Created in Dev then 'migrated' through the environments using selective export / import (in your case Dev then QA then Prod). This keeps the Object ID (GUID) of the Objects consistent between the environments. Not all jobs, but certainly the larger / more complicated / critical ones, or the ones we want to work on multiple servers.
- Basically Create Once and Export/Import it wherever else you need it.
- Same thing with Users . . .
- Not a perfect way to do this . . . . measure 3 times cut once comes to mind . . .
- Through the Client when connected to multiple servers works well also where appropriate for Jobs for example
2. We don't use the API a Lot. For us, Parameter passing to powershell has come a long way. You will find more extensive users in the forum, but we do use it a bit (for us from PowerShell). For us it allows us to perform extracts of various visualCron Job and Task Information, or Variables via scripts or programs, for use in scripts / elsewhere but you will find other users doing far more extensive changes to visualcron via the API, particularly if you need visualcron to change, and programmatically rather than via a user change.
So basically the API allows us to connect via a program or script to a visualcron server and then use the object model to access / extract and update the Jobs, Tasks, and Global Objects (Variables etc).
In my example above, we wanted to make sure any code / command lines / Parameters that may only exist in VisualCron are extracted and managed . . . as depending on your Volume of Jobs you can only keep history (and Audit Trails for Changes) around for so long . . .
VisualCron Help provides some more details on the API, and there is an API folder under the installation folder. If you are a .net shop or have an interest then it might be worth a look. We are also aware of the Web API but have not progressed with that as yet, as the existing API is doing what we need.
3. Correct, we rely on *very* selective exports and imports to move these jobs/objects around when we need to. This is a good way to move a job back to another <test> server to investigate issues if you need to. Its also a way to grab something out of a backup.
You may not be this disciplined about all your jobs / servers, but we certainly have more complicated / critical jobs where we are very structured about how they are handled. Good examples of this are any jobs interacting with Banks / Payment Providers . . . . or Jobs dealing internally with particular key systems / databases which may also have Dev / QA / Production setups.
For some jobs we have version numbers as part of the Job name also if the change is major / critical . . . . so v1.0 of a job might be live in Production (Paypal for example) and we may clone a job and have a changed v2.0 moving through Dev / QA / Production, allowing us to manage the cutover and have v1.0 disabled for a short time during transition and support a clean reversion if you need one. This is very much the minority though ! . . . but would often mirror where the scripts or programs the jobs and tasks are controlling may be versioned . . . and potentially quite different (parameters passed for example) between versions.
Happy OCD Day 😎
Hope this helps
Kevin
Edited by user
2016-05-11T21:40:42Z
|
Reason: Not specified