Quartz.net Tutorial _lesson 3:more about Jobs & Jobdetails__.net

Source: Internet
Author: User
Tags serialization
Lesson 3: Detailed homework and homework details

As you saw in Lesson 2, the jobs are rather easy to implement. There are just a few more things so you need to understand about the "nature of" jobs, about the Execute (..) method of the Ijob interface, and about Jobdetails.

As you can see in the previous lesson 2, the implementation of the assignment is fairly easy. Just for a deeper understanding of the nature of the assignment, you need to know more about the details of the Ijob interface's Execute (..) The method also has jobdetails (Job details Class).

While a job class this you implement has the code this knows how does the actual work of the particular type of job, Quar Tz.net needs to is informed about various attributes that your may wish a instance of this job to have. This are done via the Jobdetail class, which was mentioned briefly in the previous section.

When a job class that you implement has code that implements a particular particular function, the quartz.net framework needs to know the various attributes of the instance of the job class that you implement. These tasks need to be implemented through the Jobdetail class, which was briefly mentioned in the previous section.

Jobdetail instances are built using the Jobbuilder class. Jobbuilder allows you to describe your job ' s details using a fluent interface.

An instance of the Jobdetail (Job details Class) needs to be created using the Jobbuilder class. Jobbuilder (operator builder) can describe the detailed parameters of your assignment in the form of a coherent interface.

Let's take a moment now to discuss a bit about the ' nature ' of jobs and the life-cycle of job instances within Quartz.net. Lets take a look back at some of this snippet of code we saw in Lesson 1:
Let's simply take a moment to discuss the nature of the job and the lifecycle of the job instance in the Quartz.net framework. First, let's look at a block of code mentioned in a course:

Using quartz.net
Using Quartz.net

//define the job and tie it to our Hellojob class ijobdetail job = Jobbuilder.create

Notice that we give the scheduler a Ijobdetail instance, and that it refers to the job of executed by simply providing The job ' s class. Each (and every) time is the scheduler executes the job, it creates a new instance of the class before calling its Execute (.. ) method. One of the ramifications of this behavior are the fact that jobs must have a no-arguement. Another Ramification is-that it does "not" sense to have data-fields defined on the job class-as their values would n OT be preserved between job executions.

Note that we provide an instance of the Ijobdetail interface to the scheduler, and this instance simply expresses the job you want to perform by providing the job class name. Every time the scheduler executes a job, the scheduler invokes execute (..) method to create an instance before the One of the consequences of this behavior is that the job class must have an parameterless constructor. Another consequence is that it is meaningless to define data fields in the job class, because these values are not preserved during each execution.

How can I provide properties/configuration for a Job instance? ' and ' How can I keep track o f a job ' s state between executions? The answer to this questions are the Same:the key is the Jobdatamap, which are part of the Jobdetail object.

You might want to ask now, "How can I provide attribute/set parameters to a job instance?" "or" in the process of execution, how can I save the status of the job? "The answers to these questions are the same: implemented through JOBDATAMAP, which is part of the Jobdetail (Job Details Class) object."

Jobdatamap
Job Data Map

The Jobdatamap can is used to hold any number of (serializable) objects which your wish to have made to the job I Nstance when it executes. Jobdatamap is a implementation the IDictionary interface, and has some, added convenience methods for storing and Retri eving data of primitive types.
Jobdatamap can be used to hold any number of (serializable) objects that you want to provide to the job instance during execution. Jobdatamap is an implementation of a IDictionary dictionary interface, and it facilitates the addition of methods for storing and retrieving raw types of data.

Here's some quick snippets of putting data into the jobdatamap prior to adding the job to the scheduler:
Here are some blocks of code that add data to the Jobdatamap to provide data to the scheduler's job.

Setting Values in a jobdatamap
Setting parameter values in Jobdatamap

//define the job and tie it to our Dumbjob class ijobdetail job = JOBBUILDER.CREATE<DUMBJOB&G t; (). Withidentity ("Myjob", "group1")//Name "Myjob", Group "group1".
    Usingjobdata ("Jobsays", "Hello world!") . Usingjobdata ("Myfloatvalue", 3.141f).
Build (); Here's a quick example of getting data from the Jobdatamap during the job's execution:getting Values from a jobdatamap pu Blic class Dumbjob:ijob {public void Execute (Jobexecutioncontext context) {Jobkey key = context.

      Jobdetail.key; Jobdatamap Datamap = context.

      Jobdetail.jobdatamap;
      String jobsays = datamap.getstring ("Jobsays");

      float Myfloatvalue = datamap.getfloat ("Myfloatvalue");
    Console.Error.WriteLine ("Instance" + key + "of Dumbjob says:" + Jobsays + ", and Val is:" + myfloatvalue); }
}

If Use a persistent jobstore (discussed in the Jobstore section of this tutorial) your should use some care in deciding What you are in the Jobdatamap, because the object in it would be serialized, and they therefore become the prone to class-v ersioning problems. Obviously standard. NET types should is very safe, but beyond that, any time someone changes the definition of a class for Which you have serialized instances, care has the to is taken not to break compatibility.

If you use a consistent Jobstore job storage class (discussed in detail in the Jobstore chapter) you need to be very careful in jobdatamap because these objects are serialized and may be mistaken for a version problem of the class. It's obviously standard. NET type should be safe, but look further, at any time when we want to serialize an instance, we must be careful not to break its compatibility.

Optionally, you can put Adojobstore and jobdatamap into a mode where only primitives and strings can is stored in the map, Thus eliminating any possibility of later serialization problems.

Alternatively, you can combine the Adojobstore and jobdatamap groups into a pattern that only allows you to save basic data types and strings, so that you can reduce the problem of serialization when used later.

If you add a properties with set accessor to your job class this correspond to the names of the keys in the Jobdatamap, then Qua Rtz ' s default jobfactory implementation'll automatically call those setters when the job is instantiated, thus preventin G the need to explicitly get the ' values out of the ' map within your Execute method.

If you define the Set property method for your job class and match the key of the data stored in the Jobdatamap, the default job factory method of the Quartz framework automatically invokes the set method when the job is instantiated, which avoids the time you perform the job method. It also explicitly defines the value of the data in which it needs to be removed.

Triggers can also have jobdatamaps associated with them. This can are useful in the case where you have a Job and is stored in the scheduler for regular/repeated use by multiple T Riggers, yet with each independent triggering, and want to supply the JOB with different data inputs.

Triggers also have jobdatamaps containers associated with them. When you have a job in a scheduler that is executed by multiple triggers, and you want to be able to provide different parameters to the job, it is useful.

The jobdatamap is found on the Jobexecutioncontext during Job execution serves as a convenience. It is a merge of the "Jobdatamap found on" The Jobdetail and the one found on the Trigger, with the values in the latter Rriding any same-named values in the former.

The Jobdatamap defined in Jobexecutioncontext is just for convenience. This is a branch of the Jobdatamap defined in Jobdetail and trigger, where the value of the data is overwritten by the latest data when it is stored in the project named data.

Here's a quick example of getting data from the Jobexecutioncontext ' s merged jobdatamap during the job ' s execution:

The following is a case of obtaining a number from the Jobexecutioncontext branch jobdatamap during job execution:

public class Dumbjob:ijob {public void Execute (Ijobexecutioncontext context) {Jobkey key = context.

        Jobdetail.key; Jobdatamap Datamap = context.  Mergedjobdatamap;
        The difference from the previous example string jobsays = Datamap.getstring ("Jobsays");
        float Myfloatvalue = datamap.getfloat ("Myfloatvalue");
        ilist<datetimeoffset> state = (ilist<datetimeoffset>) datamap["Mystatedata"]; State.

        ADD (Datetimeoffset.utcnow);
    Console.Error.WriteLine ("Instance" + key + "of Dumbjob says:" + Jobsays + ", and Val is:" + myfloatvalue); } Or If you are wish to rely on the jobfactory "injecting" the data map values onto your class, it might look like this inst
    Ead:public class Dumbjob:ijob {public string jobsays {private get; set;}

    Public float Floatvalue {private get; set;} public void Execute (Ijobexecutioncontext context) {Jobkey key = context.

        Jobdetail.key; JobdatamAP Datamap = Context.  Mergedjobdatamap; The difference from the previous example ilist<datetimeoffset> state = (ilist<datetimeoffset>
        ) datamap["Mystatedata"]; State.

        ADD (Datetimeoffset.utcnow);
    Console.Error.WriteLine ("Instance" + key + "of Dumbjob says:" + Jobsays + ", and Val is:" + floatvalue); }
}

You'll notice that's overall code of the class is longer, but the "code in" Execute () is cleaner. One could also argue that although the code are longer, that it actually took less, if the coding's IDE was programmer To auto-generate the properties, rather than has to hand-code the individual calls to retrieve the values from the Job Datamap. The choice is yours.

You'll find that the overall code for this class looks more, but the Execute () method is clearer. You might say that, although the code is more, in fact, when programmers use the IDE for automatic cue mode development, compared to the individual data from the Jobdatamap to get the value of the development of the workload is less. It's your own business how to choose.

Job "Instances"
Job Instance

Many users spend time being confused about what exactly constitutes a "job instance". We'll try to clear, and the section below about job state and concurrency.
Many users are puzzled when they build a job instance. In the following section, we will try to articulate the status and consistency of the job.

Can create a single job class, and store many ' instance definitions ' 's it within the scheduler by creating multiple I Nstances of Jobdetails-each with it own set of properties and Jobdatamap-and adding them all to the scheduler.
You can create a separate job class, and you can store the definitions of multiple instances in the scheduler by creating instances of multiple jobdetails. Each of these job details classes have separate attributes and jobdatamap data, while

For example, your can create a class that implements the Ijob interface called "Salesreportjob". The job might be coded to expect parameters sent to it (via the Jobdatamap) to specify the name of the "Sales person" t He sales should is based on. They may then create multiple definitions (jobdetails) of the job, such as "Salesreportforjoe" and "Salesreportformike" wh Ich have "Joe" and "Mike" specified in the corresponding jobdatamaps as input to the respective jobs.

For example, you can create a "salesreportjob" (Sales Report Job) class to implement the Ijob interface. This job may be the need to pass parameters, based on different sales people to produce different sales reports. They may define different jobdetails assignment details, such as "Salesreportforjoe" (George's Sales Report) and "Salesreportformike" (Mike's Sales Report), and in the details of the assignment, by "Joe" and "Mike" There is a jobdatamaps to define the corresponding job.

When a trigger fires, the Jobdetail (instance definition) it are associated to are loaded, and the job class it refers to be Instantiated via the jobfactory configured on the Scheduler. The default jobfactory simply calls the default constructor of the job class using Activator.CreateInstance, then attempts To call setter properties on the class that match the names of the keys within the Jobdatamap. You could want to create your own implementation the jobfactory to accomplish things the as having such your ' s IoC or DI container produce/initialize The job instance.

When the trigger starts executing, its associated jobdetail job details (instance definitions) are loaded, and the associated job class is instantiated through the Jobfactory factory class set by the scheduler. The default Jobfactory factory method evokes the default constructor for the job class through the Activator.CreateInstance method, and then attempts to hoist the Setter property method that matches the key in Jobdatamap. You may want to create your own jobfactory implementation to accomplish something, such as generating or initializing your personal application of the IOC (control reversal)/di (Dependency Injection) container in the job instance.

In ' Quartz speak ', we refer to each stored jobdetail as a ' job definition ' or ' jobdetail instance ', and we refer to a Executing job as a "job instance" or "instance of a job definition". Usually if we just use the word ' job ' we are referring to a named definition, or Jobdetail. When we are referring to the class implementing the job interface, we usually use the term "job type".
By quartz, we call each stored jobdetail "job definition" or "job details definition", and we call each executed job "job instance" or "job-defined instance". Usually if we use the word "job," we usually say a named definition or job details. When we want to describe the implemented job interface class, we usually say "job type".

Job State and concurrency
job Status and consistency

Now, some additional notes about a job's state data (aka Jobdatamap) and concurrency. There are a couple attributes that can is added to your Job class this affect Quartz ' s behaviour with respect to these ASP ECTs.

There are some additional issues that need to be noted about the state data and consistency of the job. There are some attributes that can be added to your job class and affect the corresponding behavior of the quartz framework.

Disallowconcurrentexecution is ' can added to ' Job class that tells Quartz not to execute MULTIPL e instances of a given job definition (that is refers to the given job class) concurrently. Notice the wording there, as it was chosen very carefully. In the example from the previous section, if "Salesreportjob" has this attribute, than only one instance of "SALESREPORTFO Rjoe "can execute at a given time and but it can execute concurrently with a instance of" Salesreportformike ". The constraint is based upon a instance definition (Jobdetail), not on instances of the job class. However, it was decided (during the "design of Quartz") to have the attributes carried on the class itself, because it does O Ften make a difference to how the class is coded.

The

Disallowconcurrentexecution attribute can be added to the job class and tells the quartz framework not to execute multiple given job instances at the same time. Pay attention to the way they are expressed here, these words are carefully weighed. In the previous example, if the "salesreportjob" job had this feature, only one instance of "Salesreportforjoe" could be executed at a given time, but it could also execute an instance of "Salesreportformike". This restriction is based on the instance definition (Job details) and is not based on the job class. In any case, this property is associated with the class itself in the design process of the framework, because this feature usually has to do with how the code in the class is designed.

Persistjobdataafterexecution is a can added to the Job class this tells Quartz to update the stored Copy of the Jobdetail ' s jobdatamap after the Execute () method completes successfully (without throwing a exception), such That next execution of the same job (Jobdetail) receives the updated values rather than the originally values. Like the Disallowconcurrentexecution attribute, this is applies to a job definition instance, not a job class instance, thou GH It was decided to have the job class carry the attribute because it does often make a difference to how the class is co Ded (e.g. the ' statefulness ' 'll need to is explicitly ' understood ' by the code within the Execute method).

Persistjobdataafterexecution This feature can also be added to the job class and tells the quartz framework to update a copy of the Jobdatamap data in a job detail after successfully executing the Execute () method, During the next execution, the same job (details of the job) can get the updated data, not the original data. As with disallowconcurrentexecution, this attribute is applied to a job definition instance, not to a job class instance. Although the job class was decided to take this feature in the architectural design process, it is usually related to how the code in the class is designed (for example, "stateful" is explicitly "understood" by the code when executing the method).

If you use the Persistjobdataafterexecution attribute, you should strongly consider also using the DISALLOWCONCURRENTEXECU tion attribute, in order to avoid possible confusion (race conditions) of what data is left stored when two instances of The same job (Jobdetail) executed concurrently.

If you use the Persistjobdataafterexecution feature, you should strongly consider using the Disallowconcurrentexecution feature at the same time, in order to avoid the simultaneous execution of two instances of the same job (job details). Create possible conceptual confusion (small probability) to confuse what data is stored in the example.

Other Attributes of Jobs
Some other features of the job

Here's a quick summary of the other properties which can is defined for a job instance via the Jobdetail object:

Here's a simple summary of the job details object to define the job instance and include those other attributes:

Durability-if a job is non-durable, it's automatically deleted from the scheduler once there are no longer any active T Riggers associated with it. In other words, non-durable jobs have a life span bounded by the existence of its triggers.

Persistence-If a job is cost-persistent, it is automatically cleared when there are no related triggers in the scheduler. In other words, the lifecycle of a non-persistent job is determined by the existence of its triggers.

Requestsrecovery-if a job "requests recovery", and it is executing during the time of a ' hard shutdown ' of the scheduler (i.e. the process it is running within crashes, or the machine was shut off), then it are re-executed when the scheduler is Started again. The Jobexecutioncontext.recovering property would return true.

Request Recovery-If a job performs "request Recovery" and executes when the scheduler crashes, it is executed again after the scheduler is restarted. In this case, the Jobexecutioncontext.recovering property returns the truth value.

Jobexecutionexception
Job execution exception

Finally, we need to inform your a few details of the Ijob.execute (..) method. The only type of exception this is the jobexecutionexception of the "should throw from" execute method. Because of this, your should generally wrap the entire contents of the Execute method and a ' try-catch ' block. You are should also spend some time looking in the documentation for the jobexecutionexception, as your job can use it to Prov IDE the scheduler various directives as to how do you want the exception.

I hope my words can help you:

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.