Using Spring+springmvc+mybatis+kafka to do two web projects, one is a producer, one is a consumer.
The JMeter test tool simulates 100 users concurrently accessing a producer project, sending JSON data to the Producer's interface, and the producer sending the JSON data to the Kafka cluster
The consumer hears the message in the Kafka cluster and begins to consume it, and parses the JSON object into the MySQL Database.
The following is a simulation of 100 concurrent thread setups using the JMeter test tool:
Request the data that is sent:
The following is an aggregated report of 10,000 requests for 100 users:
Here are the times when the producers have finished producing 10,000 messages:
The following is the end time of consumer project consumption storage:
It can be seen that 10,000 messages from production to storage (consume 10,000 messages is only a few 10 seconds after the completion of the production time, but the consumer end of the actual storage time required to complete a long) time difference of 10 minutes.
The following is the MySQL database, all the data storage success:
The following is the Pojo for the message:
1 packagecom.xuebusi.pojo;2 3 public classTbperson {4 PrivateLong id;5 6 PrivateString name;7 8 PrivateInteger age;9 Ten publicLong getId () { one returnid; a } - - public voidsetId (Long Id) { the this. ID =id; - } - - publicString getName () { + returnname; - } + a public voidsetName (String Name) { at this. Name = Name = =NULL?NULL: Name.trim (); - } - - publicInteger getage () { - returnage ; - } in - public voidsetage (Integer Age) { to this. Age =age ; + } - the @Override * publicString toString () { $ return"tbperson [id=" + ID + ", name=" + name + ", age=" + age + "]";Panax Notoginseng } -}
The following is the logic of the production side:
1 packagecom.xuebusi.controller;2 3 Importcom.alibaba.fastjson.JSON;4 Importcom.xuebusi.pojo.TbPerson;5 Importcom.xuebusi.service.KafkaService;6 Importorg.slf4j.Logger;7 Importorg.slf4j.LoggerFactory;8 Importorg.springframework.stereotype.Controller;9 Importorg.springframework.web.bind.annotation.RequestBody;Ten Importorg.springframework.web.bind.annotation.RequestMapping; one Importorg.springframework.web.bind.annotation.RequestMethod; a Importorg.springframework.web.bind.annotation.ResponseBody; - - Importjavax.annotation.Resource; the - @Controller -@RequestMapping ("/producer") - public classKafkacontroller { + - Private Static FinalLogger Logger = Loggerfactory.getlogger (kafkacontroller.class); + a @Resource at PrivateKafkaservice kafkaservice; - - /** - * Send a message to SSMK this topic - * @param person - * @return in */ -@RequestMapping (value = "/person", method =Requestmethod.post) to @ResponseBody + publicString Createperson (@RequestBody tbperson Person) { - if(person = =NULL){ the return"fail, data can not is null."; * } $String JSON =json.tojsonstring (person);Panax Notoginseng Booleanresult = Kafkaservice.sendinfo ("ssmk", json); -Logger.info ("producer sends message [" + result + "]:" +json); the return"success"; + } a}
The following is the logic of the consumer side:
1 packagecom.xuebusi.consumer;2 3 Importcom.alibaba.fastjson.JSON;4 Importcom.xuebusi.pojo.TbPerson;5 Importcom.xuebusi.service.PersonService;6 Importorg.slf4j.Logger;7 Importorg.slf4j.LoggerFactory;8 Importorg.springframework.beans.factory.annotation.Autowired;9 Importorg.springframework.stereotype.Service;Ten one Importjava.util.List; a Importjava.util.Map; - - @Service the public classKafkaconsumerservice { - Private Static FinalLogger Logger = Loggerfactory.getlogger (kafkaconsumerservice.class); - - @Autowired + PrivatePersonservice personservice; - + public voidProcessMessage (map<string, map<integer, string>>Msgs) { a /*for (map.entry<string, map<integer, string>> entry:msgs.entrySet ()) { at String topic = Entry.getkey (); - map<integer, string> value = Entry.getvalue (); - for (map.entry<integer, string> entrySet:value.entrySet ()) { - Integer Partiton = Entryset.getkey (); - String msg = Entryset.getvalue (); - logger.info ("consumption theme:" + topic + ", consumption of the partition:" + Partiton + ", consumer messages:" + msg); in Logger.info ("======= uses JSON to parse object ========="); - Tbperson person = json.parseobject (msg, tbperson.class); to logger.info ("======= Object starts warehousing ========="); + Personservice.insert (person); - Logger.info ("======= Object Storage Success ========="); the } * }*/ $ Panax Notoginseng for(map.entry<string, map<integer, string>>Entry:msgs.entrySet ()) { -String topic =Entry.getkey (); themap<integer, string> value =Entry.getvalue (); + for(map.entry<integer, string>EntrySet:value.entrySet ()) { aInteger Partiton =Entryset.getkey (); theString msg =Entryset.getvalue (); +Logger.info ("consumption theme:" + topic + ", consumption of partition:" + Partiton + ", consumption message:" +msg); -msg = "[" + msg + "]", or//note that the brackets are added to the front and back, otherwise the json-malformed exception will be reported when parsing the JSON object (spring will separate multiple JSON data with Commas) $Logger.info ("======= uses JSON to parse object ========="); $list<tbperson> personlist = Json.parsearray (msg, Tbperson.class); - //Tbperson person = json.parseobject (msg, tbperson.class); - if(personlist! =NULL&& personlist.size () > 0) { theLogger.info ("message contains [" + personlist.size () + "] objects"); - for(tbperson Person:personlist) {WuyiLogger.info ("======= Object starts inbound ========="); the Personservice.insert (person); -Logger.info ("======= Object Inbound Successful ========="); wu } - } about $ } - } - } -}
Spring integrated Kafka project production and consumption test results record (i)