******/ private void thread Pool Executor Process(Linked Blocking Queue(), new Thread Fact()); Thread monitor = new Thread(new Monitor Thread(tpe)); Daemon(true); monitor.start(); try { while (busy) { p = q.poll(); if (p == null) { Thread.yield(); continue; } Iterator ir = p.iterator(); while (Next()) { record = (String) ir.next(); ir.remove(); if (record == null) { busy = false; break; } else { //getting the data from the queue and giving it to Consumer thread pool tpe.execute(new Consumer Thread("Data" + record)); total++; } } Thread.sleep(2000); p = null; } } catch (Rejected Execution Exception e) { // This exception comes up when the queue limit is reached. class Thread Fact implements Thread Factory { public Thread new Thread(Runnable arg0) { // TODO Auto-generated method stub return new Thread(arg0); } } //This is the actual thread that does the execution class Consumer Thread implements Runnable { private String data; public Consumer Thread(String data) { = data; } public synchronized void run() { println("Starting Thread with data:" + data); try { Thread.sleep(100); } catch (Interrupted Exception e) { // TODO Auto-generated catch block e.print Stack Trace(); } } } //This gives the statistics of the Thread Pool Executor so that //we can see the creation of new threads and closing of idle threads. Logger; class Producer implements Runnable { private Linked Blocking Queue _q, String file_name) throws IOException { q = _q; input_f = file_name; br = new Buffered Reader(new File Reader(input_f)); } public int get Answer() { return total; } public synchronized void run() { println("Executing Producer"); Array List(); Thread.yield(); } tb.add(s); try { //q.offer(tb); q.put(tb); } catch (Interrupted Exception ex) { Logger(Producer.Name()).log(Level. And how can you tell whether all of the records have been processed or not? 4) If the queue is filled, and the number of threads is greater than or equal to max Pool Size, reject the task. Unable to execute task"); //e.print Stack Trace(); } println("Total records processed"); Thread.sleep(3000); tpe.shutdown(); } //This is the thread factory that can be used to create consumer threads. Terminated()) { } } public void log(String str) { println(get Date() + ":" + str); } public String get Date() { return new Simple Date Format("yyyy-MM-dd ss. Date()); } //Creating a Thread Pool Executor with core Pool Size 50 (min threads) , maximum 100 threads (max Pool Size), Maximum size of the queue 10 and a custom Thread Factory /***** 1) If the number of threads is less than the core Pool Size, create a new Thread to run a new task 2) If the number of threads is equal (or greater than) the core Pool Size, put the task into the queue 3) If the queue is filled, and the number of threads is less than the max Pool Size, create a new thread to run tasks in.

Here is the high level overivew of my code that I have created. All these can be improved, but it's difficult to explain the how's, if the end goal and constraints (for example, why do lines need to be consumed sequentially? So, before getting lost in implementation swamp, can you give an idea of this application's end goal...

• Create Executor service for Producer to pick up the records using Linked blocking queue.. if you were asked by a non programmer "what does this application do? Best Regards Karthik Hi Karthik, Sorry for so much confused. Get the list of files in the folder and each file has 1million of records. Read each record in a particular file and send it to the Database. Likewise process the entire records in a file and move on to Next file For the above approach I have designed it in Producer/Consumer concept using Concurrency API in Java. I'm no code gestapo, and we're all learners here in one area or another.

• Once the records have been taken up by the producer, it will form like multiple array list and each one will have 100000 records. As I Posted before Reader.java--act as to Read the files and call the then once all the records has been pushed(100000 records in each index of array) to Linked Blocking queue Threadpoolexecutor will start the process of taken up the records and going on. Issues that currently facing is like, - I have to wait for the producer to process the 1 million records and then consumer is starting the process. My question arose because in the original code listing, the consumer just prints the record to console, which can't have been the end goal; hence I asked what's the actual end goal.

All I can recommend at this point is that you simplify the program to the point that a second person could actually understand it.

Please let me know in case of any further clarifications. There aren't any comments in that program (except comments about the queuing) so I have no idea what gets run and when and how.

No exception has been thrown but at middle of the processing my JVM is getting shutdown. class Monitor Thread implements Runnable { private Thread Pool Executor tpe; public Monitor Thread(Thread Pool Executor tpe) { = tpe; } public void run() { // TODO Auto-generated method stub while (true) { try { println("Stats----Active Consumer Thread Count:" + Active Count() + " Total Count:" + Pool Size() + " Consumer Queue Size:" + Queue().size()); Thread.sleep(1000); } catch (Interrupted Exception e) { // TODO Auto-generated catch block e.print Stack Trace(); } } } } // end of class Reader } import *; import *; import concurrent. SEVERE, null, ex); } br.close(); } catch (IOException e1) { e1.print Stack Trace(); } } } You say "get shutting down". (If the answer is in that code, I apologize, but there was really too much code for me to read.)I have declared the variable that line is not at all printing.• Using Thread Pool Executor, multiple Consumer Threads is getting triggered and Process the array list of records sequentially. will take a file, puts up in the Linked blocking queue. - Abruptly my consumer is shutting down when more number of records left. You've now clarified that end goal is to transfer records from files to DB, which helps a lot.• All records have been processed then the loop continues to take up the upcoming records. Once queue is getting filled,using Thread pool executor i will create mulitple consumers to taken up the records for processing. It would be great if anyone could solve my crictical issue. Need to fix: -Once the producer will start reading the record right away i need to start the Thread pool executor and notify to producer as well -If possible we can increase to multiple producers as well. What I still haven't understood is, why is it necessary to finish reading all 1 million records of a file before submitting them in one go, and only then moving on to another file? Can you tell whether these 2 constraints - 1) read all records first 2) finish one file then move to another - are really necessary?Issues I'm facing -If there are 300000 records the queue has taken 100000 records in each index(configurable),my consumer code will took only 100000 records and suddenly my TPE get shutting down. -Without concurrency API, I can able to do it by 2 mins 45 secs for the processing of 30k records,since concurrency is most advanced I opted this code to develop. If those are not necessary, then my keep-it-simple code would be (in pseudocode): class App { void process() { int thread Pool Size = 8; // Create executor - just 1 executor, not multiple. Hi Harikumar, The - design of this app, - the way it's creating (mutliple) thread pool executors, - the seemingly unnecessary "Thread.yields()" (premature optimization?