A. When we need to limit the number of threads running in the application at the same time
B. When we need to limit the number of threads running in the application as a whole
C. When we need to arrange the ordering of threads
D. None of the mentioned
A. Process
B. Thread pool
C. Thread queue
D. None of the mentioned
A. Is directly put into the blocking queue
B. Is wrapped as a task and passed on to a thread pool
C. Is kept in a normal queue and then sent to the blocking queue from where it is dequeued
D. None of the mentioned
A. A number of threads are created at process startup and placed in a pool where they sit and wait for work
B. When a process begins, a pool of threads is chosen from the many existing and each thread is allotted equal amount of work
C. All threads in a pool distribute the task equally among themselves
D. None of the mentioned
A. The server runs a new process
B. The server goes to another thread pool
C. The server demands for a new pool creation
D. The server waits until one becomes free
A. Servicing multiple requests using one thread
B. Servicing a single request using multiple threads from the pool
C. Faster servicing of requests with an existing thread rather than waiting to create a new thread
D. None of the mentioned
A. Not letting the system resources like cpu time and memory exhaust
B. Helping a limited number of processes at a time
C. Not serving all requests and ignoring many
D. None of the mentioned
A. Number of cpus in the system
B. Amount of physical memory
C. Expected number of concurrent client requests
D. All of the mentioned