class ThreadPool
Declaration
class ThreadPool { /* full declaration omitted */ };
Description
A ThreadPool for asynchronous parallel execution on a defined number of threads. The pool keeps a vector of threads alive, waiting on a condition variable for some work to become available. It is possible to reuse one thread pool for different groups of tasks by grouping tasks using ThreadPoolTaskGroup. All tasks are processed using the same queue, but it is possible to wait only for a specific group of tasks to finish. It is also possible for worker threads to submit new tasks and wait for them. Note that this may result in a deadlock in cases such as when a task (directly or indirectly) tries to wait for its own completion, or when all available threads are used up by tasks waiting for a task that has no thread left to run on (this includes waiting on the returned future). It should be generally safe to wait() for a group as long as groups do not form a cycle.
Declared at: llvm/include/llvm/Support/ThreadPool.h:52
Member Variables
- private std::vector<llvm::thread> Threads
- Threads in flight
- private llvm::sys::RWMutex ThreadsLock
- Lock protecting access to the Threads vector.
- private std::deque<std::pair<std::function<void()>, ThreadPoolTaskGroup*>> Tasks
- Tasks waiting for execution in the pool.
- private std::mutex QueueLock
- Locking and signaling for accessing the Tasks queue.
- private std::condition_variable QueueCondition
- private std::condition_variable CompletionCondition
- Signaling for job completion (all tasks or all tasks in a group).
- private unsigned int ActiveThreads = 0
- Keep track of the number of thread actually busy
- private DenseMap<llvm::ThreadPoolTaskGroup*, unsigned int> ActiveGroups
- Number of threads active for tasks in the given group (only non-zero).
- private bool EnableFlag = true
- Signal for the destruction of the pool, asking thread to exit.
- private const llvm::ThreadPoolStrategy Strategy
- private const unsigned int MaxThreadCount
- Maximum number of threads to potentially grow this pool to.
Method Overview
- public ThreadPool(llvm::ThreadPoolStrategy S = hardware_concurrency())
- public template <typename Function, typename... Args>auto async(Function && F, Args &&... ArgList)
- public template <typename Function, typename... Args>auto async(llvm::ThreadPoolTaskGroup & Group, Function && F, Args &&... ArgList)
- public template <typename Func>auto async(Func && F) -> std::shared_future<decltype(F())>
- public template <typename Func>auto async(llvm::ThreadPoolTaskGroup & Group, Func && F) -> std::shared_future<decltype(F())>
- private template <typename ResTy>std::shared_future<ResTy> asyncImpl(std::function<ResTy ()> Task, llvm::ThreadPoolTaskGroup * Group)
- private static std::pair<std::function<void ()>, std::future<void>> createTaskAndFuture(std::function<void ()> Task)
- private template <typename ResTy>static std::pair<std::function<void ()>, std::future<ResTy>> createTaskAndFuture(std::function<ResTy ()> Task)
- public unsigned int getThreadCount() const
- private void grow(int requested)
- public bool isWorkerThread() const
- private void processTasks(llvm::ThreadPoolTaskGroup * WaitingForGroup)
- public void wait()
- public void wait(llvm::ThreadPoolTaskGroup & Group)
- private bool workCompletedUnlocked(llvm::ThreadPoolTaskGroup * Group) const
- public ~ThreadPool()
Methods
¶ThreadPool(llvm::ThreadPoolStrategy S =
hardware_concurrency())
ThreadPool(llvm::ThreadPoolStrategy S =
hardware_concurrency())
Description
Construct a pool using the hardware strategy \p S for mapping hardware execution resources (threads, cores, CPUs) Defaults to using the maximum execution resources in the system, but accounting for the affinity mask.
Declared at: llvm/include/llvm/Support/ThreadPool.h:58
Parameters
- llvm::ThreadPoolStrategy S = hardware_concurrency()
¶template <typename Function, typename... Args>
auto async(Function&& F, Args&&... ArgList)
template <typename Function, typename... Args>
auto async(Function&& F, Args&&... ArgList)
Description
Asynchronous submission of a task to the pool. The returned future can be used to wait for the task to finish and is *non-blocking* on destruction.
Declared at: llvm/include/llvm/Support/ThreadPool.h:66
Templates
- Function
- Args
Parameters
- Function&& F
- Args&&... ArgList
¶template <typename Function, typename... Args>
auto async(llvm::ThreadPoolTaskGroup& Group,
Function&& F,
Args&&... ArgList)
template <typename Function, typename... Args>
auto async(llvm::ThreadPoolTaskGroup& Group,
Function&& F,
Args&&... ArgList)
Description
Overload, task will be in the given task group.
Declared at: llvm/include/llvm/Support/ThreadPool.h:74
Templates
- Function
- Args
Parameters
- llvm::ThreadPoolTaskGroup& Group
- Function&& F
- Args&&... ArgList
¶template <typename Func>
auto async(Func&& F)
-> std::shared_future<decltype(F())>
template <typename Func>
auto async(Func&& F)
-> std::shared_future<decltype(F())>
Description
Asynchronous submission of a task to the pool. The returned future can be used to wait for the task to finish and is *non-blocking* on destruction.
Declared at: llvm/include/llvm/Support/ThreadPool.h:83
Templates
- Func
Parameters
- Func&& F
¶template <typename Func>
auto async(llvm::ThreadPoolTaskGroup& Group,
Func&& F)
-> std::shared_future<decltype(F())>
template <typename Func>
auto async(llvm::ThreadPoolTaskGroup& Group,
Func&& F)
-> std::shared_future<decltype(F())>
Declared at: llvm/include/llvm/Support/ThreadPool.h:89
Templates
- Func
Parameters
- llvm::ThreadPoolTaskGroup& Group
- Func&& F
¶template <typename ResTy>
std::shared_future<ResTy> asyncImpl(
std::function<ResTy()> Task,
llvm::ThreadPoolTaskGroup* Group)
template <typename ResTy>
std::shared_future<ResTy> asyncImpl(
std::function<ResTy()> Task,
llvm::ThreadPoolTaskGroup* Group)
Description
Asynchronous submission of a task to the pool. The returned future can be used to wait for the task to finish and is *non-blocking* on destruction.
Declared at: llvm/include/llvm/Support/ThreadPool.h:148
Templates
- ResTy
Parameters
- std::function<ResTy()> Task
- llvm::ThreadPoolTaskGroup* Group
¶static std::pair<std::function<void()>,
std::future<void>>
createTaskAndFuture(std::function<void()> Task)
static std::pair<std::function<void()>,
std::future<void>>
createTaskAndFuture(std::function<void()> Task)
Declared at: llvm/include/llvm/Support/ThreadPool.h:130
Parameters
- std::function<void()> Task
¶template <typename ResTy>
static std::pair<std::function<void()>,
std::future<ResTy>>
createTaskAndFuture(std::function<ResTy()> Task)
template <typename ResTy>
static std::pair<std::function<void()>,
std::future<ResTy>>
createTaskAndFuture(std::function<ResTy()> Task)
Description
Helpers to create a promise and a callable wrapper of \p Task that sets the result of the promise. Returns the callable and a future to access the result.
Declared at: llvm/include/llvm/Support/ThreadPool.h:121
Templates
- ResTy
Parameters
- std::function<ResTy()> Task
¶unsigned int getThreadCount() const
unsigned int getThreadCount() const
Declared at: llvm/include/llvm/Support/ThreadPool.h:110
¶void grow(int requested)
void grow(int requested)
Declared at: llvm/include/llvm/Support/ThreadPool.h:184
Parameters
- int requested
¶bool isWorkerThread() const
bool isWorkerThread() const
Description
Returns true if the current thread is a worker thread of this thread pool.
Declared at: llvm/include/llvm/Support/ThreadPool.h:113
¶void processTasks(
llvm::ThreadPoolTaskGroup* WaitingForGroup)
void processTasks(
llvm::ThreadPoolTaskGroup* WaitingForGroup)
Declared at: llvm/include/llvm/Support/ThreadPool.h:186
Parameters
- llvm::ThreadPoolTaskGroup* WaitingForGroup
¶void wait()
void wait()
Description
Blocking wait for all the threads to complete and the queue to be empty. It is an error to try to add new tasks while blocking on this call. Calling wait() from a task would deadlock waiting for itself.
Declared at: llvm/include/llvm/Support/ThreadPool.h:98
¶void wait(llvm::ThreadPoolTaskGroup& Group)
void wait(llvm::ThreadPoolTaskGroup& Group)
Description
Blocking wait for only all the threads in the given group to complete. It is possible to wait even inside a task, but waiting (directly or indirectly) on itself will deadlock. If called from a task running on a worker thread, the call may process pending tasks while waiting in order not to waste the thread.
Declared at: llvm/include/llvm/Support/ThreadPool.h:105
Parameters
- llvm::ThreadPoolTaskGroup& Group
¶bool workCompletedUnlocked(
llvm::ThreadPoolTaskGroup* Group) const
bool workCompletedUnlocked(
llvm::ThreadPoolTaskGroup* Group) const
Description
Returns true if all tasks in the given group have finished (nullptr means all tasks regardless of their group). QueueLock must be locked.
Declared at: llvm/include/llvm/Support/ThreadPool.h:143
Parameters
- llvm::ThreadPoolTaskGroup* Group
¶~ThreadPool()
~ThreadPool()
Description
Blocking destructor: the pool will wait for all the threads to complete.
Declared at: llvm/include/llvm/Support/ThreadPool.h:61