Important Notice: Our web hosting provider recently started charging us for additional visits, which was unexpected. In response, we're seeking donations. Depending on the situation, we may explore different monetization options for our Community and Expert Contributors. It's crucial to provide more returns for their expertise and offer more Expert Validated Answers or AI Validated Answers. Learn more about our hosting issue here.

What are threads?

threads
0
10 Posted

What are threads?

0

A thread is a sequential flow of control through a program. Multi-threaded programming is, thus, a form of parallel programming where several threads of control are executing concurrently in the program. All threads execute in the same memory space, and can therefore work concurrently on shared data. Multi-threaded programming differs from using multiple Unix processes in that all threads share the same memory space (and a few other system resources, such as file descriptors), instead of running in their own memory space as is the case with Unix processes. Threads are useful for several reasons. First, they allow a program to exploit multi-processor machines: the threads can run in parallel on several processors, allowing a single program to divide its work between several processors, thus running faster than a single-threaded program, which runs on only one processor at a time. Second, even on uniprocessor machines, threads allow overlapping I/O and computations in a simple way.

0

A thread is an encapsulation of the flow of control in a program. Most people are used to writing single-threaded programs – that is, programs that only execute one path through their code “at a time”. Multithreaded programs may have several threads running through different code paths “simultaneously”. Why are some phrases above in quotes? In a typical process in which multiple threads exist, zero or more threads may actually be running at any one time. This depends on the number of CPUs the computer on which the process is running, and also on how the threads system is implemented. A machine with n CPUs can, intuitively enough, run no more than n threads in parallel, but it may give the appearance of running many more than n “simultaneously”, by sharing the CPUs among threads.

0

• (ADB) Very shortly, threads are “lightweight” processes (see the definition of a process in the Linux Kernel book) that can run in parallel. This is a lot different from standard UNIX processes that are scheduled and run in sequential fashion. More threads information can be found here or in the excellent Linux Parallel Processing HOWTO by Professor Hank Dietz. • (REG) When people talk about threads, they usually mean kernel threads i.e. threads that can be scheduled by the kernel. On SMP hardware, threads allow you to do things truly concurrently (this is particularly useful for large computations). However, even without SMP hardware, using threads can be good. It can allow you to divide your problems into more logical units, and have each of those run separately. For example, you could have one thread read blocking on a socket while another reads something from disk. Neither operation has to delay the other. Read “Threads Primer” by Bill Lewis and Daniel J. Berg (Prentice Hall, ISB

0

Message threads are basically “conversations” that take place based on linkages between messages. Once a message is replied to, both the original message, the reply and all subsequent replies become part of the same thread. To view forum messages as threads, select the “Thread” message view by clicking “Threaded View”. (Note that this option is not available if you are already in Threaded View. In this case, Threaded View will be replaced by Flat View.

0

Threads are lightweight processes that exist within a larger process. Threads share the same code and data segments, but have their own program counters, machine registers, and stack. Global and static variables are common to all threads, and a mutual exclusivity mechanism may be required to manage access to these variables from multiple threads within an application. Once spawned, threads run asynchronously to one another. They can access common data elements and make OCI calls in any order. Because of this shared access to data elements, a mechanism is required to maintain the integrity of data being accessed by multiple threads. The mechanism to manage data access takes the form of mutexes (mutual exclusivity locks), which ensure that no conflicts arise between multiple threads that are accessing shared resources within an application. In Oracle8i OCI, mutexes are granted on a per-environment-handle basis.

Related Questions

What is your question?

*Sadly, we had to bring back ads too. Hopefully more targeted.