Very bad. Think again. You cannot guarantee that something always finished in 60 seconds or any other hard-coded amount of time. And if you use sufficient amount of time, the waiting thread probably wait too much. This is absurd. Why using threads which never work in parallel, and if they do, this leads to trouble? Just for saying "I develop multithreaded code"? :-)
So, first thing you should think of is why are you using those threads. Even if they work in parallel, what's the benefit of it?
There is another problem with your list. When you are populating the least, some other application can change the content of the disk. And I don't know what to advise until you tell use your ultimate goals. This is just something to think about.
Now, the problem with working on some list by two threads is pretty trivial; it is solved by thread synchronization primitives, such as
lock. You can populate a collection with one thread and read the elements as they appear with another thread, without using any CPU time when a thread is waiting for data. This is the classic pattern called
producer-consumer (
http://en.wikipedia.org/wiki/Producer-consumer[
^]). But the adequate data structure is queue, not list. But you don't even have to do that. Since .NET v.4.0, you can use the synchronized collection
System.Collections.Concurrent.BlockingCollection<>
:
http://msdn.microsoft.com/en-us/library/dd267312%28v=vs.110%29.aspx[
^].
You can understand how such things work if you read my article complete with full source code and usage examples explained in detail:
Simple Blocking Queue for Thread Communication and Inter-thread Invocation[
^].
Besides, you can use my
BlockingQueue
class if you are using .NET version earlier than 4.0.
—SA