This is down for the moment.  Finally got a multi-core processor, so hopefully in a few weeks I will have this running again.  [Too busy to work on this now]

I found out over my break from this project that you can have thread-level variables which are thread-safe. So each time you access a variable with a specific attribute, it uses one specific to that thread. Hopefully this will give me the performance I need (alternative to designing thread-safe collections for reference counting--which I admit was full of fail).

New check-in, but still fails unit test.  Locks up sometimes (doing stuff not atomic enough), but at least the demo kind of runs.  Still pre-alpha.  May 1st, 2011.

Bringing simplicity to a multi-threaded world

This Read-Writer Locker is designed so that upgrading from read locks to write locks are intuitive, highly nested locking is encouraged, and deadlocks cannot happen.

Forked from jpmikkers' ReaderWriterLockAlt located at http://readerwriterlockalt.codeplex.com/.  As such, this also supports .NET 2.0+.  It no longer uses its SyncLock/PulseAll functionality, but if you would like to learn how to design your own, or you would like to help with this project, please start there.

Progress Bar Race
Don't forget to click the block headers (Problem Solved, Example, License, and Additional Information).  They are links to full articles about their topic.

Features
  • No deadlocks when passing "ReadWriteLocker.Behavior.NoDeadLock" to the constructor.
  • No manual upgrade locks. Upgrading a lock is as simple as nesting a write lock inside a read lock.
  • Compatibility with ReaderWriterLock and ReaderWriterLockSlim through similarly named wrapper classes.
  • Designed for .NET 4.0+ purposes.  This is not designed to only support legacy .NET frameworks.
  • Speed is comparable to ReaderWriterLockSlim.  It is a little slower, but manageability is many times greater.
  • Aborting a thread or forgetting to dispose the locker in the middle of thread locking shouldn't cause stability issues (work-in-progress).
  • Portability. The ideas behind this locking mechanism can be applied to many different languages and platforms (Java and C++ not finished).
Problem Solved

The typical lock(A) locking on lock(B), and lock(B) locking on lock(A). I detect when it happens, and let one thread "run its course" to solve this issue. By doing this, the "no preemption condition" of the four necessary conditions that must occur before Coffman Deadlocks can form cannot happen.

A lot of people say it isn't possible to prevent all deadlocks caused by locking alone.  So you have two paths.  Both use Pre-emption (giving one deadlocked thread priority over the other).

1.  Correct course:  After giving one thread of the deadlocked threads priority, throw an exception within that thread.  If another thread is deadlocked, throw another exception in it.  And so-on.  Until the deadlock no longer exists.
2.  Incorrect course, but could be useful:  Allow the single given thread the ability to run.  Only one thread accesses that data at a time (so it doesn't physically get corrupted).  But the threads could enter in the wrong order if things are too-finely granulated, causing logical errors/exceptions.

#1 is a lot better than simply letting your thread halt on each other. An exception can be immediately detected. And for performance, #1 could be shut off in release mode, meaning zero overhead.
#2 would be useful only at specific locations which you know would not be effected by incorrect locking.  This way if a deadlock happens between a Pre-Emptable lock location, and a non Pre-Emptable lock location, the Pre-Emptable location would be given priority. And its threads would be allowed to run their course.

Issues Remaining

As the number of locks increase, so does the complexity of preventing dead locks between a subset of locks.

This locker does not implement Distributed Lock Prevention for subsets of locks.  This doesn't mean that deadlocks are possible.  But it does mean that temporary deadlocks can occur for short periods while other threads are locking, degrading parallelism.

In the future, I will need to design a heuristic algorithm to determine and solve some of the non-permanent deadlocks occur so more parallelism can occur.  This heuristic will probably be based on the number of processors a machine has.

Also, data-consistency contract may be violated when deadlocks are solved. This is because when deadlocks are solved the order in which threads run across a piece of data may be changed.  And this is same issue with preemption between processes/programs accessing the same resource.  But with Policy Based Locking (work-in-progress) and checking IsSuperThreadRunning(), you have the ability to determine where this preemption occurs and write code so preemption will not cause any issues.

Policy Based Locking will either deadlock or throw an exception when a deadlock would otherwise occur, depending on the policy you choose for a specific lock.  IsSuperThreadRunning allows you to detect if preemption may be occurring and run alternative code that won't have issues with threads switching at that location, like the example below shows.

Example to solve this:
void Foo1()
{
using (lock1.GetWriteLock())
{
using (lock1.GetReadLock())
{
if (!(lock1.IsSuperThreadRunning))
{
// Perform code here.
} else {
// Throw exception here or do something else.
} // End if
} // End using
} // End using
} // End Foo1() Method

void Foo2()
{
using (lock2.GetWriteLock())
{
using (lock2.GetReadLock())
{
if (!(lock2.IsSuperThreadRunning))
{
// Perform code here.
} else {
// Throw exception here or do something else.
} // End if
} // End using
} // End using
} // End Foo1() Method
I hope that having the ability to prevent deadlocks and detect immediately when they happen results in many times easier to debug code than using ReaderWriterLockSlim and guessing when to break the application and knowing when the program is deadlocked.

ReadWriteLocker Lock = New ReadWriteLocker(True);

using (IDisposable ReadLock = Lock.GetReadLock())
{
// Perform any reading you might need, like iterating through a List.

using (IDisposable WriteLock = Lock.GetWriteLock())
{
// Perform any writing you may need, like adding items to a List.

} // End using
} // End using

Imports NoDeadLockReadWriteLocker
Imports ThreadSafeClasses

Public Module MainProgram
Public Sub main()
InitializeThreads()
End Sub

// List which has thread safety turned on, and initialized to an array of integers.
Dim List As New ThreadSafeList(Of Integer)(True, New Integer {0,5,0,4,0,3,0,2,0,1})
Dim IsBusy As Integer = 0

Public Sub InitalizeThreads()
For Index As Integer = 0 To 10
Interlocked.Increment(IsBusy)
ThreadPool.QueueUserWorkItem(FunctionToRunInMultipleThreads, Index)
Next

Dim VolatileIsBusy As Integer

While True
' Same as using volatile keyword in c#, or Thread.VolatileRead(IsBusy) without the function call.
VolatileIsBusy = IsBusy
  Thread.MemoryBarrier()

If VolatileIsBusy <= 0 Then
Exit While 
End If

Thread.Sleep(500)
End While
End Sub

Public Sub FunctionToRunInMultipleThreads(threadContext As Object)
Dim threadIndex As Integer = CInt(threadContext)
Console.WriteLine("thread {0} started...", threadIndex);

' Thread-safety does not need to be turned on for just the modifying while enumerating.
For Each Item As Integer In List
' This normally results in a enumeration error!
List.Add(Item)

Console.WriteLine("Value {0} found...note that items added in for-each won't appear yet.", Item);
Next

For Each Item As Integer In List
Console.WriteLine("Value {0} found...note that items added in previous for-each appear!", Item);
Next

' Due to a pitfall in caching modifications, it is advised to never manually loop through items.
' Please use for-each loops when enumerating through items. Also make sure enumerators get disposed!


' This is not thread-safe! This needs a read lock wrapped around it.
While (List.Count > 17)
' This can throw an error if List.Count is grabbed, an item is removed
' via another thread. Then RemoveAt is called. That position would be gone.
List.RemoveAt(List.Count)
    End While

' This is thread-safe. Notice the read lock wrapped around it.
'
' But this will loop forever, because removing an item will be cached
' and the count will never decrease.
Using ReadLock As IDisposable = List.GetReadLock()
While (List.Count > 14)
List.RemoveAt(List.Count)
End While
End Using

' This is correctly thread-safe. It, however, isn't very intuitive as to why.
While (List.Count > 10)
Using ReadLock As IDisposable = List.GetReadLock()
List.RemoveAt(List.Count)
End Using
End While

Interlocked.Decrement(IsBusy)
End Sub
End Module

 
License

This project uses a new Tri-License Choice MIT/L-GPL/MS-PL.  As such, you have a choice of choosing MIT, L-GPL v2 or v3, or MS-PL. Or you can choose to use the tri-license and let others have their choice. While using this tri-license, MIT will be used in regards to the code, and L-GPL v2 for patents (so algorithms remain open source).  Only when MS-PL is chosen, does it apply (dropping MIT and L-GPL).

By setting ReadWriteLocker.UseThisThreadAsPriorityThread = True in a specific thread, you force this ThreadWriteLocker to choose this thread first when preventing deadlocks (preemption).  If for instance, you do this for the UI Thread, this will give you an always responsive UI.  But at the expense of other threads maybe becoming temporarily deadlocks and there performance drastically reduced when deadlocks are found.

If you would like regular locking (i.e. SyncLock/lock) that cannot be dead-locked, simply use GetSyncLock() or GetLock() according to your programming language/preference.

Necessary conditions

Sub

Last edited Nov 9, 2011 at 4:45 PM by TamusJRoyce, version 115