C # memory leaks
C # memory leaks

C# Memory Mishaps: Forgotten Objects and Resource Hogs

In C#, your program might mistakenly hoard memory by creating objects it doesn’t clean up later. This gradually increases the application’s memory usage, potentially leading to sluggish performance or crashes if the system runs out of memory.
Memory leaks can be tricky to find and fix because they often happen subtly.
Being able to identify, resolve, and prevent memory leaks is a valuable skill. There are well-established practices for detecting leaks in your application, pinpointing the culprit, and applying a solution.
With a garbage collector (GC) in play, the term “memory leak” might seem odd. How can leaks occur when the GC is there to free up unused memory?
There are two main culprits. The first involves objects that are still referenced even though they’re no longer needed. Since they’re referenced, the GC won’t remove them, leaving them to occupy memory indefinitely. This can happen, for instance, when you subscribe to an event but forget to unsubscribe.
The second culprit is when you allocate memory that isn’t managed by the GC (unmanaged memory) and neglect to release it. This is easier than you might think. Many .NET classes themselves allocate unmanaged memory. This includes anything related to threading, graphics, the file system, or network calls (all handled behind the scenes). These classes typically provide a Dispose method to free up memory. You can also directly allocate unmanaged memory using specific .NET classes like Marshal or PInvoke.

Here’s a simple example to illustrate a memory leak:

public class MyClass
{
    public void WriteToFile(string fileName, string content)
    {
        FileStream fs = new FileStream(fileName, FileMode.OpenOrCreate); // Open the file
        StreamWriter sw = new StreamWriter(fs); // Write content
        sw.WriteLine(content);

        // **Leak! fs and sw are not disposed of**
    }
}

In this example, the WriteToFile method opens a FileStream (fs) and a StreamWriter (sw) to write to a file. However, it doesn’t dispose of them after writing. This means the memory allocated for these objects will remain occupied even after the method finishes, causing a leak if called repeatedly.

To fix the leak, we need to release the unmanaged resources by disposing of them properly:

public class MyClass
{
public void WriteToFile(string fileName, string content)
{
using (FileStream fs = new FileStream(fileName, FileMode.OpenOrCreate)) // Use using statement
{
using (StreamWriter sw = new StreamWriter(fs)) // Use using statement
{
sw.WriteLine(content);
}
} // fs and sw are disposed of automatically here
}
}

The using statement ensures that fs and sw are disposed of (their Dispose() methods are called) when the code block within the using exits, even if an exception occurs. This guarantees proper resource management and prevents memory leaks.

Detecting Memory Leaks Is Important!

Memory leaks can cripple your application! Let’s explore a handy technique to identify them. Have you ever dismissed the “Diagnostic Tools” window after installing Visual Studio? Well, it’s time to give it a second look!
This window offers a valuable service: pinpointing memory leaks and garbage collector strain (GC Pressure). Accessing it is simple: just navigate to Debug > Windows > Show Diagnostic Tools.
Once open, if your project uses garbage collection (GC), you might see yellow lines. These indicate the GC actively working to free up memory. However, a steadily rising memory usage signifies potential memory leaks.
Understanding GC Pressure: This occurs when you create and discard objects so rapidly that the garbage collector struggles to keep pace.
While this method doesn’t pinpoint specific leaks, it effectively highlights a potential memory leak issue – a crucial first step. For more granular leak detection, Visual Studio Enterprise offers a built-in memory profiler within the Diagnostic Tools window.

Task Manager, Process Explorer or PerfMon – Also Help With Detecting Memory Leaks

Another simple way to identify potential memory leaks is by using Task Manager or Process Explorer (a tool from SysInternals). These applications show how much memory your program is using. If that number keeps climbing, it might be a leak.

While a little trickier, Performance Monitor (PerfMon) offers a helpful graph of memory usage over time. It’s important to remember that this method isn’t foolproof. Sometimes, you might see a memory rise because the garbage collector hasn’t cleaned things up yet. There’s also the complexity of shared and private memory, which can lead to missed leaks or misdiagnosing someone else’s problem. Additionally, you might confuse memory leaks with GC Pressure. This occurs when you create and destroy objects so rapidly that the garbage collector struggles to keep pace, even though there’s no actual leak.

Despite the limitations, we included this technique because it’s easy to use and might be the only tool readily available. It can also serve as a general indicator of something amiss, especially if the memory usage keeps rising over a very extended period.

Using a Memory Profiler to Detect Leaks

Just like a chef relies on a sharp knife, memory profilers are essential tools for battling memory leaks. While there might be simpler or cheaper alternatives (profilers can be costly), mastering at least one is crucial for effectively diagnosing and eliminating memory leaks.
Popular .NET profilers include dotMemory, SciTech Memory Profiler, and ANTS Memory Profiler. If you have Visual Studio Enterprise, there’s also a built-in “free” option.

All profilers share a similar approach. You can either connect to a running program or analyze a memory dump file. The profiler then captures a “snapshot” of your process’s memory heap at that specific moment. This snapshot allows for in-depth analysis using various features.
You can view details like the number of instances for each object type, their memory usage, and the chain of references leading back to a “GC Root.”

A GC Root is an object that the garbage collector can’t remove. Consequently, anything linked to a GC Root is also immune to deletion. Examples of GC Roots include static objects, local variables, and currently active threads.
The most efficient and informative profiling technique involves comparing two snapshots taken under specific conditions. The first snapshot is captured before a suspected memory leak-causing operation, and the second one is taken after. Here’s an example workflow:

  1. Begin with your application in an idle state, like the main menu.
  2. Use your memory profiler to capture a snapshot by attaching to the process or saving a dump.
  3. Execute the operation suspected of causing the leak. Once finished, return to the idle state.
  4. Capture a second snapshot using the profiler.
  5. Compare these snapshots within your profiler.
  6. Focus on the “New-Created-Instances” list, as they might be potential leaks. Analyze the “path to GC Root” to understand why these objects haven’t been released.

Identifying Memory Leaks With Object IDs

Do you suspect a particular class might be leaking memory? In other words, you think instances of this class stay referenced after a script runs, preventing garbage collection. Here’s how to verify if the garbage collector is doing its job:

  1. Set a Breakpoint: Place a breakpoint where your class instance is created.
  2. Inspect the Variable: Pause execution at the breakpoint, then hover over the variable to bring up the debugger tooltip. Right-click and choose “Make Object ID” (or similar functionality depending on your debugger).
  3. Verify Object ID: To confirm successful creation of the Object ID, you can type $1 (or the assigned name) in the immediate window of your debugger.
  4. Run Leak-Causing Script: Complete the script execution that you believe might be causing the memory leak, potentially leaving the instance referenced.
  5. Force Garbage Collection: Simulate a memory cleanup by invoking the following lines (these may vary slightly depending on your environment):

GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();

6. Check for Collected Object: In the debugger’s immediate window, type $1 (or the assigned name) again. If the result is null, the garbage collector successfully collected your object, indicating no memory leak. If it returns a value, you’ve likely found a memory leak.

Use the Dispose Template to Prevent Unmanaged Memory Leaks

In the world of .NET, your applications often interact with resources that aren’t directly managed by the .NET system itself. These are called unmanaged resources. The .NET platform itself actually uses quite a bit of unmanaged code under the hood to make things run smoothly and efficiently. This unmanaged code might be used for things like threading, graphics, or even accessing parts of the Windows operating system.
When you’re working with .NET code that relies on unmanaged resources, you’ll often see a class that implements a special interface called IDisposable. This is because these resources need to be properly cleaned up when you’re done with them, and the Dispose method is where that happens. The key for you as a developer is to remember to call this Dispose method whenever you’re finished using the resource. An easy way to handle this is by using the using statement in your code.

public void Foo()
{
using (var stream = new FileStream(@"C:\..\KoderShop.txt",
FileMode.OpenOrCreate))
{
// do stuff

 }// stream.Dispose() will be called even if an exception occurs

The using statement acts like a behind-the-scenes helper, transforming your code into a try…finally block. Inside this block, the Dispose method gets called when the finally part executes.

Even without explicitly calling Dispose, those resources will eventually be released. This is because .NET classes follow the Dispose pattern. In simpler terms, if Dispose hasn’t been called yet, the garbage collector will call it on the object’s finalization (also known as finalizer). However, this finalization only happens if there are no memory leaks.

When you’re directly managing unmanaged resources (resources not handled by the garbage collector), using the Dispose pattern becomes essential.

Here’s an example:

public class DataHolder : IDisposable
{
private IntPtr _dataPtr;
public const int DATA_CHUNK_SIZE = 1048576; // 1 MB (same value in bytes)
private bool _isReleased = false;

public DataHolder()
{
_dataPtr = Marshal.AllocHGlobal(DATA_CHUNK_SIZE);
}

 protected virtual void ReleaseResources(bool disposing)
{
if (_isReleased)
return;

 if (disposing)
{
// Free any other managed objects here.
}

 // Free any unmanaged objects here.
Marshal.FreeHGlobal(_dataPtr);
_isReleased = true;
}

 public void Dispose()
{
ReleaseResources(true);
GC.SuppressFinalize(this);
}

 ~DataHolder()
{
ReleaseResources(false);
}
}

This pattern lets you explicitly release resources when you’re done with them. It also provides a safety net. If you forget to call Dispose(), the garbage collector will still clean up the resources eventually using a method called Finalizer.
GC.SuppressFinalize(this) is crucial because it prevents the Finalizer from being called if the object has already been properly disposed of. Objects with Finalizers are handled differently by the garbage collector and are more costly to clean up. These objects are added to a special queue, allowing them to survive for a bit longer than usual during garbage collection. This can lead to additional complexities in your code.

Monitoring Your Application’s Memory Footprint

There are situations where tracking your application’s memory usage might be beneficial. Perhaps you suspect a memory leak on your production server. Maybe you want to trigger an action when memory consumption hits a specific threshold. Or, you might simply prioritize keeping an eye on memory usage as a good practice.
Fortunately, the application itself provides valuable insights. Retrieving the current memory usage is a straightforward process:

Process currentProcess = Process.GetCurrentProcess();
var bytesInUse = currentProcess.PrivateMemorySize64;

For more information, you can use PerformanceCounter, a class that is used for PerfMon:

PerformanceCounter privateBytesCounter = new PerformanceCounter("Process", "Private Bytes", Process.GetCurrentProcess().ProcessName);
PerformanceCounter gen0CollectionsCounter = new PerformanceCounter(".NET CLR Memory", "# Gen 0 Collections", Process.GetCurrentProcess().ProcessName);
PerformanceCounter gen1CollectionsCounter = new PerformanceCounter(".NET CLR Memory", "# Gen 1 Collections", Process.GetCurrentProcess().ProcessName);
PerformanceCounter gen2CollectionsCounter = new PerformanceCounter(".NET CLR Memory", "# Gen 2 Collections", Process.GetCurrentProcess().ProcessName);
PerformanceCounter gen0HeapSizeCounter = new PerformanceCounter(".NET CLR Memory", "Gen 0 heap size", Process.GetCurrentProcess().ProcessName);

// ...

Debug.WriteLine("Private bytes = " + privateBytesCounter.NextValue());
Debug.WriteLine("# Gen 0 Collections = " + gen0CollectionsCounter.NextValue());
Debug.WriteLine("# Gen 1 Collections = " + gen1CollectionsCounter.NextValue());
Debug.WriteLine("# Gen 2 Collections = " + gen2CollectionsCounter.NextValue());
Debug.WriteLine("Gen 0 heap size = " + gen0HeapSizeCounter.NextValue());

While performance monitor counters provide valuable insights, they only scratch the surface.
For a deeper dive, consider CLR MD (Microsoft.Diagnostics.Runtime). It grants access to the inner workings of the heap, allowing you to extract a wealth of information. Imagine examining all the types currently loaded in memory, along with how many instances exist and how they’re being held in memory. With CLR MD, you can essentially build your own custom memory profiler.
For a practical example of CLR MD’s capabilities, explore Dudi Keleti’s DumpMiner tool.
This data can be saved to a file, but for better analysis, consider integrating it with a telemetry tool like Application Insights.

Uncovering Memory Issues: A Simple Approach

Catching memory leaks before they cause problems is crucial, and the good news is, it’s achievable! This template provides a handy starting point…

[Test]
void MemoryLeakTest()
{
var weakReference = new WeakReference(leakyObject)
// Ryn an operation with leakyObject
GC.Collect();
GC.WaitForPendingFinalizers();
GC.Collect();
Assert.IsFalse(weakReference.IsAlive);
}

For more in-depth testing, memory profilers such as SciTech’s .NET Memory Profiler and dotMemory provide a test API:

MemAssertion.NoInstances(typeof(MyLeakyClass));
MemAssertion.NoNewInstances(typeof(MyLeakyClass), lastSnapshot);
MemAssertion.MaxNewInstances(typeof(Bitmap), 10);

Steer Clear Of These Memory Leak Culprits

While we’ve covered detection methods, here are some coding practices to avoid altogether. Memory leaks aren’t inevitable, but certain patterns increase the risk. Be extra cautious with these and use the detection methods mentioned earlier to be proactive.

Common Memory Leak Traps:

  • .NET Events: Subscribing to events can lead to memory leaks if not handled carefully.

public class MyClass
{
private MyOtherClass _otherClass;

 public MyClass()
{
_otherClass = new MyOtherClass(); // Create an instance of the other class
_otherClass.MyEvent += OnEvent; // Subscribe to the event of the other class
}

 private void OnEvent(object sender, EventArgs e)
{
// Perform some action on the event
}

 // This method is never called, but the subscription keeps the MyClass instance alive
~MyClass()
{
_otherClass.MyEvent -= OnEvent; // Ideally unsubscribe here, but finalizer might not be called
}
}

In this example, MyClass subscribes to the MyEvent of MyOtherClass. If MyClass is not properly disposed of, the subscription remains active even if it’s no longer needed. This can lead to a memory leak because MyClass holds a reference to MyOtherClass, and MyOtherClass might hold other references, preventing garbage collection.

  • Static Variables, Collections, and Events: Treat static elements with suspicion, especially static events. Since the garbage collector (GC) considers them “roots,” they’re never collected.

public static class MyStaticClass
{
public static event EventHandler MyStaticEvent;

 public static void TriggerEvent()
{
MyStaticEvent?.Invoke(null, EventArgs.Empty); // Raise the static event
}
}

public class MyClass
{
public MyClass()
{
MyStaticClass.MyStaticEvent += OnStaticEvent; // Subscribe to the static event
}

 private void OnStaticEvent(object sender, EventArgs e)
{
// Perform some action on the static event
}
}

Here, MyStaticEvent is a static event in MyStaticClass. Since it’s static, the garbage collector considers it a “root” and never collects it. If MyClass subscribes to this event and is never disposed of, the reference chain keeps both classes in memory even if they are no longer in use.
Caching: Caching mechanisms can be double-edged swords. While they improve performance, excessive caching can overflow memory and cause “OutOfMemory” exceptions. Consider strategies like deleting old items or setting cache limits.

public class MyCache
{
private static Dictionary<string, object> _cache = new Dictionary<string, object>();

 public static object GetFromCache(string key)
{
if (_cache.ContainsKey(key))
{
return _cache[key];
}
return null;
}

 public static void AddToCache(string key, object value)
{
_cache.Add(key, value);
}
}

This code implements a simple in-memory cache using a static dictionary. If entries are never removed from the cache, it can grow indefinitely and lead to memory exhaustion. Consider implementing strategies like:

  • Least Recently Used (LRU): Evict the least recently used entries when the cache reaches a size limit.
  • Time-To-Live (TTL): Set an expiration time for each cache entry. Entries are automatically removed after the TTL expires.
  • WPF Bindings: Be mindful of WPF bindings. Ideally, bind to a “DependencyObject” or something that implements “INotifyPropertyChanged.” Otherwise, WPF might create a strong reference to your binding source (like a ViewModel) using a static variable, leading to a leak.

public class MyViewModel
{
public string MyProperty { get; set; }
}

public partial class MainWindow : Window
{
private MyViewModel _viewModel;

 public MainWindow()
{
InitializeComponent();
_viewModel = new MyViewModel();
DataContext = _viewModel; // Set the DataContext to the ViewModel
}

// This property is not a DependencyObject and doesn' t implement INotifyPropertyChanged
public string MyNonBindableProperty { get; set; }

 private void ButtonClick(object sender, RoutedEventArgs e)
{
MyNonBindableProperty = _viewModel.MyProperty; // Bind to a non-suitable property
}
}

In this WPF example, MainWindow binds its MyNonBindableProperty to the MyProperty of the MyViewModel in the ButtonClick method. The problem is MyNonBindableProperty is not a DependencyObject and doesn’t implement INotifyPropertyChanged. WPF might create a hidden reference to the MyViewModel using a static variable to track changes, potentially causing a leak if the view model isn’t properly disposed of.

  • Captured Members: Event handler methods clearly reference the object they belong to. But anonymous methods that capture variables also create references. This can lead to memory leaks, as shown in the example below:

public class NetworkMonitor { // Changed class name for clarity
private int _signalStrengthChanges = 0; // Renamed variable

 public NetworkMonitor(NetworkManager networkManager) { // Different network class
networkManager.onSignalStrengthChange += (sender, event) -> _signalStrengthChanges++; // Lambda with sender argument (optional)
}
}

  • Threads that run forever, without ever stopping, can cause memory leaks. Each thread has its own “Live Stack,” which acts like a special kind of memory haven for objects it uses. As long as the thread is alive, the garbage collector won’t touch any objects referenced by the thread’s stack variables. This includes timers – if the code that runs when your timer triggers is a method, the method itself becomes a reference and avoids getting collected.
    Let’s look at an example to illustrate this kind of memory leak…

public class MyClass
{
public MyClass(NetworkManager networkManager)
{
Timer timerStart = new Timer(HandleTick);
timerStart.Change(TimeSpan.FromSeconds(5), TimeSpan.FromSeconds(5));
}

 private void HandleTick(object state)
{
// do something
}