Is x500 debugging performance unusual?

I have a short program that takes x500 longer to run when debugging. Is this kind of performance cost unusual?

Recently I’ve only used F# once a year (for Advent of Code) so it may just be my memory that is a little rosy when I expect debugging to be no more than about x10 slower.

What I’m seeing is:

    fsi:   0.448s
     f5: 412.684s
ctrl f5:   0.786s

My guess is that ‘brute force’ code is being extremely optimized and so the change reflects the degree of optimization rather than anything directly related to debugging. But I just wanted to check whether there were any common causes for significantly slower debugging perfomance? (e.g.: .net 5 vs .net 3, vscode vs vstudio, use of some feature, …)

I’m just asking out curiosity, it’s Not a real problem.


I’m using .net 5 (tried 5.0.100-rc.2.20479.15 and 5.0.100) with VSCode on Windows, with “C# for Visual Studio Code (powered by OmniSharp)” v1.23.7.

I’m a bit surprised by that difference too. Maybe try changing these settings for Debug configuration to see what effect they might have. Also, conditional breakpoints or tracepoints will definitely slow things down a lot.

1 Like

Thanks Scott,

I tried those flags (in the project property group, I hope that was the right place), and if they had an affect it was probably to only slow things down by about 10%.

There aren’t any breakpoints - orignally I just hit f5 by accident and thought it had crashed.

Because there wasn’t an obvious cause I did a bit of experimenting and got the same results with (core 3.0, vscode); (core 3.1, VStudio) and (framework 4.8, VStudio). So I think my code is just a bit an edge case. If I get time I might refactor it to see if closes the gap between the two times.

Thanks again