Tensorboard subsamples scalars when plotting

I have noticed for year that tensorboard v2.x subscamples scalars when plotting.  I never tried to reproduce the problem but noticed if I had a longish training run, every epoch’s loss was not displayed on tensorboard.  This would produce a strange jaggy effect of the plot which I never liked, especially given my signal processing background.  It was also not obvious why or how it was subsampling.  For all I knew, it wasn’t even subsampling but doing some kind of filtering which resulted in the skipping of scalar values on the plots.

Today, I finally figured out how to fix this issue.  I was able to correct the issue by using a tensorboard command prompt of:

start “tensorboard” “e:\python_3.8\scripts\tensorboard” –samples_per_plugin scalars=9999999 –logdir .

The addition of the –samples_per_plugin scalars=999999 fixes the issue.  Now, all of the points I write out to tensorboard are displayed.

Ref: https://stackoverflow.com/questions/43763858/change-images-slider-step-in-tensorboard