{"id":4112,"date":"2018-07-13T10:47:53","date_gmt":"2018-07-13T15:47:53","guid":{"rendered":"http:\/\/hoolihan.net\/blog-tim\/?p=4112"},"modified":"2018-07-13T10:47:53","modified_gmt":"2018-07-13T15:47:53","slug":"using-tensorboard-with-multiple-model-runs","status":"publish","type":"post","link":"http:\/\/hoolihan.net\/blog-tim\/2018\/07\/13\/using-tensorboard-with-multiple-model-runs\/","title":{"rendered":"Using Tensorboard with Multiple Model Runs"},"content":{"rendered":"<p>I tend to use Keras when doing deep learning, with tensorflow as the back-end. This allows use of tensorboard, a web interface that will chart loss and other metrics by training iteration, as well as visualize the computation graph. I noticed tensorboard has an area of the interface for showing different runs, but wasn&#8217;t able to see the different runs. Turns out I was using it incorrectly. I used the same directory for all runs, but to use it correctly you should use a subdirectory per run. Here&#8217;s how I set it up to work for me:<\/p>\n<p>First, I needed a unique name for each run. I already had a function that I used for naming logs that captures the start time of the run when initialized. Here&#8217;s that code:<\/p>\n<script src=\"https:\/\/gist.github.com\/e5aaaedc68163c8e776dd3b5894a2e5e.js\"><\/script>\n<p>Then, I used that to create a constant for the tensorboard log directory:<\/p>\n<script src=\"https:\/\/gist.github.com\/6f0ec761402b94fa63fd0df151f9740d.js\"><\/script>\n<p>Finally, I run <strong>tensorboard on the parent directory<\/strong>, without the unique run name:<\/p>\n<script src=\"https:\/\/gist.github.com\/41d1e2f42c6a100953f8dcc0442b1f88.js\"><\/script>\n<p>If you&#8217;re wondering why I pass the host parameter to explicitly be all hosts, this is so that it works when running on a cloud GPU server. <\/p>\n<p>You&#8217;ll now see each subdirectory as a unique run in the interface:<\/p>\n<figure id=\"attachment_4114\" aria-describedby=\"caption-attachment-4114\" style=\"width: 985px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/hoolihan.net\/blog-tim\/wp-content\/uploads\/2018\/07\/Screenshot-2018-07-13-11.35.13.png\"><img loading=\"lazy\" decoding=\"async\" src=\"http:\/\/hoolihan.net\/blog-tim\/wp-content\/uploads\/2018\/07\/Screenshot-2018-07-13-11.35.13.png\" alt=\"Multiple runs in tensorboard\" width=\"985\" height=\"724\" class=\"size-full wp-image-4114\" srcset=\"http:\/\/hoolihan.net\/blog-tim\/wp-content\/uploads\/2018\/07\/Screenshot-2018-07-13-11.35.13.png 985w, http:\/\/hoolihan.net\/blog-tim\/wp-content\/uploads\/2018\/07\/Screenshot-2018-07-13-11.35.13-300x221.png 300w, http:\/\/hoolihan.net\/blog-tim\/wp-content\/uploads\/2018\/07\/Screenshot-2018-07-13-11.35.13-768x564.png 768w\" sizes=\"auto, (max-width: 985px) 100vw, 985px\" \/><\/a><figcaption id=\"caption-attachment-4114\" class=\"wp-caption-text\">Multiple runs in tensorboard<\/figcaption><\/figure>\n<p>That should do it. Comment if you have questions or feedback.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I tend to use Keras when doing deep learning, with tensorflow as the back-end. This allows use of tensorboard, a web interface that will chart loss and other metrics by training iteration, as well as visualize the computation graph. I noticed tensorboard has an area of the interface for showing different runs, but wasn&#8217;t able [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[306,289],"tags":[307,342,341,301,343,298],"class_list":["post-4112","post","type-post","status-publish","format-standard","hentry","category-data-science","category-machine-learning","tag-data-science","tag-deep-learning","tag-keras","tag-machine-learning","tag-tensorboard","tag-tensorflow"],"_links":{"self":[{"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/posts\/4112","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/comments?post=4112"}],"version-history":[{"count":0,"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/posts\/4112\/revisions"}],"wp:attachment":[{"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/media?parent=4112"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/categories?post=4112"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/hoolihan.net\/blog-tim\/wp-json\/wp\/v2\/tags?post=4112"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}