Skip to content Skip to sidebar Skip to footer

Tensorflow's Tensorflow Variable_scope Values Parameter Meaning

I am currently reading a source code for slim library that is based on Tensorflow and they use values argument for variable_scope method alot, like here. From the API page I can se

Solution 1:

The variable_scope parameter helps ensure uniqueness of variables and reuse of variables where desired.

Yes if you create two or more different computation graphs then they wouldn't necessarily share the same variable scope; however, there are ways to get them to be shared across graphs so the option is there.

Primary use cases for variable scope are for RNN's where many of the weights are tied and reused. That's one reason someone would need it. The other main reason it's there is to ensure that you are reusing the same variables when you explicitly mean to and not by accident. (For distributed settings this can become a concern.)

Post a Comment for "Tensorflow's Tensorflow Variable_scope Values Parameter Meaning"