提交 346400c1 authored 作者: Olivier Delalleau's avatar Olivier Delalleau

Fixed some typos

上级 d6934160
...@@ -136,10 +136,10 @@ AddConfigVar('nocleanup', ...@@ -136,10 +136,10 @@ AddConfigVar('nocleanup',
BoolParam(False), BoolParam(False),
in_c_key=False) in_c_key=False)
# This flag is used when we import Theano to init global variable. # This flag is used when we import Theano to initialize global variables.
# So changing it after import won't change those global variable. # So changing it after import will not modify these global variables.
# This could be changed... but for now I do the fast fix to disable the # This could be done differently... but for now we simply prevent it from being
# change at run time. # changed at runtime.
AddConfigVar('tensor.cmp_sloppy', AddConfigVar('tensor.cmp_sloppy',
"Relax tensor._allclose (0) not at all, (1) a bit, (2) more", "Relax tensor._allclose (0) not at all, (1) a bit, (2) more",
IntParam(0, lambda i: i in (0,1,2), allow_override=False), IntParam(0, lambda i: i in (0,1,2), allow_override=False),
......
...@@ -1223,8 +1223,8 @@ class _tensor_py_operators: ...@@ -1223,8 +1223,8 @@ class _tensor_py_operators:
size = property(lambda self: prod(self.shape)) size = property(lambda self: prod(self.shape))
# We can't implement __len__ to provide a better error message. # We can't implement __len__ to provide a better error message.
# Otherwise TensorVariable[:-1] don't work as Python 2.5.1 call # Otherwise TensorVariable[:-1] does not work as Python 2.5.1 calls
# __len__ before calling __getitem__. It also don't catch the raised # __len__ before calling __getitem__. It also does not catch the raised
# Exception! # Exception!
# def __len__(self): # def __len__(self):
# # We can't implement __len__ as Python requests that this # # We can't implement __len__ as Python requests that this
......
Markdown 格式
0%
您添加了 0 到此讨论。请谨慎行事。
请先完成此评论的编辑!
注册 或者 后发表评论