The Op-wise summary print the execution time of all Apply nodes executing the same Op are grouped together and the total execution time per Op is shown (so if you use dot twice, you will see only one entry there corresponding to the sum of the time spent in each of them). If two Op have different hash value, they will be separate.
The Op-wise summary print the execution time of all Apply nodes executing the same Op are grouped together and the total execution time per Op is shown (so if you use dot twice, you will see only one entry there corresponding to the sum of the time spent in each of them). If two Op have different hash value, they will be separate.
The type-Op-wise summary group the result by type of op. So event if two Op have different hash value, they will be merged.
The type-Op-wise summary group the result by type of op. So event if two Op have different hash value, they will be merged.
Their is an hack with the Op-wise summary. Go see it if you want to know more.
param: n_apply_to_print the number of apply to print. Default 15.
param: n_apply_to_print the number of apply to print. Default 15.
param: n_ops_to_print the number of ops to print. Default 20.
param: n_ops_to_print the number of ops to print. Default 20.
print' ... (remaining %i Apply instances account for %.2f%%(%.2fs) of the runtime)'\
print' ... (remaining %i Apply instances account for %.2f%%(%.2fs) of the runtime)'\
%(max(0,len(atimes)-n_apply_to_print),
%(max(0,len(atimes)-n_apply_to_print),
sum(fforf,t,ainatimes[n_apply_to_print:])*100,
sum(fforf,t,ainatimes[n_apply_to_print:])*100,
sum(tforf,t,ainatimes[n_apply_to_print:]))
sum(tforf,t,ainatimes[n_apply_to_print:]))
flops=False
flops_msg=''
fora,tinop_time.items():
ifhasattr(a,'flops'):
flops=True
flops_msg=' <MFlops/s>'
print'\nHACK WARNING: we print the flops for some OP, but the logic don\' always work. You need to know the internal of Theano to make it work correctly. Otherwise don\'t use!'
break
print'\nOp-wise summary: < of local_time spent on this kind of Op> <cumulative seconds> <self seconds>%s <Op name>'%(flops_msg)
print'\nOp-wise summary: <% of local_time spent on this kind of Op> <cumulative seconds> <self seconds> <Op name>'
print"OPTIMISATION WARNING: in ConvOp.__init__() unroll_batch(%s) must be 0 or a divisor of bsize(%s). We revert it to 1. This won't change the result, but may make it slower."%(str(self.unroll_batch),str(self.bsize))
#find the maximum value under unroll_batch that would work
self.unroll_batch=1
new=self.unroll_batch
assert(new>=1)
whileself.bsize%new!=0:
new-=1
print"OPTIMISATION WARNING: in ConvOp.__init__() unroll_batch(%s) must be 0 or a divisor of bsize(%s). We revert it to %d. This won't change the result, but may make it slower."%(str(self.unroll_batch),str(self.bsize),new)
self.unroll_batch=mew
ifself.unroll_kern>0andself.nkern%unroll_kern!=0:
ifself.unroll_kern>0andself.nkern%unroll_kern!=0:
ifself.nkern<=self.unroll_kern:
ifself.nkern<=self.unroll_kern:
self.unroll_kern=self.nkern
self.unroll_kern=self.nkern
else:
else:
print"OPTIMISATION WARNING: in ConvOp.__init__() unroll_kern(%s) should be 0 or a divisor of nkern(%s)We revert it to 1. This won't change the result, but may make it slower."%(str(self.unroll_kern),str(self.nkern))
#find the maximum value under unroll_kern that would work
self.unroll_kern=1
new=self.unroll_kern
assert(new>=1)
whileself.nkern%new!=0:
new-=1
print"OPTIMISATION WARNING: in ConvOp.__init__() unroll_kern(%s) should be 0 or a divisor of nkern(%s)We revert it to %d. This won't change the result, but may make it slower."%(str(self.unroll_kern),str(self.nkern),new)
"""Reorder the dimensions of this variable, optionally inserting broadcasted dimensions.
"""Reorder the dimensions of this variable, optionally inserting broadcasted dimensions.
:param pattern: list of int mixed with 'x' for broadcastable dimensions
:param pattern: list/tuple of int mixed with 'x' for broadcastable dimensions
For example, to create a 3D view of a [2D] matrix, call ``dimshuffle([0,'x',1])``. This
For example, to create a 3D view of a [2D] matrix, call ``dimshuffle([0,'x',1])``. This
will create a 3D view such that the middle dimension is an implicit broadcasted
will create a 3D view such that the middle dimension is an implicit broadcasted
dimension. To do the same thing on the transpose of that matrix, call ``dimshuffle([1,
dimension. To do the same thing on the transpose of that matrix, call ``dimshuffle([1,
'x', 0])``.
'x', 0])``.
This function supports the pattern passed as a tuple, or as a variable-length argument (e.g. ``a.dimshuffle(pattern)`` is equivalent to ``a.dimshuffle(*pattern)`` where ``pattern`` is a list/tuple of ints mixed with 'x' characters).
raiseTypeError("The number of dimensions and/or broadcastable pattern of the input is incorrect for this op. Expected %s, got %s."%(self.input_broadcastable,ib))
raiseTypeError("The number of dimensions and/or broadcastable pattern of the input is incorrect for this op. Expected %s, got %s."%(self.input_broadcastable,ib))