1. 29 10月, 2012 1 次提交
  2. 27 10月, 2012 2 次提交
  3. 26 10月, 2012 6 次提交
  4. 25 10月, 2012 17 次提交
  5. 24 10月, 2012 7 次提交
  6. 23 10月, 2012 5 次提交
  7. 18 10月, 2012 2 次提交
    • lamblin's avatar
      Merge pull request #1013 from pascanur/seq_bug_scan · 66126fc3
      lamblin 提交于
      bug scan
      66126fc3
    • Pascal Lamblin's avatar
      Re-add part of the dtype constraint on out grads · 3bd9ffde
      Pascal Lamblin 提交于
      In order to avoid expanding memory usage and computations in the part of
      the graph that computes gradients, I propose the following conventions,
      that re-instate some of the constraint that existed before on the
      dtype of gradients:
      - When calling some_op.grad(inputs, output_grads), each variable in the
        "output_grads" list, if it is an actual numeric variable (and not,
        for instance, DisconnectedType or NullType), should have the same
        dtype as the corresponding output variable.
      - Moreover, if one of the output variables is of a discrete dtype (int
        or uint), then the corresponding output gradient (if not a special
        case like NullType) should be zeros.
      
      This is implemented in theano.grad, so the Op's grad method does not
      have to be changed, but now it can rely again on the fact that, if an
      output gradient has a dtype, that dtype will be the same as the
      corresponding output variable.
      3bd9ffde