Custom Query (1030 matches)
Results (301 - 303 of 1030)
| Ticket | Resolution | Summary | Owner | Reporter |
|---|---|---|---|---|
| #370 | fixed | AREF/ASET optimizations with undeclared rank | ||
| Description |
From gb:
declares that A and B are SIMPLE-ARRAYs with unspecified dimensionality. We'll generally only try to open-code the AREF/CCL::ASET if the dimensionality of the array is specified in the declaration. In "reasonably safe" code, if the declared dimensionality of the array was unpecified (or specified as *), we could treat
as
and typecheck that A is in fact a one-dimensional array of the declared type before doing anything with it. If A was in fact of the wrong dimensionality, we'd get a type error (A isn't a simple-one-dimensional-array of the specified type) rather than a wrong-number-of-subscripts error, but I don't think that AREF's error behavior is too rigidly specified. In unsafe code, we crash and burn if the actual dimensionality doesn't match, but win in more cases if it does. |
|||
| #371 | fixed | permissions and GET-DESCRIPTOR-FOR | ||
| Description |
When the lisp's current directory is not writable, GET-DESCRIPTOR-FOR can fail. If OBJECT is a stream, then the function tries to create a temporary file in the current directory. It should create the temporary file in /tmp (or whatever). |
|||
| #372 | invalid | Problem with basic arithmetic | ||
| Description |
Hard to believe but this is what I'm getting: ? (= (+ .6 .8) 1.4) NIL ? (+ .8 .6) 1.4000001 ? (+ .6 .8) 1.4000001 ? (+ .6 .8 .1) 1.5000001 ? (- .8 .6) 0.19999999 I do not see this for other numbers, e.g. ? (+ .6 .9) 1.5 ? (+ .6 .7) 1.3 .6 and .8 somehow have to be involved, but this is not systematic: e.g. ? (+ .1 .6 .8) 1.5 I tried other numbers but could not find similar behavior for addition, but for substraction i'm also getting: ? (- .9 .6) 0.29999995 ? (- .6 .9) -0.29999995 ? (- .6 .5) 0.100000024 ? (- .7 .8) -0.100000024 I also seem to getting similar problems with multiplication but only with numbers 0<x<1 and with one decimal this is one particular terminal test session: $ ccl Welcome to Clozure Common Lisp Version 1.2-r11241M (DarwinX8664)! ? (= (+ .6 .8) 1.4) NIL ? (= (+ .6 .9) 1.5) T ? (+ .6 .8) 1.4000001 ? (+ .6 .9) 1.5 ? (+ .6 .7) 1.3 ? (+ .8 .6) 1.4000001 ? (+ .5 .6) 1.1 ? (+ .8 .6 .1) 1.5000001 ? (- .6 .8) -0.19999999 ? (- .5 .8) -0.3 ? (- .6 .6) 0.0 ? (- .6 .7) -0.099999964 ? (* .1 .1) 0.010000001 ? (* 1 1) 1 ? (* 2 .1) 0.2 ? (* 1.1 1.1) 1.21 ? (* 1.1 .1) 0.11000001 ? (* .2 .2) 0.040000003 ? (* .1 1) 0.1 ? (* .1 .1111) 0.01111 ? (* .1 .11) 0.011 ? (* .1 .1) 0.010000001 ? (* .11 .11) 0.0121
platform: Mac OS X 10.5.5 on 64bit intel What could be the problem here? |
|||
