[Cado-nfs-discuss] cado-nfs-2.1.1 crash during filtering

Greg Marks marks at member.ams.org
Sat Jun 13 01:23:53 CEST 2015


Dear Colleagues,

Many thanks to Paul Zimmermann and Pierrick Gaudry for their helpful
assistance with my 29 May 2015 problem.  A new problem has emerged with
the same factorization.

The program now terminates with the error message appended at the bottom
of this message.  The file $WORKDIR/c184.duplicates1//0/dup1.1.0000.gz
mentioned in the stderr below turns out to be a 20 byte file, which,
when gunzipped, is a zero byte file.

I tried simply removing the file
$WORKDIR/c184.duplicates1//0/dup1.1.0000.gz and restarting cadofactor.py
but got some error.  (As best I recall, something to the effect that
$WORKDIR/c184.duplicates1//0/dup1.1.0000.gz was listed in a binary
database file that I was unable to successfully edit, and cadofactor.py
complained of not finding the now-deleted file listed in the database.)
So I restarted the entire factorization with a new, empty working
directory, importing the sieving relations files already computed,
and the program stopped with the exact same problem: a 20 byte gzipped
empty duplicates file.

Two questions:

1. Is there a way for me to remove whatever is corrupted and restart
the computation from here?

2. When restarting a computation with a new working directory, importing
polynomials and sieving relations previously computed, the program
spends significant time on the "Generate Free Relations" step between
polynomial selection and lattice sieving.  To speed things up, is there a
way to import the factor base and the free relations previously computed?

Sincerely,
Greg Marks

    ------------------------------------------------
   | Greg Marks                                     |
   | Department of Mathematics and Computer Science |
   | St. Louis University                           |
   | St. Louis, MO 63103-2007                       |
   | U.S.A.                                         |
   |                                                |
   | Phone: (314)977-7206        Fax: (314)977-1452 |
   | PGP encryption public key ID: 0x53F269E8       |
   | Web: http://gmarks.org                         |
    ------------------------------------------------

Info:Filtering - Duplicate Removal, removal pass: Starting
Warning:Command: Process with PID 707 finished with return code 1
Error:Filtering - Duplicate Removal, removal pass: Program run on server failed with exit code 1
Error:Filtering - Duplicate Removal, removal pass: Command line was: $HOME/local/cado-nfs-build/filter/dup2 -nrels 5855241 -poly $WORKDIR/c184.polyselect2.poly -renumber $WORKDIR/c184.freerel.renumber.gz $WORKDIR/c184.duplicates1//0/dup1.2.0000.gz $WORKDIR/c184.duplicates1//0/dup1.0.0000.gz $WORKDIR/c184.duplicates1//0/dup1.1.0000.gz > $WORKDIR/c184.duplicates2.dup2.slice0.stdout.3 2> $WORKDIR/c184.duplicates2.dup2.slice0.stderr.3
Error:Filtering - Duplicate Removal, removal pass: Stderr output follows (stored in file $WORKDIR/c184.duplicates2.dup2.slice0.stderr.3):
b'antebuffer set to $HOME/local/cado-nfs-build/utils/antebuffer\nOpening $WORKDIR/c184.freerel.renumber.gz to read the renumbering table\nWarning, computed value of nb_bits (0) is different from the read value of nb_bits (32).\nRenumbering table: read 8388608 values from file in 3.4s -- 18.4 MB/s\nRenumbering table: read 16777216 values from file in 6.5s -- 19.8 MB/s\nRenumbering table: read 25165824 values from file in 10.3s -- 19.1 MB/s\nRenumbering table: read 33554432 values from file in 13.9s -- 19.0 MB/s\nRenumbering table: read 41943040 values from file in 18.0s -- 18.7 MB/s\nRenumbering table: read 50331648 values from file in 21.8s -- 18.8 MB/s\nRenumbering table: read 58720256 values from file in 25.7s -- 18.8 MB/s\nRenumbering table: read 67108864 values from file in 29.5s -- 18.9 MB/s\nRenumbering table: read 75497472 values from file in 33.1s -- 19.1 MB/s\nRenumbering table: read 83886080 values from file in 37.1s -- 19.0 MB/s\nRenumbering table: read 92274688 values from file in 40.9s -- 19.1 MB/s\nRenumbering table: read 100663296 values from file in 45.1s -- 18.9 MB/s\nRenumbering table: read 109051904 values from file in 49.0s -- 18.9 MB/s\nRenumbering table: read 117440512 values from file in 53.0s -- 18.9 MB/s\nRenumbering table: read 125829120 values from file in 56.9s -- 18.9 MB/s\nRenumbering table: read 134217728 values from file in 61.0s -- 18.9 MB/s\nRenumbering table: read 142606336 values from file in 65.0s -- 18.9 MB/s\nRenumbering table: read 150994944 values from file in 68.9s -- 18.9 MB/s\nRenumbering table: read 159383552 values from file in 71.6s -- 19.2 MB/s\nRenumbering table: read 167772160 values from file in 74.9s -- 19.4 MB/s\nRenumbering table: read 176160768 values from file in 77.2s -- 19.8 MB/s\nRenumbering table: read 184549376 values from file in 79.2s -- 20.2 MB/s\nRenumbering table: read 192937984 values from file in 81.3s -- 20.6 MB/s\nRenumbering table: read 201326592 values from file in 83.3s -- 21.0 MB/s\nRenumbering table: read 209715200 values from file in 85.4s -- 21.3 MB/s\nRenumbering table: read 218103808 values from file in 87.4s -- 21.7 MB/s\nRenumbering table: read 226492416 values from file in 89.3s -- 22.1 MB/s\nRenumbering table: read 234881024 values from file in 91.4s -- 22.4 MB/s\nRenumbering table: read 243269632 values from file in 93.4s -- 22.7 MB/s\nRenumbering table: read 251658240 values from file in 95.4s -- 23.0 MB/s\nRenumbering table: read 260046848 values from file in 97.6s -- 23.3 MB/s\nRenumbering table: read 268435456 values from file in 99.6s -- 23.6 MB/s\nRenumbering table: read 276824064 values from file in 101.7s -- 23.8 MB/s\nRenumbering table: read 285212672 values from file in 103.7s -- 24.1 MB/s\nRenumbering table: read 293601280 values from file in 105.8s -- 24.3 MB/s\nRenumbering table: read 301989888 values from file in 108.1s -- 24.5 MB/s\nRenumbering table: read 310378496 values from file in 109.5s -- 24.9 MB/s\nRenumbering table: read 318767104 values from file in 110.9s -- 25.2 MB/s\nRenumbering table: read 327155712 values from file in 112.1s -- 25.7 MB/s\nRenumbering table: read 335544320 values from file in 113.4s -- 26.0 MB/s\nRenumbering table: read 343932928 values from file in 114.5s -- 26.4 MB/s\nRenumbering table: read 352321536 values from file in 115.7s -- 26.8 MB/s\nRenumbering table: read 360710144 values from file in 116.9s -- 27.2 MB/s\nRenumbering table: read 369098752 values from file in 118.1s -- 27.5 MB/s\nRenumbering table: read 377487360 values from file in 119.4s -- 27.9 MB/s\nRenumbering table: read 385875968 values from file in 120.6s -- 28.2 MB/s\nRenumbering table: read 394264576 values from file in 121.9s -- 28.5 MB/s\nRenumbering table: read 402653184 values from file in 123.0s -- 28.9 MB/s\nRenumbering table: end of read. Read 406575736 values from file in 123.6s -- 29.0 MB/s\nRenumbering struct: nb_bits=32, sizeof(*table)=4, rat=0 nb_badideals=0 add_full_col=0\nRenumbering struct: nprimes=406575736\nRenumbering struct: first_not_cached=164321\n[checking true duplicates on sample of 70264 cells]\nAllocated hash table of 7026389 entries (26Mb)\nConstructing the two filelists...\nError while reading $WORKDIR/c184.duplicates1//0/dup1.1.0000.gz\n'
Traceback (most recent call last):
  File "$HOME/local/cado-nfs-2.1.1/scripts/cadofactor/cadofactor.py", line 72, in <module>
    factors = factorjob.run()
  File "$HOME/local/cado-nfs-2.1.1/scripts/cadofactor/cadotask.py", line 4720, in run
    last_status, last_task = self.run_next_task()
  File "$HOME/local/cado-nfs-2.1.1/scripts/cadofactor/cadotask.py", line 4788, in run_next_task
    return [task.run(), task.title]
  File "$HOME/local/cado-nfs-2.1.1/scripts/cadofactor/cadotask.py", line 2968, in run
    raise Exception("Program failed")
Exception: Program failed
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: Digital signature
URL: <http://lists.gforge.inria.fr/pipermail/cado-nfs-discuss/attachments/20150612/11569577/attachment.sig>


More information about the Cado-nfs-discuss mailing list