[Cado-nfs-discuss] [error] Failed on the filtering step.

canny georgina cannysiska at gmail.com
Thu Jul 26 06:56:08 CEST 2018


Previously, when I'm doing factoring on the c110 using imported polynomial,
the problem about the large special-q is somehow solved using the
allow_largesq=true option given at the end of the command. After that,
sieving has been done up to 100% and then gone to filtering step (duplicate
removal, and such..).

However, i found that it stopped halfway (during filtering step) as it
shows something like this:
Info:Lattice Sieving: We want 4528085 relation(s)
Info:Lattice Sieving: Reached target of 4528085 relations, now have 4528093
Info:Lattice Sieving: Aggregate statistics:
Info:Lattice Sieving: Total number of relations: 4528093
Info:Lattice Sieving: Average J: 3794.19 for 110295 special-q, max bucket
fill -bkmult 1.0,1s:1.448270
Info:Lattice Sieving: Total time: 240010s
Info:Filtering - Duplicate Removal, splitting pass: Starting
Info:Filtering - Duplicate Removal, splitting pass: No new files to split
Info:Filtering - Duplicate Removal, splitting pass: Relations per slice: 0:
2264054, 1: 2264039
Info:Filtering - Duplicate Removal, splitting pass: Total cpu/real time for
dup1: 58.14/607.607
Info:Filtering - Duplicate Removal, splitting pass: Aggregate statistics:
Info:Filtering - Duplicate Removal, splitting pass: CPU time for dup1:
605.6s
Info:Filtering - Duplicate Removal, removal pass: Starting
Warning:Command: Process with PID 46510 finished with return code -6
Error:Filtering - Duplicate Removal, removal pass: Program run on server
failed with exit code -6
Error:Filtering - Duplicate Removal, removal pass: Command line was:
/home/chunnie/Desktop/Math3/cado-nfs/build/ubuntu/filter/dup2 -nrels
2264054 -renumber /tmp/cado.ky9t_u79/c110.renumber.gz
/tmp/cado.ky9t_u79/c110.dup1//0/dup1.0.0000.gz >
/tmp/cado.ky9t_u79/c110.dup2.slice0.stdout.2 2>
/tmp/cado.ky9t_u79/c110.dup2.slice0.stderr.2
Error:Filtering - Duplicate Removal, removal pass: Stderr output follows
(stored in file /tmp/cado.ky9t_u79/c110.dup2.slice0.stderr.2):
b'antebuffer set to
/home/chunnie/Desktop/Math3/cado-nfs/build/ubuntu/utils/antebuffer\n[checking
true duplicates on sample of 27170 cells]\nAllocated hash table of 2716969
entries (10Mb)\nConstructing the two filelists...\n1 files (1 new and 0
already renumbered)\nReading files already renumbered:\nReading new files
(using 4 auxiliary threads for roots mod p):\nFatal error in
renumber_get_first_index_from_p at
/home/chunnie/Desktop/Math3/cado-nfs/utils/renumber.c:981\nIdeal (p, side)
= (0xce83705, 0) is bigger that large prime bound 2^25\nFatal error in
renumber_get_first_index_from_p at
/home/chunnie/Desktop/Math3/cado-nfs/utils/renumber.c:981\nIdeal (p, side)
= (0x48f7f67, 0) is bigger that large prime bound 2^25\nFatal error in
renumber_get_first_index_from_p at
/home/chunnie/Desktop/Math3/cado-nfs/utils/renumber.c:981\nIdeal (p, side)
= (0x409c88f, 1) is bigger that large prime bound 2^26\nFatal error in
renumber_get_first_index_from_p at
/home/chunnie/Desktop/Math3/cado-nfs/utils/renumber.c:981\nIdeal (p, side)
= (0x409c88f, 1) is bigger that large prime bound 2^26\n'
Traceback (most recent call last):
  File "./cado-nfs.py", line 122, in <module>
    factors = factorjob.run()
  File "./scripts/cadofactor/cadotask.py", line 5754, in run
    last_status, last_task = self.run_next_task()
  File "./scripts/cadofactor/cadotask.py", line 5829, in run_next_task
    return [task.run(), task.title]
  File "./scripts/cadofactor/cadotask.py", line 3560, in run
    raise Exception("Program failed")
Exception: Program failed

Thank you
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.gforge.inria.fr/pipermail/cado-nfs-discuss/attachments/20180726/d9c46bbc/attachment.html>


More information about the Cado-nfs-discuss mailing list