Hello everybody.

I have a job to automate that consists of sending multiple pieces of data for maximum end-user ease of use. Uses CSV emulation with following data headers:
Code:
PDF file path, Quantity, Initial code prefix, Initial code numerical value, Initial code suffix


I am using an overlay page with the following PressTalk code on the run page:
Code:
define(&LoopCount,integer,0)
define(&LoopPDF,integer,0)
define(&PDF,string,trim(@(1,1,500))) % file path
define(&NoPages,integer,PdfPageCount(&pdf)) % number of PDF pages
define(&PDFCount,integer,strtoint(trim(@(2,1,7)))) % number of PDFs created
define(&Counter,integer,length(&PDFCount)) % used in further processing outside the template
define(&TracerPre,string,trim(@(3,1,500))) % code variables
define(&TracerNum,string,trim(@(4,1,500))) 
define(&TracerNumLen,integer,length(trim(@(4,1,500)))) % numeric part character count to ensure consistent code length
define(&TracerSuf,string,trim(@(5,1,500)))

For(&LoopPDF,1,1,&PDFCount) % loop through copies of doc
For(&LoopCount,1,1,&NoPages) % loop through PDF pages
&tracer:=&tracerpre+&tracernum+&tracersuf
&pdfpage:=&LoopCount
$Page1
showpage()
endfor()
%&tracernum:=right('00000000000000000000'+inttostr(strtoint(&tracernum)+1),&tracernumlen) % convert string to int, increment by one and convert back to string
endfor()


Run on any machine it is PAINFULLY slow at any number of records. Processing a file with the quantity of 1 takes minutes, and the intended use is creating thousands of records. Then imported to Workflow and fed an actual data file (same as used for sample data) there is no output generated (file size is in KBs and when an opening attempt is made there is a "file corrupt" error message). Background image PDF file size is a little over 1,5MB.

How can I improve performance and make sure I get the output I need? Thanks in advance.


Edited by puszczyk (07/15/16 05:50 AM)