The PTC Community is on temporary read only status in preparation for moving our community to a new platform. Learn more here
The query result depends on how it is executed.
Obviously, the larger the dataset to be retrieved, the longer it takes to complete the query.
This is the first time I’ve heard of someone running a report and not getting a result after 12 hours.
How many GB of RAM does your installation have?
Most installations have a maximum limit set for the number of results that can be returned, to prevent the report from running unnecessarily without producing results or from saturating the system’s RAM.
My advice is to “split” the report into multiple reports by filtering, for example, by supplier groups or some other attribute relevant to your needs.
Can you post your QML? Perhaps there is something inefficient with it. Your DBA should also be able to track long running queries, likely related to this and see about tuning to make it run better or use an index as opposed to full table scans.
I have a QML that reports on Buy parts, part classifications and the AML/AVL. It does not take too long to run (about 10K parts) but have noticed first execution fails or times out. If I repeat the query, data might be cached and it does succeed.
For extremely large reports, I typically use a straight SQL call and SQLDeveloper or Toad for Oracle. They allow export straight to Excel for any analysis.
This is almost certainly an issue with the report structure. Reports can run very quickly for large amounts of data if they're simple. Performance issues occur when the report is not simple.
Without seeing the report format directly, the general advice I can give is to avoid any "IF" logic. For example, this report will perform poorly at scale: Return all manufacturer parts, and show "IF" they have a particular type of document linked. You'd want to break this by using 2 reports. Example methods:
