Problems downloading

3 posts / 0 new
Last post
Problems downloading

Hi, I work for WWBIC, the LERC for West Wales. Since the new year I have tried downloading iRecord general records 3 or 4 times and each time i have had an error meassage. I am using LRC download - current format. The error is below.

Can anyone help?


{"error":"There was an SQL error: ERROR:  canceling statement due to statement timeout - \n  SELECT  distinct on ( as \"occurrence_id\",\n'iBRC' || as \"recordkey\",\no.external_key as \"external_key\",\nsnf.website_title || ' | ' || snf.survey_title || coalesce(' | ' || snf.group_title, '') as \"source\",\ncase when onf.sensitivity_precision is null then o.sample_id else null end as \"sample_id\",\ncttl.preferred_taxon as \"taxon\",\ncttl.default_common_name as \"common\",\ncttl.taxon_group as \"taxon_group\",\ncttl.kingdom_taxon as \"kingdom\",\ncttl.order_taxon as \"order\",\ncttl.family_taxon as \"family\",\no.taxa_taxon_list_external_key as \"taxonversionkey\",\no.taxa_taxon_list_id as \"taxa_taxon_list_id\",\ncase when onf.sensitivity_precision is null and onf.privacy_precision is null then o.location_name else 'Sensitive. Lat long is approximate.' end as \"location_name\",\nsnf.public_entered_sref as \"entered_sref\",\nsubstring(st_astext(st_transform(st_centroid(o.public_geom), 4326)) from E'POINT\\\\(.+ (.+)\\\\)') as \"lat\",\nsubstring(st_astext(st_transform(st_centroid(o.public_geom), 4326)) from E'POINT\\\\((.+) ') as \"long\",\ncase snf.entered_sref_system when '4326' then 'WGS84' when '27700' then 'OSGB36' else upper(snf.entered_sref_system) end as \"projection\",\nget_sref_precision(snf.public_entered_sref, snf.entered_sref_system, snf.attr_sref_precision) as \"precision\",\nonf.output_sref as \"output_sref\",\ncase onf.output_sref_system when '4326' then 'WGS84' when '27700' then 'OSGB36' else upper(onf.output_sref_system) end as \"output_sref_projection\",\nsnf.attr_biotope as \"attr_biotope\",\n(SELECT string_agg(vc.code, '; ')         FROM locations vc         WHERE vc.location_type_id=15         AND NOT LIKE '%+%'         AND = ANY(o.location_ids)) as \"vicecounty_number\",\n(SELECT string_agg(, '; ')         FROM locations vc         WHERE vc.location_type_id=15         AND NOT LIKE '%+%'         AND = ANY(o.location_ids)) as \"vicecounty\",\nnull as date,\no.date_start as \"date_start\",\no.date_end as \"date_end\",\no.date_type as \"date_type\",\nsnf.attr_sample_method as \"attr_sample_method\",\nrtrim(snf.recorders, ', ') as \"recorder\",\nCASE WHEN onf.attr_det_full_name IS NULL THEN         CASE WHEN onf.attr_det_last_name IS NULL THEN NULL ELSE onf.attr_det_last_name || COALESCE(', ' || onf.attr_det_first_name, '') END       ELSE onf.attr_det_full_name END as \"determiner\",\nCASE o.certainty WHEN 'C' THEN 'Certain' WHEN 'L' THEN 'Likely' WHEN 'U' THEN 'Uncertain' END as \"certainty\",\nonf.attr_sex as \"attr_sex\",\nonf.attr_stage as \"attr_stage\",\nonf.attr_sex_stage_count as \"attr_sex_stage_count\",\nupper(cast (o.zero_abundance as character)) as \"zeroabundance\",\nonf.comment as \"record_comment\",\nsnf.comment as \"sample_comment\",\ncase when is null then null else 'http:\/\/\/upload\/' || replace(, ',', ', http:\/\/\/upload\/') end as \"images\",\no.created_on as \"input_date\",\no.updated_on as \"last_edit_date\",\nCASE o.record_status         WHEN 'V' THEN 'Accepted'         WHEN 'C' THEN 'Unconfirmed'         WHEN 'R' THEN 'Rejected'         WHEN 'I' THEN 'Input still in progress'         WHEN 'D' THEN 'Queried'         WHEN 'S' THEN 'Awaiting check'         ELSE o.record_status         END as \"record_status\",\nCASE o.record_status         WHEN 'V' THEN           CASE o.record_substatus WHEN 1 THEN 'correct' WHEN 2 THEN 'Considered correct' ELSE NULL END         WHEN 'C' THEN           CASE o.record_substatus WHEN 3 THEN 'Plausible' ELSE 'Not reviewed' END         WHEN 'R' THEN           CASE o.record_substatus WHEN 4 THEN 'Unable to verify' WHEN 5 THEN 'Incorrect' ELSE NULL END         ELSE NULL         END as \"record_substatus\",\ncase o.query when 'A' then 'Answered' when 'Q' then 'Queried' end as \"query\",\nonf.verifier as \"verifier\",\no.verified_on as \"verified_on\",\nonf.licence_code as \"licence_code\",\nCASE WHEN o.data_cleaner_result='t' THEN 'pass' WHEN not o.verification_checks_enabled THEN 'checks disabled' WHEN o.data_cleaner_result IS NULL THEN 'pending' ELSE onf.data_cleaner_info END as \"autochecks\",\nCASE CAST(o.created_by_id AS character varying) WHEN '#currentUser#' THEN true ELSE false END as \"belongs_to_user\",\ as attr_id_sample_22,\nsample22.text_value as attr_sample_22,\ as attr_id_sample_36,\nsample36.text_value as attr_sample_36,\ as attr_id_sample_58,\nsample58.text_value as attr_sample_58,\ as attr_id_sample_127,\nsample127.text_value as attr_sample_127,\ as attr_id_sample_209,\nsample209.int_value as attr_sample_209,\nltt209.term as attr_sample_term_209,\ as attr_id_occurrence_18,\noccurrence18.text_value as attr_occurrence_18,\ as attr_id_occurrence_54,\noccurrence54.int_value as attr_occurrence_54,\nltt54.term as attr_occurrence_term_54,\ as attr_id_occurrence_93,\noccurrence93.text_value as attr_occurrence_93,\ as attr_id_occurrence_105,\noccurrence105.int_value as attr_occurrence_105,\nltt105.term as attr_occurrence_term_105,\ as attr_id_occurrence_106,\noccurrence106.int_value as attr_occurrence_106,\nltt106.term as attr_occurrence_term_106\n  FROM cache_occurrences_functional o\n  JOIN cache_occurrences_nonfunctional onf on\n  JOIN cache_samples_nonfunctional snf on\n  JOIN cache_taxa_taxon_lists cttl on\n  JOIN websites w on and w.deleted=false\n  JOIN system sys ON\n  LEFT JOIN sample_attribute_values sample22 ON sample22.sample_id=o.sample_id AND sample22.sample_attribute_id=22 AND sample22.deleted=false\n LEFT JOIN sample_attribute_values sample36 ON sample36.sample_id=o.sample_id AND sample36.sample_attribute_id=36 AND sample36.deleted=false\n LEFT JOIN sample_attribute_values sample58 ON sample58.sample_id=o.sample_id AND sample58.sample_attribute_id=58 AND sample58.deleted=false\n LEFT JOIN sample_attribute_values sample127 ON sample127.sample_id=o.sample_id AND sample127.sample_attribute_id=127 AND sample127.deleted=false\n LEFT JOIN sample_attribute_values sample209 ON sample209.sample_id=o.sample_id AND sample209.sample_attribute_id=209 AND sample209.deleted=false\nLEFT JOIN cache_termlists_terms ltt209 ON\n LEFT JOIN occurrence_attribute_values occurrence18 ON AND occurrence18.occurrence_attribute_id=18 AND occurrence18.deleted=false\n LEFT JOIN occurrence_attribute_values occurrence54 ON AND occurrence54.occurrence_attribute_id=54 AND occurrence54.deleted=false\nLEFT JOIN cache_termlists_terms ltt54 ON\n LEFT JOIN occurrence_attribute_values occurrence93 ON AND occurrence93.occurrence_attribute_id=93 AND occurrence93.deleted=false\n LEFT JOIN occurrence_attribute_values occurrence105 ON AND occurrence105.occurrence_attribute_id=105 AND occurrence105.deleted=false\nLEFT JOIN cache_termlists_terms ltt105 ON\n LEFT JOIN occurrence_attribute_values occurrence106 ON AND occurrence106.occurrence_attribute_id=106 AND occurrence106.deleted=false\nLEFT JOIN cache_termlists_terms ltt106 ON\n \n  WHERE AND (o.website_id in (23) OR o.created_by_id=1 OR o.blocked_sharing_tasks IS NULL OR NOT o.blocked_sharing_tasks @> ARRAY['data_flow'::character ])\n  \n  AND o.release_status='R'\nAND o.confidential='f'\nAND o.survey_id in (42)\nAND ('2010-01-01'='Click here' OR o.date_end >= CAST(COALESCE('2010-01-01','1500-01-01') as date))\nAND ('2010-06-01'='Click here' OR o.date_start <= CAST(COALESCE('2010-06-01','1500-01-01') as date))\nAND o.location_ids && ARRAY[1465]\n ORDER BY DESC LIMIT 30001","code":44,"file":"D:\\websites\\warehouse\\system\\libraries\\drivers\\Database\\Pgsql.php","line":342,"trace":[]}

You're getting error messages

You're getting error messages?  I've been trying all day to down load verified records, and whatever date range I try, I get "Contacting iRecord" at the bottom of the browser for a few minutes, which then goes way with nothing downloaded.  No messages, it just silently fails.

Edit: and today it's working again.


Thanks for reporting.  We are

Thanks for reporting.  We are aware of problems for large downloads and are implementing a solution to resolve.  This should be available within the next month. In the meantime we suggest using filters (i.e. by date) to download in smaller chunks.


Log in to post comments