After
When a JSON record is being loaded in the large query through load order, we see the following error:
load BigQuery error in operation: error processing work 'job_d727bf8944884b20b709ded2887b7a13': flat price specified for failure information in record territory:
- unexpected. Please try again
- Rows over the maximum allowed size
The record in question looks fine under the maximum allowable size for JSON and well formed. Uses the schema of the table, a nested record, but neither connected nor schema format has changed recently so it is unclear why this error suddenly
Give update .:
we leave usually one max_bad_records run load operation with the ultimate order records made some bad manner that may exist. I tried loading the file was different sources you specify failure, and it seems to have succeeded this time with an error without max_bad_records with both defined
BQ load. - max_bad_records 20 NEWLINE_DELIMITED_JSON telemetry_data_2013_06_20 --source_format "gs: /.../ 2013-06-12-01 / IP-10-144-3-198. Log"
working here There are ID:
job_5822a36c5c364117a6651f3e8b81b49f job_ed4080f9f60c485bb265c09367902f00
Why did this happen again successful this file
According to the logs we have, it appears that your load job is re-entered which is "": without any extra data. "
It is possible that there is a problem in BigQuery, in which in some cases, Sets a record size limit of 2 MB compared to unmodified 16MB Can you confirm that the "row extends the maximum allowable size" throws the response below 2 MB?
No comments:
Post a Comment