Page 1 of 1

BigQuery Destination - Bulk Insert

Posted: Tue 17 Nov 2020 09:51
by GAReporting
Good morning,

I'm having some trouble setting up a BigQuery Destination using the Bulk Insert method. The process works if I turn Bulk Insert off, but it's very slow so the actual connection seems to be OK.

The error message I get is pasted below.

Code: Select all

Error: 0x0 at Upload Instruction Data, Devart BigQuery Destination: An exception has occurred during insert operation, the message returned from the provider is:
Insufficient Permission
Error: 0xC0047062 at Upload Instruction Data, Devart BigQuery Destination [2]: System.Exception: An exception has occurred during insert operation, the message returned from the provider is:
Insufficient Permission ---> Devart.Data.BigQuery.BigQueryException: Insufficient Permission
   at Devart.Data.t.a[a](Boolean A_0, Int32 A_1, Func`1 A_2)
   at Devart.Data.BigQuery.ai.a(String A_0)
   at Devart.Data.BigQuery.c.g9()
   at Devart.SSIS.SqlShim.DestinationComponent.ProcessInput(Int32 inputID, PipelineBuffer buffer)
   --- End of inner exception stack trace ---
   at Devart.SSIS.SqlShim.DestinationComponent.ProcessInput(Int32 inputID, PipelineBuffer buffer)
   at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)
Error: 0xC0047022 at Upload Instruction Data, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "Devart BigQuery Destination" (2) failed with error code 0x80131500 while processing input "input" (11). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
Are there any logs created locally that would give more detailed information? I've had a look around on the Google console and couldn't find any logs that shed any light, but I'm not an expert on the Google stack.

Any info / suggestions would be appreciated.

Re: BigQuery Destination - Bulk Insert

Posted: Wed 18 Nov 2020 19:03
by Shalex
GAReporting wrote: Tue 17 Nov 2020 09:51

Code: Select all

Devart.Data.BigQuery.BigQueryException: Insufficient Permission
   at Devart.Data.t.a[a](Boolean A_0, Int32 A_1, Func`1 A_2)
   at Devart.Data.BigQuery.ai.a(String A_0)
   at Devart.Data.BigQuery.c.g9()
   at Devart.SSIS.SqlShim.DestinationComponent.ProcessInput(Int32 inputID, PipelineBuffer buffer)
The error occurs when our component talks to Cloud Storage Bucket. Make sure that you have permissions to work with it. Refer to https://stackoverflow.com/questions/272 ... ssion-from.

Devart BigQuery Destination with BulkInsert=True loads data in the following way to achieve the highest performance:
* First, it writes data to a CSV file, till the CSV file reaches the size, specified in the UploadBatchSize property. After this, it uploads the CSV file to Google Cloud Storage and writes data to a new CSV file.
* After the total size of uploaded files reaches the limit, specified in the BatchSize property, Devart BigQuery Destination tells Google BigQuery to import data from these CSV files. After import, uploaded files are deleted.

Devart BigQuery Destination with BulkInsert=False uses usual INSERT statement to insert data.
GAReporting wrote: Tue 17 Nov 2020 09:51Are there any logs created locally that would give more detailed information?
BulkInsert doesn't create any logs, that's why it is fast.