Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hi everyone,
I'm having issue writing back to a Oracle Onprem with Gateway.
My usecase is to trigger some refreshes in Power BI with api, and write status back on Oracle onprem data warehouse. This is up and running in Power Automate, but I now want to migrate all of this jobs into Data pipelines instead.
I can read from Oracle, but when I'm using copy activity to write from api back to oracel onprem its stucked in "Queued".
Here is the JSON for pipeline:
{
"name": "Write to Oracle",
"objectId": "<objectId>",
"properties": {
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"source": {
"type": "RestSource",
"httpRequestTimeout": "00:01:40",
"requestInterval": "00.00:00:00.010",
"requestMethod": "GET",
"paginationRules": {
"supportRFC5988": "true"
},
"datasetSettings": {
"annotations": [],
"type": "RestResource",
"typeProperties": {
"relativeUrl": "groups/<groupid>/datasets/<datasetid>/refreshes?$top=1"
},
"schema": [],
"externalReferences": {
"connection": "<connectionid>"
}
}
},
"sink": {
"type": "OracleSink",
"writeBatchSize": 10000,
"datasetSettings": {
"annotations": [],
"type": "OracleTable",
"schema": [],
"typeProperties": {
"schema": "<schemaname>",
"table": "PBI_REFRESH_LOG_MSV"
},
"externalReferences": {
"connection": "<connectionid>"
}
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "value.requestId",
"type": "String"
},
"sink": {
"name": "REQUESTID",
"type": "String"
}
},
{
"source": {
"name": "value.id",
"type": "Int64"
},
"sink": {
"name": "ID",
"type": "String"
}
},
{
"source": {
"name": "value.refreshType",
"type": "String"
},
"sink": {
"name": "REFRESHTYPE",
"type": "String"
}
},
{
"source": {
"name": "value.startTime",
"type": "String"
},
"sink": {
"name": "STARTTIME",
"type": "String"
}
},
{
"source": {
"name": "value.endTime",
"type": "String"
},
"sink": {
"name": "ENDTIME",
"type": "String"
}
},
{
"source": {
"name": "value.status",
"type": "String"
},
"sink": {
"name": "STATUS",
"type": "String"
}
},
{
"source": {
"name": "value.extendedStatus",
"type": "String"
},
"sink": {
"name": "EXTENDEDSTATUS",
"type": "String"
}
}
],
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
},
"columnFlattenSettings": {
"treatArrayAsString": false,
"treatStructAsString": false,
"flattenColumnDelimiter": "."
}
}
}
}
],
"lastModifiedByObjectId": "<objectid>",
"lastPublishTime": "2024-07-01T13:51:38Z"
}
I really appreciate all the help, as I have struggled with this for several days.
Br
Marius 🙂
Hi,
Last hope someone has done this and can tell me how 🙂
Marius
Is there anyone got this working? Write back to an oracle database with power bi premium gateway 🙂
Br
Marius
Hi @mariussve1 ,
The new connector for supported copy data activities still seems to be a preview feature, which includes oracle data sources.
Refer to below document:
What's new and planned for Data Factory in Microsoft Fabric - Microsoft Fabric | Microsoft Learn
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous and thank you for the answer 🙂
Even if the connector is in preview, it should work? Or am I missing something here?
Marius
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |