Skip to main content


This article provides information about External support in Guzzle and what are different external library Guzzle supports.

Guzzle supports External Framework and tools.

Guzzle External supports below Datastore

  1. Databricks

  2. JDBC

  3. ADF

Guzzle provider below functionality and parameters for External

PropertyDescriptionDefault ValueRequired
DatastoreYou can choose any of the available Datastore from drop-down (as appropriate). If the connection is not available in drop-down then you can create new data storeNoneYes
Script (Only Applicable for JDBC)Used to specify script to run as externalNoneYes
Run NameUsed to specify external Run name to identifyNoneNo
Timeout SecondsSpecify When guzzle tries to connect external tools and how long stick this.NoneNo
TaskUsed to specify external tool type guzzle supports 1) Notebook, 2) Spark Jar, 3) PySparkNoneNo
Notebook Path (only applicable if Task is Notebook)Used to specify the Databricks Notebook full pathNoneYes
Main Class (only applicable if Task is Spark Jar)Used to specify the main class name of the spark jar file.NoneYes
Script Path (Only applicable if Task is PySpark)Used to specify the script path of an external file.NoneYes
Revision TimestampNoneNo
ParametersUsed to specify Parameter it’s name and value pair which guzzle used while running external library.NoneNo
LibrariesSpecify external librariesNoneNo

Libraries :#

Guzzle User can add a new library by click on click to add library, Guzzle supports below libraries configs.

  1. DBFS

  2. PyPi

  3. Maven

  4. CRAN

Dependency :#

Guzzle used to know dependency between lineages so while the user ran the job under pipeline guzzle will prepare dependency stack and follow that stack.

Users can specify source and target dependency by clicking on dependency tab.

User has to specify

Endpoints : Select dependent endpoint from dropdowns.

Property : it is autopopulated based on selected endpoints.

Value : Specify dependent value it may be table, files ...