Skip to main content

Looker

There are 2 sources that provide integration with Looker

Source ModuleDocumentation

looker

This plugin extracts the following:

  • Looker dashboards, dashboard elements (charts) and explores
  • Names, descriptions, URLs, chart types, input explores for the charts
  • Schemas and input views for explores
  • Owners of dashboards
note

To get complete Looker metadata integration (including Looker views and lineage to the underlying warehouse tables), you must ALSO use the lookml module.

Read more...

lookml

This plugin extracts the following:

  • LookML views from model files in a project
  • Name, upstream table names, metadata for dimensions, measures, and dimension groups attached as tags
  • If API integration is enabled (recommended), resolves table and view names by calling the Looker API, otherwise supports offline resolution of these names.
note

To get complete Looker metadata integration (including Looker dashboards and charts and lineage to the underlying Looker views, you must ALSO use the looker source module.

Read more...

Module looker

Certified

Important Capabilities

CapabilityStatusNotes
Dataset UsageEnabled by default, configured using extract_usage_history
DescriptionsEnabled by default
Extract OwnershipEnabled by default, configured using extract_owners
Platform InstanceNot supported

This plugin extracts the following:

  • Looker dashboards, dashboard elements (charts) and explores
  • Names, descriptions, URLs, chart types, input explores for the charts
  • Schemas and input views for explores
  • Owners of dashboards
note

To get complete Looker metadata integration (including Looker views and lineage to the underlying warehouse tables), you must ALSO use the lookml module.

Prerequisites

Set up the right permissions

You need to provide the following permissions for ingestion to work correctly.

access_data
explore
manage_models
see_datagroups
see_lookml
see_lookml_dashboards
see_looks
see_pdts
see_queries
see_schedules
see_sql
see_system_activity
see_user_dashboards
see_users

Here is an example permission set after configuration. Looker DataHub Permission Set

Get an API key

You need to get an API key for the account with the above privileges to perform ingestion. See the Looker authentication docs for the steps to create a client ID and secret.

Ingestion through UI

The following video shows you how to get started with ingesting Looker metadata through the UI.

note

You will need to run lookml ingestion through the CLI after you have ingested Looker metadata through the UI. Otherwise you will not be able to see Looker Views and their lineage to your warehouse tables.

CLI based Ingestion

Install the Plugin

pip install 'acryl-datahub[looker]'

Starter Recipe

Check out the following recipe to get started with ingestion! See below for full configuration options.

For general pointers on writing and running a recipe, see our main recipe guide.

source:
type: "looker"
config:
# Coordinates
base_url: "https://<company>.cloud.looker.com"

# Credentials
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}

# sink configs

Config Details

Note that a . is used to denote nested fields in the YAML recipe.

View All Configuration Options
Field [Required]TypeDescriptionDefaultNotes
actor [✅]stringThis config is deprecated in favor of extract_owners. Previously, was the actor to use in ownership properties of ingested metadata.None
base_url [✅]stringUrl to your Looker instance: https://company.looker.com:19999 or https://looker.company.com, or similar. Used for making API calls to Looker and constructing clickable dashboard and chart urls.None
client_id [✅]stringLooker API client id.None
client_secret [✅]stringLooker API client secret.None
external_base_url [✅]stringOptional URL to use when constructing external URLs to Looker if the base_url is not the correct one to use. For example, https://looker-public.company.com. If not provided, the external base URL will default to base_url.None
extract_column_level_lineage [✅]booleanWhen enabled, extracts column-level lineage from Views and ExploresTrue
extract_embed_urls [✅]booleanProduce URLs used to render Looker Explores as Previews inside of DataHub UI. Embeds must be enabled inside of Looker to use this feature.True
extract_owners [✅]booleanWhen enabled, extracts ownership from Looker directly. When disabled, ownership is left empty for dashboards and charts.True
extract_usage_history [✅]booleanWhether to ingest usage statistics for dashboards. Setting this to True will query looker system activity explores to fetch historical dashboard usage.True
extract_usage_history_for_interval [✅]stringUsed only if extract_usage_history is set to True. Interval to extract looker dashboard usage history for. See https://docs.looker.com/reference/filter-expressions#date_and_time.30 days
include_deleted [✅]booleanWhether to include deleted dashboards.None
max_threads [✅]integerMax parallelism for Looker API calls. Defaults to cpuCount or 404
platform_instance [✅]stringThe instance of the platform that all assets produced by this recipe belong toNone
platform_name [✅]stringDefault platform name. Don't change.looker
skip_personal_folders [✅]booleanWhether to skip ingestion of dashboards in personal folders. Setting this to True will only ingest dashboards in the Shared folder space.None
strip_user_ids_from_email [✅]booleanWhen enabled, converts Looker user emails of the form name@domain.com to urn:li:corpuser:name when assigning ownershipNone
tag_measures_and_dimensions [✅]booleanWhen enabled, attaches tags to measures, dimensions and dimension groups to make them more discoverable. When disabled, adds this information to the description of the column.True
env [✅]stringThe environment that all assets produced by this connector belong toPROD
chart_pattern [✅]AllowDenyPatternPatterns for selecting chart ids that are to be included{'allow': ['.*'], 'deny': [], 'ignoreCase': True}
chart_pattern.allow [❓ (required if chart_pattern is set)]array(string)None
chart_pattern.deny [❓ (required if chart_pattern is set)]array(string)None
chart_pattern.ignoreCase [❓ (required if chart_pattern is set)]booleanWhether to ignore case sensitivity during pattern matching.True
dashboard_pattern [✅]AllowDenyPatternPatterns for selecting dashboard ids that are to be included{'allow': ['.*'], 'deny': [], 'ignoreCase': True}
dashboard_pattern.allow [❓ (required if dashboard_pattern is set)]array(string)None
dashboard_pattern.deny [❓ (required if dashboard_pattern is set)]array(string)None
dashboard_pattern.ignoreCase [❓ (required if dashboard_pattern is set)]booleanWhether to ignore case sensitivity during pattern matching.True
explore_browse_pattern [✅]LookerNamingPatternPattern for providing browse paths to explores. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '/{env}/{platform}/{project}/explores'}
explore_browse_pattern.pattern [❓ (required if explore_browse_pattern is set)]stringNone
explore_naming_pattern [✅]LookerNamingPatternPattern for providing dataset names to explores. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '{model}.explore.{name}'}
explore_naming_pattern.pattern [❓ (required if explore_naming_pattern is set)]stringNone
transport_options [✅]TransportOptionsConfigPopulates the TransportOptions struct for looker clientNone
transport_options.headers [❓ (required if transport_options is set)]map(str,string)None
transport_options.timeout [❓ (required if transport_options is set)]integerNone
view_browse_pattern [✅]LookerNamingPatternPattern for providing browse paths to views. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '/{env}/{platform}/{project}/views'}
view_browse_pattern.pattern [❓ (required if view_browse_pattern is set)]stringNone
view_naming_pattern [✅]LookerNamingPatternPattern for providing dataset names to views. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '{project}.view.{name}'}
view_naming_pattern.pattern [❓ (required if view_naming_pattern is set)]stringNone
stateful_ingestion [✅]StatefulStaleMetadataRemovalConfigBase specialized config for Stateful Ingestion with stale metadata removal capability.None
stateful_ingestion.enabled [❓ (required if stateful_ingestion is set)]booleanThe type of the ingestion state provider registered with datahub.None
stateful_ingestion.ignore_new_state [❓ (required if stateful_ingestion is set)]booleanIf set to True, ignores the current checkpoint state.None
stateful_ingestion.ignore_old_state [❓ (required if stateful_ingestion is set)]booleanIf set to True, ignores the previous checkpoint state.None
stateful_ingestion.remove_stale_metadata [❓ (required if stateful_ingestion is set)]booleanSoft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled.True

Code Coordinates

  • Class Name: datahub.ingestion.source.looker.looker_source.LookerDashboardSource
  • Browse on GitHub

Module lookml

Certified

Important Capabilities

CapabilityStatusNotes
Column-level LineageEnabled by default, configured using extract_column_level_lineage
Platform InstanceSupported using the connection_to_platform_map
Table-Level LineageSupported by default

This plugin extracts the following:

  • LookML views from model files in a project
  • Name, upstream table names, metadata for dimensions, measures, and dimension groups attached as tags
  • If API integration is enabled (recommended), resolves table and view names by calling the Looker API, otherwise supports offline resolution of these names.
note

To get complete Looker metadata integration (including Looker dashboards and charts and lineage to the underlying Looker views, you must ALSO use the looker source module.

Prerequisites

To use LookML ingestion through the UI, or automate github checkout through the cli, you must set up a GitHub deploy key for your Looker GitHub repository. Read this document for how to set up deploy keys for your Looker git repo.

In a nutshell, there are three steps:

  1. Generate a private-public ssh key pair. This will typically generate two files, e.g. looker_datahub_deploy_key (this is the private key) and looker_datahub_deploy_key.pub (this is the public key) Image

  2. Add the public key to your Looker git repo as a deploy key with read access (no need to provision write access). Follow the guide here for that. Image

  3. Make note of the private key file, you will need to paste the contents of the file into the GitHub Deploy Key field later while setting up ingestion using the UI.

[Optional] Create an API key with admin privileges

See the Looker authentication docs for the steps to create a client ID and secret. You need to ensure that the API key is attached to a user that has Admin privileges.

If that is not possible, read the configuration section and provide an offline specification of the connection_to_platform_map and the project_name.

Ingestion Options

You have 3 options for controlling where your ingestion of LookML is run.

  • The DataHub UI (recommended for the easiest out-of-the-box experience)
  • As a GitHub Action (recommended to ensure that you have the freshest metadata pushed on change)
  • Using the CLI (scheduled via an orchestrator like Airflow)

Read on to learn more about these options.

To ingest LookML metadata through the UI, you must set up a GitHub deploy key using the instructions in the section above. Once that is complete, you can follow the on-screen instructions to set up a LookML source using the Ingestion page. The following video shows you how to ingest LookML metadata through the UI and find the relevant information from your Looker account.

You can set up ingestion using a GitHub Action to push metadata whenever your main Looker GitHub repo changes. The following sample GitHub action file can be modified to emit LookML metadata whenever there is a change to your repository. This ensures that metadata is already fresh and up to date.

Sample GitHub Action

Drop this file into your .github/workflows directory inside your Looker GitHub repo. You need to set up the following secrets in your GitHub repository to get this workflow to work:

  • DATAHUB_GMS_HOST: The endpoint where your DataHub host is running
  • DATAHUB_TOKEN: An authentication token provisioned for DataHub ingestion
  • LOOKER_BASE_URL: The base url where your Looker assets are hosted (e.g. https://acryl.cloud.looker.com)
  • LOOKER_CLIENT_ID: A provisioned Looker Client ID
  • LOOKER_CLIENT_SECRET: A provisioned Looker Client Secret
name: lookml metadata upload
on:
# Note that this action only runs on pushes to your main branch. If you want to also
# run on pull requests, we'd recommend running datahub ingest with the `--dry-run` flag.
push:
branches:
- main
release:
types: [published, edited]
workflow_dispatch:


jobs:
lookml-metadata-upload:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Run LookML ingestion
run: |
pip install 'acryl-datahub[lookml,datahub-rest]'
cat << EOF > lookml_ingestion.yml
# LookML ingestion configuration
source:
type: "lookml"
config:
base_folder: ${{ github.workspace }}
parse_table_names_from_sql: true
github_info:
repo: ${{ github.repository }}
branch: ${{ github.ref }}
# Options
#connection_to_platform_map:
# connection-name:
#platform: platform-name (e.g. snowflake)
#default_db: default-db-name (e.g. DEMO_PIPELINE)
api:
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}
base_url: ${LOOKER_BASE_URL}
sink:
type: datahub-rest
config:
server: ${DATAHUB_GMS_HOST}
token: ${DATAHUB_TOKEN}
EOF
datahub ingest -c lookml_ingestion.yml
env:
DATAHUB_GMS_HOST: ${{ secrets.DATAHUB_GMS_HOST }}
DATAHUB_TOKEN: ${{ secrets.DATAHUB_TOKEN }}
LOOKER_BASE_URL: ${{ secrets.LOOKER_BASE_URL }}
LOOKER_CLIENT_ID: ${{ secrets.LOOKER_CLIENT_ID }}
LOOKER_CLIENT_SECRET: ${{ secrets.LOOKER_CLIENT_SECRET }}

If you want to ingest lookml using the datahub cli directly, read on for instructions and configuration details.

CLI based Ingestion

Install the Plugin

pip install 'acryl-datahub[lookml]'

Starter Recipe

Check out the following recipe to get started with ingestion! See below for full configuration options.

For general pointers on writing and running a recipe, see our main recipe guide.

source:
type: "lookml"
config:
# GitHub Coordinates: Used to check out the repo locally and add github links on the dataset's entity page.
github_info:
repo: org/repo-name
deploy_key_file: ${LOOKER_DEPLOY_KEY_FILE} # file containing the private ssh key for a deploy key for the looker git repo

# Coordinates
# base_folder: /path/to/model/files ## Optional if you are not able to provide a GitHub deploy key

# Options
api:
# Coordinates for your looker instance
base_url: "https://YOUR_INSTANCE.cloud.looker.com"

# Credentials for your Looker connection (https://docs.looker.com/reference/api-and-integration/api-auth)
client_id: ${LOOKER_CLIENT_ID}
client_secret: ${LOOKER_CLIENT_SECRET}

# Alternative to API section above if you want a purely file-based ingestion with no api calls to Looker or if you want to provide platform_instance ids for your connections
# project_name: PROJECT_NAME # See (https://docs.looker.com/data-modeling/getting-started/how-project-works) to understand what is your project name
# connection_to_platform_map:
# connection_name_1:
# platform: snowflake # bigquery, hive, etc
# default_db: DEFAULT_DATABASE. # the default database configured for this connection
# default_schema: DEFAULT_SCHEMA # the default schema configured for this connection
# platform_instance: snow_warehouse # optional
# platform_env: PROD # optional
# connection_name_2:
# platform: bigquery # snowflake, hive, etc
# default_db: DEFAULT_DATABASE. # the default database configured for this connection
# default_schema: DEFAULT_SCHEMA # the default schema configured for this connection
# platform_instance: bq_warehouse # optional
# platform_env: DEV # optional
# Default sink is datahub-rest and doesn't need to be configured
# See https://datahubproject.io/docs/metadata-ingestion/sink_docs/datahub for customization options


Config Details

Note that a . is used to denote nested fields in the YAML recipe.

View All Configuration Options
Field [Required]TypeDescriptionDefaultNotes
base_folder [✅]string(directory-path)Required if not providing github configuration and deploy keys. A pointer to a local directory (accessible to the ingestion system) where the root of the LookML repo has been checked out (typically via a git clone). This is typically the root folder where the *.model.lkml and *.view.lkml files are stored. e.g. If you have checked out your LookML repo under /Users/jdoe/workspace/my-lookml-repo, then set base_folder to /Users/jdoe/workspace/my-lookml-repo.None
emit_reachable_views_only [✅]booleanWhen enabled, only views that are reachable from explores defined in the model files are emittedTrue
extract_column_level_lineage [✅]booleanWhen enabled, extracts column-level lineage from Views and ExploresTrue
max_file_snippet_length [✅]integerWhen extracting the view definition from a lookml file, the maximum number of characters to extract.512000
parse_table_names_from_sql [✅]booleanSee note below.None
platform_instance [✅]stringThe instance of the platform that all assets produced by this recipe belong toNone
platform_name [✅]stringDefault platform name. Don't change.looker
populate_sql_logic_for_missing_descriptions [✅]booleanWhen enabled, field descriptions will include the sql logic for computed fields if descriptions are missingNone
process_isolation_for_sql_parsing [✅]booleanWhen enabled, sql parsing will be executed in a separate process to prevent memory leaks.None
project_name [✅]stringRequired if you don't specify the api section. The project name within which all the model files live. See (https://docs.looker.com/data-modeling/getting-started/how-project-works) to understand what the Looker project name should be. The simplest way to see your projects is to click on Develop followed by Manage LookML Projects in the Looker application.None
sql_parser [✅]stringSee note below.datahub.utilities.sql_parser.DefaultSQLParser
tag_measures_and_dimensions [✅]booleanWhen enabled, attaches tags to measures, dimensions and dimension groups to make them more discoverable. When disabled, adds this information to the description of the column.True
env [✅]stringThe environment that all assets produced by this connector belong toPROD
api [✅]LookerAPIConfigNone
api.base_url [❓ (required if api is set)]stringUrl to your Looker instance: https://company.looker.com:19999 or https://looker.company.com, or similar. Used for making API calls to Looker and constructing clickable dashboard and chart urls.None
api.client_id [❓ (required if api is set)]stringLooker API client id.None
api.client_secret [❓ (required if api is set)]stringLooker API client secret.None
api.transport_options [❓ (required if api is set)]TransportOptionsConfigPopulates the TransportOptions struct for looker clientNone
api.transport_options.headers [❓ (required if transport_options is set)]map(str,string)None
api.transport_options.timeout [❓ (required if transport_options is set)]integerNone
connection_to_platform_map [✅]map(str,LookerConnectionDefinition)None
connection_to_platform_map.key.platform [❓ (required if connection_to_platform_map is set)]stringNone
connection_to_platform_map.key.default_db [❓ (required if connection_to_platform_map is set)]stringNone
connection_to_platform_map.key.default_schema [❓ (required if connection_to_platform_map is set)]stringNone
connection_to_platform_map.key.platform_env [❓ (required if connection_to_platform_map is set)]stringThe environment that the platform is located in. Leaving this empty will inherit defaults from the top level Looker configurationNone
connection_to_platform_map.key.platform_instance [❓ (required if connection_to_platform_map is set)]stringNone
explore_browse_pattern [✅]LookerNamingPatternPattern for providing browse paths to explores. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '/{env}/{platform}/{project}/explores'}
explore_browse_pattern.pattern [❓ (required if explore_browse_pattern is set)]stringNone
explore_naming_pattern [✅]LookerNamingPatternPattern for providing dataset names to explores. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '{model}.explore.{name}'}
explore_naming_pattern.pattern [❓ (required if explore_naming_pattern is set)]stringNone
git_info [✅]GitInfoReference to your git location. If present, supplies handy links to your lookml on the dataset entity page.None
git_info.branch [❓ (required if git_info is set)]stringBranch on which your files live by default. Typically main or master. This can also be a commit hash.main
git_info.deploy_key [❓ (required if git_info is set)]string(password)A private key that contains an ssh key that has been configured as a deploy key for this repository. See deploy_key_file if you want to use a file that contains this key.None
git_info.deploy_key_file [❓ (required if git_info is set)]string(file-path)A private key file that contains an ssh key that has been configured as a deploy key for this repository. Use a file where possible, else see deploy_key for a config field that accepts a raw string.None
git_info.repo [❓ (required if git_info is set)]stringName of your Git repo e.g. https://github.com/datahub-project/datahub or https://gitlab.com/gitlab-org/gitlab. If organization/repo is provided, we assume it is a GitHub repo.None
git_info.repo_ssh_locator [❓ (required if git_info is set)]stringThe url to call git clone on. We infer this for github and gitlab repos, but it is required for other hosts.None
git_info.url_template [❓ (required if git_info is set)]stringTemplate for generating a URL to a file in the repo e.g. '{repo_url}/blob/{branch}/{file_path}'. We can infer this for GitHub and GitLab repos, and it is otherwise required.It supports the following variables: {repo_url}, {branch}, {file_path}None
model_pattern [✅]AllowDenyPatternList of regex patterns for LookML models to include in the extraction.{'allow': ['.*'], 'deny': [], 'ignoreCase': True}
model_pattern.allow [❓ (required if model_pattern is set)]array(string)None
model_pattern.deny [❓ (required if model_pattern is set)]array(string)None
model_pattern.ignoreCase [❓ (required if model_pattern is set)]booleanWhether to ignore case sensitivity during pattern matching.True
project_dependencies [✅]UnionType (See notes for variants)NoneOne of map(str,union)(directory-path),map(str,union)
project_dependencies.key.repo [❓ (required if project_dependencies is set)]stringName of your Git repo e.g. https://github.com/datahub-project/datahub or https://gitlab.com/gitlab-org/gitlab. If organization/repo is provided, we assume it is a GitHub repo.None
project_dependencies.key.branch [❓ (required if project_dependencies is set)]stringBranch on which your files live by default. Typically main or master. This can also be a commit hash.main
project_dependencies.key.deploy_key [❓ (required if project_dependencies is set)]string(password)A private key that contains an ssh key that has been configured as a deploy key for this repository. See deploy_key_file if you want to use a file that contains this key.None
project_dependencies.key.deploy_key_file [❓ (required if project_dependencies is set)]string(file-path)A private key file that contains an ssh key that has been configured as a deploy key for this repository. Use a file where possible, else see deploy_key for a config field that accepts a raw string.None
project_dependencies.key.repo_ssh_locator [❓ (required if project_dependencies is set)]stringThe url to call git clone on. We infer this for github and gitlab repos, but it is required for other hosts.None
project_dependencies.key.url_template [❓ (required if project_dependencies is set)]stringTemplate for generating a URL to a file in the repo e.g. '{repo_url}/blob/{branch}/{file_path}'. We can infer this for GitHub and GitLab repos, and it is otherwise required.It supports the following variables: {repo_url}, {branch}, {file_path}None
transport_options [✅]TransportOptionsConfigPopulates the TransportOptions struct for looker clientNone
transport_options.headers [❓ (required if transport_options is set)]map(str,string)None
transport_options.timeout [❓ (required if transport_options is set)]integerNone
view_browse_pattern [✅]LookerNamingPatternPattern for providing browse paths to views. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '/{env}/{platform}/{project}/views'}
view_browse_pattern.pattern [❓ (required if view_browse_pattern is set)]stringNone
view_naming_pattern [✅]LookerNamingPatternPattern for providing dataset names to views. Allowed variables are ['platform', 'env', 'project', 'model', 'name']{'pattern': '{project}.view.{name}'}
view_naming_pattern.pattern [❓ (required if view_naming_pattern is set)]stringNone
view_pattern [✅]AllowDenyPatternList of regex patterns for LookML views to include in the extraction.{'allow': ['.*'], 'deny': [], 'ignoreCase': True}
view_pattern.allow [❓ (required if view_pattern is set)]array(string)None
view_pattern.deny [❓ (required if view_pattern is set)]array(string)None
view_pattern.ignoreCase [❓ (required if view_pattern is set)]booleanWhether to ignore case sensitivity during pattern matching.True
stateful_ingestion [✅]StatefulStaleMetadataRemovalConfigBase specialized config for Stateful Ingestion with stale metadata removal capability.None
stateful_ingestion.enabled [❓ (required if stateful_ingestion is set)]booleanThe type of the ingestion state provider registered with datahub.None
stateful_ingestion.ignore_new_state [❓ (required if stateful_ingestion is set)]booleanIf set to True, ignores the current checkpoint state.None
stateful_ingestion.ignore_old_state [❓ (required if stateful_ingestion is set)]booleanIf set to True, ignores the previous checkpoint state.None
stateful_ingestion.remove_stale_metadata [❓ (required if stateful_ingestion is set)]booleanSoft-deletes the entities present in the last successful run but missing in the current run with stateful_ingestion enabled.True

Configuration Notes

note

The integration can use an SQL parser to try to parse the tables the views depends on.

This parsing is disabled by default, but can be enabled by setting parse_table_names_from_sql: True. The default parser is based on the sqllineage package. As this package doesn't officially support all the SQL dialects that Looker supports, the result might not be correct. You can, however, implement a custom parser and take it into use by setting the sql_parser configuration value. A custom SQL parser must inherit from datahub.utilities.sql_parser.SQLParser and must be made available to Datahub by ,for example, installing it. The configuration then needs to be set to module_name.ClassName of the parser.

Multi-Project LookML (Advanced)

Looker projects support organization as multiple git repos, with remote includes that can refer to projects that are stored in a different repo. If your Looker implementation uses multi-project setup, you can configure the LookML source to pull in metadata from your remote projects as well.

If you are using local or remote dependencies, you will see include directives in your lookml files that look like this:

include: "//e_flights/views/users.view.lkml"
include: "//e_commerce/public/orders.view.lkml"

Also, you will see projects that are being referred to listed in your manifest.lkml file. Something like this:

project_name: this_project

local_dependency: {
project: "my-remote-project"
}

remote_dependency: ga_360_block {
url: "https://github.com/llooker/google_ga360"
ref: "0bbbef5d8080e88ade2747230b7ed62418437c21"
}

To ingest Looker repositories that are including files defined in other projects, you will need to use the project_dependencies directive within the configuration section. Consider the following scenario:

  • Your primary project refers to a remote project called my_remote_project
  • The remote project is homed in the GitHub repo my_org/my_remote_project
  • You have provisioned a GitHub deploy key and stored the credential in the environment variable (or UI secret), ${MY_REMOTE_PROJECT_DEPLOY_KEY}

In this case, you can add this section to your recipe to activate multi-project LookML ingestion.

source:
type: lookml
config:
... other config variables

project_dependencies:
my_remote_project:
repo: my_org/my_remote_project
deploy_key: ${MY_REMOTE_PROJECT_DEPLOY_KEY}

Under the hood, DataHub will check out your remote repository using the provisioned deploy key, and use it to navigate includes that you have in the model files from your primary project.

If you have the remote project checked out locally, and do not need DataHub to clone the project for you, you can provide DataHub directly with the path to the project like the config snippet below:

source:
type: lookml
config:
... other config variables

project_dependencies:
my_remote_project: /path/to/local_git_clone_of_remote_project
note

This is not the same as ingesting the remote project as a primary Looker project because DataHub will not be processing the model files that might live in the remote project. If you want to additionally include the views accessible via the models in the remote project, create a second recipe where your remote project is the primary project.

Code Coordinates

  • Class Name: datahub.ingestion.source.looker.lookml_source.LookMLSource
  • Browse on GitHub

Questions

If you've got any questions on configuring ingestion for Looker, feel free to ping us on our Slack