Copy BigQuery Table Using Python

In this tutorial, we will learn how to copy a table from a source dataset to a destination dataset in BigQuery using Python. This is a common operation for data backup or environment management.

Python Code to Copy Table

We will use the copy_table method from the google-cloud-bigquery library.

Python

# Import the packages
from dotenv import load_dotenv
import os
from google.oauth2 import service_account
from google.cloud import bigquery

def main():
    # Loads environment variables from a .env file
    load_dotenv()

    # Use environment variables from .env file
    credentials_path = os.getenv("GOOGLE_APPLICATION_CREDENTIALS")
    project_name = os.getenv("project_id")

    # Get the service account credentials
    credentials = service_account.Credentials.from_service_account_file(credentials_path)

    # Create the BigQuery client
    client = bigquery.Client(credentials=credentials, project=project_name)

    # Set source_table_id to the ID of the original table
    # source_table_id = "your-project.source_dataset.source_table"
    source_table_id = "ashishcoder.ds_from_tf.table_from_tf"

    # Set destination_table_id to the ID of the destination table
    # destination_table_id = "your-project.destination_dataset.destination_table"
    destination_table_id = "ashishcoder.Coding_Dataset.copied_table"

    # Start copy job
    job = client.copy_table(source_table_id, destination_table_id)

    # Wait for the job to complete
    job.result()

    print("A copy of the table is created.")

if __name__ == "__main__":
    main()    

Key Steps Explained

Note: Ensure that the destination dataset already exists before you attempt to copy the table into it.