Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 7 Next »

Data Quality Checks and Processes

Validating uploaded Data

For security reasons and to avoid the possibility of tampering with uploaded data, data files that are ingested through S3 buckets are moved immediately to a different location when uploaded. Upon successful upload of data files to the ingest bucket using the above step, data will be moved to the raw submissions bucket or “Data Lake.”

Background processes will move the data from Raw data bucket to the Standardized data bucket under a folder labelled based on the date it was uploaded. As a data provider, you will have a folder in the Data Lake that contains all of the data you upload.

 'Drop-zone' -> 'Raw-data' -> 'Standardized-data'

Data uploads can be verified by running the below AWS CLI command on the Standardized data bucket to list the objects there.

aws s3 ls s3://prod.sdc.dot.gov.data-lake.standardized-data/<data-provider> --profile sdc

The Standardized data bucket name is provided in the table below. The “project name” and “data provider name” are provided in the welcome email.

DEV

TEST

PROD

Standardized data bucket name

Configuration Files

When the uploaded data reaches the Standardized Data Bucket, it will be checked by the validation lambda function. This function confirms that all the correct fields exist for each message and that the data in each field is reasonable (eg. Variable speed limit set to 0-254 MPH). The field and data checks are stored in a file called config.ini.

  • No labels