VTCP Certification Criteria

Criteria for Certification

This page contains the criteria that the tools need to adhere to, and that the vendors need to adhere to in order to participate in the VTCP program.

Introduction to the Criteria

Welcome to the base criteria for submitting objects generated from your tool – to the VTCP program.  Please note: this specification only covers the tooling itself.  Remember that certification of the tool also requires properly trained resources from the vendor.

We have selected Snowflake DB as our platform of choice to run our vendor tool certifications on.    All DDL and all executing code must comply with Snowflake DB rules.  We do however request that the SQL in the views and the DDL (datatypes) adhere to the ANSI-SQL 99 standards to the extent possible.

NOTE: Please see the failure rules at the bottom of this article.   We reserve the right to change our certification execution platform at any time, should we do so – we will notify the vendors actively participating or in-process of submitting to the VTCP program at that time.

Each ZIP file that is submitted is versioned and kept for a minimum of 2 years.

About the Source DDL

We provide you the SOURCE.DDL file, formatted as we require, complete with JSON comments, markers for business keys, primary and foreign key relationships, and more.  We do expect you to include the SOURCE.DDL in your zip file submission.  In this manner, we can correctly version the entire deployment package that has been submitted to us.  This will allow us to go back to any point in time and replay the entire submission from start to finish.

You can click any title in the Table of Contents below to navigate directly to that content on the page. 

Table of Contents
    Add a header to begin generating the table of contents

    DDL Objects Required for Submission

    Notes:

    The DDL Objects that are submitted must be independent files. Named accordingly.

    DDL Submission Rules

    1. DDL Objects all must contain baseline ANSI-SQL 99 compliant datatypes.  Exception: HASH storage datatype – must be Snowflake Native
    2. DDL Objects all must have JSON styled and formatted comments at the table and column levels.
    3. DDL Objects must adhere to naming conventions set forth in the uploaded JSON formatted configuration file.
    4. DDL Objects must have primary key declarations
    5. DDL for Raw Vault, PITS and Bridges must have Foreign Key declarations
    Notes:

    DDL Table Comments, and Column Comments MUST be JSON formatted.  Otherwise not enough metadata can be derived by our testing process to figure out what the objects should be.  

    We leverage JSON comments to fit within the ANSI-SQL compliance of cross-database testing, and allow our routines to access enriched metadata that only currently exists within the vendor tools.

    DML Objects Required for Submission

    Notes:

    Please stick to ANSI-99 Standard view commands.  For the Code that moves data, please stick to: INSERT..INTO … SELECT * … FROM…

    Enable the INSERT..INTO commands to run as a script file.

    DO NOT SUBMIT PYTHON JAVA, POWERSHELL, VB, OR JAVASCRIPT CODE – or any other form of code for that matter…. CODE SUBMITTED WILL BE REJECTED AS A FAILURE.

    VTCP Architecture Flow

    Data will flow from left to right. Our testing process will execute the INSERT… sql files to move our data sets from one phase to the next.

    Schema Deliniation

    Schemas are required.  Views must be created in the schema for which they pull data from.  The Insert scripts must select data based on the schema (dot) view name

    Required File Directories and Locations

    Directory Path File
    config/ config.json
    source/ source.ddl
    source/ source_views.sql
    stage1/ insert_stage1.sql
    stage1/ stage_level1.ddl
    stage1/ stage1_views.sql
    stage2/ insert_stage2.sql
    stage2/ stage_level2.ddl
    stage2/ stage2_views.sql
    rdv/ insert_rdv.sql
    rdv/ raw_dv.ddl
    rdv/ rdv_views.sql
    infomart/ insert_pitbridge.sql
    infomart/ pit_bridge.ddl
    infomart/ infomart_views.sql

    File Formatting Requirements

     Please pay attention to the file names.  These file names are the ones that will be expected inside the ZIP file.  Any file not found in the proper directory, or found to be missing, will be considered a failure and will halt the verification process immediately.

    Completed Submission File

    All artifacts are to be ZIPPED in appropriate sub-folders (as indicated), and a single zip file is to be uploaded to the VTCP platform.  

    Rules for Failures

    1. Any break in compliance will result in immediate failure of the submission.   Please ensure the proper governance and QA is put in place before assembling the ZIP file for submission.
    2. DO NOT SUBMIT DATA. SUBMITTING DATA WILL RESULT IN IMMEDIATE FAILURE AND REJECTION OF THE SUBMISSION.
    3. All submissions are loaded to Snowflake DB.  Any submission that fails due to a syntax error will be counted as a failure and immediately rejected.
    4. Any missing or empty files (within the ZIP file) will count as a failure and will be rejected.
    5. Any missing or empty directories (within the ZIP file) will count as a failure and will be rejected.
    6. Any break (non-compliance or non-conformance) that causes our testing process to fail will fail the submission and will be rejected.
    7. A Submission of a pre built SNOWFLAKE DB that is shared with us, will result in a FAILURE.   At this time, we must run our certification process completely separate from the shared instances that Snowflake DB offers.

    Termination from the VTCP Program

    Three (3) Repeated (consecutive) failures will result in immediate termination of participation in the VTCP program for the current year only. The vendor will then have to wait for the new year to reapply for the program.

    Three consecutive (year over year) failures will result in a 5 year ban from participation in the VTCP program.

    Frequently Asked Questions

    Yes, but only after the vendor has been accepted to participate in the current years' VTCP.

    No.  Instead, the example files are just that, examples.  The only exception to this is the CONFIG file, which will be a complete example.

    Yes.  Every certification will expire, and prior to expiration the vendor is required to resubmit for certification renewal.

    Vendors are constantly upgrading their tools, adding features, changing functionality.  We require recertification to ensure the ongoing consistency and commitment to quality, along with adherence to the latest standards.

    No.  the VTCP program is setup in 3 phases.  The first phase tests basic standards, the second phase augments that by adding a time-line to add changes and resubmit.  The third phase is also an iteration test that examines turn around time in addition to upkeep of the standards.