VTCP Certification Criteria
Criteria for Certification
This page contains the criteria that the tools need to adhere to, and that the vendors need to adhere to in order to participate in the VTCP program.
Introduction to the Criteria
Welcome to the base criteria for submitting objects generated from your tool – to the VTCP program. Please note: this specification only covers the tooling itself. Remember that certification of the tool also requires properly trained resources from the vendor.
We have selected Snowflake DB as our platform of choice to run our vendor tool certifications on. All DDL and all executing code must comply with Snowflake DB rules. We do however request that the SQL in the views and the DDL (datatypes) adhere to the ANSI-SQL 99 standards to the extent possible.
NOTE: Please see the failure rules at the bottom of this article. We reserve the right to change our certification execution platform at any time, should we do so – we will notify the vendors actively participating or in-process of submitting to the VTCP program at that time.
Each ZIP file that is submitted is versioned and kept for a minimum of 2 years.
About the Source DDL
We provide you the SOURCE.DDL file, formatted as we require, complete with JSON comments, markers for business keys, primary and foreign key relationships, and more. We do expect you to include the SOURCE.DDL in your zip file submission. In this manner, we can correctly version the entire deployment package that has been submitted to us. This will allow us to go back to any point in time and replay the entire submission from start to finish.
You can click any title in the Table of Contents below to navigate directly to that content on the page.
DDL Objects Required for Submission
The DDL Objects that are submitted must be independent files. Named accordingly.
DDL Submission Rules
- DDL Objects all must contain baseline ANSI-SQL 99 compliant datatypes. Exception: HASH storage datatype – must be Snowflake Native
- DDL Objects all must have JSON styled and formatted comments at the table and column levels.
- DDL Objects must adhere to naming conventions set forth in the uploaded JSON formatted configuration file.
- DDL Objects must have primary key declarations
- DDL for Raw Vault, PITS and Bridges must have Foreign Key declarations
DDL Table Comments, and Column Comments MUST be JSON formatted. Otherwise not enough metadata can be derived by our testing process to figure out what the objects should be.
We leverage JSON comments to fit within the ANSI-SQL compliance of cross-database testing, and allow our routines to access enriched metadata that only currently exists within the vendor tools.
DML Objects Required for Submission
Please stick to ANSI-99 Standard view commands. For the Code that moves data, please stick to: INSERT..INTO … SELECT * … FROM…
Enable the INSERT..INTO commands to run as a script file.
VTCP Architecture Flow
Data will flow from left to right. Our testing process will execute the INSERT… sql files to move our data sets from one phase to the next.
Schemas are required. Views must be created in the schema for which they pull data from. The Insert scripts must select data based on the schema (dot) view name
Required File Directories and Locations
File Formatting Requirements
Please pay attention to the file names. These file names are the ones that will be expected inside the ZIP file. Any file not found in the proper directory, or found to be missing, will be considered a failure and will halt the verification process immediately.
Completed Submission File
All artifacts are to be ZIPPED in appropriate sub-folders (as indicated), and a single zip file is to be uploaded to the VTCP platform.
Rules for Failures
- Any break in compliance will result in immediate failure of the submission. Please ensure the proper governance and QA is put in place before assembling the ZIP file for submission.
- DO NOT SUBMIT DATA. SUBMITTING DATA WILL RESULT IN IMMEDIATE FAILURE AND REJECTION OF THE SUBMISSION.
- All submissions are loaded to Snowflake DB. Any submission that fails due to a syntax error will be counted as a failure and immediately rejected.
- Any missing or empty files (within the ZIP file) will count as a failure and will be rejected.
- Any missing or empty directories (within the ZIP file) will count as a failure and will be rejected.
- Any break (non-compliance or non-conformance) that causes our testing process to fail will fail the submission and will be rejected.
- A Submission of a pre built SNOWFLAKE DB that is shared with us, will result in a FAILURE. At this time, we must run our certification process completely separate from the shared instances that Snowflake DB offers.
Termination from the VTCP Program
Three (3) Repeated (consecutive) failures will result in immediate termination of participation in the VTCP program for the current year only. The vendor will then have to wait for the new year to reapply for the program.
Three consecutive (year over year) failures will result in a 5 year ban from participation in the VTCP program.
Frequently Asked Questions
Can I download an example ZIP file?
Yes, but only after the vendor has been accepted to participate in the current years' VTCP.
Do the example files represent the actual test cases?
No. Instead, the example files are just that, examples. The only exception to this is the CONFIG file, which will be a complete example.
Do I have to resubmit for re-certification?
Yes. Every certification will expire, and prior to expiration the vendor is required to resubmit for certification renewal.
Why does the vendor have to resubmit?
Vendors are constantly upgrading their tools, adding features, changing functionality. We require recertification to ensure the ongoing consistency and commitment to quality, along with adherence to the latest standards.
Is there only one submission during the VTCP?
No. the VTCP program is setup in 3 phases. The first phase tests basic standards, the second phase augments that by adding a time-line to add changes and resubmit. The third phase is also an iteration test that examines turn around time in addition to upkeep of the standards.