Data types

Understanding Prequel data types

Understanding Prequel data types

Because data from any source can be synced to any destination, and because data types are not necessarily synonymous across data stores, a common mapping must be used to specify the source data type and predict the destination data type. As part of Prequel configuration, the expected data type must be defined ahead of time: for Export, this means specifying what data type the source will export, or for Import, specifying what data type to enforce in the destination.

Source type mapping

If you are configuring a source and need to decide what PREQUEL type to use, reference this table.

Prequel TypeSupportedAthenaBigQueryClickhouseDatabricksMongoDBMySQLPostgresRedshiftSnowflakeSQL Server
bytes:heavy-check-mark:binarybytesblob, mediumblob, longblobbinarybinDatabinary, varbinary, blob, mediumblob, longblobbyteaVARBYTE, VARBINARY, BINARY VARYINGbinary, varbinarybinary, image, varbinary
string:heavy-check-mark:char, varchar, stringstringstring, fixedstring, text, mediumtext, longtext, varcharstringString, ObjectIdstring, char, text, longtext, varcharchar, character, character varying, text, uuid, varcharCHAR, CHARACTER, NCHAR, BPCHAR, CHARACTER VARYING, NVARCHAR, TEXTCHAR, CHARACTER, NCHAR, BPCHAR, CHARACTER VARYING, NVARCHAR, TEXTchar, varchar, text, nvarchar, ntext, nchar
boolean:heavy-check-mark:booleanboolbool, booleanbooleanBooleanbooleanbooleanbooleanbooleanbit
integer:heavy-check-mark:smallint, integersmallint, int64, bigintint, int8, int16, int32smallint, intInt32smallint, mediumint, intsmallint, integersmallint, integersmallint, integersmallint, int
bigint:heavy-check-mark:bigintint64, bigintint64, bigintbigintInt64, Longbigintbigintbigintbigintbigint
decimal:heavy-check-mark:decimaldecimal, numericdecimal, numericdecimal, dec, numericDecimal128decimal, numericdecimal, numericdecimal, numericdecimal, numericdecimal, numeric
float:heavy-check-mark:real, floatfloat64float32, float64, doublefloat, doubleDoublefloat, doublereal, double precisionreal, float4, float8, double precisionreal, float4, float8, double precisionfloat, real
timestamp:heavy-check-mark:timestamptimestampdatetime, Datetime64timestampDate, Timestamptimestamptimestamp, timestamptztimestamp, timestamptztimestamp, timestamp_ntz, timestamp_tz, timestamp_ltzdatetime2
date:heavy-check-mark:datedatedatedatedatedatedatedatedate
json:heavy-check-mark:varchar, stringjsonstring*stringObjectjsonjson, jsonbsupervarchar, variantnvarchar(MAX)
time*:heavy-multiplication-x:varchar, stringtimetimestringtimetimetimetimetime

* An asterisk indicates partial support. Ask us about any specific data type limitations.

Destination type mapping

If you are predicting what data type your Recipient destination will receive, reference this table.

Prequel TypeSupportedAthenaBigQueryClickhouseDatabricksMySQLPostgresRedshiftSnowflakeSQL Server
bytes:heavy-check-mark:stringbytesblobbinarybinarybyteaVARBYTE(MAX)*binaryblob
string:heavy-check-mark:stringstringstringstringtexttextvarchar(MAX)textnvarchar(MAX)
boolean:heavy-check-mark:booleanboolbooleanbooleanbooleanbooleanbooleanbooleanbit
integer:heavy-check-mark:integerint64integerintintintegerbigintintegerint
bigint:heavy-check-mark:bigintbigintbigintbigintbigintbigintbigintbigintbigint
decimal:heavy-check-mark:decimaldecimaldecimaldecimaldecimaldecimaldecimaldecimaldecimal
float:heavy-check-mark:floatfloat64doubledoubledoubledouble precisiondouble precisionfloat
timestamp:heavy-check-mark:timestamptimestampDatetime64timestamptimestamptimestamptztimestamptztimestamp_tzdatetime2
date:heavy-check-mark:datedatedatedatedatedatedatedatedate
json:heavy-check-mark:stringjsonstring*stringjsonjsonbsupervariantnvarchar(MAX)

* An asterisk indicates partial or incomplete support. Ask us about any specific data type limitations.