Execute a database query and cache it locally as a Feather file and remotely as a Parquet file.
Cache storage locations are managed by tessilake.depths
options.
Database connection defined by an ODBC profile with the name set by the tessilake.tessitura
option.
Usage
read_sql(
query,
name = digest::sha1(query),
select = NULL,
primary_keys = NULL,
date_column = NULL,
freshness = as.difftime(7, units = "days"),
incremental = TRUE,
...
)
read_sql_table(
table_name,
schema = "dbo",
select = NULL,
primary_keys = NULL,
date_column = NULL,
freshness = as.difftime(7, units = "days"),
incremental = TRUE,
...
)
Arguments
- query
character query to run on the database.
- name
name of the query, defaults to the SHA1 hash of the query.
- select
vector of strings indicating columns to select from database
- primary_keys
character vector, primary keys of the table.
read_sql_table
will attempt to identify the primary keys using SQL metadata.- date_column
character, date column of the table showing the last date the row was updated. Defaults to "last_update_dt" if it exists in the table.
- freshness
the returned data will be at least this fresh
- incremental
whether or not to load data incrementally, default is
TRUE
- ...
Arguments passed on to
write_cache
- table_name
character, table name without schema.
- schema
character, database schema. Default is
dbo