commit_prepared_transaction(transaction_id, opts=OPTS)
click to toggle source
Commit an existing prepared transaction with the given transaction
identifier string.
def commit_prepared_transaction(transaction_id, opts=OPTS)
run("COMMIT PREPARED #{literal(transaction_id)}", opts)
end
create_function(name, definition, opts=OPTS)
click to toggle source
Creates the function in the database. Arguments:
- name
-
name of the function to create
- definition
-
string definition of the function, or object file for a dynamically loaded
C function.
- opts
-
options hash:
- :args
-
function arguments, can be either a symbol or string specifying a type or
an array of 1-3 elements:
- 1
-
argument data type
- 2
-
argument name
- 3
-
argument mode (e.g. in, out, inout)
- :behavior
-
Should be IMMUTABLE, STABLE, or VOLATILE. PostgreSQL assumes VOLATILE by
default.
- :cost
-
The estimated cost of the function, used by the query planner.
- :language
-
The language the function uses. SQL is the
default.
- :link_symbol
-
For a dynamically loaded see function, the function's link symbol if
different from the definition argument.
- :returns
-
The data type returned by the function. If you are using OUT or INOUT
argument modes, this is ignored. Otherwise, if this is not specified, void
is used by default to specify the function is not supposed to return a
value.
- :rows
-
The estimated number of rows the function will return. Only use if the
function returns SETOF something.
- :security_definer
-
Makes the privileges of the function the same as the privileges of the user
who defined the function instead of the privileges of the user who runs the
function. There are security implications when doing this, see the
PostgreSQL documentation.
- :set
-
Configuration variables to set while the function is being run, can be a
hash or an array of two pairs. search_path is often used here if
:security_definer is used.
- :strict
-
Makes the function return NULL when any argument is NULL.
def create_function(name, definition, opts=OPTS)
self << create_function_sql(name, definition, opts)
end
create_language(name, opts=OPTS)
click to toggle source
Create the procedural language in the database. Arguments:
- name
-
Name of the procedural language (e.g. plpgsql)
- opts
-
options hash:
- :handler
-
The name of a previously registered function used as a call handler for
this language.
- :replace
-
Replace the installed language if it already exists (on PostgreSQL 9.0+).
- :trusted
-
Marks the language being created as trusted, allowing unprivileged users to
create functions using this language.
- :validator
-
The name of previously registered function used as a validator of functions
defined in this language.
def create_language(name, opts=OPTS)
self << create_language_sql(name, opts)
end
create_schema(name, opts=OPTS)
click to toggle source
Create a schema in the database. Arguments:
- name
-
Name of the schema (e.g. admin)
- opts
-
options hash:
- :if_not_exists
-
Don't raise an error if the schema already exists (PostgreSQL 9.3+)
- :owner
-
The owner to set for the schema (defaults to current user if not specified)
def create_schema(name, opts=OPTS)
self << create_schema_sql(name, opts)
end
create_trigger(table, name, function, opts=OPTS)
click to toggle source
Create a trigger in the database. Arguments:
- table
-
the table on which this trigger operates
- name
-
the name of this trigger
- function
-
the function to call for this trigger, which should return type trigger.
- opts
-
options hash:
- :after
-
Calls the trigger after execution instead of before.
- :args
-
An argument or array of arguments to pass to the function.
- :each_row
-
Calls the trigger for each row instead of for each statement.
- :events
-
Can be :insert, :update, :delete, or an array of any of those. Calls the
trigger whenever that type of statement is used. By default, the trigger
is called for insert, update, or delete.
- :when
-
A filter to use for the trigger
def create_trigger(table, name, function, opts=OPTS)
self << create_trigger_sql(table, name, function, opts)
end
database_type()
click to toggle source
PostgreSQL uses the :postgres database type.
def database_type
:postgres
end
do(code, opts=OPTS)
click to toggle source
Use PostgreSQL's DO syntax to execute an anonymous code block. The
code should be the literal code string to use in the underlying procedural
language. Options:
- :language
-
The procedural language the code is written in. The PostgreSQL default is
plpgsql. Can be specified as a string or a symbol.
def do(code, opts=OPTS)
language = opts[:language]
run "DO #{"LANGUAGE #{literal(language.to_s)} " if language}#{literal(code)}"
end
drop_function(name, opts=OPTS)
click to toggle source
Drops the function from the database. Arguments:
- name
-
name of the function to drop
- opts
-
options hash:
- :args
-
The arguments for the function. See create_function_sql.
- :cascade
-
Drop other objects depending on this function.
- :if_exists
-
Don't raise an error if the function doesn't exist.
def drop_function(name, opts=OPTS)
self << drop_function_sql(name, opts)
end
drop_language(name, opts=OPTS)
click to toggle source
Drops a procedural language from the database. Arguments:
- name
-
name of the procedural language to drop
- opts
-
options hash:
- :cascade
-
Drop other objects depending on this function.
- :if_exists
-
Don't raise an error if the function doesn't exist.
def drop_language(name, opts=OPTS)
self << drop_language_sql(name, opts)
end
drop_schema(name, opts=OPTS)
click to toggle source
Drops a schema from the database. Arguments:
- name
-
name of the schema to drop
- opts
-
options hash:
- :cascade
-
Drop all objects in this schema.
- :if_exists
-
Don't raise an error if the schema doesn't exist.
def drop_schema(name, opts=OPTS)
self << drop_schema_sql(name, opts)
end
drop_trigger(table, name, opts=OPTS)
click to toggle source
Drops a trigger from the database. Arguments:
- table
-
table from which to drop the trigger
- name
-
name of the trigger to drop
- opts
-
options hash:
- :cascade
-
Drop other objects depending on this function.
- :if_exists
-
Don't raise an error if the function doesn't exist.
def drop_trigger(table, name, opts=OPTS)
self << drop_trigger_sql(table, name, opts)
end
foreign_key_list(table, opts=OPTS)
click to toggle source
Return full foreign key information using the pg system tables, including
:name, :on_delete, :on_update, and :deferrable entries in the hashes.
def foreign_key_list(table, opts=OPTS)
m = output_identifier_meth
schema, _ = opts.fetch(:schema, schema_and_table(table))
range = 0...32
base_ds = metadata_dataset.
from(:pg_constraint___co).
join(:pg_class___cl, :oid=>:conrelid).
where(:cl__relkind=>'r', :co__contype=>'f', :cl__oid=>regclass_oid(table))
ds = base_ds.
join(:pg_attribute___att, :attrelid=>:oid, :attnum=>SQL::Function.new(:ANY, :co__conkey)).
order(:co__conname, SQL::CaseExpression.new(range.map{|x| [SQL::Subscript.new(:co__conkey, [x]), x]}, 32, :att__attnum)).
select(:co__conname___name, :att__attname___column, :co__confupdtype___on_update, :co__confdeltype___on_delete,
SQL::BooleanExpression.new(:AND, :co__condeferrable, :co__condeferred).as(:deferrable))
ref_ds = base_ds.
join(:pg_class___cl2, :oid=>:co__confrelid).
join(:pg_attribute___att2, :attrelid=>:oid, :attnum=>SQL::Function.new(:ANY, :co__confkey)).
order(:co__conname, SQL::CaseExpression.new(range.map{|x| [SQL::Subscript.new(:co__conkey, [x]), x]}, 32, :att2__attnum)).
select(:co__conname___name, :cl2__relname___table, :att2__attname___refcolumn)
if schema
ref_ds = ref_ds.join(:pg_namespace___nsp2, :oid=>:cl2__relnamespace).
select_more(:nsp2__nspname___schema)
end
h = {}
fklod_map = FOREIGN_KEY_LIST_ON_DELETE_MAP
ds.each do |row|
if r = h[row[:name]]
r[:columns] << m.call(row[:column])
else
h[row[:name]] = {:name=>m.call(row[:name]), :columns=>[m.call(row[:column])], :on_update=>fklod_map[row[:on_update]], :on_delete=>fklod_map[row[:on_delete]], :deferrable=>row[:deferrable]}
end
end
ref_ds.each do |row|
r = h[row[:name]]
r[:table] ||= schema ? SQL::QualifiedIdentifier.new(m.call(row[:schema]), m.call(row[:table])) : m.call(row[:table])
r[:key] ||= []
r[:key] << m.call(row[:refcolumn])
end
h.values
end
indexes(table, opts=OPTS)
click to toggle source
Use the pg_* system tables to determine indexes on a table
def indexes(table, opts=OPTS)
m = output_identifier_meth
range = 0...32
attnums = server_version >= 80100 ? SQL::Function.new(:ANY, :ind__indkey) : range.map{|x| SQL::Subscript.new(:ind__indkey, [x])}
ds = metadata_dataset.
from(:pg_class___tab).
join(:pg_index___ind, :indrelid=>:oid).
join(:pg_class___indc, :oid=>:indexrelid).
join(:pg_attribute___att, :attrelid=>:tab__oid, :attnum=>attnums).
left_join(:pg_constraint___con, :conname=>:indc__relname).
filter(:indc__relkind=>'i', :ind__indisprimary=>false, :indexprs=>nil, :indpred=>nil, :indisvalid=>true, :tab__oid=>regclass_oid(table, opts)).
order(:indc__relname, SQL::CaseExpression.new(range.map{|x| [SQL::Subscript.new(:ind__indkey, [x]), x]}, 32, :att__attnum)).
select(:indc__relname___name, :ind__indisunique___unique, :att__attname___column, :con__condeferrable___deferrable)
ds.filter!(:indisready=>true, :indcheckxmin=>false) if server_version >= 80300
indexes = {}
ds.each do |r|
i = indexes[m.call(r[:name])] ||= {:columns=>[], :unique=>r[:unique], :deferrable=>r[:deferrable]}
i[:columns] << m.call(r[:column])
end
indexes
end
locks()
click to toggle source
Dataset containing all current database locks
def locks
dataset.from(:pg_class).join(:pg_locks, :relation=>:relfilenode).select(:pg_class__relname, Sequel::SQL::ColumnAll.new(:pg_locks))
end
notify(channel, opts=OPTS)
click to toggle source
Notifies the given channel. See the PostgreSQL NOTIFY documentation.
Options:
- :payload
-
The payload string to use for the NOTIFY statement. Only supported in
PostgreSQL 9.0+.
- :server
-
The server to which to send the NOTIFY statement, if the sharding support
is being used.
def notify(channel, opts=OPTS)
sql = "NOTIFY "
dataset.send(:identifier_append, sql, channel)
if payload = opts[:payload]
sql << ", "
dataset.literal_append(sql, payload.to_s)
end
execute_ddl(sql, opts)
end
primary_key(table, opts=OPTS)
click to toggle source
Return primary key for the given table.
def primary_key(table, opts=OPTS)
quoted_table = quote_schema_table(table)
Sequel.synchronize{return @primary_keys[quoted_table] if @primary_keys.has_key?(quoted_table)}
sql = "#{SELECT_PK_SQL} AND pg_class.oid = #{literal(regclass_oid(table, opts))}"
value = fetch(sql).single_value
Sequel.synchronize{@primary_keys[quoted_table] = value}
end
primary_key_sequence(table, opts=OPTS)
click to toggle source
Return the sequence providing the default for the primary key for the given
table.
def primary_key_sequence(table, opts=OPTS)
quoted_table = quote_schema_table(table)
Sequel.synchronize{return @primary_key_sequences[quoted_table] if @primary_key_sequences.has_key?(quoted_table)}
sql = "#{SELECT_SERIAL_SEQUENCE_SQL} AND t.oid = #{literal(regclass_oid(table, opts))}"
if pks = fetch(sql).single_record
value = literal(SQL::QualifiedIdentifier.new(pks[:schema], pks[:sequence]))
Sequel.synchronize{@primary_key_sequences[quoted_table] = value}
else
sql = "#{SELECT_CUSTOM_SEQUENCE_SQL} AND t.oid = #{literal(regclass_oid(table, opts))}"
if pks = fetch(sql).single_record
value = literal(SQL::QualifiedIdentifier.new(pks[:schema], LiteralString.new(pks[:sequence])))
Sequel.synchronize{@primary_key_sequences[quoted_table] = value}
end
end
end
refresh_view(name, opts=OPTS)
click to toggle source
Refresh the materialized view with the given name.
DB.refresh_view(:items_view)
DB.refresh_view(:items_view, :concurrently=>true)
def refresh_view(name, opts=OPTS)
run "REFRESH MATERIALIZED VIEW#{' CONCURRENTLY' if opts[:concurrently]} #{quote_schema_table(name)}"
end
reset_conversion_procs()
click to toggle source
Reset the database's conversion procs, requires a server query if there
any named types.
def reset_conversion_procs
@conversion_procs = get_conversion_procs
conversion_procs_updated
@conversion_procs
end
reset_primary_key_sequence(table)
click to toggle source
Reset the primary key sequence for the given table, basing it on the
maximum current value of the table's primary key.
def reset_primary_key_sequence(table)
return unless seq = primary_key_sequence(table)
pk = SQL::Identifier.new(primary_key(table))
db = self
seq_ds = db.from(LiteralString.new(seq))
s, t = schema_and_table(table)
table = Sequel.qualify(s, t) if s
get{setval(seq, db[table].select{coalesce(max(pk)+seq_ds.select{:increment_by}, seq_ds.select(:min_value))}, false)}
end
rollback_prepared_transaction(transaction_id, opts=OPTS)
click to toggle source
Rollback an existing prepared transaction
with the given transaction identifier string.
def rollback_prepared_transaction(transaction_id, opts=OPTS)
run("ROLLBACK PREPARED #{literal(transaction_id)}", opts)
end
serial_primary_key_options()
click to toggle source
PostgreSQL uses SERIAL psuedo-type instead of AUTOINCREMENT for managing
incrementing primary keys.
def serial_primary_key_options
{:primary_key => true, :serial => true, :type=>Integer}
end
server_version(server=nil)
click to toggle source
The version of the PostgreSQL server, used for determining capability.
def server_version(server=nil)
return @server_version if @server_version
@server_version = synchronize(server) do |conn|
(conn.server_version rescue nil) if conn.respond_to?(:server_version)
end
unless @server_version
@server_version = if m = /PostgreSQL (\d+)\.(\d+)(?:(?:rc\d+)|\.(\d+))?/.match(fetch('SELECT version()').single_value)
(m[1].to_i * 10000) + (m[2].to_i * 100) + m[3].to_i
else
0
end
end
warn 'Sequel no longer supports PostgreSQL <8.2, some things may not work' if @server_version < 80200
@server_version
end
supports_create_table_if_not_exists?()
click to toggle source
PostgreSQL supports CREATE TABLE IF NOT EXISTS on 9.1+
def supports_create_table_if_not_exists?
server_version >= 90100
end
supports_deferrable_constraints?()
click to toggle source
PostgreSQL 9.0+ supports some types of deferrable constraints beyond
foreign key constraints.
def supports_deferrable_constraints?
server_version >= 90000
end
supports_deferrable_foreign_key_constraints?()
click to toggle source
PostgreSQL supports deferrable foreign key constraints.
def supports_deferrable_foreign_key_constraints?
true
end
supports_drop_table_if_exists?()
click to toggle source
PostgreSQL supports DROP TABLE IF EXISTS
def supports_drop_table_if_exists?
true
end
supports_partial_indexes?()
click to toggle source
PostgreSQL supports partial indexes.
def supports_partial_indexes?
true
end
supports_prepared_transactions?()
click to toggle source
PostgreSQL supports prepared transactions (two-phase commit) if
max_prepared_transactions is greater than 0.
def supports_prepared_transactions?
return @supports_prepared_transactions if defined?(@supports_prepared_transactions)
@supports_prepared_transactions = self['SHOW max_prepared_transactions'].get.to_i > 0
end
supports_savepoints?()
click to toggle source
PostgreSQL supports savepoints
def supports_savepoints?
true
end
supports_transaction_isolation_levels?()
click to toggle source
PostgreSQL supports transaction isolation levels
def supports_transaction_isolation_levels?
true
end
supports_transactional_ddl?()
click to toggle source
PostgreSQL supports transaction DDL statements.
def supports_transactional_ddl?
true
end
supports_trigger_conditions?()
click to toggle source
PostgreSQL 9.0+ supports trigger conditions.
def supports_trigger_conditions?
server_version >= 90000
end
tables(opts=OPTS, &block)
click to toggle source
Array of symbols specifying table names in
the current database. The dataset used is yielded to the block if one is
provided, otherwise, an array of symbols of table names is returned.
Options:
- :qualify
-
Return the tables as Sequel::SQL::QualifiedIdentifier
instances, using the schema the table is located in as the qualifier.
- :schema
-
The schema to search
- :server
-
The server to use
def tables(opts=OPTS, &block)
pg_class_relname('r', opts, &block)
end
type_supported?(type)
click to toggle source
Check whether the given type name string/symbol (e.g. :hstore) is supported
by the database.
def type_supported?(type)
@supported_types ||= {}
@supported_types.fetch(type){@supported_types[type] = (from(:pg_type).filter(:typtype=>'b', :typname=>type.to_s).count > 0)}
end
values(v)
click to toggle source
Creates a dataset that uses the VALUES clause:
DB.values([[1, 2], [3, 4]])
VALUES ((1, 2), (3, 4))
DB.values([[1, 2], [3, 4]]).order(:column2).limit(1, 1)
VALUES ((1, 2), (3, 4)) ORDER BY column2 LIMIT 1 OFFSET 1
def values(v)
@default_dataset.clone(:values=>v)
end
views(opts=OPTS)
click to toggle source
Array of symbols specifying view names in
the current database.
Options:
- :qualify
-
Return the views as Sequel::SQL::QualifiedIdentifier
instances, using the schema the view is located in as the qualifier.
- :schema
-
The schema to search
- :server
-
The server to use
def views(opts=OPTS)
pg_class_relname('v', opts)
end
add_named_conversion_procs(procs, named_procs)
click to toggle source
Do a type name-to-oid lookup using the database and update the procs with
the related proc if the database supports the type.
def add_named_conversion_procs(procs, named_procs)
unless (named_procs).empty?
convert_named_procs_to_procs(named_procs).each do |oid, pr|
procs[oid] ||= pr
end
conversion_procs_updated
end
end
alter_table_generator_class()
click to toggle source
Use a PostgreSQL-specific alter table generator
def alter_table_generator_class
Postgres::AlterTableGenerator
end
alter_table_op_sql(table, op)
click to toggle source
Handle :using option for set_column_type op, and the :validate_constraint
op.
Calls superclass method
def alter_table_op_sql(table, op)
case op[:op]
when :set_column_type
s = super
if using = op[:using]
using = Sequel::LiteralString.new(using) if using.is_a?(String)
s << ' USING '
s << literal(using)
end
s
when :validate_constraint
"VALIDATE CONSTRAINT #{quote_identifier(op[:name])}"
else
super
end
end
begin_new_transaction(conn, opts)
click to toggle source
If the :synchronous option is given and non-nil, set synchronous_commit
appropriately. Valid values for the :synchronous option are true, :on,
false, :off, :local, and :remote_write.
Calls superclass method
def begin_new_transaction(conn, opts)
super
if opts.has_key?(:synchronous)
case sync = opts[:synchronous]
when true
sync = :on
when false
sync = :off
when nil
return
end
log_connection_execute(conn, "SET LOCAL synchronous_commit = #{sync}")
end
end
begin_savepoint(conn, opts)
click to toggle source
Set the READ ONLY transaction setting per savepoint, as PostgreSQL supports
that.
Calls superclass method
def begin_savepoint(conn, opts)
super
unless (read_only = opts[:read_only]).nil?
log_connection_execute(conn, "SET TRANSACTION READ #{read_only ? 'ONLY' : 'WRITE'}")
end
end
column_schema_normalize_default(default, type)
click to toggle source
Handle PostgreSQL specific default format.
Calls superclass method
def column_schema_normalize_default(default, type)
if m = POSTGRES_DEFAULT_RE.match(default)
default = m[1] || m[2]
end
super(default, type)
end
combinable_alter_table_op?(op)
click to toggle source
PostgreSQL can't combine rename_column operations, and it can combine
the custom validate_constraint operation.
Calls superclass method
def combinable_alter_table_op?(op)
(super || op[:op] == :validate_constraint) && op[:op] != :rename_column
end
commit_transaction(conn, opts=OPTS)
click to toggle source
If the :prepare option is given and we aren't in a savepoint, prepare
the transaction for a two-phase commit.
Calls superclass method
def commit_transaction(conn, opts=OPTS)
if (s = opts[:prepare]) && savepoint_level(conn) <= 1
log_connection_execute(conn, "PREPARE TRANSACTION #{literal(s)}")
else
super
end
end
connection_configuration_sqls()
click to toggle source
The SQL queries to execute when starting a new
connection.
def connection_configuration_sqls
sqls = []
sqls << "SET standard_conforming_strings = ON" if typecast_value_boolean(@opts.fetch(:force_standard_strings, Postgres.force_standard_strings))
if (cmm = @opts.fetch(:client_min_messages, Postgres.client_min_messages)) && !cmm.to_s.empty?
cmm = cmm.to_s.upcase.strip
unless VALID_CLIENT_MIN_MESSAGES.include?(cmm)
raise Error, "Unsupported client_min_messages setting: #{cmm}"
end
sqls << "SET client_min_messages = '#{cmm.to_s.upcase}'"
end
if search_path = @opts[:search_path]
case search_path
when String
search_path = search_path.split(",").map{|s| s.strip}
when Array
else
raise Error, "unrecognized value for :search_path option: #{search_path.inspect}"
end
sqls << "SET search_path = #{search_path.map{|s| "\"#{s.gsub('"', '""')}\""}.join(',')}"
end
sqls
end
constraint_definition_sql(constraint)
click to toggle source
Handle exclusion constraints.
Calls superclass method
def constraint_definition_sql(constraint)
case constraint[:type]
when :exclude
elements = constraint[:elements].map{|c, op| "#{literal(c)} WITH #{op}"}.join(', ')
sql = "#{"CONSTRAINT #{quote_identifier(constraint[:name])} " if constraint[:name]}EXCLUDE USING #{constraint[:using]||'gist'} (#{elements})#{" WHERE #{filter_expr(constraint[:where])}" if constraint[:where]}"
constraint_deferrable_sql_append(sql, constraint[:deferrable])
sql
when :foreign_key, :check
sql = super
if constraint[:not_valid]
sql << " NOT VALID"
end
sql
else
super
end
end
conversion_procs_updated()
click to toggle source
Callback used when conversion procs are updated.
def conversion_procs_updated
nil
end
convert_named_procs_to_procs(named_procs)
click to toggle source
Convert the hash of named conversion procs into a hash a oid conversion
procs.
def convert_named_procs_to_procs(named_procs)
h = {}
from(:pg_type).where(:typtype=>'b', :typname=>named_procs.keys.map{|t| t.to_s}).select_map([:oid, :typname]).each do |oid, name|
h[oid.to_i] = named_procs[name.untaint.to_sym]
end
h
end
copy_conversion_procs(oids)
click to toggle source
Copy the conversion procs related to the given oids from PG_TYPES into the
conversion procs for this instance.
def copy_conversion_procs(oids)
procs = conversion_procs
oids.each do |oid|
procs[oid] = PG_TYPES[oid]
end
conversion_procs_updated
end
copy_into_sql(table, opts)
click to toggle source
SQL for doing fast table insert from stdin.
def copy_into_sql(table, opts)
sql = "COPY #{literal(table)}"
if cols = opts[:columns]
sql << literal(Array(cols))
end
sql << " FROM STDIN"
if opts[:options] || opts[:format]
sql << " ("
sql << "FORMAT #{opts[:format]}" if opts[:format]
sql << "#{', ' if opts[:format]}#{opts[:options]}" if opts[:options]
sql << ')'
end
sql
end
copy_table_sql(table, opts)
click to toggle source
SQL for doing fast table output to stdout.
def copy_table_sql(table, opts)
if table.is_a?(String)
table
else
if opts[:options] || opts[:format]
options = " ("
options << "FORMAT #{opts[:format]}" if opts[:format]
options << "#{', ' if opts[:format]}#{opts[:options]}" if opts[:options]
options << ')'
end
table = if table.is_a?(::Sequel::Dataset)
"(#{table.sql})"
else
literal(table)
end
"COPY #{table} TO STDOUT#{options}"
end
end
create_function_sql(name, definition, opts=OPTS)
click to toggle source
SQL statement to create database function.
def create_function_sql(name, definition, opts=OPTS)
args = opts[:args]
if !opts[:args].is_a?(Array) || !opts[:args].any?{|a| Array(a).length == 3 and %wOUT INOUT'.include?(a[2].to_s)}
returns = opts[:returns] || 'void'
end
language = opts[:language] || 'SQL'
<<-END
CREATE#{' OR REPLACE' if opts[:replace]} FUNCTION #{name}#{sql_function_args(args)}
#{"RETURNS #{returns}" if returns}
LANGUAGE #{language}
#{opts[:behavior].to_s.upcase if opts[:behavior]}
#{'STRICT' if opts[:strict]}
#{'SECURITY DEFINER' if opts[:security_definer]}
#{"COST #{opts[:cost]}" if opts[:cost]}
#{"ROWS #{opts[:rows]}" if opts[:rows]}
#{opts[:set].map{|k,v| " SET #{k} = #{v}"}.join("\n") if opts[:set]}
AS #{literal(definition.to_s)}#{", #{literal(opts[:link_symbol].to_s)}" if opts[:link_symbol]}
END
end
create_language_sql(name, opts=OPTS)
click to toggle source
SQL for creating a procedural language.
def create_language_sql(name, opts=OPTS)
"CREATE#{' OR REPLACE' if opts[:replace] && server_version >= 90000}#{' TRUSTED' if opts[:trusted]} LANGUAGE #{name}#{" HANDLER #{opts[:handler]}" if opts[:handler]}#{" VALIDATOR #{opts[:validator]}" if opts[:validator]}"
end
create_schema_sql(name, opts=OPTS)
click to toggle source
SQL for creating a schema.
def create_schema_sql(name, opts=OPTS)
"CREATE SCHEMA #{'IF NOT EXISTS ' if opts[:if_not_exists]}#{quote_identifier(name)}#{" AUTHORIZATION #{literal(opts[:owner])}" if opts[:owner]}"
end
create_table_as_sql(name, sql, options)
click to toggle source
def create_table_as_sql(name, sql, options)
result = create_table_prefix_sql name, options
if on_commit = options[:on_commit]
result << " ON COMMIT #{ON_COMMIT[on_commit]}"
end
result << " AS #{sql}"
end
create_table_generator_class()
click to toggle source
Use a PostgreSQL-specific create table generator
def create_table_generator_class
Postgres::CreateTableGenerator
end
create_table_prefix_sql(name, options)
click to toggle source
DDL statement for creating a table with the given name, columns, and
options
def create_table_prefix_sql(name, options)
prefix_sql = if options[:temp]
raise(Error, "can't provide both :temp and :unlogged to create_table") if options[:unlogged]
raise(Error, "can't provide both :temp and :foreign to create_table") if options[:foreign]
temporary_table_sql
elsif options[:foreign]
raise(Error, "can't provide both :foreign and :unlogged to create_table") if options[:unlogged]
'FOREIGN '
elsif options[:unlogged]
UNLOGGED
end
"CREATE #{prefix_sql}TABLE#{' IF NOT EXISTS' if options[:if_not_exists]} #{options[:temp] ? quote_identifier(name) : quote_schema_table(name)}"
end
create_table_sql(name, generator, options)
click to toggle source
Calls superclass method
def create_table_sql(name, generator, options)
sql = super
if inherits = options[:inherits]
sql << " INHERITS (#{Array(inherits).map{|t| quote_schema_table(t)}.join(', ')})"
end
if on_commit = options[:on_commit]
raise(Error, "can't provide :on_commit without :temp to create_table") unless options[:temp]
raise(Error, "unsupported on_commit option: #{on_commit.inspect}") unless ON_COMMIT.has_key?(on_commit)
sql << " ON COMMIT #{ON_COMMIT[on_commit]}"
end
if server = options[:foreign]
sql << " SERVER #{quote_identifier(server)}"
if foreign_opts = options[:options]
sql << " OPTIONS (#{foreign_opts.map{|k, v| "#{k} #{literal(v.to_s)}"}.join(', ')})"
end
end
sql
end
create_trigger_sql(table, name, function, opts=OPTS)
click to toggle source
SQL for creating a database trigger.
def create_trigger_sql(table, name, function, opts=OPTS)
events = opts[:events] ? Array(opts[:events]) : [:insert, :update, :delete]
whence = opts[:after] ? 'AFTER' : 'BEFORE'
if filter = opts[:when]
raise Error, "Trigger conditions are not supported for this database" unless supports_trigger_conditions?
filter = " WHEN #{filter_expr(filter)}"
end
"CREATE TRIGGER #{name} #{whence} #{events.map{|e| e.to_s.upcase}.join(' OR ')} ON #{quote_schema_table(table)}#{' FOR EACH ROW' if opts[:each_row]}#{filter} EXECUTE PROCEDURE #{function}(#{Array(opts[:args]).map{|a| literal(a)}.join(', ')})"
end
create_view_prefix_sql(name, options)
click to toggle source
DDL fragment for initial part of CREATE VIEW statement
def create_view_prefix_sql(name, options)
create_view_sql_append_columns("CREATE #{'OR REPLACE 'if options[:replace]}#{'TEMPORARY 'if options[:temp]}#{'RECURSIVE ' if options[:recursive]}#{'MATERIALIZED ' if options[:materialized]}VIEW #{quote_schema_table(name)}", options[:columns] || options[:recursive])
end
database_error_classes()
click to toggle source
The errors that the main adapters can raise, depends on the adapter being
used
def database_error_classes
CONVERTED_EXCEPTIONS
end
database_error_regexps()
click to toggle source
def database_error_regexps
DATABASE_ERROR_REGEXPS
end
database_specific_error_class_from_sqlstate(sqlstate)
click to toggle source
Calls superclass method
def database_specific_error_class_from_sqlstate(sqlstate)
if sqlstate == EXCLUSION_CONSTRAINT_SQL_STATE
ExclusionConstraintViolation
elsif sqlstate == DEADLOCK_SQL_STATE
SerializationFailure
else
super
end
end
drop_function_sql(name, opts=OPTS)
click to toggle source
SQL for dropping a function from the database.
def drop_function_sql(name, opts=OPTS)
"DROP FUNCTION#{' IF EXISTS' if opts[:if_exists]} #{name}#{sql_function_args(opts[:args])}#{' CASCADE' if opts[:cascade]}"
end
drop_index_sql(table, op)
click to toggle source
Support :if_exists, :cascade, and :concurrently options.
def drop_index_sql(table, op)
sch, _ = schema_and_table(table)
"DROP INDEX#{' CONCURRENTLY' if op[:concurrently]}#{' IF EXISTS' if op[:if_exists]} #{"#{quote_identifier(sch)}." if sch}#{quote_identifier(op[:name] || default_index_name(table, op[:columns]))}#{' CASCADE' if op[:cascade]}"
end
drop_language_sql(name, opts=OPTS)
click to toggle source
SQL for dropping a procedural language from the
database.
def drop_language_sql(name, opts=OPTS)
"DROP LANGUAGE#{' IF EXISTS' if opts[:if_exists]} #{name}#{' CASCADE' if opts[:cascade]}"
end
drop_schema_sql(name, opts=OPTS)
click to toggle source
SQL for dropping a schema from the database.
def drop_schema_sql(name, opts=OPTS)
"DROP SCHEMA#{' IF EXISTS' if opts[:if_exists]} #{quote_identifier(name)}#{' CASCADE' if opts[:cascade]}"
end
drop_table_sql(name, options)
click to toggle source
Support :foreign tables
def drop_table_sql(name, options)
"DROP#{' FOREIGN' if options[:foreign]} TABLE#{' IF EXISTS' if options[:if_exists]} #{quote_schema_table(name)}#{' CASCADE' if options[:cascade]}"
end
drop_trigger_sql(table, name, opts=OPTS)
click to toggle source
SQL for dropping a trigger from the database.
def drop_trigger_sql(table, name, opts=OPTS)
"DROP TRIGGER#{' IF EXISTS' if opts[:if_exists]} #{name} ON #{quote_schema_table(table)}#{' CASCADE' if opts[:cascade]}"
end
drop_view_sql(name, opts=OPTS)
click to toggle source
SQL for dropping a view from the database.
def drop_view_sql(name, opts=OPTS)
"DROP #{'MATERIALIZED ' if opts[:materialized]}VIEW#{' IF EXISTS' if opts[:if_exists]} #{quote_schema_table(name)}#{' CASCADE' if opts[:cascade]}"
end
filter_schema(ds, opts)
click to toggle source
If opts includes a :schema option, or a default schema is used, restrict
the dataset to that schema. Otherwise, just exclude the default PostgreSQL
schemas except for public.
def filter_schema(ds, opts)
expr = if schema = opts[:schema]
schema.to_s
else
Sequel.function(:any, Sequel.function(:current_schemas, false))
end
ds.where(:pg_namespace__nspname=>expr)
end
get_conversion_procs()
click to toggle source
Return a hash with oid keys and callable values, used for converting types.
def get_conversion_procs
procs = PG_TYPES.dup
procs[1184] = procs[1114] = method(:to_application_timestamp)
add_named_conversion_procs(procs, PG_NAMED_TYPES)
procs
end
identifier_output_method_default()
click to toggle source
PostgreSQL folds unquoted identifiers to lowercase, so it shouldn't
need to upcase identifiers on output.
def identifier_output_method_default
nil
end
index_definition_sql(table_name, index)
click to toggle source
PostgreSQL specific index SQL.
def index_definition_sql(table_name, index)
cols = index[:columns]
index_name = index[:name] || default_index_name(table_name, cols)
expr = if o = index[:opclass]
"(#{Array(cols).map{|c| "#{literal(c)} #{o}"}.join(', ')})"
else
literal(Array(cols))
end
unique = "UNIQUE " if index[:unique]
index_type = index[:type]
filter = index[:where] || index[:filter]
filter = " WHERE #{filter_expr(filter)}" if filter
case index_type
when :full_text
expr = "(to_tsvector(#{literal(index[:language] || 'simple')}::regconfig, #{literal(dataset.send(:full_text_string_join, cols))}))"
index_type = index[:index_type] || :gin
when :spatial
index_type = :gist
end
"CREATE #{unique}INDEX#{' CONCURRENTLY' if index[:concurrently]} #{quote_identifier(index_name)} ON #{quote_schema_table(table_name)} #{"USING #{index_type} " if index_type}#{expr}#{filter}"
end
initialize_postgres_adapter()
click to toggle source
Setup datastructures shared by all postgres adapters.
def initialize_postgres_adapter
@primary_keys = {}
@primary_key_sequences = {}
@conversion_procs = PG_TYPES.dup
reset_conversion_procs
end
pg_class_relname(type, opts) { |ds| ... }
click to toggle source
Backbone of the tables and views support.
def pg_class_relname(type, opts)
ds = metadata_dataset.from(:pg_class).filter(:relkind=>type).select(:relname).server(opts[:server]).join(:pg_namespace, :oid=>:relnamespace)
ds = filter_schema(ds, opts)
m = output_identifier_meth
if block_given?
yield(ds)
elsif opts[:qualify]
ds.select_append(:pg_namespace__nspname).map{|r| Sequel.qualify(m.call(r[:nspname]), m.call(r[:relname]))}
else
ds.map{|r| m.call(r[:relname])}
end
end
prepared_arg_placeholder()
click to toggle source
Use a dollar sign instead of question mark for the argument placeholder.
def prepared_arg_placeholder
PREPARED_ARG_PLACEHOLDER
end
regclass_oid(expr, opts=OPTS)
click to toggle source
Return an expression the oid for the table expr. Used by the metadata
parsing code to disambiguate unqualified tables.
def regclass_oid(expr, opts=OPTS)
if expr.is_a?(String) && !expr.is_a?(LiteralString)
expr = Sequel.identifier(expr)
end
sch, table = schema_and_table(expr)
sch ||= opts[:schema]
if sch
expr = Sequel.qualify(sch, table)
end
expr = if ds = opts[:dataset]
ds.literal(expr)
else
literal(expr)
end
Sequel.cast(expr.to_s,:regclass).cast(:oid)
end
remove_cached_schema(table)
click to toggle source
Remove the cached entries for primary keys and sequences when a table is
changed.
Calls superclass method
def remove_cached_schema(table)
tab = quote_schema_table(table)
Sequel.synchronize do
@primary_keys.delete(tab)
@primary_key_sequences.delete(tab)
end
super
end
rename_table_sql(name, new_name)
click to toggle source
SQL DDL statement for renaming a table.
PostgreSQL doesn't allow you to change a table's schema in a rename
table operation, so speciying a new schema in new_name will not have an
effect.
def rename_table_sql(name, new_name)
"ALTER TABLE #{quote_schema_table(name)} RENAME TO #{quote_identifier(schema_and_table(new_name).last)}"
end
schema_autoincrementing_primary_key?(schema)
click to toggle source
PostgreSQL's autoincrementing primary keys are of type integer or
bigint using a nextval function call as a default.
Calls superclass method
def schema_autoincrementing_primary_key?(schema)
super && schema[:default] =~ /\Anextval/o
end
schema_column_type(db_type)
click to toggle source
Recognize PostgreSQL interval type.
Calls superclass method
def schema_column_type(db_type)
case db_type
when /\Ainterval\z/o
:interval
when /\Acitext\z/o
:string
else
super
end
end
schema_parse_table(table_name, opts)
click to toggle source
The dataset used for parsing table schemas, using the pg_* system catalogs.
def schema_parse_table(table_name, opts)
m = output_identifier_meth(opts[:dataset])
ds = metadata_dataset.select(:pg_attribute__attname___name,
SQL::Cast.new(:pg_attribute__atttypid, :integer).as(:oid),
SQL::Cast.new(:basetype__oid, :integer).as(:base_oid),
SQL::Function.new(:format_type, :basetype__oid, :pg_type__typtypmod).as(:db_base_type),
SQL::Function.new(:format_type, :pg_type__oid, :pg_attribute__atttypmod).as(:db_type),
SQL::Function.new(:pg_get_expr, :pg_attrdef__adbin, :pg_class__oid).as(:default),
SQL::BooleanExpression.new(:NOT, :pg_attribute__attnotnull).as(:allow_null),
SQL::Function.new(:COALESCE, SQL::BooleanExpression.from_value_pairs(:pg_attribute__attnum => SQL::Function.new(:ANY, :pg_index__indkey)), false).as(:primary_key)).
from(:pg_class).
join(:pg_attribute, :attrelid=>:oid).
join(:pg_type, :oid=>:atttypid).
left_outer_join(:pg_type___basetype, :oid=>:typbasetype).
left_outer_join(:pg_attrdef, :adrelid=>:pg_class__oid, :adnum=>:pg_attribute__attnum).
left_outer_join(:pg_index, :indrelid=>:pg_class__oid, :indisprimary=>true).
filter(:pg_attribute__attisdropped=>false).
filter{|o| o.pg_attribute__attnum > 0}.
filter(:pg_class__oid=>regclass_oid(table_name, opts)).
order(:pg_attribute__attnum)
ds.map do |row|
row[:default] = nil if blank_object?(row[:default])
if row[:base_oid]
row[:domain_oid] = row[:oid]
row[:oid] = row.delete(:base_oid)
row[:db_domain_type] = row[:db_type]
row[:db_type] = row.delete(:db_base_type)
else
row.delete(:base_oid)
row.delete(:db_base_type)
end
row[:type] = schema_column_type(row[:db_type])
[m.call(row.delete(:name)), row]
end
end
set_transaction_isolation(conn, opts)
click to toggle source
Set the transaction isolation level on the given connection
def set_transaction_isolation(conn, opts)
level = opts.fetch(:isolation, transaction_isolation_level)
read_only = opts[:read_only]
deferrable = opts[:deferrable]
if level || !read_only.nil? || !deferrable.nil?
sql = "SET TRANSACTION"
sql << " ISOLATION LEVEL #{Sequel::Database::TRANSACTION_ISOLATION_LEVELS[level]}" if level
sql << " READ #{read_only ? 'ONLY' : 'WRITE'}" unless read_only.nil?
sql << " #{'NOT ' unless deferrable}DEFERRABLE" unless deferrable.nil?
log_connection_execute(conn, sql)
end
end
sql_function_args(args)
click to toggle source
Turns an array of argument specifiers into an SQL
fragment used for function arguments. See create_function_sql.
def sql_function_args(args)
"(#{Array(args).map{|a| Array(a).reverse.join(' ')}.join(', ')})"
end
supports_combining_alter_table_ops?()
click to toggle source
PostgreSQL can combine multiple alter table ops into a single query.
def supports_combining_alter_table_ops?
true
end
supports_create_or_replace_view?()
click to toggle source
PostgreSQL supports CREATE OR REPLACE VIEW.
def supports_create_or_replace_view?
true
end
type_literal_generic_bignum(column)
click to toggle source
Handle bigserial type if :serial option is present
Calls superclass method
def type_literal_generic_bignum(column)
column[:serial] ? :bigserial : super
end
type_literal_generic_file(column)
click to toggle source
PostgreSQL uses the bytea data type for blobs
def type_literal_generic_file(column)
:bytea
end
type_literal_generic_integer(column)
click to toggle source
Handle serial type if :serial option is present
Calls superclass method
def type_literal_generic_integer(column)
column[:serial] ? :serial : super
end
type_literal_generic_string(column)
click to toggle source
PostgreSQL prefers the text datatype. If a fixed size is requested, the
char type is used. If the text type is specifically disallowed or there is
a size specified, use the varchar type. Otherwise use the type type.
def type_literal_generic_string(column)
if column[:fixed]
"char(#{column[:size]||255})"
elsif column[:text] == false or column[:size]
"varchar(#{column[:size]||255})"
else
:text
end
end
view_with_check_option_support()
click to toggle source
PostgreSQL 9.4+ supports views with check option.
def view_with_check_option_support
:local if server_version >= 90400
end