summaryrefslogtreecommitdiff
path: root/doc/build/core
diff options
context:
space:
mode:
authorSławek Ehlert <slafs@op.pl>2015-01-27 22:04:38 +0100
committerSławek Ehlert <slafs@op.pl>2015-01-27 22:04:38 +0100
commit57b2bd5dcba6140b511c898c0f682234f13d5c51 (patch)
treea0899b2a35d27e177001b163054c3c9a8f7f1c06 /doc/build/core
parent6a1f16d09958e549502a0991890d64964c71b357 (diff)
parent8aaa8dd6bdfb85fa481efa3115b9080d935d344c (diff)
downloadsqlalchemy-pr/152.tar.gz
Merge branch 'master' into oracle-servicename-optionpr/152
Diffstat (limited to 'doc/build/core')
-rw-r--r--doc/build/core/api_basics.rst12
-rw-r--r--doc/build/core/compiler.rst2
-rw-r--r--doc/build/core/connections.rst37
-rw-r--r--doc/build/core/constraints.rst189
-rw-r--r--doc/build/core/custom_types.rst500
-rw-r--r--doc/build/core/ddl.rst30
-rw-r--r--doc/build/core/engines_connections.rst11
-rw-r--r--doc/build/core/exceptions.rst2
-rw-r--r--doc/build/core/expression_api.rst4
-rw-r--r--doc/build/core/functions.rst3
-rw-r--r--doc/build/core/index.rst16
-rw-r--r--doc/build/core/internals.rst7
-rw-r--r--doc/build/core/metadata.rst1
-rw-r--r--doc/build/core/schema.rst4
-rw-r--r--doc/build/core/selectable.rst3
-rw-r--r--doc/build/core/sqla_engine_arch.pngbin28189 -> 28190 bytes
-rw-r--r--doc/build/core/tutorial.rst4
-rw-r--r--doc/build/core/type_api.rst22
-rw-r--r--doc/build/core/type_basics.rst229
-rw-r--r--doc/build/core/types.rst745
20 files changed, 1011 insertions, 810 deletions
diff --git a/doc/build/core/api_basics.rst b/doc/build/core/api_basics.rst
new file mode 100644
index 000000000..e56a1117b
--- /dev/null
+++ b/doc/build/core/api_basics.rst
@@ -0,0 +1,12 @@
+=================
+Core API Basics
+=================
+
+.. toctree::
+ :maxdepth: 2
+
+ event
+ inspection
+ interfaces
+ exceptions
+ internals
diff --git a/doc/build/core/compiler.rst b/doc/build/core/compiler.rst
index 73c9e3995..202ef2b0e 100644
--- a/doc/build/core/compiler.rst
+++ b/doc/build/core/compiler.rst
@@ -4,4 +4,4 @@ Custom SQL Constructs and Compilation Extension
===============================================
.. automodule:: sqlalchemy.ext.compiler
- :members: \ No newline at end of file
+ :members:
diff --git a/doc/build/core/connections.rst b/doc/build/core/connections.rst
index 248309a2e..6d7e7622f 100644
--- a/doc/build/core/connections.rst
+++ b/doc/build/core/connections.rst
@@ -453,13 +453,36 @@ Working with Raw DBAPI Connections
There are some cases where SQLAlchemy does not provide a genericized way
at accessing some :term:`DBAPI` functions, such as calling stored procedures as well
as dealing with multiple result sets. In these cases, it's just as expedient
-to deal with the raw DBAPI connection directly. This is accessible from
-a :class:`.Engine` using the :meth:`.Engine.raw_connection` method::
+to deal with the raw DBAPI connection directly.
+
+The most common way to access the raw DBAPI connection is to get it
+from an already present :class:`.Connection` object directly. It is
+present using the :attr:`.Connection.connection` attribute::
+
+ connection = engine.connect()
+ dbapi_conn = connection.connection
+
+The DBAPI connection here is actually a "proxied" in terms of the
+originating connection pool, however this is an implementation detail
+that in most cases can be ignored. As this DBAPI connection is still
+contained within the scope of an owning :class:`.Connection` object, it is
+best to make use of the :class:`.Connection` object for most features such
+as transaction control as well as calling the :meth:`.Connection.close`
+method; if these operations are performed on the DBAPI connection directly,
+the owning :class:`.Connection` will not be aware of these changes in state.
+
+To overcome the limitations imposed by the DBAPI connection that is
+maintained by an owning :class:`.Connection`, a DBAPI connection is also
+available without the need to procure a
+:class:`.Connection` first, using the :meth:`.Engine.raw_connection` method
+of :class:`.Engine`::
dbapi_conn = engine.raw_connection()
-The instance returned is a "wrapped" form of DBAPI connection. When its
-``.close()`` method is called, the connection is :term:`released` back to the
+This DBAPI connection is again a "proxied" form as was the case before.
+The purpose of this proxying is now apparent, as when we call the ``.close()``
+method of this connection, the DBAPI connection is typically not actually
+closed, but instead :term:`released` back to the
engine's connection pool::
dbapi_conn.close()
@@ -568,16 +591,16 @@ Connection / Engine API
.. autoclass:: Engine
:members:
-.. autoclass:: sqlalchemy.engine.ExceptionContext
+.. autoclass:: ExceptionContext
:members:
.. autoclass:: NestedTransaction
:members:
-.. autoclass:: sqlalchemy.engine.ResultProxy
+.. autoclass:: ResultProxy
:members:
-.. autoclass:: sqlalchemy.engine.RowProxy
+.. autoclass:: RowProxy
:members:
.. autoclass:: Transaction
diff --git a/doc/build/core/constraints.rst b/doc/build/core/constraints.rst
index 554d003bb..1f855c724 100644
--- a/doc/build/core/constraints.rst
+++ b/doc/build/core/constraints.rst
@@ -7,11 +7,11 @@
Defining Constraints and Indexes
=================================
-.. _metadata_foreignkeys:
-
This section will discuss SQL :term:`constraints` and indexes. In SQLAlchemy
the key classes include :class:`.ForeignKeyConstraint` and :class:`.Index`.
+.. _metadata_foreignkeys:
+
Defining Foreign Keys
---------------------
@@ -95,40 +95,175 @@ foreign key referencing two columns.
Creating/Dropping Foreign Key Constraints via ALTER
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-In all the above examples, the :class:`~sqlalchemy.schema.ForeignKey` object
-causes the "REFERENCES" keyword to be added inline to a column definition
-within a "CREATE TABLE" statement when
-:func:`~sqlalchemy.schema.MetaData.create_all` is issued, and
-:class:`~sqlalchemy.schema.ForeignKeyConstraint` invokes the "CONSTRAINT"
-keyword inline with "CREATE TABLE". There are some cases where this is
-undesirable, particularly when two tables reference each other mutually, each
-with a foreign key referencing the other. In such a situation at least one of
-the foreign key constraints must be generated after both tables have been
-built. To support such a scheme, :class:`~sqlalchemy.schema.ForeignKey` and
-:class:`~sqlalchemy.schema.ForeignKeyConstraint` offer the flag
-``use_alter=True``. When using this flag, the constraint will be generated
-using a definition similar to "ALTER TABLE <tablename> ADD CONSTRAINT <name>
-...". Since a name is required, the ``name`` attribute must also be specified.
-For example::
-
- node = Table('node', meta,
+The behavior we've seen in tutorials and elsewhere involving
+foreign keys with DDL illustrates that the constraints are typically
+rendered "inline" within the CREATE TABLE statement, such as:
+
+.. sourcecode:: sql
+
+ CREATE TABLE addresses (
+ id INTEGER NOT NULL,
+ user_id INTEGER,
+ email_address VARCHAR NOT NULL,
+ PRIMARY KEY (id),
+ CONSTRAINT user_id_fk FOREIGN KEY(user_id) REFERENCES users (id)
+ )
+
+The ``CONSTRAINT .. FOREIGN KEY`` directive is used to create the constraint
+in an "inline" fashion within the CREATE TABLE definition. The
+:meth:`.MetaData.create_all` and :meth:`.MetaData.drop_all` methods do
+this by default, using a topological sort of all the :class:`.Table` objects
+involved such that tables are created and dropped in order of their foreign
+key dependency (this sort is also available via the
+:attr:`.MetaData.sorted_tables` accessor).
+
+This approach can't work when two or more foreign key constraints are
+involved in a "dependency cycle", where a set of tables
+are mutually dependent on each other, assuming the backend enforces foreign
+keys (always the case except on SQLite, MySQL/MyISAM). The methods will
+therefore break out constraints in such a cycle into separate ALTER
+statements, on all backends other than SQLite which does not support
+most forms of ALTER. Given a schema like::
+
+ node = Table(
+ 'node', metadata,
Column('node_id', Integer, primary_key=True),
- Column('primary_element', Integer,
- ForeignKey('element.element_id', use_alter=True, name='fk_node_element_id')
+ Column(
+ 'primary_element', Integer,
+ ForeignKey('element.element_id')
)
)
- element = Table('element', meta,
+ element = Table(
+ 'element', metadata,
Column('element_id', Integer, primary_key=True),
Column('parent_node_id', Integer),
ForeignKeyConstraint(
- ['parent_node_id'],
- ['node.node_id'],
- use_alter=True,
+ ['parent_node_id'], ['node.node_id'],
name='fk_element_parent_node_id'
)
)
+When we call upon :meth:`.MetaData.create_all` on a backend such as the
+Postgresql backend, the cycle between these two tables is resolved and the
+constraints are created separately:
+
+.. sourcecode:: pycon+sql
+
+ >>> with engine.connect() as conn:
+ ... metadata.create_all(conn, checkfirst=False)
+ {opensql}CREATE TABLE element (
+ element_id SERIAL NOT NULL,
+ parent_node_id INTEGER,
+ PRIMARY KEY (element_id)
+ )
+
+ CREATE TABLE node (
+ node_id SERIAL NOT NULL,
+ primary_element INTEGER,
+ PRIMARY KEY (node_id)
+ )
+
+ ALTER TABLE element ADD CONSTRAINT fk_element_parent_node_id
+ FOREIGN KEY(parent_node_id) REFERENCES node (node_id)
+ ALTER TABLE node ADD FOREIGN KEY(primary_element)
+ REFERENCES element (element_id)
+ {stop}
+
+In order to emit DROP for these tables, the same logic applies, however
+note here that in SQL, to emit DROP CONSTRAINT requires that the constraint
+has a name. In the case of the ``'node'`` table above, we haven't named
+this constraint; the system will therefore attempt to emit DROP for only
+those constraints that are named:
+
+.. sourcecode:: pycon+sql
+
+ >>> with engine.connect() as conn:
+ ... metadata.drop_all(conn, checkfirst=False)
+ {opensql}ALTER TABLE element DROP CONSTRAINT fk_element_parent_node_id
+ DROP TABLE node
+ DROP TABLE element
+ {stop}
+
+
+In the case where the cycle cannot be resolved, such as if we hadn't applied
+a name to either constraint here, we will receive the following error::
+
+ sqlalchemy.exc.CircularDependencyError: Can't sort tables for DROP;
+ an unresolvable foreign key dependency exists between tables:
+ element, node. Please ensure that the ForeignKey and ForeignKeyConstraint
+ objects involved in the cycle have names so that they can be dropped
+ using DROP CONSTRAINT.
+
+This error only applies to the DROP case as we can emit "ADD CONSTRAINT"
+in the CREATE case without a name; the database typically assigns one
+automatically.
+
+The :paramref:`.ForeignKeyConstraint.use_alter` and
+:paramref:`.ForeignKey.use_alter` keyword arguments can be used
+to manually resolve dependency cycles. We can add this flag only to
+the ``'element'`` table as follows::
+
+ element = Table(
+ 'element', metadata,
+ Column('element_id', Integer, primary_key=True),
+ Column('parent_node_id', Integer),
+ ForeignKeyConstraint(
+ ['parent_node_id'], ['node.node_id'],
+ use_alter=True, name='fk_element_parent_node_id'
+ )
+ )
+
+in our CREATE DDL we will see the ALTER statement only for this constraint,
+and not the other one:
+
+.. sourcecode:: pycon+sql
+
+ >>> with engine.connect() as conn:
+ ... metadata.create_all(conn, checkfirst=False)
+ {opensql}CREATE TABLE element (
+ element_id SERIAL NOT NULL,
+ parent_node_id INTEGER,
+ PRIMARY KEY (element_id)
+ )
+
+ CREATE TABLE node (
+ node_id SERIAL NOT NULL,
+ primary_element INTEGER,
+ PRIMARY KEY (node_id),
+ FOREIGN KEY(primary_element) REFERENCES element (element_id)
+ )
+
+ ALTER TABLE element ADD CONSTRAINT fk_element_parent_node_id
+ FOREIGN KEY(parent_node_id) REFERENCES node (node_id)
+ {stop}
+
+:paramref:`.ForeignKeyConstraint.use_alter` and
+:paramref:`.ForeignKey.use_alter`, when used in conjunction with a drop
+operation, will require that the constraint is named, else an error
+like the following is generated::
+
+ sqlalchemy.exc.CompileError: Can't emit DROP CONSTRAINT for constraint
+ ForeignKeyConstraint(...); it has no name
+
+.. versionchanged:: 1.0.0 - The DDL system invoked by
+ :meth:`.MetaData.create_all`
+ and :meth:`.MetaData.drop_all` will now automatically resolve mutually
+ depdendent foreign keys between tables declared by
+ :class:`.ForeignKeyConstraint` and :class:`.ForeignKey` objects, without
+ the need to explicitly set the :paramref:`.ForeignKeyConstraint.use_alter`
+ flag.
+
+.. versionchanged:: 1.0.0 - The :paramref:`.ForeignKeyConstraint.use_alter`
+ flag can be used with an un-named constraint; only the DROP operation
+ will emit a specific error when actually called upon.
+
+.. seealso::
+
+ :ref:`constraint_naming_conventions`
+
+ :func:`.sort_tables_and_constraints`
+
.. _on_update_on_delete:
ON UPDATE and ON DELETE
@@ -439,14 +574,10 @@ Constraints API
:members:
:inherited-members:
-.. autoclass:: ColumnCollectionConstraint
- :members:
-
.. autoclass:: ForeignKey
:members:
:inherited-members:
-
.. autoclass:: ForeignKeyConstraint
:members:
:inherited-members:
diff --git a/doc/build/core/custom_types.rst b/doc/build/core/custom_types.rst
new file mode 100644
index 000000000..8d0c42703
--- /dev/null
+++ b/doc/build/core/custom_types.rst
@@ -0,0 +1,500 @@
+.. module:: sqlalchemy.types
+
+.. _types_custom:
+
+Custom Types
+------------
+
+A variety of methods exist to redefine the behavior of existing types
+as well as to provide new ones.
+
+Overriding Type Compilation
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+A frequent need is to force the "string" version of a type, that is
+the one rendered in a CREATE TABLE statement or other SQL function
+like CAST, to be changed. For example, an application may want
+to force the rendering of ``BINARY`` for all platforms
+except for one, in which is wants ``BLOB`` to be rendered. Usage
+of an existing generic type, in this case :class:`.LargeBinary`, is
+preferred for most use cases. But to control
+types more accurately, a compilation directive that is per-dialect
+can be associated with any type::
+
+ from sqlalchemy.ext.compiler import compiles
+ from sqlalchemy.types import BINARY
+
+ @compiles(BINARY, "sqlite")
+ def compile_binary_sqlite(type_, compiler, **kw):
+ return "BLOB"
+
+The above code allows the usage of :class:`.types.BINARY`, which
+will produce the string ``BINARY`` against all backends except SQLite,
+in which case it will produce ``BLOB``.
+
+See the section :ref:`type_compilation_extension`, a subsection of
+:ref:`sqlalchemy.ext.compiler_toplevel`, for additional examples.
+
+.. _types_typedecorator:
+
+Augmenting Existing Types
+~~~~~~~~~~~~~~~~~~~~~~~~~
+
+The :class:`.TypeDecorator` allows the creation of custom types which
+add bind-parameter and result-processing behavior to an existing
+type object. It is used when additional in-Python marshaling of data
+to and from the database is required.
+
+.. note::
+
+ The bind- and result-processing of :class:`.TypeDecorator`
+ is *in addition* to the processing already performed by the hosted
+ type, which is customized by SQLAlchemy on a per-DBAPI basis to perform
+ processing specific to that DBAPI. To change the DBAPI-level processing
+ for an existing type, see the section :ref:`replacing_processors`.
+
+.. autoclass:: TypeDecorator
+ :members:
+ :inherited-members:
+
+
+TypeDecorator Recipes
+~~~~~~~~~~~~~~~~~~~~~
+A few key :class:`.TypeDecorator` recipes follow.
+
+.. _coerce_to_unicode:
+
+Coercing Encoded Strings to Unicode
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+A common source of confusion regarding the :class:`.Unicode` type
+is that it is intended to deal *only* with Python ``unicode`` objects
+on the Python side, meaning values passed to it as bind parameters
+must be of the form ``u'some string'`` if using Python 2 and not 3.
+The encoding/decoding functions it performs are only to suit what the
+DBAPI in use requires, and are primarily a private implementation detail.
+
+The use case of a type that can safely receive Python bytestrings,
+that is strings that contain non-ASCII characters and are not ``u''``
+objects in Python 2, can be achieved using a :class:`.TypeDecorator`
+which coerces as needed::
+
+ from sqlalchemy.types import TypeDecorator, Unicode
+
+ class CoerceUTF8(TypeDecorator):
+ """Safely coerce Python bytestrings to Unicode
+ before passing off to the database."""
+
+ impl = Unicode
+
+ def process_bind_param(self, value, dialect):
+ if isinstance(value, str):
+ value = value.decode('utf-8')
+ return value
+
+Rounding Numerics
+^^^^^^^^^^^^^^^^^
+
+Some database connectors like those of SQL Server choke if a Decimal is passed with too
+many decimal places. Here's a recipe that rounds them down::
+
+ from sqlalchemy.types import TypeDecorator, Numeric
+ from decimal import Decimal
+
+ class SafeNumeric(TypeDecorator):
+ """Adds quantization to Numeric."""
+
+ impl = Numeric
+
+ def __init__(self, *arg, **kw):
+ TypeDecorator.__init__(self, *arg, **kw)
+ self.quantize_int = -(self.impl.precision - self.impl.scale)
+ self.quantize = Decimal(10) ** self.quantize_int
+
+ def process_bind_param(self, value, dialect):
+ if isinstance(value, Decimal) and \
+ value.as_tuple()[2] < self.quantize_int:
+ value = value.quantize(self.quantize)
+ return value
+
+.. _custom_guid_type:
+
+Backend-agnostic GUID Type
+^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Receives and returns Python uuid() objects. Uses the PG UUID type
+when using Postgresql, CHAR(32) on other backends, storing them
+in stringified hex format. Can be modified to store
+binary in CHAR(16) if desired::
+
+ from sqlalchemy.types import TypeDecorator, CHAR
+ from sqlalchemy.dialects.postgresql import UUID
+ import uuid
+
+ class GUID(TypeDecorator):
+ """Platform-independent GUID type.
+
+ Uses Postgresql's UUID type, otherwise uses
+ CHAR(32), storing as stringified hex values.
+
+ """
+ impl = CHAR
+
+ def load_dialect_impl(self, dialect):
+ if dialect.name == 'postgresql':
+ return dialect.type_descriptor(UUID())
+ else:
+ return dialect.type_descriptor(CHAR(32))
+
+ def process_bind_param(self, value, dialect):
+ if value is None:
+ return value
+ elif dialect.name == 'postgresql':
+ return str(value)
+ else:
+ if not isinstance(value, uuid.UUID):
+ return "%.32x" % uuid.UUID(value)
+ else:
+ # hexstring
+ return "%.32x" % value
+
+ def process_result_value(self, value, dialect):
+ if value is None:
+ return value
+ else:
+ return uuid.UUID(value)
+
+Marshal JSON Strings
+^^^^^^^^^^^^^^^^^^^^^
+
+This type uses ``simplejson`` to marshal Python data structures
+to/from JSON. Can be modified to use Python's builtin json encoder::
+
+ from sqlalchemy.types import TypeDecorator, VARCHAR
+ import json
+
+ class JSONEncodedDict(TypeDecorator):
+ """Represents an immutable structure as a json-encoded string.
+
+ Usage::
+
+ JSONEncodedDict(255)
+
+ """
+
+ impl = VARCHAR
+
+ def process_bind_param(self, value, dialect):
+ if value is not None:
+ value = json.dumps(value)
+
+ return value
+
+ def process_result_value(self, value, dialect):
+ if value is not None:
+ value = json.loads(value)
+ return value
+
+Note that the ORM by default will not detect "mutability" on such a type -
+meaning, in-place changes to values will not be detected and will not be
+flushed. Without further steps, you instead would need to replace the existing
+value with a new one on each parent object to detect changes. Note that
+there's nothing wrong with this, as many applications may not require that the
+values are ever mutated once created. For those which do have this requirement,
+support for mutability is best applied using the ``sqlalchemy.ext.mutable``
+extension - see the example in :ref:`mutable_toplevel`.
+
+.. _replacing_processors:
+
+Replacing the Bind/Result Processing of Existing Types
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Most augmentation of type behavior at the bind/result level
+is achieved using :class:`.TypeDecorator`. For the rare scenario
+where the specific processing applied by SQLAlchemy at the DBAPI
+level needs to be replaced, the SQLAlchemy type can be subclassed
+directly, and the ``bind_processor()`` or ``result_processor()``
+methods can be overridden. Doing so requires that the
+``adapt()`` method also be overridden. This method is the mechanism
+by which SQLAlchemy produces DBAPI-specific type behavior during
+statement execution. Overriding it allows a copy of the custom
+type to be used in lieu of a DBAPI-specific type. Below we subclass
+the :class:`.types.TIME` type to have custom result processing behavior.
+The ``process()`` function will receive ``value`` from the DBAPI
+cursor directly::
+
+ class MySpecialTime(TIME):
+ def __init__(self, special_argument):
+ super(MySpecialTime, self).__init__()
+ self.special_argument = special_argument
+
+ def result_processor(self, dialect, coltype):
+ import datetime
+ time = datetime.time
+ def process(value):
+ if value is not None:
+ microseconds = value.microseconds
+ seconds = value.seconds
+ minutes = seconds / 60
+ return time(
+ minutes / 60,
+ minutes % 60,
+ seconds - minutes * 60,
+ microseconds)
+ else:
+ return None
+ return process
+
+ def adapt(self, impltype):
+ return MySpecialTime(self.special_argument)
+
+.. _types_sql_value_processing:
+
+Applying SQL-level Bind/Result Processing
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+As seen in the sections :ref:`types_typedecorator` and :ref:`replacing_processors`,
+SQLAlchemy allows Python functions to be invoked both when parameters are sent
+to a statement, as well as when result rows are loaded from the database, to apply
+transformations to the values as they are sent to or from the database. It is also
+possible to define SQL-level transformations as well. The rationale here is when
+only the relational database contains a particular series of functions that are necessary
+to coerce incoming and outgoing data between an application and persistence format.
+Examples include using database-defined encryption/decryption functions, as well
+as stored procedures that handle geographic data. The Postgis extension to Postgresql
+includes an extensive array of SQL functions that are necessary for coercing
+data into particular formats.
+
+Any :class:`.TypeEngine`, :class:`.UserDefinedType` or :class:`.TypeDecorator` subclass
+can include implementations of
+:meth:`.TypeEngine.bind_expression` and/or :meth:`.TypeEngine.column_expression`, which
+when defined to return a non-``None`` value should return a :class:`.ColumnElement`
+expression to be injected into the SQL statement, either surrounding
+bound parameters or a column expression. For example, to build a ``Geometry``
+type which will apply the Postgis function ``ST_GeomFromText`` to all outgoing
+values and the function ``ST_AsText`` to all incoming data, we can create
+our own subclass of :class:`.UserDefinedType` which provides these methods
+in conjunction with :data:`~.sqlalchemy.sql.expression.func`::
+
+ from sqlalchemy import func
+ from sqlalchemy.types import UserDefinedType
+
+ class Geometry(UserDefinedType):
+ def get_col_spec(self):
+ return "GEOMETRY"
+
+ def bind_expression(self, bindvalue):
+ return func.ST_GeomFromText(bindvalue, type_=self)
+
+ def column_expression(self, col):
+ return func.ST_AsText(col, type_=self)
+
+We can apply the ``Geometry`` type into :class:`.Table` metadata
+and use it in a :func:`.select` construct::
+
+ geometry = Table('geometry', metadata,
+ Column('geom_id', Integer, primary_key=True),
+ Column('geom_data', Geometry)
+ )
+
+ print select([geometry]).where(
+ geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)')
+
+The resulting SQL embeds both functions as appropriate. ``ST_AsText``
+is applied to the columns clause so that the return value is run through
+the function before passing into a result set, and ``ST_GeomFromText``
+is run on the bound parameter so that the passed-in value is converted::
+
+ SELECT geometry.geom_id, ST_AsText(geometry.geom_data) AS geom_data_1
+ FROM geometry
+ WHERE geometry.geom_data = ST_GeomFromText(:geom_data_2)
+
+The :meth:`.TypeEngine.column_expression` method interacts with the
+mechanics of the compiler such that the SQL expression does not interfere
+with the labeling of the wrapped expression. Such as, if we rendered
+a :func:`.select` against a :func:`.label` of our expression, the string
+label is moved to the outside of the wrapped expression::
+
+ print select([geometry.c.geom_data.label('my_data')])
+
+Output::
+
+ SELECT ST_AsText(geometry.geom_data) AS my_data
+ FROM geometry
+
+For an example of subclassing a built in type directly, we subclass
+:class:`.postgresql.BYTEA` to provide a ``PGPString``, which will make use of the
+Postgresql ``pgcrypto`` extension to encrpyt/decrypt values
+transparently::
+
+ from sqlalchemy import create_engine, String, select, func, \
+ MetaData, Table, Column, type_coerce
+
+ from sqlalchemy.dialects.postgresql import BYTEA
+
+ class PGPString(BYTEA):
+ def __init__(self, passphrase, length=None):
+ super(PGPString, self).__init__(length)
+ self.passphrase = passphrase
+
+ def bind_expression(self, bindvalue):
+ # convert the bind's type from PGPString to
+ # String, so that it's passed to psycopg2 as is without
+ # a dbapi.Binary wrapper
+ bindvalue = type_coerce(bindvalue, String)
+ return func.pgp_sym_encrypt(bindvalue, self.passphrase)
+
+ def column_expression(self, col):
+ return func.pgp_sym_decrypt(col, self.passphrase)
+
+ metadata = MetaData()
+ message = Table('message', metadata,
+ Column('username', String(50)),
+ Column('message',
+ PGPString("this is my passphrase", length=1000)),
+ )
+
+ engine = create_engine("postgresql://scott:tiger@localhost/test", echo=True)
+ with engine.begin() as conn:
+ metadata.create_all(conn)
+
+ conn.execute(message.insert(), username="some user",
+ message="this is my message")
+
+ print conn.scalar(
+ select([message.c.message]).\
+ where(message.c.username == "some user")
+ )
+
+The ``pgp_sym_encrypt`` and ``pgp_sym_decrypt`` functions are applied
+to the INSERT and SELECT statements::
+
+ INSERT INTO message (username, message)
+ VALUES (%(username)s, pgp_sym_encrypt(%(message)s, %(pgp_sym_encrypt_1)s))
+ {'username': 'some user', 'message': 'this is my message',
+ 'pgp_sym_encrypt_1': 'this is my passphrase'}
+
+ SELECT pgp_sym_decrypt(message.message, %(pgp_sym_decrypt_1)s) AS message_1
+ FROM message
+ WHERE message.username = %(username_1)s
+ {'pgp_sym_decrypt_1': 'this is my passphrase', 'username_1': 'some user'}
+
+
+.. versionadded:: 0.8 Added the :meth:`.TypeEngine.bind_expression` and
+ :meth:`.TypeEngine.column_expression` methods.
+
+See also:
+
+:ref:`examples_postgis`
+
+.. _types_operators:
+
+Redefining and Creating New Operators
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+SQLAlchemy Core defines a fixed set of expression operators available to all column expressions.
+Some of these operations have the effect of overloading Python's built in operators;
+examples of such operators include
+:meth:`.ColumnOperators.__eq__` (``table.c.somecolumn == 'foo'``),
+:meth:`.ColumnOperators.__invert__` (``~table.c.flag``),
+and :meth:`.ColumnOperators.__add__` (``table.c.x + table.c.y``). Other operators are exposed as
+explicit methods on column expressions, such as
+:meth:`.ColumnOperators.in_` (``table.c.value.in_(['x', 'y'])``) and :meth:`.ColumnOperators.like`
+(``table.c.value.like('%ed%')``).
+
+The Core expression constructs in all cases consult the type of the expression in order to determine
+the behavior of existing operators, as well as to locate additional operators that aren't part of
+the built in set. The :class:`.TypeEngine` base class defines a root "comparison" implementation
+:class:`.TypeEngine.Comparator`, and many specific types provide their own sub-implementations of this
+class. User-defined :class:`.TypeEngine.Comparator` implementations can be built directly into a
+simple subclass of a particular type in order to override or define new operations. Below,
+we create a :class:`.Integer` subclass which overrides the :meth:`.ColumnOperators.__add__` operator::
+
+ from sqlalchemy import Integer
+
+ class MyInt(Integer):
+ class comparator_factory(Integer.Comparator):
+ def __add__(self, other):
+ return self.op("goofy")(other)
+
+The above configuration creates a new class ``MyInt``, which
+establishes the :attr:`.TypeEngine.comparator_factory` attribute as
+referring to a new class, subclassing the :class:`.TypeEngine.Comparator` class
+associated with the :class:`.Integer` type.
+
+Usage::
+
+ >>> sometable = Table("sometable", metadata, Column("data", MyInt))
+ >>> print sometable.c.data + 5
+ sometable.data goofy :data_1
+
+The implementation for :meth:`.ColumnOperators.__add__` is consulted
+by an owning SQL expression, by instantiating the :class:`.TypeEngine.Comparator` with
+itself as the ``expr`` attribute. The mechanics of the expression
+system are such that operations continue recursively until an
+expression object produces a new SQL expression construct. Above, we
+could just as well have said ``self.expr.op("goofy")(other)`` instead
+of ``self.op("goofy")(other)``.
+
+New methods added to a :class:`.TypeEngine.Comparator` are exposed on an
+owning SQL expression
+using a ``__getattr__`` scheme, which exposes methods added to
+:class:`.TypeEngine.Comparator` onto the owning :class:`.ColumnElement`.
+For example, to add a ``log()`` function
+to integers::
+
+ from sqlalchemy import Integer, func
+
+ class MyInt(Integer):
+ class comparator_factory(Integer.Comparator):
+ def log(self, other):
+ return func.log(self.expr, other)
+
+Using the above type::
+
+ >>> print sometable.c.data.log(5)
+ log(:log_1, :log_2)
+
+
+Unary operations
+are also possible. For example, to add an implementation of the
+Postgresql factorial operator, we combine the :class:`.UnaryExpression` construct
+along with a :class:`.custom_op` to produce the factorial expression::
+
+ from sqlalchemy import Integer
+ from sqlalchemy.sql.expression import UnaryExpression
+ from sqlalchemy.sql import operators
+
+ class MyInteger(Integer):
+ class comparator_factory(Integer.Comparator):
+ def factorial(self):
+ return UnaryExpression(self.expr,
+ modifier=operators.custom_op("!"),
+ type_=MyInteger)
+
+Using the above type::
+
+ >>> from sqlalchemy.sql import column
+ >>> print column('x', MyInteger).factorial()
+ x !
+
+See also:
+
+:attr:`.TypeEngine.comparator_factory`
+
+.. versionadded:: 0.8 The expression system was enhanced to support
+ customization of operators on a per-type level.
+
+
+Creating New Types
+~~~~~~~~~~~~~~~~~~
+
+The :class:`.UserDefinedType` class is provided as a simple base class
+for defining entirely new database types. Use this to represent native
+database types not known by SQLAlchemy. If only Python translation behavior
+is needed, use :class:`.TypeDecorator` instead.
+
+.. autoclass:: UserDefinedType
+ :members:
+
+
diff --git a/doc/build/core/ddl.rst b/doc/build/core/ddl.rst
index cee6f876e..0ba2f2806 100644
--- a/doc/build/core/ddl.rst
+++ b/doc/build/core/ddl.rst
@@ -220,68 +220,72 @@ details.
DDL Expression Constructs API
-----------------------------
+.. autofunction:: sort_tables
+
+.. autofunction:: sort_tables_and_constraints
+
.. autoclass:: DDLElement
:members:
:undoc-members:
-
+
.. autoclass:: DDL
:members:
:undoc-members:
-
+
.. autoclass:: CreateTable
:members:
:undoc-members:
-
+
.. autoclass:: DropTable
:members:
:undoc-members:
-
+
.. autoclass:: CreateColumn
:members:
:undoc-members:
-
+
.. autoclass:: CreateSequence
:members:
:undoc-members:
-
+
.. autoclass:: DropSequence
:members:
:undoc-members:
-
+
.. autoclass:: CreateIndex
:members:
:undoc-members:
-
+
.. autoclass:: DropIndex
:members:
:undoc-members:
-
+
.. autoclass:: AddConstraint
:members:
:undoc-members:
-
+
.. autoclass:: DropConstraint
:members:
:undoc-members:
-
+
.. autoclass:: CreateSchema
:members:
:undoc-members:
-
+
.. autoclass:: DropSchema
:members:
:undoc-members:
-
+
diff --git a/doc/build/core/engines_connections.rst b/doc/build/core/engines_connections.rst
new file mode 100644
index 000000000..f163a7629
--- /dev/null
+++ b/doc/build/core/engines_connections.rst
@@ -0,0 +1,11 @@
+=========================
+Engine and Connection Use
+=========================
+
+.. toctree::
+ :maxdepth: 2
+
+ engines
+ connections
+ pooling
+ events
diff --git a/doc/build/core/exceptions.rst b/doc/build/core/exceptions.rst
index 30270f8b0..63bbc1e15 100644
--- a/doc/build/core/exceptions.rst
+++ b/doc/build/core/exceptions.rst
@@ -2,4 +2,4 @@ Core Exceptions
===============
.. automodule:: sqlalchemy.exc
- :members: \ No newline at end of file
+ :members:
diff --git a/doc/build/core/expression_api.rst b/doc/build/core/expression_api.rst
index 99bb98881..b32fa0e23 100644
--- a/doc/build/core/expression_api.rst
+++ b/doc/build/core/expression_api.rst
@@ -16,5 +16,5 @@ see :ref:`sqlexpression_toplevel`.
selectable
dml
functions
- types
-
+ compiler
+ serializer
diff --git a/doc/build/core/functions.rst b/doc/build/core/functions.rst
index d284d125f..90164850d 100644
--- a/doc/build/core/functions.rst
+++ b/doc/build/core/functions.rst
@@ -22,6 +22,7 @@ return types are in use.
.. automodule:: sqlalchemy.sql.functions
:members:
:undoc-members:
-
+ :exclude-members: func
+
diff --git a/doc/build/core/index.rst b/doc/build/core/index.rst
index 210f28412..26c26af07 100644
--- a/doc/build/core/index.rst
+++ b/doc/build/core/index.rst
@@ -9,19 +9,11 @@ In contrast to the ORM’s domain-centric mode of usage, the SQL Expression
Language provides a schema-centric usage paradigm.
.. toctree::
- :maxdepth: 3
+ :maxdepth: 2
tutorial
expression_api
schema
- engines
- connections
- pooling
- event
- events
- compiler
- inspection
- serializer
- interfaces
- exceptions
- internals
+ types
+ engines_connections
+ api_basics
diff --git a/doc/build/core/internals.rst b/doc/build/core/internals.rst
index 1a85e9e6c..81b4f1a81 100644
--- a/doc/build/core/internals.rst
+++ b/doc/build/core/internals.rst
@@ -7,6 +7,9 @@ Some key internal constructs are listed here.
.. currentmodule: sqlalchemy
+.. autoclass:: sqlalchemy.schema.ColumnCollectionMixin
+ :members:
+
.. autoclass:: sqlalchemy.engine.interfaces.Compiled
:members:
@@ -29,6 +32,10 @@ Some key internal constructs are listed here.
:members:
+.. autoclass:: sqlalchemy.log.Identified
+ :members:
+
+
.. autoclass:: sqlalchemy.sql.compiler.IdentifierPreparer
:members:
diff --git a/doc/build/core/metadata.rst b/doc/build/core/metadata.rst
index d6fc8c6af..e46217c17 100644
--- a/doc/build/core/metadata.rst
+++ b/doc/build/core/metadata.rst
@@ -316,6 +316,7 @@ Column, Table, MetaData API
.. autoclass:: SchemaItem
:members:
+ :undoc-members:
.. autoclass:: Table
:members:
diff --git a/doc/build/core/schema.rst b/doc/build/core/schema.rst
index aeb04be18..8553ebcbf 100644
--- a/doc/build/core/schema.rst
+++ b/doc/build/core/schema.rst
@@ -33,7 +33,7 @@ real DDL. They are therefore most intuitive to those who have some background
in creating real schema generation scripts.
.. toctree::
- :maxdepth: 1
+ :maxdepth: 2
metadata
reflection
@@ -41,5 +41,3 @@ in creating real schema generation scripts.
constraints
ddl
-
-
diff --git a/doc/build/core/selectable.rst b/doc/build/core/selectable.rst
index 52acb28e5..03ebeb4ab 100644
--- a/doc/build/core/selectable.rst
+++ b/doc/build/core/selectable.rst
@@ -60,6 +60,9 @@ elements are themselves :class:`.ColumnElement` subclasses).
.. autoclass:: HasPrefixes
:members:
+.. autoclass:: HasSuffixes
+ :members:
+
.. autoclass:: Join
:members:
:inherited-members:
diff --git a/doc/build/core/sqla_engine_arch.png b/doc/build/core/sqla_engine_arch.png
index f54d105bd..f040a2cf3 100644
--- a/doc/build/core/sqla_engine_arch.png
+++ b/doc/build/core/sqla_engine_arch.png
Binary files differ
diff --git a/doc/build/core/tutorial.rst b/doc/build/core/tutorial.rst
index 04a25b174..e96217f79 100644
--- a/doc/build/core/tutorial.rst
+++ b/doc/build/core/tutorial.rst
@@ -307,6 +307,8 @@ them is different across different databases; each database's
determine the correct value (or values; note that ``inserted_primary_key``
returns a list so that it supports composite primary keys).
+.. _execute_multiple:
+
Executing Multiple Statements
==============================
@@ -368,7 +370,7 @@ Selecting
==========
We began with inserts just so that our test database had some data in it. The
-more interesting part of the data is selecting it ! We'll cover UPDATE and
+more interesting part of the data is selecting it! We'll cover UPDATE and
DELETE statements later. The primary construct used to generate SELECT
statements is the :func:`.select` function:
diff --git a/doc/build/core/type_api.rst b/doc/build/core/type_api.rst
new file mode 100644
index 000000000..88da4939e
--- /dev/null
+++ b/doc/build/core/type_api.rst
@@ -0,0 +1,22 @@
+.. module:: sqlalchemy.types
+
+.. _types_api:
+
+Base Type API
+--------------
+
+.. autoclass:: TypeEngine
+ :members:
+
+
+.. autoclass:: Concatenable
+ :members:
+ :inherited-members:
+
+
+.. autoclass:: NullType
+
+
+.. autoclass:: Variant
+
+ :members: with_variant, __init__
diff --git a/doc/build/core/type_basics.rst b/doc/build/core/type_basics.rst
new file mode 100644
index 000000000..1ff1baac2
--- /dev/null
+++ b/doc/build/core/type_basics.rst
@@ -0,0 +1,229 @@
+Column and Data Types
+=====================
+
+.. module:: sqlalchemy.types
+
+SQLAlchemy provides abstractions for most common database data types,
+and a mechanism for specifying your own custom data types.
+
+The methods and attributes of type objects are rarely used directly.
+Type objects are supplied to :class:`~sqlalchemy.schema.Table` definitions
+and can be supplied as type hints to `functions` for occasions where
+the database driver returns an incorrect type.
+
+.. code-block:: pycon
+
+ >>> users = Table('users', metadata,
+ ... Column('id', Integer, primary_key=True)
+ ... Column('login', String(32))
+ ... )
+
+
+SQLAlchemy will use the ``Integer`` and ``String(32)`` type
+information when issuing a ``CREATE TABLE`` statement and will use it
+again when reading back rows ``SELECTed`` from the database.
+Functions that accept a type (such as :func:`~sqlalchemy.schema.Column`) will
+typically accept a type class or instance; ``Integer`` is equivalent
+to ``Integer()`` with no construction arguments in this case.
+
+.. _types_generic:
+
+Generic Types
+-------------
+
+Generic types specify a column that can read, write and store a
+particular type of Python data. SQLAlchemy will choose the best
+database column type available on the target database when issuing a
+``CREATE TABLE`` statement. For complete control over which column
+type is emitted in ``CREATE TABLE``, such as ``VARCHAR`` see `SQL
+Standard Types`_ and the other sections of this chapter.
+
+.. autoclass:: BigInteger
+ :members:
+
+.. autoclass:: Boolean
+ :members:
+
+.. autoclass:: Date
+ :members:
+
+.. autoclass:: DateTime
+ :members:
+
+.. autoclass:: Enum
+ :members: __init__, create, drop
+
+.. autoclass:: Float
+ :members:
+
+.. autoclass:: Integer
+ :members:
+
+.. autoclass:: Interval
+ :members:
+
+.. autoclass:: LargeBinary
+ :members:
+
+.. autoclass:: MatchType
+ :members:
+
+.. autoclass:: Numeric
+ :members:
+
+.. autoclass:: PickleType
+ :members:
+
+.. autoclass:: SchemaType
+ :members:
+ :undoc-members:
+
+.. autoclass:: SmallInteger
+ :members:
+
+.. autoclass:: String
+ :members:
+
+.. autoclass:: Text
+ :members:
+
+.. autoclass:: Time
+ :members:
+
+.. autoclass:: Unicode
+ :members:
+
+.. autoclass:: UnicodeText
+ :members:
+
+.. _types_sqlstandard:
+
+SQL Standard Types
+------------------
+
+The SQL standard types always create database column types of the same
+name when ``CREATE TABLE`` is issued. Some types may not be supported
+on all databases.
+
+.. autoclass:: BIGINT
+
+
+.. autoclass:: BINARY
+
+
+.. autoclass:: BLOB
+
+
+.. autoclass:: BOOLEAN
+
+
+.. autoclass:: CHAR
+
+
+.. autoclass:: CLOB
+
+
+.. autoclass:: DATE
+
+
+.. autoclass:: DATETIME
+
+
+.. autoclass:: DECIMAL
+
+
+.. autoclass:: FLOAT
+
+
+.. autoclass:: INT
+
+
+.. autoclass:: sqlalchemy.types.INTEGER
+
+
+.. autoclass:: NCHAR
+
+
+.. autoclass:: NVARCHAR
+
+
+.. autoclass:: NUMERIC
+
+
+.. autoclass:: REAL
+
+
+.. autoclass:: SMALLINT
+
+
+.. autoclass:: TEXT
+
+
+.. autoclass:: TIME
+
+
+.. autoclass:: TIMESTAMP
+
+
+.. autoclass:: VARBINARY
+
+
+.. autoclass:: VARCHAR
+
+
+.. _types_vendor:
+
+Vendor-Specific Types
+---------------------
+
+Database-specific types are also available for import from each
+database's dialect module. See the :ref:`dialect_toplevel`
+reference for the database you're interested in.
+
+For example, MySQL has a ``BIGINT`` type and PostgreSQL has an
+``INET`` type. To use these, import them from the module explicitly::
+
+ from sqlalchemy.dialects import mysql
+
+ table = Table('foo', metadata,
+ Column('id', mysql.BIGINT),
+ Column('enumerates', mysql.ENUM('a', 'b', 'c'))
+ )
+
+Or some PostgreSQL types::
+
+ from sqlalchemy.dialects import postgresql
+
+ table = Table('foo', metadata,
+ Column('ipaddress', postgresql.INET),
+ Column('elements', postgresql.ARRAY(String))
+ )
+
+Each dialect provides the full set of typenames supported by
+that backend within its `__all__` collection, so that a simple
+`import *` or similar will import all supported types as
+implemented for that backend::
+
+ from sqlalchemy.dialects.postgresql import *
+
+ t = Table('mytable', metadata,
+ Column('id', INTEGER, primary_key=True),
+ Column('name', VARCHAR(300)),
+ Column('inetaddr', INET)
+ )
+
+Where above, the INTEGER and VARCHAR types are ultimately from
+sqlalchemy.types, and INET is specific to the Postgresql dialect.
+
+Some dialect level types have the same name as the SQL standard type,
+but also provide additional arguments. For example, MySQL implements
+the full range of character and string types including additional arguments
+such as `collation` and `charset`::
+
+ from sqlalchemy.dialects.mysql import VARCHAR, TEXT
+
+ table = Table('foo', meta,
+ Column('col1', VARCHAR(200, collation='binary')),
+ Column('col2', TEXT(charset='latin1'))
+ )
+
diff --git a/doc/build/core/types.rst b/doc/build/core/types.rst
index 14e30e46d..ab761a1cb 100644
--- a/doc/build/core/types.rst
+++ b/doc/build/core/types.rst
@@ -3,744 +3,9 @@
Column and Data Types
=====================
-.. module:: sqlalchemy.types
+.. toctree::
+ :maxdepth: 2
-SQLAlchemy provides abstractions for most common database data types,
-and a mechanism for specifying your own custom data types.
-
-The methods and attributes of type objects are rarely used directly.
-Type objects are supplied to :class:`~sqlalchemy.schema.Table` definitions
-and can be supplied as type hints to `functions` for occasions where
-the database driver returns an incorrect type.
-
-.. code-block:: pycon
-
- >>> users = Table('users', metadata,
- ... Column('id', Integer, primary_key=True)
- ... Column('login', String(32))
- ... )
-
-
-SQLAlchemy will use the ``Integer`` and ``String(32)`` type
-information when issuing a ``CREATE TABLE`` statement and will use it
-again when reading back rows ``SELECTed`` from the database.
-Functions that accept a type (such as :func:`~sqlalchemy.schema.Column`) will
-typically accept a type class or instance; ``Integer`` is equivalent
-to ``Integer()`` with no construction arguments in this case.
-
-.. _types_generic:
-
-Generic Types
--------------
-
-Generic types specify a column that can read, write and store a
-particular type of Python data. SQLAlchemy will choose the best
-database column type available on the target database when issuing a
-``CREATE TABLE`` statement. For complete control over which column
-type is emitted in ``CREATE TABLE``, such as ``VARCHAR`` see `SQL
-Standard Types`_ and the other sections of this chapter.
-
-.. autoclass:: BigInteger
- :members:
-
-.. autoclass:: Boolean
- :members:
-
-.. autoclass:: Date
- :members:
-
-.. autoclass:: DateTime
- :members:
-
-.. autoclass:: Enum
- :members: __init__, create, drop
-
-.. autoclass:: Float
- :members:
-
-.. autoclass:: Integer
- :members:
-
-.. autoclass:: Interval
- :members:
-
-.. autoclass:: LargeBinary
- :members:
-
-.. autoclass:: Numeric
- :members:
-
-.. autoclass:: PickleType
- :members:
-
-.. autoclass:: SchemaType
- :members:
- :undoc-members:
-
-.. autoclass:: SmallInteger
- :members:
-
-.. autoclass:: String
- :members:
-
-.. autoclass:: Text
- :members:
-
-.. autoclass:: Time
- :members:
-
-.. autoclass:: Unicode
- :members:
-
-.. autoclass:: UnicodeText
- :members:
-
-.. _types_sqlstandard:
-
-SQL Standard Types
-------------------
-
-The SQL standard types always create database column types of the same
-name when ``CREATE TABLE`` is issued. Some types may not be supported
-on all databases.
-
-.. autoclass:: BIGINT
-
-
-.. autoclass:: BINARY
-
-
-.. autoclass:: BLOB
-
-
-.. autoclass:: BOOLEAN
-
-
-.. autoclass:: CHAR
-
-
-.. autoclass:: CLOB
-
-
-.. autoclass:: DATE
-
-
-.. autoclass:: DATETIME
-
-
-.. autoclass:: DECIMAL
-
-
-.. autoclass:: FLOAT
-
-
-.. autoclass:: INT
-
-
-.. autoclass:: sqlalchemy.types.INTEGER
-
-
-.. autoclass:: NCHAR
-
-
-.. autoclass:: NVARCHAR
-
-
-.. autoclass:: NUMERIC
-
-
-.. autoclass:: REAL
-
-
-.. autoclass:: SMALLINT
-
-
-.. autoclass:: TEXT
-
-
-.. autoclass:: TIME
-
-
-.. autoclass:: TIMESTAMP
-
-
-.. autoclass:: VARBINARY
-
-
-.. autoclass:: VARCHAR
-
-
-.. _types_vendor:
-
-Vendor-Specific Types
----------------------
-
-Database-specific types are also available for import from each
-database's dialect module. See the :ref:`dialect_toplevel`
-reference for the database you're interested in.
-
-For example, MySQL has a ``BIGINT`` type and PostgreSQL has an
-``INET`` type. To use these, import them from the module explicitly::
-
- from sqlalchemy.dialects import mysql
-
- table = Table('foo', metadata,
- Column('id', mysql.BIGINT),
- Column('enumerates', mysql.ENUM('a', 'b', 'c'))
- )
-
-Or some PostgreSQL types::
-
- from sqlalchemy.dialects import postgresql
-
- table = Table('foo', metadata,
- Column('ipaddress', postgresql.INET),
- Column('elements', postgresql.ARRAY(String))
- )
-
-Each dialect provides the full set of typenames supported by
-that backend within its `__all__` collection, so that a simple
-`import *` or similar will import all supported types as
-implemented for that backend::
-
- from sqlalchemy.dialects.postgresql import *
-
- t = Table('mytable', metadata,
- Column('id', INTEGER, primary_key=True),
- Column('name', VARCHAR(300)),
- Column('inetaddr', INET)
- )
-
-Where above, the INTEGER and VARCHAR types are ultimately from
-sqlalchemy.types, and INET is specific to the Postgresql dialect.
-
-Some dialect level types have the same name as the SQL standard type,
-but also provide additional arguments. For example, MySQL implements
-the full range of character and string types including additional arguments
-such as `collation` and `charset`::
-
- from sqlalchemy.dialects.mysql import VARCHAR, TEXT
-
- table = Table('foo', meta,
- Column('col1', VARCHAR(200, collation='binary')),
- Column('col2', TEXT(charset='latin1'))
- )
-
-.. _types_custom:
-
-Custom Types
-------------
-
-A variety of methods exist to redefine the behavior of existing types
-as well as to provide new ones.
-
-Overriding Type Compilation
-~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-A frequent need is to force the "string" version of a type, that is
-the one rendered in a CREATE TABLE statement or other SQL function
-like CAST, to be changed. For example, an application may want
-to force the rendering of ``BINARY`` for all platforms
-except for one, in which is wants ``BLOB`` to be rendered. Usage
-of an existing generic type, in this case :class:`.LargeBinary`, is
-preferred for most use cases. But to control
-types more accurately, a compilation directive that is per-dialect
-can be associated with any type::
-
- from sqlalchemy.ext.compiler import compiles
- from sqlalchemy.types import BINARY
-
- @compiles(BINARY, "sqlite")
- def compile_binary_sqlite(type_, compiler, **kw):
- return "BLOB"
-
-The above code allows the usage of :class:`.types.BINARY`, which
-will produce the string ``BINARY`` against all backends except SQLite,
-in which case it will produce ``BLOB``.
-
-See the section :ref:`type_compilation_extension`, a subsection of
-:ref:`sqlalchemy.ext.compiler_toplevel`, for additional examples.
-
-.. _types_typedecorator:
-
-Augmenting Existing Types
-~~~~~~~~~~~~~~~~~~~~~~~~~
-
-The :class:`.TypeDecorator` allows the creation of custom types which
-add bind-parameter and result-processing behavior to an existing
-type object. It is used when additional in-Python marshaling of data
-to and from the database is required.
-
-.. note::
-
- The bind- and result-processing of :class:`.TypeDecorator`
- is *in addition* to the processing already performed by the hosted
- type, which is customized by SQLAlchemy on a per-DBAPI basis to perform
- processing specific to that DBAPI. To change the DBAPI-level processing
- for an existing type, see the section :ref:`replacing_processors`.
-
-.. autoclass:: TypeDecorator
- :members:
- :inherited-members:
-
-
-TypeDecorator Recipes
-~~~~~~~~~~~~~~~~~~~~~
-A few key :class:`.TypeDecorator` recipes follow.
-
-.. _coerce_to_unicode:
-
-Coercing Encoded Strings to Unicode
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-A common source of confusion regarding the :class:`.Unicode` type
-is that it is intended to deal *only* with Python ``unicode`` objects
-on the Python side, meaning values passed to it as bind parameters
-must be of the form ``u'some string'`` if using Python 2 and not 3.
-The encoding/decoding functions it performs are only to suit what the
-DBAPI in use requires, and are primarily a private implementation detail.
-
-The use case of a type that can safely receive Python bytestrings,
-that is strings that contain non-ASCII characters and are not ``u''``
-objects in Python 2, can be achieved using a :class:`.TypeDecorator`
-which coerces as needed::
-
- from sqlalchemy.types import TypeDecorator, Unicode
-
- class CoerceUTF8(TypeDecorator):
- """Safely coerce Python bytestrings to Unicode
- before passing off to the database."""
-
- impl = Unicode
-
- def process_bind_param(self, value, dialect):
- if isinstance(value, str):
- value = value.decode('utf-8')
- return value
-
-Rounding Numerics
-^^^^^^^^^^^^^^^^^
-
-Some database connectors like those of SQL Server choke if a Decimal is passed with too
-many decimal places. Here's a recipe that rounds them down::
-
- from sqlalchemy.types import TypeDecorator, Numeric
- from decimal import Decimal
-
- class SafeNumeric(TypeDecorator):
- """Adds quantization to Numeric."""
-
- impl = Numeric
-
- def __init__(self, *arg, **kw):
- TypeDecorator.__init__(self, *arg, **kw)
- self.quantize_int = -(self.impl.precision - self.impl.scale)
- self.quantize = Decimal(10) ** self.quantize_int
-
- def process_bind_param(self, value, dialect):
- if isinstance(value, Decimal) and \
- value.as_tuple()[2] < self.quantize_int:
- value = value.quantize(self.quantize)
- return value
-
-.. _custom_guid_type:
-
-Backend-agnostic GUID Type
-^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Receives and returns Python uuid() objects. Uses the PG UUID type
-when using Postgresql, CHAR(32) on other backends, storing them
-in stringified hex format. Can be modified to store
-binary in CHAR(16) if desired::
-
- from sqlalchemy.types import TypeDecorator, CHAR
- from sqlalchemy.dialects.postgresql import UUID
- import uuid
-
- class GUID(TypeDecorator):
- """Platform-independent GUID type.
-
- Uses Postgresql's UUID type, otherwise uses
- CHAR(32), storing as stringified hex values.
-
- """
- impl = CHAR
-
- def load_dialect_impl(self, dialect):
- if dialect.name == 'postgresql':
- return dialect.type_descriptor(UUID())
- else:
- return dialect.type_descriptor(CHAR(32))
-
- def process_bind_param(self, value, dialect):
- if value is None:
- return value
- elif dialect.name == 'postgresql':
- return str(value)
- else:
- if not isinstance(value, uuid.UUID):
- return "%.32x" % uuid.UUID(value)
- else:
- # hexstring
- return "%.32x" % value
-
- def process_result_value(self, value, dialect):
- if value is None:
- return value
- else:
- return uuid.UUID(value)
-
-Marshal JSON Strings
-^^^^^^^^^^^^^^^^^^^^^
-
-This type uses ``simplejson`` to marshal Python data structures
-to/from JSON. Can be modified to use Python's builtin json encoder::
-
- from sqlalchemy.types import TypeDecorator, VARCHAR
- import json
-
- class JSONEncodedDict(TypeDecorator):
- """Represents an immutable structure as a json-encoded string.
-
- Usage::
-
- JSONEncodedDict(255)
-
- """
-
- impl = VARCHAR
-
- def process_bind_param(self, value, dialect):
- if value is not None:
- value = json.dumps(value)
-
- return value
-
- def process_result_value(self, value, dialect):
- if value is not None:
- value = json.loads(value)
- return value
-
-Note that the ORM by default will not detect "mutability" on such a type -
-meaning, in-place changes to values will not be detected and will not be
-flushed. Without further steps, you instead would need to replace the existing
-value with a new one on each parent object to detect changes. Note that
-there's nothing wrong with this, as many applications may not require that the
-values are ever mutated once created. For those which do have this requirement,
-support for mutability is best applied using the ``sqlalchemy.ext.mutable``
-extension - see the example in :ref:`mutable_toplevel`.
-
-.. _replacing_processors:
-
-Replacing the Bind/Result Processing of Existing Types
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-Most augmentation of type behavior at the bind/result level
-is achieved using :class:`.TypeDecorator`. For the rare scenario
-where the specific processing applied by SQLAlchemy at the DBAPI
-level needs to be replaced, the SQLAlchemy type can be subclassed
-directly, and the ``bind_processor()`` or ``result_processor()``
-methods can be overridden. Doing so requires that the
-``adapt()`` method also be overridden. This method is the mechanism
-by which SQLAlchemy produces DBAPI-specific type behavior during
-statement execution. Overriding it allows a copy of the custom
-type to be used in lieu of a DBAPI-specific type. Below we subclass
-the :class:`.types.TIME` type to have custom result processing behavior.
-The ``process()`` function will receive ``value`` from the DBAPI
-cursor directly::
-
- class MySpecialTime(TIME):
- def __init__(self, special_argument):
- super(MySpecialTime, self).__init__()
- self.special_argument = special_argument
-
- def result_processor(self, dialect, coltype):
- import datetime
- time = datetime.time
- def process(value):
- if value is not None:
- microseconds = value.microseconds
- seconds = value.seconds
- minutes = seconds / 60
- return time(
- minutes / 60,
- minutes % 60,
- seconds - minutes * 60,
- microseconds)
- else:
- return None
- return process
-
- def adapt(self, impltype):
- return MySpecialTime(self.special_argument)
-
-.. _types_sql_value_processing:
-
-Applying SQL-level Bind/Result Processing
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-As seen in the sections :ref:`types_typedecorator` and :ref:`replacing_processors`,
-SQLAlchemy allows Python functions to be invoked both when parameters are sent
-to a statement, as well as when result rows are loaded from the database, to apply
-transformations to the values as they are sent to or from the database. It is also
-possible to define SQL-level transformations as well. The rationale here is when
-only the relational database contains a particular series of functions that are necessary
-to coerce incoming and outgoing data between an application and persistence format.
-Examples include using database-defined encryption/decryption functions, as well
-as stored procedures that handle geographic data. The Postgis extension to Postgresql
-includes an extensive array of SQL functions that are necessary for coercing
-data into particular formats.
-
-Any :class:`.TypeEngine`, :class:`.UserDefinedType` or :class:`.TypeDecorator` subclass
-can include implementations of
-:meth:`.TypeEngine.bind_expression` and/or :meth:`.TypeEngine.column_expression`, which
-when defined to return a non-``None`` value should return a :class:`.ColumnElement`
-expression to be injected into the SQL statement, either surrounding
-bound parameters or a column expression. For example, to build a ``Geometry``
-type which will apply the Postgis function ``ST_GeomFromText`` to all outgoing
-values and the function ``ST_AsText`` to all incoming data, we can create
-our own subclass of :class:`.UserDefinedType` which provides these methods
-in conjunction with :data:`~.sqlalchemy.sql.expression.func`::
-
- from sqlalchemy import func
- from sqlalchemy.types import UserDefinedType
-
- class Geometry(UserDefinedType):
- def get_col_spec(self):
- return "GEOMETRY"
-
- def bind_expression(self, bindvalue):
- return func.ST_GeomFromText(bindvalue, type_=self)
-
- def column_expression(self, col):
- return func.ST_AsText(col, type_=self)
-
-We can apply the ``Geometry`` type into :class:`.Table` metadata
-and use it in a :func:`.select` construct::
-
- geometry = Table('geometry', metadata,
- Column('geom_id', Integer, primary_key=True),
- Column('geom_data', Geometry)
- )
-
- print select([geometry]).where(
- geometry.c.geom_data == 'LINESTRING(189412 252431,189631 259122)')
-
-The resulting SQL embeds both functions as appropriate. ``ST_AsText``
-is applied to the columns clause so that the return value is run through
-the function before passing into a result set, and ``ST_GeomFromText``
-is run on the bound parameter so that the passed-in value is converted::
-
- SELECT geometry.geom_id, ST_AsText(geometry.geom_data) AS geom_data_1
- FROM geometry
- WHERE geometry.geom_data = ST_GeomFromText(:geom_data_2)
-
-The :meth:`.TypeEngine.column_expression` method interacts with the
-mechanics of the compiler such that the SQL expression does not interfere
-with the labeling of the wrapped expression. Such as, if we rendered
-a :func:`.select` against a :func:`.label` of our expression, the string
-label is moved to the outside of the wrapped expression::
-
- print select([geometry.c.geom_data.label('my_data')])
-
-Output::
-
- SELECT ST_AsText(geometry.geom_data) AS my_data
- FROM geometry
-
-For an example of subclassing a built in type directly, we subclass
-:class:`.postgresql.BYTEA` to provide a ``PGPString``, which will make use of the
-Postgresql ``pgcrypto`` extension to encrpyt/decrypt values
-transparently::
-
- from sqlalchemy import create_engine, String, select, func, \
- MetaData, Table, Column, type_coerce
-
- from sqlalchemy.dialects.postgresql import BYTEA
-
- class PGPString(BYTEA):
- def __init__(self, passphrase, length=None):
- super(PGPString, self).__init__(length)
- self.passphrase = passphrase
-
- def bind_expression(self, bindvalue):
- # convert the bind's type from PGPString to
- # String, so that it's passed to psycopg2 as is without
- # a dbapi.Binary wrapper
- bindvalue = type_coerce(bindvalue, String)
- return func.pgp_sym_encrypt(bindvalue, self.passphrase)
-
- def column_expression(self, col):
- return func.pgp_sym_decrypt(col, self.passphrase)
-
- metadata = MetaData()
- message = Table('message', metadata,
- Column('username', String(50)),
- Column('message',
- PGPString("this is my passphrase", length=1000)),
- )
-
- engine = create_engine("postgresql://scott:tiger@localhost/test", echo=True)
- with engine.begin() as conn:
- metadata.create_all(conn)
-
- conn.execute(message.insert(), username="some user",
- message="this is my message")
-
- print conn.scalar(
- select([message.c.message]).\
- where(message.c.username == "some user")
- )
-
-The ``pgp_sym_encrypt`` and ``pgp_sym_decrypt`` functions are applied
-to the INSERT and SELECT statements::
-
- INSERT INTO message (username, message)
- VALUES (%(username)s, pgp_sym_encrypt(%(message)s, %(pgp_sym_encrypt_1)s))
- {'username': 'some user', 'message': 'this is my message',
- 'pgp_sym_encrypt_1': 'this is my passphrase'}
-
- SELECT pgp_sym_decrypt(message.message, %(pgp_sym_decrypt_1)s) AS message_1
- FROM message
- WHERE message.username = %(username_1)s
- {'pgp_sym_decrypt_1': 'this is my passphrase', 'username_1': 'some user'}
-
-
-.. versionadded:: 0.8 Added the :meth:`.TypeEngine.bind_expression` and
- :meth:`.TypeEngine.column_expression` methods.
-
-See also:
-
-:ref:`examples_postgis`
-
-.. _types_operators:
-
-Redefining and Creating New Operators
-~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-SQLAlchemy Core defines a fixed set of expression operators available to all column expressions.
-Some of these operations have the effect of overloading Python's built in operators;
-examples of such operators include
-:meth:`.ColumnOperators.__eq__` (``table.c.somecolumn == 'foo'``),
-:meth:`.ColumnOperators.__invert__` (``~table.c.flag``),
-and :meth:`.ColumnOperators.__add__` (``table.c.x + table.c.y``). Other operators are exposed as
-explicit methods on column expressions, such as
-:meth:`.ColumnOperators.in_` (``table.c.value.in_(['x', 'y'])``) and :meth:`.ColumnOperators.like`
-(``table.c.value.like('%ed%')``).
-
-The Core expression constructs in all cases consult the type of the expression in order to determine
-the behavior of existing operators, as well as to locate additional operators that aren't part of
-the built in set. The :class:`.TypeEngine` base class defines a root "comparison" implementation
-:class:`.TypeEngine.Comparator`, and many specific types provide their own sub-implementations of this
-class. User-defined :class:`.TypeEngine.Comparator` implementations can be built directly into a
-simple subclass of a particular type in order to override or define new operations. Below,
-we create a :class:`.Integer` subclass which overrides the :meth:`.ColumnOperators.__add__` operator::
-
- from sqlalchemy import Integer
-
- class MyInt(Integer):
- class comparator_factory(Integer.Comparator):
- def __add__(self, other):
- return self.op("goofy")(other)
-
-The above configuration creates a new class ``MyInt``, which
-establishes the :attr:`.TypeEngine.comparator_factory` attribute as
-referring to a new class, subclassing the :class:`.TypeEngine.Comparator` class
-associated with the :class:`.Integer` type.
-
-Usage::
-
- >>> sometable = Table("sometable", metadata, Column("data", MyInt))
- >>> print sometable.c.data + 5
- sometable.data goofy :data_1
-
-The implementation for :meth:`.ColumnOperators.__add__` is consulted
-by an owning SQL expression, by instantiating the :class:`.TypeEngine.Comparator` with
-itself as the ``expr`` attribute. The mechanics of the expression
-system are such that operations continue recursively until an
-expression object produces a new SQL expression construct. Above, we
-could just as well have said ``self.expr.op("goofy")(other)`` instead
-of ``self.op("goofy")(other)``.
-
-New methods added to a :class:`.TypeEngine.Comparator` are exposed on an
-owning SQL expression
-using a ``__getattr__`` scheme, which exposes methods added to
-:class:`.TypeEngine.Comparator` onto the owning :class:`.ColumnElement`.
-For example, to add a ``log()`` function
-to integers::
-
- from sqlalchemy import Integer, func
-
- class MyInt(Integer):
- class comparator_factory(Integer.Comparator):
- def log(self, other):
- return func.log(self.expr, other)
-
-Using the above type::
-
- >>> print sometable.c.data.log(5)
- log(:log_1, :log_2)
-
-
-Unary operations
-are also possible. For example, to add an implementation of the
-Postgresql factorial operator, we combine the :class:`.UnaryExpression` construct
-along with a :class:`.custom_op` to produce the factorial expression::
-
- from sqlalchemy import Integer
- from sqlalchemy.sql.expression import UnaryExpression
- from sqlalchemy.sql import operators
-
- class MyInteger(Integer):
- class comparator_factory(Integer.Comparator):
- def factorial(self):
- return UnaryExpression(self.expr,
- modifier=operators.custom_op("!"),
- type_=MyInteger)
-
-Using the above type::
-
- >>> from sqlalchemy.sql import column
- >>> print column('x', MyInteger).factorial()
- x !
-
-See also:
-
-:attr:`.TypeEngine.comparator_factory`
-
-.. versionadded:: 0.8 The expression system was enhanced to support
- customization of operators on a per-type level.
-
-
-Creating New Types
-~~~~~~~~~~~~~~~~~~
-
-The :class:`.UserDefinedType` class is provided as a simple base class
-for defining entirely new database types. Use this to represent native
-database types not known by SQLAlchemy. If only Python translation behavior
-is needed, use :class:`.TypeDecorator` instead.
-
-.. autoclass:: UserDefinedType
- :members:
-
-
-.. _types_api:
-
-Base Type API
---------------
-
-.. autoclass:: TypeEngine
- :members:
-
-
-.. autoclass:: Concatenable
- :members:
- :inherited-members:
-
-
-.. autoclass:: NullType
-
-
-.. autoclass:: Variant
-
- :members: with_variant, __init__
+ type_basics
+ custom_types
+ type_api