Indices and tables¶
Subpackages¶
- pydal.adapters package
- Submodules
- pydal.adapters.base module
- pydal.adapters.couchdb module
- pydal.adapters.cubrid module
- pydal.adapters.db2 module
- pydal.adapters.firebird module
- pydal.adapters.google_adapters module
- pydal.adapters.imap module
- pydal.adapters.informix module
- pydal.adapters.ingres module
- pydal.adapters.mongo module
- pydal.adapters.mssql module
- pydal.adapters.mysql module
- pydal.adapters.oracle module
- pydal.adapters.postgres module
- pydal.adapters.sapdb module
- pydal.adapters.sqlite module
- pydal.adapters.teradata module
- Module contents
- pydal.helpers package
Submodules¶
pydal.base module¶
This file contains the DAL support for many relational databases, including:
- SQLite & SpatiaLite
- MySQL
- Postgres
- Firebird
- Oracle
- MS SQL
- DB2
- Interbase
- Ingres
- Informix (9+ and SE)
- SapDB (experimental)
- Cubrid (experimental)
- CouchDB (experimental)
- MongoDB (in progress)
- Google:nosql
- Google:sql
- Teradata
- IMAP (experimental)
Example of usage:
>>> # from dal import DAL, Field
### create DAL connection (and create DB if it doesn't exist)
>>> db = DAL(('sqlite://storage.sqlite','mysql://a:b@localhost/x'),
... folder=None)
### define a table 'person' (create/alter as necessary)
>>> person = db.define_table('person',Field('name','string'))
### insert a record
>>> id = person.insert(name='James')
### retrieve it by id
>>> james = person(id)
### retrieve it by name
>>> james = person(name='James')
### retrieve it by arbitrary query
>>> query = (person.name=='James') & (person.name.startswith('J'))
>>> james = db(query).select(person.ALL)[0]
### update one record
>>> james.update_record(name='Jim')
<Row {'id': 1, 'name': 'Jim'}>
### update multiple records by query
>>> db(person.name.like('J%')).update(name='James')
1
### delete records by query
>>> db(person.name.lower() == 'jim').delete()
0
### retrieve multiple records (rows)
>>> people = db(person).select(orderby=person.name,
... groupby=person.name, limitby=(0,100))
### further filter them
>>> james = people.find(lambda row: row.name == 'James').first()
>>> print james.id, james.name
1 James
### check aggregates
>>> counter = person.id.count()
>>> print db(person).select(counter).first()(counter)
1
### delete one record
>>> james.delete_record()
1
### delete (drop) entire database table
>>> person.drop()
Supported DAL URI strings:
'sqlite://test.db'
'spatialite://test.db'
'sqlite:memory'
'spatialite:memory'
'jdbc:sqlite://test.db'
'mysql://root:none@localhost/test'
'postgres://mdipierro:password@localhost/test'
'postgres:psycopg2://mdipierro:password@localhost/test'
'postgres:pg8000://mdipierro:password@localhost/test'
'jdbc:postgres://mdipierro:none@localhost/test'
'mssql://web2py:none@A64X2/web2py_test'
'mssql2://web2py:none@A64X2/web2py_test' # alternate mappings
'mssql3://web2py:none@A64X2/web2py_test' # better pagination (requires >= 2005)
'mssql4://web2py:none@A64X2/web2py_test' # best pagination (requires >= 2012)
'oracle://username:password@database'
'firebird://user:password@server:3050/database'
'db2:ibm_db_dbi://DSN=dsn;UID=user;PWD=pass'
'db2:pyodbc://driver=DB2;hostname=host;database=database;uid=user;pwd=password;port=port'
'firebird://username:password@hostname/database'
'firebird_embedded://username:password@c://path'
'informix://user:password@server:3050/database'
'informixu://user:password@server:3050/database' # unicode informix
'ingres://database' # or use an ODBC connection string, e.g. 'ingres://dsn=dsn_name'
'google:datastore' # for google app engine datastore (uses ndb by default)
'google:sql' # for google app engine with sql (mysql compatible)
'teradata://DSN=dsn;UID=user;PWD=pass; DATABASE=database' # experimental
'imap://user:password@server:port' # experimental
'mongodb://user:password@server:port/database' # experimental
For more info:
help(DAL)
help(Field)
-
class
pydal.base.
DAL
(uri='sqlite://dummy.db', pool_size=0, folder=None, db_codec='UTF-8', check_reserved=None, migrate=True, fake_migrate=False, migrate_enabled=True, fake_migrate_all=False, decode_credentials=False, driver_args=None, adapter_args=None, attempts=5, auto_import=False, bigint_id=False, debug=False, lazy_tables=False, db_uid=None, do_connect=True, after_connection=None, tables=None, ignore_field_case=True, entity_quoting=True, table_hash=None)[source]¶ Bases:
pydal.helpers.classes.Serializable
,pydal.helpers.classes.BasicStorage
An instance of this class represents a database connection
Parameters: - uri (str) –
contains information for connecting to a database. Defaults to ‘sqlite://dummy.db’
Note
experimental: you can specify a dictionary as uri parameter i.e. with:
db = DAL({"uri": "sqlite://storage.sqlite", "tables": {...}, ...})
for an example of dict input you can check the output of the scaffolding db model with
db.as_dict()Note that for compatibility with Python older than version 2.6.5 you should cast your dict input keys to str due to a syntax limitation on kwarg names. for proper DAL dictionary input you can use one of:
obj = serializers.cast_keys(dict, [encoding="utf-8"]) #or else (for parsing json input) obj = serializers.loads_json(data, unicode_keys=False)
- pool_size – How many open connections to make to the database object.
- folder – where .table files will be created. Automatically set within web2py. Use an explicit path when using DAL outside web2py
- db_codec – string encoding of the database (default: ‘UTF-8’)
- table_hash – database identifier with .tables. If your connection hash change you can still using old .tables if they have db_hash as prefix
- check_reserved –
list of adapters to check tablenames and column names against sql/nosql reserved keywords. Defaults to None
- ‘common’ List of sql keywords that are common to all database types such as “SELECT, INSERT”. (recommended)
- ‘all’ Checks against all known SQL keywords
- ‘<adaptername>’’ Checks against the specific adapters list of keywords
- ‘<adaptername>_nonreserved’ Checks against the specific adapters list of nonreserved keywords. (if available)
- migrate – sets default migrate behavior for all tables
- fake_migrate – sets default fake_migrate behavior for all tables
- migrate_enabled – If set to False disables ALL migrations
- fake_migrate_all – If set to True fake migrates ALL tables
- attempts – Number of times to attempt connecting
- auto_import – If set to True, tries import automatically table definitions from the databases folder (works only for simple models)
- bigint_id – If set, turn on bigint instead of int for id and reference fields
- lazy_tables – delays table definition until table access
- after_connection – can a callable that will be executed after the connection
Example
Use as:
db = DAL('sqlite://test.db')
or:
db = DAL(**{"uri": ..., "tables": [...]...}) # experimental db.define_table('tablename', Field('fieldname1'), Field('fieldname2'))
-
class
Row
(*args, **kwargs)¶ Bases:
pydal.helpers.classes.BasicStorage
A dictionary that lets you do d[‘a’] as well as d.a this is only used to store a Row
-
as_dict
(datetime_to_str=False, custom_types=None)¶
-
as_json
(mode='object', default=None, colnames=None, serialize=True, **kwargs)¶ serializes the row to a JSON object kwargs are passed to .as_dict method only “object” mode supported
serialize = False used by Rows.as_json
TODO: return array mode with query column order
mode and colnames are not implemented
-
as_xml
(row_name='row', colnames=None, indent=' ')¶
-
get
(key, default=None)¶
-
-
class
DAL.
Rows
(db=None, records=[], colnames=[], compact=True, rawrows=None)¶ Bases:
pydal.objects.BasicRows
A wrapper for the return value of a select. It basically represents a table. It has an iterator and each row is represented as a Row dictionary.
-
append
(row)¶
-
column
(column=None)¶
-
exclude
(f)¶ Removes elements from the calling Rows object, filtered by the function f, and returns a new Rows object containing the removed elements
-
find
(f, limitby=None)¶ Returns a new Rows object, a subset of the original object, filtered by the function f
-
first
()¶
-
group_by_value
(*fields, **args)¶ Regroups the rows, by one of the fields
-
insert
(position, row)¶
-
join
(field, name=None, constraint=None, fields=[], orderby=None)¶
-
last
()¶
-
render
(i=None, fields=None)¶ Takes an index and returns a copy of the indexed row with values transformed via the “represent” attributes of the associated fields.
Parameters: - i – index. If not specified, a generator is returned for iteration over all the rows.
- fields – a list of fields to transform (if None, all fields with “represent” attributes will be transformed)
-
setvirtualfields
(**keyed_virtualfields)¶ For reference:
db.define_table('x', Field('number', 'integer')) if db(db.x).isempty(): [db.x.insert(number=i) for i in range(10)] from gluon.dal import lazy_virtualfield class MyVirtualFields(object): # normal virtual field (backward compatible, discouraged) def normal_shift(self): return self.x.number+1 # lazy virtual field (because of @staticmethod) @lazy_virtualfield def lazy_shift(instance, row, delta=4): return row.x.number+delta db.x.virtualfields.append(MyVirtualFields()) for row in db(db.x).select(): print row.number, row.normal_shift, row.lazy_shift(delta=7)
-
sort
(f, reverse=False)¶ Returns a list of sorted elements (not sorted in place)
-
-
class
DAL.
Table
(db, tablename, *fields, **args)¶ Bases:
pydal.helpers.classes.Serializable
,pydal.helpers.classes.BasicStorage
Represents a database table
- Example::
- You can create a table as::
- db = DAL(...) db.define_table(‘users’, Field(‘name’))
And then:
db.users.insert(name='me') # print db.users._insert(...) to see SQL db.users.drop()
-
as_dict
(flat=False, sanitize=True)¶
-
bulk_insert
(items)¶ here items is a list of dictionaries
-
create_index
(name, *fields, **kwargs)¶
-
drop
(mode='')¶
-
drop_index
(name)¶
-
fields
¶
-
import_from_csv_file
(csvfile, id_map=None, null='<NULL>', unique='uuid', id_offset=None, transform=None, *args, **kwargs)¶ Import records from csv file. Column headers must have same names as table fields. Field ‘id’ is ignored. If column names read ‘table.file’ the ‘table.’ prefix is ignored.
- ‘unique’ argument is a field which must be unique (typically a uuid field)
- ‘restore’ argument is default False; if set True will remove old values in table first.
- ‘id_map’ if set to None will not map ids
The import will keep the id numbers in the restored table. This assumes that there is an field of type id that is integer and in incrementing order. Will keep the id numbers in restored table.
-
insert
(**fields)¶
-
on
(query)¶
-
sqlsafe
¶
-
sqlsafe_alias
¶
-
truncate
(mode='')¶
-
update
(*args, **kwargs)¶
-
update_or_insert
(_key=<function <lambda>>, **values)¶
-
validate_and_insert
(**fields)¶
-
validate_and_update
(_key=<function <lambda>>, **fields)¶
-
validate_and_update_or_insert
(_key=<function <lambda>>, **fields)¶
-
with_alias
(alias)¶
-
DAL.
check_reserved_keyword
(name)[source]¶ Validates name against SQL keywords Uses self.check_reserve which is a list of operators to use.
-
DAL.
executesql
(query, placeholders=None, as_dict=False, fields=None, colnames=None, as_ordered_dict=False)[source]¶ Executes an arbitrary query
Parameters: - query (str) – the query to submit to the backend
- placeholders – is optional and will always be None. If using raw SQL with placeholders, placeholders may be a sequence of values to be substituted in or, (if supported by the DB driver), a dictionary with keys matching named placeholders in your SQL.
- as_dict – will always be None when using DAL. If using raw SQL can be set to True and the results cursor returned by the DB driver will be converted to a sequence of dictionaries keyed with the db field names. Results returned with as_dict=True are the same as those returned when applying .to_list() to a DAL query. If “as_ordered_dict”=True the behaviour is the same as when “as_dict”=True with the keys (field names) guaranteed to be in the same order as returned by the select name executed on the database.
- fields –
list of DAL Fields that match the fields returned from the DB. The Field objects should be part of one or more Table objects defined on the DAL object. The “fields” list can include one or more DAL Table objects in addition to or instead of including Field objects, or it can be just a single table (not in a list). In that case, the Field objects will be extracted from the table(s).
Note
if either fields or colnames is provided, the results will be converted to a DAL Rows object using the db._adapter.parse() method
- colnames – list of field names in tablename.fieldname format
Note
It is also possible to specify both “fields” and the associated “colnames”. In that case, “fields” can also include DAL Expression objects in addition to Field objects. For Field objects in “fields”, the associated “colnames” must still be in tablename.fieldname format. For Expression objects in “fields”, the associated “colnames” can be any arbitrary labels.
DAL Table objects referred to by “fields” or “colnames” can be dummy tables and do not have to represent any real tables in the database. Also, note that the “fields” and “colnames” must be in the same order as the fields in the results cursor returned from the DB.
-
DAL.
execution_handlers
= [<class 'pydal.helpers.classes.TimingHandler'>]¶
-
static
DAL.
get_instances
()[source]¶ Returns a dictionary with uri as key with timings and defined tables:
{'sqlite://storage.sqlite': { 'dbstats': [(select auth_user.email from auth_user, 0.02009)], 'dbtables': { 'defined': ['auth_cas', 'auth_event', 'auth_group', 'auth_membership', 'auth_permission', 'auth_user'], 'lazy': '[]' } } }
-
DAL.
import_from_csv_file
(ifile, id_map=None, null='<NULL>', unique='uuid', map_tablenames=None, ignore_missing_tables=False, *args, **kwargs)[source]¶
-
DAL.
logger
= <logging.Logger object>¶
-
DAL.
record_operators
= {'update_record': <class 'pydal.helpers.classes.RecordUpdater'>, 'delete_record': <class 'pydal.helpers.classes.RecordDeleter'>}¶
-
DAL.
representers
= {}¶
-
DAL.
serializers
= None¶
-
DAL.
tables
¶
-
DAL.
uuid
(x)¶
-
DAL.
validators
= None¶
-
DAL.
validators_method
= None¶
- uri (str) –
pydal.connection module¶
-
class
pydal.connection.
ConnectionPool
[source]¶ Bases:
object
-
POOLS
= {}¶
-
check_active_connection
= True¶
-
static
close_all_instances
(action)[source]¶ to close cleanly databases in a multithreaded environment
-
connection
¶
-
cursor
¶
-
cursors
¶
-
pydal.objects module¶
-
class
pydal.objects.
BasicRows
[source]¶ Bases:
object
Abstract class for Rows and IterRows
-
as_csv
()¶ Serializes the table into a csv file
-
as_dict
(key='id', compact=True, storage_to_dict=True, datetime_to_str=False, custom_types=None)[source]¶ Returns the data as a dictionary of dictionaries (storage_to_dict=True) or records (False)
Parameters: - key – the name of the field to be used as dict key, normally the id
- compact – ? (default True)
- storage_to_dict – when True returns a dict, otherwise a list(default True)
- datetime_to_str – convert datetime fields as strings (default False)
-
as_json
(mode='object', default=None)[source]¶ Serializes the rows to a JSON list or object with objects mode=’object’ is not implemented (should return a nested object structure)
-
as_list
(compact=True, storage_to_dict=True, datetime_to_str=False, custom_types=None)[source]¶ Returns the data as a list or dictionary.
Parameters: - storage_to_dict – when True returns a dict, otherwise a list
- datetime_to_str – convert datetime fields as strings
-
as_trees
(parent_name='parent_id', children_name='children', render=False)[source]¶ returns the data as list of trees.
Parameters: - parent_name – the name of the field holding the reference to the parent (default parent_id).
- children_name – the name where the children of each row will be stored as a list (default children).
- render – whether we will render the fields using their represent (default False) can be a list of fields to render or True to render all.
-
export_to_csv_file
(ofile, null='<NULL>', *args, **kwargs)[source]¶ Exports data to csv, the first line contains the column names
Parameters: - ofile – where the csv must be exported to
- null – how null values must be represented (default ‘<NULL>’)
- delimiter – delimiter to separate values (default ‘,’)
- quotechar – character to use to quote string values (default ‘”’)
- quoting – quote system, use csv.QUOTE_*** (default csv.QUOTE_MINIMAL)
- represent – use the fields .represent value (default False)
- colnames – list of column names to use (default self.colnames)
This will only work when exporting rows objects!!!! DO NOT use this with db.export_to_csv()
-
json
(mode='object', default=None)¶ Serializes the rows to a JSON list or object with objects mode=’object’ is not implemented (should return a nested object structure)
-
-
class
pydal.objects.
Expression
(db, op, first=None, second=None, type=None, **optional_args)[source]¶ Bases:
object
-
belongs
(*value, **kwattr)[source]¶ Accepts the following inputs:
field.belongs(1, 2) field.belongs((1, 2)) field.belongs(query)
Does NOT accept:
field.belongs(1)If the set you want back includes None values, you can do:
field.belongs((1, None), null=True)
-
-
class
pydal.objects.
Field
(fieldname, type='string', length=None, default=<function <lambda>>, required=False, requires=<function <lambda>>, ondelete='CASCADE', notnull=False, unique=False, uploadfield=True, widget=None, label=None, comment=None, writable=True, readable=True, update=None, authorize=None, autodelete=False, represent=None, uploadfolder=None, uploadseparate=False, uploadfs=None, compute=None, custom_store=None, custom_retrieve=None, custom_retrieve_file_properties=None, custom_delete=None, filter_in=None, filter_out=None, custom_qualifier=None, map_none=None, rname=None)[source]¶ Bases:
pydal.objects.Expression
,pydal.helpers.classes.Serializable
-
Lazy
¶ Represents a database field
Example
Usage:
a = Field(name, 'string', length=32, default=None, required=False, requires=IS_NOT_EMPTY(), ondelete='CASCADE', notnull=False, unique=False, uploadfield=True, widget=None, label=None, comment=None, uploadfield=True, # True means store on disk, # 'a_field_name' means store in this field in db # False means file content will be discarded. writable=True, readable=True, update=None, authorize=None, autodelete=False, represent=None, uploadfolder=None, uploadseparate=False # upload to separate directories by uuid_keys # first 2 character and tablename.fieldname # False - old behavior # True - put uploaded file in # <uploaddir>/<tablename>.<fieldname>/uuid_key[:2] # directory) uploadfs=None # a pyfilesystem where to store upload )
to be used as argument of DAL.define_table
alias of
FieldMethod
-
Method
¶ alias of
FieldMethod
-
Virtual
¶ alias of
FieldVirtual
-
retrieve
(name, path=None, nameonly=False)[source]¶ If nameonly==True return (filename, fullfilename) instead of (filename, stream)
-
sqlsafe
¶
-
sqlsafe_name
¶
-
-
class
pydal.objects.
FieldVirtual
(name, f=None, ftype='string', label=None, table_name=None)[source]¶ Bases:
object
-
class
pydal.objects.
IterRows
(db, sql, fields, colnames, blob_decode, cacheable)[source]¶ Bases:
pydal.objects.BasicRows
-
next
()¶
-
-
class
pydal.objects.
Query
(db, op, first=None, second=None, ignore_common_filters=False, **optional_args)[source]¶ Bases:
pydal.helpers.classes.Serializable
Necessary to define a set. It can be stored or can be passed to DAL.__call__() to obtain a Set
Example
Use as:
query = db.users.name=='Max' set = db(query) records = set.select()
-
as_dict
(flat=False, sanitize=True)[source]¶ Experimental stuff
This allows to return a plain dictionary with the basic query representation. Can be used with json/xml services for client-side db I/O
Example
Usage:
q = db.auth_user.id != 0 q.as_dict(flat=True) { "op": "NE", "first":{ "tablename": "auth_user", "fieldname": "id" }, "second":0 }
-
-
class
pydal.objects.
Row
(*args, **kwargs)[source]¶ Bases:
pydal.helpers.classes.BasicStorage
A dictionary that lets you do d[‘a’] as well as d.a this is only used to store a Row
-
as_json
(mode='object', default=None, colnames=None, serialize=True, **kwargs)[source]¶ serializes the row to a JSON object kwargs are passed to .as_dict method only “object” mode supported
serialize = False used by Rows.as_json
TODO: return array mode with query column order
mode and colnames are not implemented
-
-
class
pydal.objects.
Rows
(db=None, records=[], colnames=[], compact=True, rawrows=None)[source]¶ Bases:
pydal.objects.BasicRows
A wrapper for the return value of a select. It basically represents a table. It has an iterator and each row is represented as a Row dictionary.
-
exclude
(f)[source]¶ Removes elements from the calling Rows object, filtered by the function f, and returns a new Rows object containing the removed elements
-
find
(f, limitby=None)[source]¶ Returns a new Rows object, a subset of the original object, filtered by the function f
-
render
(i=None, fields=None)[source]¶ Takes an index and returns a copy of the indexed row with values transformed via the “represent” attributes of the associated fields.
Parameters: - i – index. If not specified, a generator is returned for iteration over all the rows.
- fields – a list of fields to transform (if None, all fields with “represent” attributes will be transformed)
-
setvirtualfields
(**keyed_virtualfields)[source]¶ For reference:
db.define_table('x', Field('number', 'integer')) if db(db.x).isempty(): [db.x.insert(number=i) for i in range(10)] from gluon.dal import lazy_virtualfield class MyVirtualFields(object): # normal virtual field (backward compatible, discouraged) def normal_shift(self): return self.x.number+1 # lazy virtual field (because of @staticmethod) @lazy_virtualfield def lazy_shift(instance, row, delta=4): return row.x.number+delta db.x.virtualfields.append(MyVirtualFields()) for row in db(db.x).select(): print row.number, row.normal_shift, row.lazy_shift(delta=7)
-
-
class
pydal.objects.
Set
(db, query, ignore_common_filters=None)[source]¶ Bases:
pydal.helpers.classes.Serializable
Represents a set of records in the database. Records are identified by the query=Query(...) object. Normally the Set is generated by DAL.__call__(Query(...))
Given a set, for example:
myset = db(db.users.name=='Max')
you can:
myset.update(db.users.name='Massimo') myset.delete() # all elements in the set myset.select(orderby=db.users.id, groupby=db.users.name, limitby=(0, 10))
and take subsets:
subset = myset(db.users.id<5)
-
class
pydal.objects.
Table
(db, tablename, *fields, **args)[source]¶ Bases:
pydal.helpers.classes.Serializable
,pydal.helpers.classes.BasicStorage
Represents a database table
- Example::
- You can create a table as::
- db = DAL(...) db.define_table(‘users’, Field(‘name’))
And then:
db.users.insert(name='me') # print db.users._insert(...) to see SQL db.users.drop()
-
fields
¶
-
import_from_csv_file
(csvfile, id_map=None, null='<NULL>', unique='uuid', id_offset=None, transform=None, *args, **kwargs)[source]¶ Import records from csv file. Column headers must have same names as table fields. Field ‘id’ is ignored. If column names read ‘table.file’ the ‘table.’ prefix is ignored.
- ‘unique’ argument is a field which must be unique (typically a uuid field)
- ‘restore’ argument is default False; if set True will remove old values in table first.
- ‘id_map’ if set to None will not map ids
The import will keep the id numbers in the restored table. This assumes that there is an field of type id that is integer and in incrementing order. Will keep the id numbers in restored table.
-
sqlsafe
¶
-
sqlsafe_alias
¶