Document all the new datafy/nav stuff

This commit is contained in:
Sean Corfield 2020-05-31 21:32:03 -07:00
parent d24dd892dd
commit cc3988e773
3 changed files with 28 additions and 8 deletions

View file

@ -3,7 +3,7 @@
Only accretive/fixative changes will be made from now on. Only accretive/fixative changes will be made from now on.
Changes made since release 1.0.445: Changes made since release 1.0.445:
* Addition of `next.jdbc.datafy` to provide more `datafy`/`nav` introspection (work in progress; documentation pending). * Addition of `next.jdbc.datafy` to provide more `datafy`/`nav` introspection (see the additional section in **datafy, nav, and :schema** for details).
* Addition of `next.jdbc.result-set/metadata` to provide (datafied) result set metadata within `plan`. * Addition of `next.jdbc.result-set/metadata` to provide (datafied) result set metadata within `plan`.
## Stable Builds ## Stable Builds

View file

@ -4,9 +4,11 @@ Clojure 1.10 introduced a new namespace, [`clojure.datafy`](http://clojure.githu
Shortly after REBL's release, I added experimental support to `clojure.java.jdbc` for `datafy` and `nav` that supported lazy navigation through result sets into foreign key relationships and connected rows and tables. `next.jdbc` bakes that support into result sets produced by `execute!` and `execute-one!`. Shortly after REBL's release, I added experimental support to `clojure.java.jdbc` for `datafy` and `nav` that supported lazy navigation through result sets into foreign key relationships and connected rows and tables. `next.jdbc` bakes that support into result sets produced by `execute!` and `execute-one!`.
## The `datafy`/`nav` Lifecycle In addition to `datafy` and `nav` support in the result sets, as of version 1.0.next, there is a `next.jdbc.datafy` namespace that can be required to extend these protocols to a number of JDBC object types. See **JDBC Datafication** near the end of this page for more detail of this.
Here's how the process works: ## The `datafy`/`nav` Lifecycle on Result Sets
Here's how the process works, for result sets produced by `next.jdbc`:
* `execute!` and `execute-one!` produce result sets containing rows that are `Datafiable`, * `execute!` and `execute-one!` produce result sets containing rows that are `Datafiable`,
* Tools like REBL can call `datafy` on result sets to render them as "pure data" (which they already are, but this makes them also `Navigable`), * Tools like REBL can call `datafy` on result sets to render them as "pure data" (which they already are, but this makes them also `Navigable`),
@ -16,7 +18,7 @@ Here's how the process works:
In addition to `execute!` and `execute-one!`, you can call `next.jdbc.result-set/datafiable-result-set` on any `ResultSet` object to produce a result set whose rows are `Datafiable`. Inside a reduction over the result of `plan`, you can call `next.jdbc.result-set/datafiable-row` on a row to produce a `Datafiable` row. That will realize the entire row, including generating column names using the row builder specified (or `as-maps` by default). In addition to `execute!` and `execute-one!`, you can call `next.jdbc.result-set/datafiable-result-set` on any `ResultSet` object to produce a result set whose rows are `Datafiable`. Inside a reduction over the result of `plan`, you can call `next.jdbc.result-set/datafiable-row` on a row to produce a `Datafiable` row. That will realize the entire row, including generating column names using the row builder specified (or `as-maps` by default).
## Identifying Foreign Keys ### Identifying Foreign Keys
By default, `next.jdbc` assumes that a column named `<something>id` or `<something>_id` is a foreign key into a table called `<something>` with a primary key called `id`. As an example, if you have a table `address` which has columns `id` (the primary key), `name`, `email`, etc, and a table `contact` which has various columns including `addressid`, then if you retrieve a result set based on `contact`, call `datafy` on it and then "drill down" into the columns, when `(nav row :contact/addressid v)` is called (where `v` is the value of that column in that row) `next.jdbc`'s implementation of `nav` will fetch a single row from the `address` table, identified by `id` matching `v`. By default, `next.jdbc` assumes that a column named `<something>id` or `<something>_id` is a foreign key into a table called `<something>` with a primary key called `id`. As an example, if you have a table `address` which has columns `id` (the primary key), `name`, `email`, etc, and a table `contact` which has various columns including `addressid`, then if you retrieve a result set based on `contact`, call `datafy` on it and then "drill down" into the columns, when `(nav row :contact/addressid v)` is called (where `v` is the value of that column in that row) `next.jdbc`'s implementation of `nav` will fetch a single row from the `address` table, identified by `id` matching `v`.
@ -45,7 +47,7 @@ When you indicate a `*-to-many` relationship, by wrapping the foreign table/key
If you use foreign key constraints in your database, you could probably generate this `:schema` data structure automatically from the metadata in your database. Similarly, if you use a library that depends on an entity relationship map (such as [seql](https://exoscale.github.io/seql/) or [walkable](https://walkable.gitlab.io/)), then you could probably generate this `:schema` data structure from that entity map. If you use foreign key constraints in your database, you could probably generate this `:schema` data structure automatically from the metadata in your database. Similarly, if you use a library that depends on an entity relationship map (such as [seql](https://exoscale.github.io/seql/) or [walkable](https://walkable.gitlab.io/)), then you could probably generate this `:schema` data structure from that entity map.
## Behind The Scenes ### Behind The Scenes
Making rows datafiable is implemented by adding metadata to each row with a key of `clojure.core.protocols/datafy` and a function as the value. That function closes over the connectable and options passed in to the `execute!` or `execute-one!` call that produced the result set containing those rows. Making rows datafiable is implemented by adding metadata to each row with a key of `clojure.core.protocols/datafy` and a function as the value. That function closes over the connectable and options passed in to the `execute!` or `execute-one!` call that produced the result set containing those rows.
@ -57,4 +59,23 @@ The protocol `next.jdbc.result-set/DatafiableRow` has a default implementation o
In addition, you can call `next.jdbc.result-set/datafiable-result-set` on any `ResultSet` object and get a fully realized, datafiable result set created using any of the result set builders. In addition, you can call `next.jdbc.result-set/datafiable-result-set` on any `ResultSet` object and get a fully realized, datafiable result set created using any of the result set builders.
## JDBC Datafication
If you require `next.jdbc.datafy`, the `Datafiable` protocol is extended to several JDBC object types, so that calling `datafy` will turn them into hash maps according to Java Bean introspection, similar to `clojure.core/bean` although `next.jdbc` uses `clojure.java.data/from-java-shallow` (from [`org.clojure/java.data`](https://github.com/clojure/java.data)), with some additions as described below.
* `java.sql.Connection` -- datafies as a bean; The `:metaData` property is a `java.sql.DatabaseMetaData`, which is also datafiable.
* `DatabaseMetaData` -- datafies as a bean, with an additional `:all-tables` property (that is a dummy object); six properties are navigable to produce fully-realized datafiable result sets:
* `all-tables` -- produced from `(.getTables this nil nil nil nil)`, this is all the tables and views available from the connection that produced the database metadata,
* `catalogs` -- produced from `(.getCatalogs this)`
* `clientInfoProperties` -- all the client properties that the database driver supports,
* `schemas` -- produced from `(.getSchemas this)`,
* `tableTypes` -- produced from `(.getTableTypes this)`,
* `typeInfo` -- produced from `(.getTypeInfo this)`.
* `ParameterMetaData` -- datafies as a vector of parameter descriptions; each parameter hash map has: `:class` (the name of the parameter class -- JVM), `:mode` (one of `:in`, `:in-out`, or `:out`), `:nullability` (one of: `:null`, `:not-null`, or `:unknown`), `:precision`, `:scale`, `:type` (the name of the parameter type -- SQL), and `:signed` (Boolean).
* `ResultSet` -- datafies as a bean; if the `ResultSet` has an associated `Statement` and that in turn has an associated `Connection` then an additional key of `:rows` is provided which is a datafied result set, from `next.jdbc.result-set/datafiable-result-set` with default options. This is provided as a convenience, purely for datafication of other JDBC data types -- in normal `next.jdbc` usage, result sets are datafied under full user control.
* `ResultSetMetaData` -- datafies as a vector of column descriptions; each column hash map has: `:auto-increment`, `:case-sensitive`, `:catalog`, `:class` (the name of the column class -- JVM), `:currency` (Boolean), `:definitely-writable`, `:display-size`, `:label`, `:name`, `:nullability`, `:precision`, `:read-only`, `:searchable`, `:signed`, `:scale`, `:schema`, `:table`, `:type`, and `:writable`.
* `Statement` -- datafies as a bean.
See the Java documentation for these JDBC types for further details on what all the properties from each of these classes mean and which are `int`, `String`, or some other JDBC object type.
[<: All The Options](/doc/all-the-options.md) | [Migration from `clojure.java.jdbc` :>](/doc/migration-from-clojure-java-jdbc.md) [<: All The Options](/doc/all-the-options.md) | [Migration from `clojure.java.jdbc` :>](/doc/migration-from-clojure-java-jdbc.md)

View file

@ -4,9 +4,8 @@
"This namespace provides datafication of several JDBC object types, "This namespace provides datafication of several JDBC object types,
all within the `java.sql` package: all within the `java.sql` package:
* `Connection` -- datafies as a bean; `:metaData` is navigable * `Connection` -- datafies as a bean.
and produces `java.sql.DatabaseMetaData`. * `DatabaseMetaData` -- datafies as a bean; six properties
* `DatabaseMetaData` -- datafies as a bean; five properties
are navigable to produce fully-realized datafiable result sets. are navigable to produce fully-realized datafiable result sets.
* `ParameterMetaData` -- datafies as a vector of parameter descriptions. * `ParameterMetaData` -- datafies as a vector of parameter descriptions.
* `ResultSet` -- datafies as a bean; if the `ResultSet` has an associated * `ResultSet` -- datafies as a bean; if the `ResultSet` has an associated